We have time series data, where each data is segregated by # of days in a window.
In each window we have probability of event occurring. The event can be categorized as +ve or -ve but ultimately probability sums up to 1.
The time series is divided by two week window
time +ve event -ve event
W1 0.7 0.3
W2 0.4 0.6
W3 0.2 0.8
W4 .... ....
Now, we want to predict the probability of +ve event occurring/-ve event occurring in window 4
How can we use Naive Bayes or other algorithms which use probability at it’s core?
Are there any other suitable Machine Learning Model for such class of problems?
I would appreciate, if I get sample code
Here are some ideas (not implemented though)
You can use the naive method, wherein you put in the probability of previous week (w3 in this case) and the probability of next week.
You can train your model on time and +ve event column. Then predict the values of the +ve event column for w4,w5,… and so on. For -ve event column, fill values using fomula 1-[+ve event column] (make sure the values are between 0 and 1 for the +ve event)
You can use Hidden Markov model
Is it possible for you to share a pointer to rough implementation of the same in public domain?
Specifically, I am not clear on how would you train on week by week basis.
You can follow the below mentioned course which covers most of the forecasting techniques in Python:
In this course, originally we had hourly data and we have converted it to daily data and forecasted the daily data. Similarly, you can forecast for weekly data.