matlab - Lag in time series regression using LibSVM -
i use libsvm
in matlab examine utility of svm
regression time series prediction. use following code sample:
t = -10:0.1:10; x = 2*sin(10*t)+0.5*t.^2+4; x = (x - min(x)) / (max(x) - min(x)); x = x'; data = x(1:end-1); datalabels = x(2:end); traindatalength = round(length(data)*70/100); trainingset = data(1:traindatalength); trainingsetlabels = datalabels(1:traindatalength); testset = data(traindatalength+1:end); testsetlabels = datalabels(traindatalength+1:end); options = ' -s 3 -t 2 -c 100 -p 0.001 -h 0'; model = svmtrain(trainingsetlabels, trainingset, options); [predicted_label, accuracy, decision_values] = svmpredict(testsetlabels, testset, model); figure(2); plot(1:length(testsetlabels), testsetlabels, '-b'); hold on; plot(1:length(testsetlabels), predicted_label, '-r'); hold off;
and figure is:
from figure can seen there lag in predicted values vs. actual values. don't know if lag because of bug in code, in libsvm
code, or natural, , cannot expect predict one-step ahead value of time series.
what in line
model = svmtrain(trainingsetlabels, trainingset, options);
is ask estimate y=trainingsetlabels features contained in x=trainingset.
given code, there 1 timestep lag between x , y, behavior normal. however, can improve estimation. x can matrix, 1 column per feature vector. can add following columns :
- x 1 time step lag (you have it)
- x n time steps lag (where n corresponds period of sinus)
- a column vector such (1:1:length(x)), used estimate trend.
this way (mostly n time step lag column), able anticipate incoming values.
cheers