hilomotDemo5 Demo 5: Static process: Global estimation and ridge regression after training.
HiLoMoT - Nonlinear System Identification Toolbox
Benjamin Hartmann, 25-January-2013
Institute of Mechanics & Automatic Control, University of Siegen, Germany
Copyright (c) 2013 by Prof. Dr.-Ing. Oliver Nelles
LMN = hilomot;
N = 50;
u = linspace(0,1,N)';
sigmoide = 1./(1+exp((-0.65+u)/60));
y1 = 1+sin(4*pi*u-pi/8);
y2 = 0.1./(0.1+u);
y = y1.*sigmoide + y2.*(1-sigmoide);
LMN.input = u;
LMN.output = y + 0.1*randn(N,1);
LMN.xRegressorDegree = 2;
LMN = LMN.train;
figure
LMN.plotModel
title('least squares local estimation')
figure;
LMNglob = LMN;
LMNglob = LMNglob.estimateParametersGlobal('LS');
LMNglob.plotModel
title('least squares global estimation')
figure;
LMNridge = LMN;
LMNridge = LMNridge.estimateParametersGlobal('RIDGE');
LMNridge.plotModel
title('least squares global estimation & ridge regression')
xInputDelay is empty, defaults are used: xInputDelay(1:p) = {0}
zInputDelay is empty, defaults are used: zInputDelay(1:p) = {0}
xOutputDelay is empty, defaults are used: xOutputDelay(1:p) = {[]}
zOutputDelay is empty, defaults are used: zOutputDelay(1:p) = {[]}
Initial net has 1 local model(s): J = 0.869099
1. Iteration. Number of local models = 1. Checking for split of model 1 ...
Testing split in dimension 1: J = 0.578801
Axes-oblique splitting: J = 0.466309
-> SPLITTING RESULT: J = 0.466309
2. Iteration. Number of local models = 2. Checking for split of model 3 ...
Testing split in dimension 1: J = 0.235739
Axes-oblique splitting: J = 0.229819
-> SPLITTING RESULT: J = 0.229819
3. Iteration. Number of local models = 3. Checking for split of model 4 ...
Testing split in dimension 1: J = 0.207973
Axes-oblique splitting: J = 0.207454
-> SPLITTING RESULT: J = 0.207454
4. Iteration. Number of local models = 4. Checking for split of model 2 ...
Testing split in dimension 1: J = 0.204787
Axes-oblique splitting: J = 0.198832
-> SPLITTING RESULT: J = 0.198832
Estimated model complexity limit reached. The improvement of
the loss function (penaltyLossFunction) was 2 times less than
1.000000e-12 on TRAINING data.
Final net has 5 local models and 19 parameters: J = 0.198832
Net 3 with 3 LMs and 11 parameters is suggested as the model with the best complexity trade-off.Plot model for dimension 1.
Plot model for dimension 1.
Plot model for dimension 1.