hilomotDemo1 Demo 1: Static process with 1 input and 1 output.

HiLoMoT - Nonlinear System Identification Toolbox
Benjamin Hartmann, 04-April-2012
Institute of Mechanics & Automatic Control, University of Siegen, Germany
Copyright (c) 2012 by Prof. Dr.-Ing. Oliver Nelles
% Initialize new hilomot object as local model network (LMN)
LMN = hilomot;

% Example process
N = 40;
u = [linspace(0,0.3,20) linspace(0.5,1,20)]';
sigmoide = 1./(1+exp((-0.65+u)/60));
y1 = 1+sin(4*pi*u-pi/8);
y2 = 0.1./(0.1+u);
y = y1.*sigmoide + y2.*(1-sigmoide);

% Set input and output for training
LMN.input  = u;
LMN.output = y + 0.05*randn(N,1);

% Set local models to polynomial order of 2
LMN.xRegressorDegree = 2;

% Train hilomot object
LMN = LMN.train;

% Generalization
uG = linspace(-0.2,1.2,100)';
yG = 1 ./ (0.1 + uG);

% Calculate model output and global loss function
yGModel = calculateModelOutput(LMN, uG, yG);
JG = calcGlobalLossFunction(LMN, yG, yGModel);

% Calculate errorbar or confidence intervals, respectively
alpha = 0.05; % Level of confidence for errorbar.
errorbar = calcErrorbar(LMN, uG, [], alpha);

% Visualization
figure
LMN.plotModel
hold on
plot(uG,yGModel+errorbar,'r')
legend('model output','data','model +/- errorbar')
plot(uG,yGModel-errorbar,'r')
axis([-0.2 1.2 -1 2])
xInputDelay is empty, defaults are used: xInputDelay(1:p) = {0}
zInputDelay is empty, defaults are used: zInputDelay(1:p) = {0}
xOutputDelay is empty, defaults are used: xOutputDelay(1:p) = {[]}
zOutputDelay is empty, defaults are used: zOutputDelay(1:p) = {[]}

Initial net has 1 local model(s): J = 0.650108


1. Iteration. Number of local models = 1. Checking for split of model 1 ... 

   Testing split in dimension 1:         J = 0.536771
   Axes-oblique splitting:               J = 0.495625
-> SPLITTING RESULT:                     J = 0.495625

2. Iteration. Number of local models = 2. Checking for split of model 3 ... 

   Testing split in dimension 1:         J = 0.318336
   Axes-oblique splitting:               J = 0.236574
-> SPLITTING RESULT:                     J = 0.236574

3. Iteration. Number of local models = 3. Checking for split of model 2 ... 

   Testing split in dimension 1:         J = 0.218785
   Axes-oblique splitting:               J = 0.197711
-> SPLITTING RESULT:                     J = 0.197711

4. Iteration. Number of local models = 4. Checking for split of model 4 ... 

   Testing split in dimension 1:         J = 0.163493
   Axes-oblique splitting:               J = 0.163359
-> SPLITTING RESULT:                     J = 0.163359

5. Iteration. Number of local models = 5. Checking for split of model 5 ... 

   Testing split in dimension 1:         J = 0.135563
   Axes-oblique splitting:               J = 0.133815
-> SPLITTING RESULT:                     J = 0.133815

6. Iteration. Number of local models = 6. Checking for split of model 6 ... 

   Testing split in dimension 1:         J = 0.127121
   Axes-oblique splitting:               J = 0.126375
-> SPLITTING RESULT:                     J = 0.126375

Estimated model complexity limit reached. The improvement of 
the loss function (penaltyLossFunction) was 2 times less than 
1.000000e-12 on TRAINING data.


Final net has 7 local models and 27 parameters: J = 0.126375

Net 5 with 5 LMs and 19 parameters is suggested as the model with the best complexity trade-off.Plot model for dimension 1.