lolimotObject = lolimot;
u = linspace(0,1,300)';
lolimotObject.input = u;
lolimotObject.output = 1 ./ (0.1 + u);
lolimotObject.maxNumberOfLM = 6;
lolimotObject.minError = 0.005;
lolimotObject.kStepPrediction = 0;
lolimotObject.smoothness = 0.8;
lolimotObject.complexityPenalty = 1;
lolimotObject.history.displayMode = true;
lolimotObject.LOOCV = true;
lolimotObject = lolimotObject.train;
[~, idxBest] = min(lolimotObject.history.penaltyLossFunction);
lolimotObject.leafModels = lolimotObject.history.leafModelIter{idxBest};
uG = linspace(-0.05,1.2,270)';
yG = 1 ./ (0.1 + uG);
yGModel = calculateModelOutput(lolimotObject, uG, yG);
JG = calcGlobalLossFunction(lolimotObject ,yG, yGModel);
figure
lolimotObject.plotModel
figure
lolimotObject.plotPartition
Input scaling complete.
Initial net has 1 local linear model(s): J = 0.558870.
1. Iteration. Number of local linear models = 1. Checking for split of model 1 ...
Testing split in dimension 1 with ratio 0.50: J = 0.338520.
-> Splitting in dimension 1 with ratio 0.50: J = 0.338520.
2. Iteration. Number of local linear models = 2. Checking for split of model 2 ...
Testing split in dimension 1 with ratio 0.50: J = 0.168858.
-> Splitting in dimension 1 with ratio 0.50: J = 0.168858.
3. Iteration. Number of local linear models = 3. Checking for split of model 4 ...
Testing split in dimension 1 with ratio 0.50: J = 0.070777.
-> Splitting in dimension 1 with ratio 0.50: J = 0.070777.
4. Iteration. Number of local linear models = 4. Checking for split of model 6 ...
Testing split in dimension 1 with ratio 0.50: J = 0.032529.
-> Splitting in dimension 1 with ratio 0.50: J = 0.032529.
5. Iteration. Number of local linear models = 5. Checking for split of model 8 ...
Testing split in dimension 1 with ratio 0.50: J = 0.025310.
-> Splitting in dimension 1 with ratio 0.50: J = 0.025310.
Maximum number of local models reached.
Final net has 6 local linear models: J = 0.025310.
Net 6 with 6 LLMs is suggested as the model with the best complexity trade-off.
Plot model for dimension 1.