Вы находитесь на странице: 1из 5

>> input = [1 1 0 0; 1 0 1 0];

net = newff([0 1; 0 1], minmax(input), [2 1], {'logsig', 'logsig'});


output = sim(net, input)
output =
0.5598 0.3949 0.6096 0.6230
0.7448 0.5821 0.7938 0.8071
net.iw{1,1}
ans =
-3.4331 1.9733
3.9158 0.5885
net.lw{2,1}
ans =
4.9819 -2.5574
target = [0 1 1 0]
target =
0 1 1 0
plot(output, 'r')
plot(target, 'b')
plot(output, 'r')
net.iw{1, 1} (1, 2) = 5;
net.iw{1,1}
ans =
-3.4331 5.0000
3.9158 0.5885
net.lw{1,1}
ans =
[]
output = sim(net, input)
output =
0.5778 0.3866 0.6096 0.6165
0.7625 0.5739 0.7939 0.8007
plot(output, 'g')
output =
0.0870 0.0789 0.1024 0.0789
0.5570 0.4520 0.7559 0.4527
??? output =
|
Error: Expression or statement is incomplete or incorrect.

>> plot (output, 'g')
>> plot (output, 'g')
>> input = [1 1 0 0 ; 1 0 1 0];
>> net = newff ([0 1; 0 1], [2 1], {'logsig' , 'logsig'})
Warning: NEWFF used in an obsolete way.
> In nntobsu at 18
In newff at 86
See help for NEWFF to update calls to the new argument list.

net =
Neural Network object:
architecture:
numInputs: 1
numLayers: 2
biasConnect: [1; 1]
inputConnect: [1; 0]
layerConnect: [0 0; 1 0]
outputConnect: [0 1]
numOutputs: 1 (read-only)
numInputDelays: 0 (read-only)
numLayerDelays: 0 (read-only)
subobject structures:
inputs: {1x1 cell} of inputs
layers: {2x1 cell} of layers
outputs: {1x2 cell} containing 1 output
biases: {2x1 cell} containing 2 biases
inputWeights: {2x1 cell} containing 1 input weight
layerWeights: {2x2 cell} containing 1 layer weight
functions:
adaptFcn: 'trains'
divideFcn: (none)
gradientFcn: 'gdefaults'
initFcn: 'initlay'
performFcn: 'mse'
plotFcns: {'plotperform','plottrainstate','plotregression'}
trainFcn: 'trainlm'
parameters:
adaptParam: .passes
divideParam: (none)
gradientParam: (none)
initParam: (none)
performParam: (none)
trainParam: .show, .showWindow, .showCommandLine, .epochs,
.time, .goal, .max_fail, .mem_reduc,
.min_grad, .mu, .mu_dec, .mu_inc,
.mu_max
weight and bias values:
IW: {2x1 cell} containing 1 input weight matrix
LW: {2x2 cell} containing 1 layer weight matrix
b: {2x1 cell} containing 2 bias vectors
other:
name: ''
userdata: (user information)
>> output = sim(net, input)
output =
0.7124 0.9800 0.9713 0.6221
>> target = [0 1 1 0]
target =
0 1 1 0
>> net.iw{1, 1}
ans =
-4.9248 -6.2021
6.8054 4.0504
>> net.iw{1, 1} (1,2) = 3
net =
Neural Network object:
architecture:
numInputs: 1
numLayers: 2
biasConnect: [1; 1]
inputConnect: [1; 0]
layerConnect: [0 0; 1 0]
outputConnect: [0 1]
numOutputs: 1 (read-only)
numInputDelays: 0 (read-only)
numLayerDelays: 0 (read-only)
subobject structures:
inputs: {1x1 cell} of inputs
layers: {2x1 cell} of layers
outputs: {1x2 cell} containing 1 output
biases: {2x1 cell} containing 2 biases
inputWeights: {2x1 cell} containing 1 input weight
layerWeights: {2x2 cell} containing 1 layer weight
functions:
adaptFcn: 'trains'
divideFcn: (none)
gradientFcn: 'gdefaults'
initFcn: 'initlay'
performFcn: 'mse'
plotFcns: {'plotperform','plottrainstate','plotregression'}
trainFcn: 'trainlm'
parameters:
adaptParam: .passes
divideParam: (none)
gradientParam: (none)
initParam: (none)
performParam: (none)
trainParam: .show, .showWindow, .showCommandLine, .epochs,
.time, .goal, .max_fail, .mem_reduc,
.min_grad, .mu, .mu_dec, .mu_inc,
.mu_max
weight and bias values:
IW: {2x1 cell} containing 1 input weight matrix
LW: {2x2 cell} containing 1 layer weight matrix
b: {2x1 cell} containing 2 bias vectors
other:
name: ''
userdata: (user information)
>> net.lw{2, 1}
ans =
3.6529 4.2446
>> net = train(net, input, target)
net =
Neural Network object:
architecture:
numInputs: 1
numLayers: 2
biasConnect: [1; 1]
inputConnect: [1; 0]
layerConnect: [0 0; 1 0]
outputConnect: [0 1]
numOutputs: 1 (read-only)
numInputDelays: 0 (read-only)
numLayerDelays: 0 (read-only)
subobject structures:
inputs: {1x1 cell} of inputs
layers: {2x1 cell} of layers
outputs: {1x2 cell} containing 1 output
biases: {2x1 cell} containing 2 biases
inputWeights: {2x1 cell} containing 1 input weight
layerWeights: {2x2 cell} containing 1 layer weight
functions:
adaptFcn: 'trains'
divideFcn: (none)
gradientFcn: 'gdefaults'
initFcn: 'initlay'
performFcn: 'mse'
plotFcns: {'plotperform','plottrainstate','plotregression'}
trainFcn: 'trainlm'
parameters:
adaptParam: .passes
divideParam: (none)
gradientParam: (none)
initParam: (none)
performParam: (none)
trainParam: .show, .showWindow, .showCommandLine, .epochs,
.time, .goal, .max_fail, .mem_reduc,
.min_grad, .mu, .mu_dec, .mu_inc,
.mu_max
weight and bias values:
IW: {2x1 cell} containing 1 input weight matrix
LW: {2x2 cell} containing 1 layer weight matrix
b: {2x1 cell} containing 2 bias vectors
other:
name: ''
userdata: (user information)
>> output = sim(net, input)
output =
0.5000 1.0000 0.5000 0.0000
-- El 22 examen final:
-- El 26 sustitutorio:
-- El 19 practica calificada: trabajo exposicion

Вам также может понравиться