Академический Документы
Профессиональный Документы
Культура Документы
discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/292970981
READS
668
1 author:
Aymen Ammari
Universit de Jendouba
11 PUBLICATIONS 2 CITATIONS
SEE PROFILE
ARTICLE INFO
ABSTRACT
Article history:
Received 00 December 00
Received in revised form 00 January 00
Accepted 00 February 00
Keywords: Artificial Neural Network
1. Upbringing
A neural network is a vastly analogous
distributed processor made up of simple
processing units that have an expected tendency
***************************************
%% Data Input and Preparation
clc; clear; close all;
in=xlsread('input'); % Input File
out=xlsread('output');
%
Output File
data=[in out];
corrplot(data)
%%
*************************************
t=log(t+1);
% Defining Validation Dataset
trainRatio1=.6;
valRatio1=.2;
testRatio1=.2;
***************************************
Network Definition
***************************************
%% Network Definition
nnn1=5;
%
First Number of Neurons in Hidden Layer
nnnj=5
%
Jump in Number of Neurons in Hidden
Layer
nnnf=20;
%
Last Number of Neurons in Hidden Layer
net1.divideParam.trainRatio=trainRatio1
;
net1.divideParam.valRatio=valRatio1;
net1.divideParam.testRatio=testRatio1;
estval=sim(net1,p(:,tr.valInd));
eval=mse(estvalt(:,tr.valInd));
if eval<evalopt(ii)
netopt{(ii)}=net1;
tropt(ii)=tr; evalopt(ii)=eval
end
end
end
plot(nnn1:nnnj:nnnf,evalopt)
% net1.trainparam.lr=0.1;
% net1.trainParam.epochs=500;
% Training Network
it=20;
% Max Number of Iteration
ii=0;
netopt{:}=1:nnnf;
for nnn=nnn1:nnnj:nnnf
ii=ii+1; nnn
net1=newff(p,t,[nnn nnn]); % For
more functions see: 'Function
Reference' in 'Neural Network Toolbox'
of Matlab help
evalopt(ii)=100;
for i=1:it
[net1,tr,y,et]=train(net1,p,t);
% Training
Network Output
***************************************
%% Output
clear; close all;
load('run_log_2');
nn=4
ptrain=p(:,tropt(nn).trainInd);
ttrain=t(:,tropt(nn).trainInd);
esttrain=sim(netopt{nn},ptrain);
ptest=p(:,tropt(nn).testInd);
ttest=t(:,tropt(nn).testInd);
esttest=sim(netopt{nn},ptest);
pval=p(:,tropt(nn).valInd);
tval=t(:,tropt(nn).valInd);
estval=sim(netopt{nn},pval);
estwhole=sim(netopt{nn},p);
% ttrain=exp(ttrain); ttest=exp(ttest);
tval=exp(tval); t=exp(t);
% esttrain=exp(esttrain);
esttest=exp(esttest);
estval=exp(estval);
estwhole=exp(estwhole);
figure; plot(ttrain,esttrain,'.b'); %
train data: ttrain: real and esttrain:
estimated
figure; plot(tval,estval,'.g');
%
validation
figure; plot(ttest,esttest,'.r');
%
test
figure; plot(t,estwhole,'.k')
%
whole
figure;
plotregression(ttrain,esttrain,'Train',
tval,estval,'Validation',ttest,esttest,
'Test',t,estwhole,'Whole Data');
***************************************
Conclusion:
Based on a chapter of my Ph.D dissertation
which analyses the French corporate governance
linked to their firm performance, this paper
present the MATLAB code used and clarifies
step by step the implanting of Artificial Neural
Networks Estimation. The code was written for
flexibility so as to be easily modified to many
other applications for educational purposes.
Acknowledgement:
I acknowledge Verlinden, B. et al 2007, Cost
estimation for sheet metal parts using multiple
regression and artificial neural networks: A case
study for getting his background support.