当前位置:文档之家› BP神经网络实验报告

BP神经网络实验报告

深圳大学实验报告实验课程名称:人工神经网络技术实验项目名称:BP神经网络对蝴蝶花分类学院:专业:软件工程报告人:学号:班级:同组人:无指导教师:实验时间:实验报告提交时间:教务处制一、实验目的初步熟悉MATLAB 工作环境,熟悉命令窗口,学会使用帮助窗口查找帮助信息。

二、实验内容1、网络设计,包括输入层、隐含层、输出层节点个数的设计。

2、算法步骤3、编程,注意原始数据的通用化,数据输入的随机性。

4、网络训练,注意训练数据与验证数据分开。

5、网络验证6、结果分析,修改隐含层节点个数,修改学习率,分别对结果的影响。

三、实验步骤直接在Matlab软件中的Editor中运行以下代码:(完善的代码用红色字体表示)% li.m% A BP NN with momentum to solve Fisher's Iris Flower problem% by lixiujuan, Nov 13, 2011%% the NN architecture% it is a three layers neural network 4-3-3.%% parameter description% h=4 the node numer of input layer% i=3 the node numer of hidden layer% j=3 the node numer of output layer% V[h,i] the weights between input and hidden layers% W[i,j] the weights between hidden and output layers% Pi[i] the thresholds of hidden layer nodes% Tau[j] the thresholds of output layer nodes% a[h] the input values% b[i] the hidden layer node activations% c[j] the output layer node activations% ck[j] the desired output of output layer nodes% d[j] the eror in output layer nodes% e[i] the eror in hidden layer nodes% DeltaW[i,j] the amount of change for the weights W[i,j]% DeltaV[h,i] the amount of change for the weights V[h,i]% DeltaPi[i] the amount of change for the thresholds Pi[i]% DeltaTau[j] the amount of change for the thresholds Tau[j]% Alpha=0.1 the leaning rate% Beta=0.1 the leaning rate% Gamma=0.8 the constant determines effect of past weight changes% Tor=0.001 the torrelance that determines when to stop training% Maxepoch=1000 the max iterate number%% other parameters% Ntrain=115 the number of trainning sets% Ntest=35 the number of test sets% Otrain[115] the output of training sets% Otest[35] the output of test sets% Odesired[150] the desired output of training and test sets%% function description% f(x)=logsig(x)% f(x)=1/(1+exp(-x))%% data file% input file: data.dat%close all; clc; clf; clear all;% parameters for the NN structureh=4;i=3;j=3;Alpha=0.1;Beta=0.1;Gamma=0.85;Tor=0.0005;Maxepoch=2000;Accuracy=0;Ntrain=115;Ntest=35;%assign random values in the range [-1, +1]V=2*(rand(h,i)-0.5);W=2*(rand(i,j)-0.5);Pi=2*(rand(1,i)-0.5);Tau=2*(rand(1,j)-0.5);DeltaWOld(i,j)=0; %set the delat of Wij to 0DeltaVOld(h,i)=0; %set the delat of Vij to 0DeltaPiOld(i)=0; %set the delat of Pi to 0DeltaTauOld(j)=0; %set the delat of Tau to 0% the learning processEpoch=1;Error=10;% load the training set data and test set dataload data.datOdesired=data(:,2); % get the desired of output of 150 data sets % normalize the input data to rang [-1 +1]datanew=data(:,3:6);maxv=max(max(datanew));minv=min(min(datanew));datanorm=2*((datanew-minv)/(maxv-minv)-0.5);while Error>TorErr(Epoch)=0;for k=1:Ntrain % k = the index of tranning seta=datanorm(k,:); % get the input% set the desired output ck[j]if data(k,2)==0ck=[1 0 0];elseif data(k,2)==1ck=[0 1 0];elseck=[0 0 1];end;% calculate the hidden nodes activationfor ki=1:ib(ki)=logsig(a*V(:,ki)+Pi(ki));end;% calculate the output nodes activationfor kj=1:jc(kj)=logsig(b*W(:,kj)+Tau(kj));end;% calculate error in output Layer FCd=c.*(1-c).*(ck-c);% calculate error in hidden layer FBe=b.*(1-b).*(d*W');% adjust weights Wij between FB and FCfor ki=1:ifor kj=1:jDeltaW(ki,kj)=Alpha*b(ki)*d(kj)+Gamma*DeltaWOld(ki,kj); endend;W=W+DeltaW;DeltaWOld=DeltaW;% adjust weights Vij between FA and FBfor kh=1:hfor ki=1:iDeltaV(kh,ki)=Beta*a(kh)*e(ki);endend;V=V+DeltaV;DeltaVold=DeltaV;% adjust thresholds Pi and TauDeltaPi=Beta*e+Gamma*DeltaPiOld;Pi=Pi+DeltaPi;DeltaPiold=DeltaPi;DeltaTau=Alpha*d+Gamma*DeltaTauOld;Tau=Tau+DeltaTau;DeltaTauold=DeltaTau;% the error is the max of d(1),d(2),d(3)Err(Epoch)=Err(Epoch)+0.5*(d(1)*d(1)+d(2)*d(2)+d(3)*d(3)); end %for k=1:NtrainErr(Epoch)=Err(Epoch)/Ntrain;Error=Err(Epoch);% the training stops when iterate is too muchif Epoch > Maxepochbreak;endEpoch = Epoch +1; % update the iterate numberend% test datafor k=1:Ntest % k = the index of test seta=datanorm(Ntrain+k,:); % get the input of test sets% calculate the hidden nodes activationfor ki=1:ib(ki)=logsig(a*V(:,ki)+Pi(ki));end;% calculate the output of test setsfor kj=1:jc(kj)=logsig(b*W(:,kj)+Tau(kj));end;% transfer the output to one field formatif (c(1)> 0.9)Otest(k)=0;elseif (c(2)> 0.9)Otest(k)=1;elseif (c(3)> 0.9)Otest(k)=2;elseOtest(k)=3;end;% calculate the accuracy of test setsif Otest(k)==Odesired(Ntrain+k)Accuracy=Accuracy+1;end;end; % k=1:Ntest% plot the errorplot(Err);% plot the NN output and desired output during testN=1:Ntest;figure; plot(N,Otest,'b-',N,Odesired(116:150),'r-');% display the accuracyAccuracy = 100*Accuracy/Ntest;t=['TESTING RESULT, the accuracy of test sets is: ' num2str(Accuracy) '%' ]; disp(t);当中间隐藏层结点数为i=3时,得出下图:当Alpha=0.08;Beta=0.08时;如下图:当Alpha=0.2;Beta=0.2时;如下图:当Alpha=0.2;Beta=0.3时;如下图:当中间隐藏层结点数为i=4时,得出下图:当中间隐藏层结点数为i=2时,得出下图:当Alpha=0.05;Beta=0.05时;如下图:当中间隐藏层结点数为i=1时,得出下图:四、总结分析由于对MATLAB的不熟悉,一开始还是不知道怎么完成这个实验,但是在同学的帮助下,渐渐懂得怎么导入数据和代码实现,进而了解怎么样使用BP神经网络来进行蝴蝶花的分类。

相关主题