您现在的位置:首页 > >

人工智能利用BP网络对26个字母进行识别

发布时间:

人工智能利用 BP 网络对 26 个字母进行识别(matlab)
>> clear;%清除工作区中的变量 [alphabet,targets] = prprob;%通过 prprob 函数得到训练字母集和目标集 [R,Q] = size(alphabet);%取得字母集的维数 [S2,Q] = size(targets);%取得目标集的维数,在这里目标集是 26 个二进制数。分别代表 A~Z S1 = 10;%隐含层个数 net = newff(minmax(alphabet),[S1 S2],{'logsig' 'logsig'},'traingdx');%创建神经网络,两层,函数 为 Logsig。 net.LW{2,1} = net.LW{2,1}*0.01;%设定权值 net.b{2} = net.b{2}*0.01;%设定偏差向量 net.performFcn = 'sse'; %运用和*方最小误差方式 net.trainParam.goal = 0.1;%训练精度目标为 0.1 net.trainParam.epochs = 5000; %迭代 5000 次 net.trainParam.mc = 0.95; %冲量为 0.95 P = alphabet;%训练集 T = targets;%目标集 [net,tr] = train(net,P,T);%训练网络 netn = net;%创建带有噪声的网络 netn.trainParam.goal = 0.6; %训练精度目标为 0.6 netn.trainParam.epochs = 300; %迭代 300 次 T = [targets targetstargets targets];%扩展目标集 for pass = 1:10 fprintf('Pass = %.0f\n',pass); P = [alphabet, alphabet, ... (alphabet + randn(R,Q)*0.1), ... (alphabet + randn(R,Q)*0.2)];%将训练集加入噪声 [netn,tr] = train(netn,P,T);%训练带有噪声的神经网络 end netn.trainParam.goal = 0.1; %训练精度目标为 0.1 netn.trainParam.epochs = 500; %迭代 500 次 P = alphabet;%重设训练集 T = targets;%重设目标集 [netn,tr] = train(netn,P,T);%训练代有噪声的神经网络 noise_range = 0:.05:.5;%设定噪声 max_test = 100; network1 = []; network2 = []; T = targets;%设定目标集 fornoiselevel = noise_range fprintf('Testing networks with noise level of %.2f.\n',noiselevel); errors1 = 0; errors2 = 0;

for i=1:max_test P = alphabet + randn(35,26)*noiselevel;%仿真集加入噪声 A = sim(net,P);%对用无噪声训练的网络进行仿真 AA = compet(A);%输出为 1 的那个对应输入向量为竞争获胜的输入向量 errors1 = errors1 + sum(sum(abs(AA-T)))/2;%计算误差 An = sim(netn,P);%对用有噪声训练的网络进行仿真 AAn = compet(An);%输出为 1 的那个对应输入向量为竞争获胜的输入向量 errors2 = errors2 + sum(sum(abs(AAn-T)))/ 2;%计算误差 end network1 = [network1 errors1/26/100];%计算出 Network1 的误差 network2 = [network2 errors2/26/100];%计算出 Network2 的误差 end plot(noise_range,network1*100,'--',noise_range,network2*100,'*');%打出效果图 title('误差百分数'); xlabel('噪声水*'); ylabel('Network 1 - - Network 2 **'); noisyF=alphabet(:,6)+randn(35,1)*0.2; plotchar(noisyF) A1= sim(net,noisyF); plotchar(noisyF) ; plotchar(noisyF) ; figure; A = sim(net,P); A1= sim(net,noisyF); A1=compet(A2); A1=compet(A1); plotchar(noisyF); figure; A2= sim(netn,noisyF); A2=compet(A2); plotchar(noisyF); figure plotchar(A2) figure plotchar(A1); TRAINGDX, Epoch 0/5000, SSE 168.447/0.1, Gradient 46.0382/1e-006 TRAINGDX, Epoch 25/5000, SSE 24.9713/0.1, Gradient 0.563004/1e-006 TRAINGDX, Epoch 50/5000, SSE 25.4485/0.1, Gradient 0.363459/1e-006 TRAINGDX, Epoch 75/5000, SSE 25.532/0.1, Gradient 0.333604/1e-006 TRAINGDX, Epoch 100/5000, SSE 25.4204/0.1, Gradient 0.407/1e-006 TRAINGDX, Epoch 125/5000, SSE 24.5179/0.1, Gradient 0.590354/1e-006 TRAINGDX, Epoch 150/5000, SSE 20.1153/0.1, Gradient 1.03697/1e-006

TRAINGDX, Epoch 175/5000, SSE 3.3111/0.1, Gradient 0.741009/1e-006 TRAINGDX, Epoch 199/5000, SSE 0.0903872/0.1, Gradient 0.0562929/1e-006 TRAINGDX, Performance goal met. Pass = 1 TRAINGDX, Epoch 0/300, SSE 4.38727/0.6, Gradient 3.91707/1e-006 TRAINGDX, Epoch 25/300, SSE 2.34149/0.6, Gradient 1.61314/1e-006 TRAINGDX, Epoch 50/300, SSE 1.66179/0.6, Gradient 0.87181/1e-006 TRAINGDX, Epoch 75/300, SSE 0.860046/0.6, Gradient 0.390595/1e-006 TRAINGDX, Epoch 89/300, SSE 0.595819/0.6, Gradient 0.201143/1e-006 TRAINGDX, Performance goal met. Pass = 2 TRAINGDX, Epoch 0/300, SSE 1.69802/0.6, Gradient 2.60629/1e-006 TRAINGDX, Epoch 25/300, SSE 1.0446/0.6, Gradient 0.670983/1e-006 TRAINGDX, Epoch 50/300, SSE 0.809773/0.6, Gradient 0.387906/1e-006 TRAINGDX, Epoch 73/300, SSE 0.597462/0.6, Gradient 0.196257/1e-006 TRAINGDX, Performance goal met. Pass = 3 TRAINGDX, Epoch 0/300, SSE 1.84056/0.6, Gradient 3.45909/1e-006 TRAINGDX, Epoch 25/300, SSE 1.03318/0.6, Gradient 0.911047/1e-006 TRAINGDX, Epoch 50/300, SSE 0.706745/0.6, Gradient 0.415866/1e-006 TRAINGDX, Epoch 60/300, SSE 0.599226/0.6, Gradient 0.295716/1e-006 TRAINGDX, Performance goal met. Pass = 4 TRAINGDX, Epoch 0/300, SSE 1.60648/0.6, Gradient 2.3488/1e-006 TRAINGDX, Epoch 25/300, SSE 1.01923/0.6, Gradient 0.815083/1e-006 TRAINGDX, Epoch 50/300, SSE 0.784938/0.6, Gradient 0.396484/1e-006 TRAINGDX, Epoch 74/300, SSE 0.592602/0.6, Gradient 0.188876/1e-006 TRAINGDX, Performance goal met. Pass = 5 TRAINGDX, Epoch 0/300, SSE 3.77602/0.6, Gradient 4.17101/1e-006 TRAINGDX, Epoch 25/300, SSE 1.53422/0.6, Gradient 2.16367/1e-006 TRAINGDX, Epoch 50/300, SSE 0.825853/0.6, Gradient 0.63716/1e-006 TRAINGDX, Epoch 71/300, SSE 0.594703/0.6, Gradient 0.226427/1e-006 TRAINGDX, Performance goal met. Pass = 6 TRAINGDX, Epoch 0/300, SSE 1.42439/0.6, Gradient 2.20268/1e-006 TRAINGDX, Epoch 25/300, SSE 0.834845/0.6, Gradient 0.700766/1e-006 TRAINGDX, Epoch 50/300, SSE 0.625815/0.6, Gradient 0.35906/1e-006

TRAINGDX, Epoch 54/300, SSE 0.594342/0.6, Gradient 0.318662/1e-006 TRAINGDX, Performance goal met. Pass = 7 TRAINGDX, Epoch 0/300, SSE 2.09651/0.6, Gradient 3.81564/1e-006 TRAINGDX, Epoch 25/300, SSE 0.897211/0.6, Gradient 1.60762/1e-006 TRAINGDX, Epoch 49/300, SSE 0.597725/0.6, Gradient 0.519457/1e-006 TRAINGDX, Performance goal met. Pass = 8 TRAINGDX, Epoch 0/300, SSE 0.923138/0.6, Gradient 1.46303/1e-006 TRAINGDX, Epoch 25/300, SSE 0.66187/0.6, Gradient 0.533742/1e-006 TRAINGDX, Epoch 36/300, SSE 0.597901/0.6, Gradient 0.448717/1e-006 TRAINGDX, Performance goal met. Pass = 9 TRAINGDX, Epoch 0/300, SSE 3.00343/0.6, Gradient 2.66202/1e-006 TRAINGDX, Epoch 25/300, SSE 1.97339/0.6, Gradient 1.36688/1e-006 TRAINGDX, Epoch 50/300, SSE 1.05923/0.6, Gradient 0.830243/1e-006 TRAINGDX, Epoch 75/300, SSE 0.612867/0.6, Gradient 0.277627/1e-006 TRAINGDX, Epoch 76/300, SSE 0.599021/0.6, Gradient 0.247759/1e-006 TRAINGDX, Performance goal met. Pass = 10 TRAINGDX, Epoch 0/300, SSE 3.92135/0.6, Gradient 4.29976/1e-006 TRAINGDX, Epoch 25/300, SSE 2.54259/0.6, Gradient 1.46508/1e-006 TRAINGDX, Epoch 50/300, SSE 1.95622/0.6, Gradient 0.593645/1e-006 TRAINGDX, Epoch 75/300, SSE 1.54477/0.6, Gradient 0.266081/1e-006 TRAINGDX, Epoch 100/300, SSE 1.32874/0.6, Gradient 0.103221/1e-006 TRAINGDX, Epoch 125/300, SSE 1.18632/0.6, Gradient 0.0412091/1e-006 TRAINGDX, Epoch 150/300, SSE 1.00956/0.6, Gradient 1.05356/1e-006 TRAINGDX, Epoch 160/300, SSE 0.348182/0.6, Gradient 0.958822/1e-006 TRAINGDX, Performance goal met. TRAINGDX, Epoch 0/500, SSE 0.0340377/0.1, Gradient 0.0559476/1e-006 TRAINGDX, Performance goal met. Testing networks with noise level of 0.00. Testing networks with noise level of 0.05. Testing networks with noise level of 0.10. Testing networks with noise level of 0.15. Testing networks with noise level of 0.20. Testing networks with noise level of 0.25. Testing networks with noise level of 0.30.

Testing networks with noise level of 0.35. Testing networks with noise level of 0.40. Testing networks with noise level of 0.45. Testing networks with noise level of 0.50. >>noisyG=alphabet(:,7)+randn(35,1)*0.2; figure; plotchar(noisyG); G=sim(netn,noisyG); G=compet(G); answer=find(compet(G)); figure; plotchar(alphabet(:,answer));



热文推荐
猜你喜欢
友情链接: 医学资料大全 农林牧渔 幼儿教育心得 小学教育 中学 高中 职业教育 成人教育 大学资料 求职职场 职场文档 总结汇报