加载中…
个人资料
  • 博客等级:
  • 博客积分:
  • 博客访问:
  • 关注人气:
  • 获赠金笔:0支
  • 赠出金笔:0支
  • 荣誉徽章:
正文 字体大小:

matlab神经网络的工程实例(超级详细)

(2016-06-02 16:41:05)
标签:

文化

it

教育

介绍神经网络算法在机械结构优化中的应用的例子

(大家要学习的时候只需要把输入输出变量更改为你自己的数据既可以了,如果看完了还有问题的话可以加我微博“极南师兄”给我留言,与大家共同进步)


http://s5/small/002Wb5Sdgy72ao4eWuEa4&690

把一个结构的8个尺寸参数设计为变量,如上图所示,

对应的质量,温差,面积作为输出。用神经网络拟合变量与输出的数学模型,首相必须要有数据来源,这里我用复合中心设计法则构造设计点,根据规则,八个变量将构造出81个设计点。然后在ansys workbench中进行81次仿真(先在proe建模并设置变量,将模型导入wokbench中进行相应的设置,那么就会自动的完成81次仿真,将结果导出来exceel文件)

 

Matlab程序如下

 

P=

[20   2.5        14.9 16.5             14.9 16.5

15     2.5        14.9 16.5             14.9 16.5

25     2.5        14.9 16.5             14.9 16.5

20               14.9 16.5             14.9  16.5

20               14.9 16.5             14.9 16.5

20     2.5        14.9  16.5            14.9 16.5

20     2.5   10     14.9 16.5             14.9 16.5

20     2.5        10     16.5              14.9 16.5

20     2.5        19.8 16.5 6       14.9 16.5

20     2.5        14.9 10          14.9 16.5

20     2.5        14.9 23          14.9 16.5

20     2.5        14.9 16.5 2       14.9 16.5

20     2.5        14.9 16.5 10     14.9 16.5

20     2.5        14.9 16.5 6       10     16.5

20     2.5        14.9 16.5 6       19.8 16.5

20     2.5        14.9 16.5 6       14.9 10

20     2.5        14.9 16.5 6       14.9 23

17.51238947   1.75371684      4.009911573   12.46214168   13.26610631   4.009911573   12.46214168   19.73389369

22.48761053   1.75371684      4.009911573   12.46214168   13.26610631   4.009911573   12.46214168   13.26610631

17.51238947   3.24628316      4.009911573   12.46214168   13.26610631   4.009911573   17.33785832   19.73389369

22.48761053   3.24628316      4.009911573   12.46214168   13.26610631   4.009911573   17.33785832   13.26610631

17.51238947   1.75371684      7.990088427   12.46214168   13.26610631   4.009911573   17.33785832   19.73389369

22.48761053   1.75371684      7.990088427   12.46214168   13.26610631   4.009911573   17.33785832   13.26610631

17.51238947   3.24628316      7.990088427   12.46214168   13.26610631   4.009911573   12.46214168   19.73389369

22.48761053   3.24628316      7.990088427   12.46214168   13.26610631   4.009911573   12.46214168   13.26610631

17.51238947   1.75371684      4.009911573   17.33785832   13.26610631   4.009911573   17.33785832   13.26610631

22.48761053   1.75371684      4.009911573   17.33785832   13.26610631   4.009911573   17.33785832   19.73389369

17.51238947   3.24628316      4.009911573   17.33785832   13.26610631   4.009911573   12.46214168   13.26610631

22.48761053   3.24628316      4.009911573   17.33785832   13.26610631   4.009911573   12.46214168   19.73389369

17.51238947   1.75371684      7.990088427   17.33785832   13.26610631   4.009911573   12.46214168   13.26610631

22.48761053   1.75371684      7.990088427   17.33785832   13.26610631   4.009911573   12.46214168   19.73389369

17.51238947   3.24628316      7.990088427   17.33785832   13.26610631   4.009911573   17.33785832   13.26610631

22.48761053   3.24628316      7.990088427   17.33785832   13.26610631   4.009911573   17.33785832   19.73389369

17.51238947   1.75371684      4.009911573   12.46214168   19.73389369   4.009911573   17.33785832   13.26610631

22.48761053   1.75371684      4.009911573   12.46214168   19.73389369   4.009911573   17.33785832   19.73389369

17.51238947   3.24628316      4.009911573   12.46214168   19.73389369   4.009911573   12.46214168   13.26610631

22.48761053   3.24628316      4.009911573   12.46214168   19.73389369   4.009911573   12.46214168   19.73389369

17.51238947   1.75371684      7.990088427   12.46214168   19.73389369   4.009911573   12.46214168   13.26610631

22.48761053   1.75371684      7.990088427   12.46214168   19.73389369   4.009911573   12.46214168   19.73389369

17.51238947   3.24628316      7.990088427   12.46214168   19.73389369   4.009911573   17.33785832   13.26610631

22.48761053   3.24628316      7.990088427   12.46214168   19.73389369   4.009911573   17.33785832   19.73389369

17.51238947   1.75371684      4.009911573   17.33785832   19.73389369   4.009911573   12.46214168   19.73389369

22.48761053   1.75371684      4.009911573   17.33785832   19.73389369   4.009911573   12.46214168   13.26610631

17.51238947   3.24628316      4.009911573   17.33785832   19.73389369   4.009911573   17.33785832   19.73389369

22.48761053   3.24628316      4.009911573   17.33785832   19.73389369   4.009911573   17.33785832   13.26610631

17.51238947   1.75371684      7.990088427   17.33785832   19.73389369   4.009911573   17.33785832   19.73389369

22.48761053   1.75371684      7.990088427   17.33785832   19.73389369   4.009911573   17.33785832   13.26610631

17.51238947   3.24628316      7.990088427   17.33785832   19.73389369   4.009911573   12.46214168   19.73389369

22.48761053   3.24628316      7.990088427   17.33785832   19.73389369   4.009911573   12.46214168   13.26610631

17.51238947   1.75371684      4.009911573   12.46214168   13.26610631   7.990088427   17.33785832   13.26610631

22.48761053   1.75371684      4.009911573   12.46214168   13.26610631   7.990088427   17.33785832   19.73389369

17.51238947   3.24628316      4.009911573   12.46214168   13.26610631   7.990088427   12.46214168   13.26610631

22.48761053   3.24628316      4.009911573   12.46214168   13.26610631   7.990088427   12.46214168   19.73389369

17.51238947   1.75371684      7.990088427   12.46214168   13.26610631   7.990088427   12.46214168   13.26610631

22.48761053   1.75371684      7.990088427   12.46214168   13.26610631   7.990088427   12.46214168   19.73389369

17.51238947   3.24628316      7.990088427   12.46214168   13.26610631   7.990088427   17.33785832   13.26610631

22.48761053   3.24628316      7.990088427   12.46214168   13.26610631   7.990088427   17.33785832   19.73389369

17.51238947   1.75371684      4.009911573   17.33785832   13.26610631   7.990088427   12.46214168   19.73389369

22.48761053   1.75371684      4.009911573   17.33785832   13.26610631   7.990088427   12.46214168   13.26610631

17.51238947   3.24628316      4.009911573   17.33785832   13.26610631   7.990088427   17.33785832   19.73389369

22.48761053   3.24628316      4.009911573   17.33785832   13.26610631   7.990088427   17.33785832   13.26610631

17.51238947   1.75371684      7.990088427   17.33785832   13.26610631   7.990088427   17.33785832   19.73389369

22.48761053   1.75371684      7.990088427   17.33785832   13.26610631   7.990088427   17.33785832   13.26610631

17.51238947   3.24628316      7.990088427   17.33785832   13.26610631   7.990088427   12.46214168   19.73389369

22.48761053   3.24628316      7.990088427   17.33785832   13.26610631   7.990088427   12.46214168   13.26610631

17.51238947   1.75371684      4.009911573   12.46214168   19.73389369   7.990088427   12.46214168   19.73389369

22.48761053   1.75371684      4.009911573   12.46214168   19.73389369   7.990088427   12.46214168   13.26610631

17.51238947   3.24628316      4.009911573   12.46214168   19.73389369   7.990088427   17.33785832   19.73389369

22.48761053   3.24628316      4.009911573   12.46214168   19.73389369   7.990088427   17.33785832   13.26610631

17.51238947   1.75371684      7.990088427   12.46214168   19.73389369   7.990088427   17.33785832   19.73389369

22.48761053   1.75371684      7.990088427   12.46214168   19.73389369   7.990088427   17.33785832   13.26610631

17.51238947   3.24628316      7.990088427   12.46214168   19.73389369   7.990088427   12.46214168   19.73389369

22.48761053   3.24628316      7.990088427   12.46214168   19.73389369   7.990088427   12.46214168   13.26610631

17.51238947   1.75371684      4.009911573   17.33785832   19.73389369   7.990088427   17.33785832   13.26610631

22.48761053   1.75371684      4.009911573   17.33785832   19.73389369   7.990088427   17.33785832   19.73389369

17.51238947   3.24628316      4.009911573   17.33785832   19.73389369   7.990088427   12.46214168   13.26610631

22.48761053   3.24628316      4.009911573   17.33785832   19.73389369   7.990088427   12.46214168   19.73389369

17.51238947   1.75371684      7.990088427   17.33785832   19.73389369   7.990088427   12.46214168   13.26610631

22.48761053   1.75371684      7.990088427   17.33785832   19.73389369   7.990088427   12.46214168   19.73389369

17.51238947   3.24628316      7.990088427   17.33785832   19.73389369   7.990088427   17.33785832   13.26610631

22.48761053   3.24628316      7.990088427   17.33785832   19.73389369   7.990088427   17.33785832   19.73389369

]';%注意因为本人做了81组仿真试验,这里的矩阵后面有转置符号,在神经网络模型中,输入P的是8X81的矩阵(把程序复制过来之后格式没对齐,大家自己调整一下啦),对应的下面的输出T的是3x81的矩阵。

T=[150.749       2.28499   13.466

165.148   2.64021   9.6525

138.061   1.92976   17.2795

149.446   2.25704   13.766

151.642   2.31293   13.166

147.146   2.22947   14.062

154.131   2.3405      12.87

144.164   2.2576      13.76

155.889   2.31237   13.172

150.646   2.28499   13.466

150.621   2.28499   13.466

147.091   2.22947   14.062

154.166   2.3405      12.87

144.289   2.2576      13.76

155.553   2.31237   13.172

150.653   2.28499   13.466

150.704   2.28499   13.466

148.424   2.37609   12.4879

134.952   2.01917   16.3197

154.264   2.41865   12.0311

141.207   2.06864   15.7885

156.492   2.44051   11.7964

142.671   2.08358   15.6282

152.473   2.44664   11.7306

138.329   2.09663   15.488

159.696   2.41252   12.0969

145.947   2.05559   15.9287

155.401   2.41865   12.0311

141.73      2.06864   15.7885

157.408   2.45858   11.6024

144.1        2.10166   15.4341

163.483   2.50114   11.1455

150.483   2.15114   14.9029

154.111   2.3943      12.2924

140.418   2.03738   16.1242

149.253   2.40044   12.2266

135.997   2.05043   15.984

151.518   2.4223      11.9919

137.257   2.06537   15.8237

158.05      2.46485   11.535

143.739   2.11485   15.2925

153.641   2.3943      12.2924

140.723   2.03738   16.1242

158.956   2.43686   11.8355

146.933   2.08685   15.593

160.731   2.4768      11.4068

149.315   2.11987   15.2386

156.842   2.48293   11.341

145.17      2.13292   15.0984

156.942   2.45858   11.6024

143.948   2.10166   15.4341

152.503   2.44664   11.7306

138.486   2.09663   15.488

154.84      2.4685      11.4959

139.795   2.11157   15.3276

161.574   2.52914   10.845

147.502   2.17913   14.6024

156.975   2.44051   11.7964

143.06      2.08358   15.6282

162.688   2.50114   11.1455

150.483   2.15114   14.9029

164.588   2.54108   10.7168

153.024   2.18415   14.5485

160.908   2.52914   10.845

147.794   2.17913   14.6024

151.437   2.4223      11.9919

137.386   2.06537   15.8237

156.979   2.48293   11.341

144.915   2.13292   15.0984

159.167   2.50479   11.1063

146.229   2.14786   14.9381

155.699   2.49285   11.2345

140.767   2.14284   14.992

161.782   2.4768      11.4068

149.124   2.11987   15.2386

157.819   2.46485   11.535

143.8        2.11485   15.2925

159.553   2.50479   11.1063

146.186   2.14786   14.9381

166.512   2.56542   10.4554

153.896   2.21542   14.2129

]'; % T 为目标矢量

[PP,ps]=mapminmax(P,-1,1); %P归一化处理变为pp,在范围(-1,1)内

%T归一化处理变TT,在范围(-1,1)内,归一化主要是为了消除不通量岗对结果的影响

[TT,ps]=mapminmax(T,-1,1);

创建三层前向神经网络,隐层神经元为15输出层神经元为3

net=newff(minmax(PP),[15,3],{'tansig','purelin'},'traingdm')

% ---------------------------------------------------------------

% 训练函数:traingdm,功能:以动量BP算法修正神经网络的权值和阈值。

% 它的相关特性包括:

% epochs:训练的次数,默认:100

% goal:误差性能目标值,默认:0

% lr:学习率,默认:0.01

% max_fail:确认样本进行仿真时,最大的失败次数,默认:5

% mc:动量因子,默认:0.9

% min_grad:最小梯度值,默认:1e-10

% show:显示的间隔次数,默认:25

% time:训练的最长时间,默认:inf

% ---------------------------------------------------------------

inputWeights=net.IW{1,1}  %当前输入层权值和阈值

inputbias=net.b{1} 

当前网络层权值和阈值 

layerWeights=net.LW{2,1} 

layerbias=net.b{2} 

设置网络的训练参数 

net.trainParam.show = 2; 

net.trainParam.lr = 0.05; 

net.trainParam.mc = 0.9; 

net.trainParam.epochs =10000; 

net.trainParam.goal = 1e-3;  

调用 TRAINGDM 算法训练 BP 网络  (在构建net中有说明)

[net,tr]=train(net,PP,TT); 

A = sim(net,PP) ; %  BP 网络进行仿真, 

A=mapminmax('reverse',A,ps) ; %  A矩阵进行反归一化处理()

计算仿真误差 

E = T - A 

MSE=mse(E) 

echo off

 

按上面的运行之后结果如图所示。

http://s5/small/002Wb5Sdgy72aob8LLC34&690


如果输出值与目标值完全相等则R=1,这里已经非常接近了,说明效果拟合效果还是可以的,右图是训练过程的平方和误差变化,达到我们指定的误差0.001时候,训练停止。

0

阅读 收藏 喜欢 打印举报/Report
  

新浪BLOG意见反馈留言板 欢迎批评指正

新浪简介 | About Sina | 广告服务 | 联系我们 | 招聘信息 | 网站律师 | SINA English | 产品答疑

新浪公司 版权所有