运行简单的代码后出现错误。谢谢

这是我的代码

input_training_data = HandWritingData(random_list,:)'; %size = [256X788]
target_training_data = Category(random_list,:)'; %size = [10x788]
net = newff(minmax(input_training_data), [15 10], {'logsig','logsig'} , 'hardlim' );



Error in ==> Neural_Network_code at 19
net = newff(minmax(input_training_data), [15 10],
{'logsig','logsig'} , 'hardlim' ); %create a network

最佳答案

出于多种原因,您不想使用newff,但是想要使用RTFM:

newff创建前馈反向传播网络。

在R2010b NNET 7.0中已作废。上次使用于R2010a NNET 6.0.4。
推荐的功能是feedforwardnet。

句法

 net = newff(P,T,S)
 net = newff(P,T,S,TF,BTF,BLF,PF,IPF,OPF,DDF)


描述

 newff(P,T,S) takes,
   P  - RxQ1 matrix of Q1 representative R-element input vectors.
   T  - SNxQ2 matrix of Q2 representative SN-element target vectors.
   Si  - Sizes of N-1 hidden layers, S1 to S(N-1), default = [].
         (Output layer size SN is determined from T.)
 and returns an N layer feed-forward backprop network.

 newff(P,T,S,TF,BTF,BLF,PF,IPF,OPF,DDF) takes optional inputs,
   TFi - Transfer function of ith layer. Default is 'tansig' for
         hidden layers, and 'purelin' for output layer.
   BTF - Backprop network training function, default = 'trainlm'.
   BLF - Backprop weight/bias learning function, default = 'learngdm'.
   PF  - Performance function, default = 'mse'.
   IPF - Row cell array of input processing functions.
         Default is {'fixunknowns','remconstantrows','mapminmax'}.
   OPF - Row cell array of output processing functions.
         Default is {'remconstantrows','mapminmax'}.
   DDF - Data division function, default = 'dividerand';
 and returns an N layer feed-forward backprop network.


hardlim不是传递函数。使用trainbg的trainlm。

net = newff(minmax(input_training_data), [15 10], {'logsig','logsig'} , 'trainlm' );

09-25 18:48