簡體   English   中英

神經網絡訓練Matlab parfor問題

[英]Neural network train matlab parfor problems

我正在嘗試找出哪里有錯誤。 如果您能幫助我,將非常高興。

這是我的問題:

在串行火車中,來自神經網絡工具箱的函數以一種方式運行,但是當我將其放入parfor循環中時,一切都會變得瘋狂。

>> version

ans =

8.3.0.532 (R2014a)

這是一個功能

function per = neuralTr(tSet,Y,CrossVal,Ycv)

hiddenLayerSize = 94;
redeT = patternnet(hiddenLayerSize);
redeT.input.processFcns = {'removeconstantrows','mapminmax'};
redeT.output.processFcns = {'removeconstantrows','mapminmax'};
redeT.divideFcn = 'dividerand';  % Divide data randomly
redeT.divideMode = 'sample';  % Divide up every sample
redeT.divideParam.trainRatio = 80/100;
redeT.divideParam.valRatio = 10/100;
redeT.divideParam.testRatio = 10/100;
redeT.trainFcn = 'trainscg';  % Scaled conjugate gradient
redeT.performFcn = 'crossentropy';  % Cross-entropy
redeT.trainParam.showWindow=0; %default is 1)
redeT = train(redeT,tSet,Y);    
outputs = sim(redeT,CrossVal);
per = perform(redeT,Ycv,outputs);

end

這是我輸入的代碼:

Data loaded in workspace
whos
        Name            Size              Bytes  Class     Attributes

        CrossVal      282x157            354192  double
        Y               2x363              5808  double
        Ycv             2x157              2512  double
        per             1x1                   8  double
        tSet          282x363            818928  double

串行執行功能

per = neuralTr(tSet,Y,CrossVal,Ycv)

        per =

        0.90

平行開始

>> parpool local
Starting parallel pool (parpool) using the 'local' profile ... connected to 12 workers.

ans = 

 Pool with properties: 

            Connected: true
           NumWorkers: 12
              Cluster: local
        AttachedFiles: {}
          IdleTimeout: Inf (no automatic shut down)
          SpmdEnabled: true

並行初始化和執行功能12次

per = cell(12,1);
parfor ii = 1 : 12
per{ii} = neuralTr(tSet,Y,CrossVal,Ycv);
end        

per

per = 

    [0.96]
    [0.83]
    [0.92]
    [1.08]
    [0.85]
    [0.89]
    [1.06]
    [0.83]
    [0.90]
    [0.93]
    [0.95]
    [0.81]

再次執行以查看隨機初始化是否帶來不同的值

per = cell(12,1);
parfor ii = 1 : 12
per{ii} = neuralTr(tSet,Y,CrossVal,Ycv);
end
per

per = 

    [0.96]
    [0.83]
    [0.92]
    [1.08]
    [0.85]
    [0.89]
    [1.06]
    [0.83]
    [0.90]
    [0.93]
    [0.95]
    [0.81]

編輯1:僅針對

per = cell(12,1);
for ii = 1 : 12
    per{ii} = neuralTr(tSet,Y,CrossVal,Ycv);
end
    per

    per =

    [0.90]
    [0.90]
    [0.90]
    [0.90]
    [0.90]
    [0.90]
    [0.90]
    [0.90]
    [0.90]
    [0.90]
    [0.90]
    [0.90]

編輯2:我修改了我的功能,現在一切正常。 也許問題在於並行分割數據。 所以我將數據分割后發送到並行。 TKS很多

function per = neuralTr(tSet,Y,CrossVal,Ycv)
indt = 1:round(size(tSet,2) * 0.8) ;
indv = round(size(tSet,2) * 0.8):round(size(tSet,2) * 0.9);
indte = round(size(tSet,2) * 0.9):size(tSet,2);
hiddenLayerSize = 94;
redeT = patternnet(hiddenLayerSize);
redeT.input.processFcns = {'removeconstantrows','mapminmax'};
redeT.output.processFcns = {'removeconstantrows','mapminmax'};
redeT.divideFcn = 'dividerand';  % Divide data randomly
redeT.divideMode = 'sample';  % Divide up every sample
redeT.divideParam.trainRatio = 80/100;
redeT.divideParam.valRatio =  10/100;
redeT.divideParam.testRatio = 10/100;

redeT.trainFcn = 'trainscg';  % Scaled conjugate gradient
redeT.performFcn = 'crossentropy';  % Cross-entropy
redeT.trainParam.showWindow=0; %default is 1)
redeT = train(redeT,tSet,Y);    
outputs = sim(redeT,CrossVal);
per = zeros(12,1);
parfor ii = 1 : 12
    redes = train(redeT,tSet,Y);
    per(ii) = perform(redes,Ycv,outputs);
end
end

結果:

>> per = neuralTr(tSet,Y,CrossVal,Ycv)

per =

          0.90
          0.90
          0.90
          0.90
          0.90
          0.90
          0.90
          0.90
          0.90
          0.90
          0.90
          0.90

哦! 我想我找到了,但是無法測試。

您的代碼中有:

redeT.divideFcn = 'dividerand';  % Divide data randomly

如果每個工人都是隨機選擇數據,那么他們期望他們獲得不同的結果,不是嗎?

嘗試下一個:

per = cell(12,1);
parfor ii = 1 : 12
   rng(1); % set the seed for random number generation, so every time the number generated will be the same
   per{ii} = neuralTr(tSet,Y,CrossVal,Ycv);
end
per

不知道neuralTr是否確實將種子設置在里面,但可以neuralTr一下。

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM