简体   繁体   中英

Optimize parameter in nnet

I use the following R code to train my data

    >mygrid = expand.grid(.decay=seq(0.01,0.1,0.01), .size=c(10:20))
    >set.seed(25000)
    >nnetfit = train(logprice ~ ., 
                     data=traindata, 
                     method="nnet",
                     maxit=5000, #max number of iteration
                     linout=T, #F would be logistic output
                     tuneGrid=mygrid,
                     trace=F)

The result is as follow:

> print(nnetfit)

Neural Network 

10639 samples
   12 predictor

No pre-processing
Resampling: Bootstrapped (25 reps) 
Summary of sample sizes: 10639, 10639, 10639, 10639, 10639, 10639, ... 
Resampling results across tuning parameters:

  decay  size  RMSE       Rsquared   MAE      
  0.01   10    0.3026250  0.6158591  0.2379521
...

My problem is, what is

Resampling: Bootstrapped (25 reps) 
Summary of sample sizes: 10639, 10639, 10639, 10639, 10639, 10639, ... 

these two raws mean? I think the combination of decay and size would be 10*10=100 so the "25 reps" is confusing.

To get a more realistic estimate for RMSE and Rsquared , the train function (by default) uses a bootstrap resampling approach with 25 repetitions.

For each of the repetitions, your observations are randomly resampled with replacement (so you can have duplicate observations and that's OK). A nnet is developed only on that subset of your data. Whatever observations are withheld in each resampling are used to determine how well your nnet will predict logprice . The RMSE and Rsquared you see are the average values across all 25 samples.

To change the number of bootstrap samples, you have to use trainControl in the caret package ( https://topepo.github.io/caret/model-training-and-tuning.html#control ), eg to do bootstrap resampling with 50 samples:

tc <- trainControl(method="boot",number=50)
nnetfit = train(logprice ~ ., 
    ... all your other hyperparamters...
    trControl=tc)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM