简体   繁体   中英

Kernel Restarting on jupyter notebook

I have run into a issue during the development of the next code:

import nlopt

import numpy as np

import time

def Rosenbrock(x):

N=len(x)

x1=x[0:N-1]

x2=x[1:N]

return(sum(100*np.square(x2 - np.square(x1)) + np.square(np.ones(N-1) - x1)))

def myfunc1(x, grad):

    if grad.size > 0:

        grad[:]=NULL

    return Rosenbrock(x)`

def myfunc2(x, grad):

    if grad.size > 0:

        grad[:]=Rosen_grad(x)

    return Rosenbrock(x)

names = ["LN_SBPLX", "LN_NELDERMEAD", "LN_PRAXIS", "LN_NEWUOA", "LN_BOBYQA", "LN_COBYLA", "LD_MMA", "LD_LBFGS"]

j=2

for i in range(len(names)):

ini = time.time()
print('entra en el primer loop')
while time.time()-ini < 180:
    x0 = np.repeat(0,j)
    print(names[i])
    a = 'nlopt.' + names[i]
    opt = nlopt.opt(a, len(x0))
    print(a)
    if(i == "LD_MMA" or i == "LD_LBFGS" ): #Depending on the method we have to change the function to optimize
        opt.set_min_objective(myfunc2) 
    else :
        opt.set_min_objective(myfunc1)
    opt.set_lower_bounds(np.repeat(-10, len(x0)))
    opt.set_upper_bounds(np.repeat(10, len(x0)))
    opt.set_xtol_rel(0)
    opt.set_stopval(1e-8)
    start=time.time()
    x = opt.optimize(x0)
    end=time.time()
    with open('results' + i, 'w') as f:
        f.write([i,end-start,opt.last_optimize_result()])
    f.close()of 
   j+=1

As you may see I'm using nlopt to compute some optimizations of the Rosenbrock function and then save each case on different files. When I run this code on Jupyter I have a message of error as you may see in the image. Jupyter 上的错误消息

I'm not sure if the problem is in the loop which calls the function nlopt.opt() or just some problem of compability with the enviroment.

Thanks for the help:)

Your assumption is correct. The list names must be defined differently:

names = [nlopt.LN_NELDERMEAD, nlopt.LN_NEWUOA, nlopt.LN_BOBYQA, nlopt.LN_COBYLA]

For testing I reduced the number of algorithms and the dimension of the function to 2.

j=2

with open('results_dim_' + str(j) + '.txt', 'w') as f:

    for i in range(len(names)):

        ini = time.time()
        print('entra en el primer loop')
        #while time.time()-ini < 180:
        x0 = np.repeat(0,j)
        print(names[i])
        opt = nlopt.opt(names[i], len(x0))
        if((names[i] == nlopt.LD_MMA) or (names[i] == nlopt.LD_LBFGS)): #Depending on the method we have to change the function to optimize
            opt.set_min_objective(myfunc2) 
        else :
            opt.set_min_objective(myfunc1)
        opt.set_lower_bounds(np.repeat(-10, len(x0)))
        opt.set_upper_bounds(np.repeat(10, len(x0)))
        opt.set_xtol_rel(1e-8)
        opt.set_stopval(0)
        start=time.time()
        x = opt.optimize(x0)
        end=time.time()
        f.write(str(names[i]) + '\t' + str(end-start) + '\t' + str(x) + '\t' + str(opt.last_optimum_value()) + '\n')
f.close() 

I created one file for one dimension with all algorithm results. The algorithm list is an enumeration of named int's but it's not possible to print the names. You can use opt.get_algorithm_name() to get a string description but it's not the enumeration name. For LN_NELDERMEAD you get 'Nelder-Mead simplex algorithm (local, no-derivative)'. I also corrected the if statement for the LD algorithms that need a gradient but I could not test it because you did not provide the gradient function Rosen_grad(x).

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM