简体   繁体   中英

Is scipy.optimize.least_squares deterministic?

I am using scipy 's optimize.least_squares algorithm with set initial conditions and always get the same result on my Computer, if however I try this on any other Computer (all with latest Scipy, Python, and bumpy packages and the same 64 Bit Ubuntu Linux), I get different results at each PC. Why is this?

Thank you.

The answer is yes.

As you can find on the document , there are 3 methods implemented on optimize.least_squares ;

  1. Trust Region Reflective algorithm
  2. dogleg algorithm with rectangular trust regions
  3. Levenberg-Marquardt algorithm

All of these are iterative methods which start from an initial value (or vector) and go to the minimum value step by step. How to determine this step is different from method to method, but is deterministic in all methods.
For more detail, you can read this blog written by the developer of this function.

I'm not sure why you get different results at each PC, sorry.

By default (if diff_step=None ), optimize.least_squares uses a machine-dependent step size for the finite difference approximation [1]. For a particular computer, the result should be deterministic, but it could be different on another computer.

Unless there are other machine-dependent parameters, setting diff_step manually should yield the same results on different machines.

[1] See the scipy documentation .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM