如何解决scipy.optimize.minimize 使用 L-BFGS-B 返回 ABNORMAL_TERMINATION_IN_LNSRCH
我正在使用 scipy.optimize.minimize 来解决/计算 314 次仿射变换中的 3768 个变量以将一个点云转换为另一个 eval_fun,首先我尝试使用 scipy.optimize.fmin_l_bfgs_b 和 approx_grad 来避免实现梯度,但它一直崩溃与 ABnorMAL_TERMINATION_IN_LNSRCH。然后我实现了梯度函数并开始使用最小化,但它仍然以相同的错误崩溃。
res = minimize(eval_func,x0=np.array(M),method='L-BFGS-B',args=(scan_pts,scan_fce,scan_nrm,scan_mar,temp_pts,temp_fce,temp_nrm,temp_mar,alfa,beta,gama,edges),options={'iprint' : 99,'maxiter' : 100},jac=True)
这是完整的输出:
This problem is unconstrained.
Line search cannot locate an adequate point after 20 function
and gradient evaluations. PrevIoUs x,f and g restored.
Possible causes: 1 error in function or gradient evaluation;
2 rounding error dominate computation.
RUNNING THE L-BFGS-B CODE
* * *
Machine precision = 2.220D-16
N = 3768 M = 10
At X0 0 variables are exactly at the bounds
At iterate 0 f= 1.88129D-01 |proj g|= 9.98119D-01
IteraTION 1
---------------- CAUCHY entered-------------------
There are 0 breakpoints
GCP found in this segment
Piece 1 --f1,f2 at start point -9.3847D+02 9.3847D+02
distance to the stationary point = 1.0000D+00
---------------- exit CAUCHY----------------------
3768 variables are free at GCP 1
* * *
Tit = total number of iterations
Tnf = total number of function evaluations
Tnint = total number of segments explored during Cauchy searches
Skip = number of BFGS updates skipped
Nact = number of active bounds at final generalized Cauchy point
Projg = norm of the final projected gradient
F = final function value
* * *
N Tit Tnf Tnint Skip Nact Projg F
3768 1 21 1 0 0 9.981D-01 1.881D-01
F = 0.18812870968000006
ABnorMAL_TERMINATION_IN_LNSRCH
我尝试更改所有值,如此处所述: scipy.optimize.fmin_l_bfgs_b returns 'ABNORMAL_TERMINATION_IN_LNSRCH' 但它们都不起作用
我能做什么?
版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。