如何解决SCS 和 MOSEK 求解器继续运行
我的应用程序使用 ECOS 求解器很长时间了,突然间我们开始得到不可行的解决方案,从而以求解器错误告终。在网上查看了一些堆栈和建议,我看到了针对 MOSEK 和 SCS 求解器的建议。
我尝试将 ECOS 替换为 SCS 和 MOSEK 求解器,但我的运行永无止境。通常我的运行会在 2 小时内结束,但在更换后运行了大约 8 小时并且永无止境。请推荐我。
下面是参数,
'solver': {'name': 'MOSEK','backup_name': 'SCS','详细':对, 'max_iters':3505}
请帮忙
错误日志:
作业因阶段失败而中止:阶段 6.0 中的任务 1934 失败 4 次,最近失败:阶段 6.0 中丢失任务 1934.3(TID 5028,ip-10-219-208-218.ec2.internal,执行器 1) :org.apache.spark.api.python.PythonException:回溯(最近一次调用): 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/py_dependencies.zip/cat/tf/tf_model/model.py”,第262行,合适 引发求解器错误 cvxpy.error.SolverError
在处理上述异常的过程中,又发生了一个异常:
回溯(最近一次调用最后一次): 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/miniconda/envs/project/lib/python3.6/site-packages/cvxpy/expressions/constants/constante_1618545751422_0044_02_000002/miniconda/envs/project/lib/python3.6/site-packages/cvxpy/expressions/constants/constant24_1618545751422_0044_02_000002 ev = SA_eigsh(sigma) 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/miniconda/envs/project/lib/python3.6/site-packages/cvxpy/expressions/constants/constant23” return eigsh(A,k=1,sigma=sigma,return_eigenvectors=False) 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/miniconda/envs/project/lib/python3.6/site-packages/scipy/sparse/linalg/argenpack,inarg7,inarg7/eigen. 参数.迭代() 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/miniconda/envs/project/lib/python3.6/site-packages/scipy/sparse/linalg/linepack/eigen7,inalg/linepack5/7.py” self._raise_no_convergence() 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/miniconda/envs/project/lib/python3.6/site-packages/scipy/sparse/linalg/argence_7,linepack/argence_751422_0044_02_000002” 提高 ArpackNoConvergence(msg % (num_iter,k_ok,self.k),ev,vec) scipy.sparse.linalg.eigen.arpack.arpack.ArpackNoConvergence:ARPACK 错误 -1:无收敛(361 次迭代,0/1 特征向量收敛)
在处理上述异常的过程中,又发生了一个异常:
回溯(最近一次调用最后一次): 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/pyspark.zip/pyspark/worker.py”,第377行,主目录 过程() 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/pyspark.zip/pyspark/worker.py”,第372行,处理中 serializer.dump_stream(func(split_index,iterator),outfile) 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/pyspark.zip/pyspark/serializers.py”,第400行,在dump_stream vs = list(itertools.islice(iterator,batch)) 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/pyspark.zip/pyspark/util.py”,第113行,包装器 返回 f(*args,**kwargs) 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000001/py_dependencies.zip/pyspark_scripts/spark_tf_pipeline.py”,第49行,在 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/py_dependencies.zip/cat/tf/tf_model/tf_get_from_smu_records38_getcord_inf_get_from_smu_records.pyfrom”, 数据点、当前日期字符串、参数) 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/py_dependencies.zip/cat/tf/tf_model/tf_get_from_smu_records24_from_fsmu_records.py”,模型输出,_ = 拟合模型(ts_wrapper,参数) 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/py_dependencies.zip/cat/tf/tf_model/fit_model.py”,第13行,在fit_model machine_model.fit() 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/py_dependencies.zip/cat/tf/tf_model/machine_model.py”,第62行,合适 self._fit() 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/py_dependencies.zip/cat/tf/tf_model/machine_model.py”,第120行,在_fit self.model.fit() 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/py_dependencies.zip/cat/tf/tf_model/model.py”,第267行,合适 self._fit(self.solver_params['backup_name']) 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/py_dependencies.zip/cat/tf/tf_model/model.py”,第245行,_fit 盛宴_inacc=tols['feastol_inacc']) 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/miniconda/envs/project/lib/python3.6/site-packages/cvxpy/problems/problem4.py1”,in line solve.py 返回 solve_func(self,*args,**kwargs) 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/miniconda/envs/project/lib/python3.6/site-packages/cvxpy/problems/problem818”,在line_solve818中 自我,数据,warm_start,详细,kwargs) 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/miniconda/envs/project/lib/python3.6/site-packages/cvxpy/reductions/solvers/solving_1618545751422_0044_02_000001 solver_opts,problem._solver_cache) 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/miniconda/envs/project/lib/python3.6/site-packages/cvxpy/reductions/solvers/conicopt,cvxpy/reductions/solvers/conicopt”,cvxpy/reductions/solvers/conicopt 如果 self.remove_redundant_rows(data) == s.INFEASIBLE: 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/miniconda/envs/project/lib/python3.6/site-packages/cvxpy/reductions/solvers/conicopt_1618545751422_0044_02_000002” eig = extremal_eig_near_ref(gram,ref=TOL) 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/miniconda/envs/project/lib/python3.6/site-packages/cvxpy/expressions/constants/constante_1618545751422_0044_02_00万 ev = SA_eigsh(sigma) 文件“envpath/appcache/application_1618545751422_0044/container_1618545751422_0044_02_000002/miniconda/envs/project/lib/python3.6/site-packages/cvxpy/expressions/constants/constant23” return eigsh(A,vec) scipy.sparse.linalg.eigen.arpack.arpack.ArpackNoConvergence:ARPACK 错误 -1:无收敛(361 次迭代,0/1 特征向量收敛)
at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.handlePythonException(PythonRunner.scala:456)
at org.apache.spark.api.python.PythonRunner$$anon$1.read(PythonRunner.scala:592)
at org.apache.spark.api.python.PythonRunner$$anon$1.read(PythonRunner.scala:575)
at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.hasNext(PythonRunner.scala:410)
at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
at org.apache.spark.sql.execution.UnsafeExternalRowSorter.sort(UnsafeExternalRowSorter.java:227)
at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$3.apply(ShuffleExchangeExec.scala:283)
at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$3.apply(ShuffleExchangeExec.scala:252)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$24.apply(RDD.scala:858)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$24.apply(RDD.scala:858)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1405)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
驱动程序堆栈跟踪:
版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。