如何解决无法处理 Py4JJavaError
尝试处理在向 spark.sql() 输入无效 sql 代码时引发的 Py4JJavaError。我的功能如下:
org.apache.spark.sql.catalyst.parser.ParseException:
---------------------------------------------------------------------------
Py4JJavaError Traceback (most recent call last)
/databricks/spark/python/pyspark/sql/utils.py in deco(*a,**kw)
62 try:
---> 63 return f(*a,**kw)
64 except py4j.protocol.Py4JJavaError as e:
/databricks/spark/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py in get_return_value(answer,gateway_client,target_id,name)
327 "An error occurred while calling {0}{1}{2}.\n".
--> 328 format(target_id,".",name),value)
329 else:
Py4JJavaError: An error occurred while calling o213.sql.
: org.apache.spark.sql.catalyst.parser.ParseException:
mismatched input 'sd' expecting {'(','SELECT','FROM','ADD','DESC','WITH','VALUES','CREATE','TABLE','INSERT','DELETE','DESCRIBE','EXPLAIN','SHOW','USE','DROP','ALTER','MAP','SET','RESET','START','COMMIT','ROLLBACK','MERGE','UPDATE','CONVERT','REDUCE','REFRESH','CLEAR','CACHE','UNCACHE','DFS','TruncATE','ANALYZE','LIST','REVOKE','GRANT','LOCK','UNLOCK','MSCK','EXPORT','IMPORT','LOAD','OPTIMIZE','copY'}(line 1,pos
0)
== sql ==
sd
^^^
// SCREEN A
// This is the callback method I will send through my navigation
const [cartProducts,setCartProducts] = useState(Global.cart); // I have a global cart variable which saves the products in cart.
const updateCart = (cartProducts) => {
setCartProducts(cartProducts);
};
当我在 sql_string 参数中使用无效的 sql 运行函数时,错误没有得到处理,它仍然引发相同的错误代码,而不是“异常(f'无效的 sql 代码由 {sql_string} 传入。')'。如果有人能弄清楚为什么这没有得到正确处理,我将不胜感激:)
版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。