如何解决FileNotFoundError: [WinError 2] 系统找不到指定的文件 - PySpark
操作系统 - Windows 开发环境 - Eclipse PyDev
PySpark 代码 -
import sparknlp,os
from pyspark.sql.types import StringType,IntegerType
from sparknlp.base import *
from pyspark.sql import SparkSession
from sparknlp.annotator import *
sys.path.append("D:/Python/")
spark = SparkSession.builder \
.appName("Spark NLP")\
.master("local[4]")\
.config("spark.driver.memory","16G")\
.config("spark.driver.maxResultSize","0") \
.config("spark.jars.packages","com.johnsNowlabs.nlp:spark-nlp_2.11:2.7.4")\
.config("spark.kryoserializer.buffer.max","1000M")\
.getorCreate()
spark = sparknlp.start()
错误追溯 -
Traceback (most recent call last):
File "D:\Workspace\NLP-Parser\src\Parser.py",line 15,in <module>
spark = SparkSession.builder \
File "D:\Python\lib\site-packages\pyspark\sql\session.py",line 186,in getorCreate
sc = SparkContext.getorCreate(sparkConf)
File "D:\Python\lib\site-packages\pyspark\context.py",line 376,in getorCreate
SparkContext(conf=conf or SparkConf())
File "D:\Python\lib\site-packages\pyspark\context.py",line 133,in __init__
SparkContext._ensure_initialized(self,gateway=gateway,conf=conf)
File "D:\Python\lib\site-packages\pyspark\context.py",line 325,in _ensure_initialized
SparkContext._gateway = gateway or launch_gateway(conf)
File "D:\Python\lib\site-packages\pyspark\java_gateway.py",line 99,in launch_gateway
proc = Popen(command,**popen_kwargs)
File "D:\Python\lib\subprocess.py",line 947,in __init__
self._execute_child(args,executable,preexec_fn,close_fds,File "D:\Python\lib\subprocess.py",line 1416,in _execute_child
hp,ht,pid,tid = _winapi.CreateProcess(executable,args,FileNotFoundError: [WinError 2] The system cannot find the file specified
版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。