For循环以列表中的名称替换字符串中的单词

如何解决For循环以列表中的名称替换字符串中的单词

我需要一个for循环,将下面html字符串中的“名称”替换为name_lst中的名称。 预期的输出应该是绘制的地图,并在下面的列表中的每个坐标处都有一个名称

[error] (run-main-0) org.apache.spark.SparkException: Job aborted due to stage failure: Task 35 in stage 3.0 Failed 1 times,most recent failure: Lost task 35.0 in stage 3.0 (TID 399,localhost,executor driver): java.lang.RuntimeException: Unsupported literal type class org.apache.spark.sql.catalyst.expressions.GenericRowWithSchema [1591772400000]
[error]     at org.apache.spark.sql.catalyst.expressions.Literal$.apply(literals.scala:75)
[error]     at org.apache.spark.sql.functions$.lit(functions.scala:101)
[error]     at org.apache.spark.sql.Column.$eq$eq$eq(Column.scala:267)
[error]     at spark_pkg.SparkMain$$anonfun$main$1.apply(SparkMain.scala:880)
[error]     at spark_pkg.SparkMain$$anonfun$main$1.apply(SparkMain.scala:878)
[error]     at scala.collection.Iterator$class.foreach(Iterator.scala:893)
[error]     at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
[error]     at org.apache.spark.rdd.RDD$$anonfun$foreach$1$$anonfun$apply$28.apply(RDD.scala:917)
[error]     at org.apache.spark.rdd.RDD$$anonfun$foreach$1$$anonfun$apply$28.apply(RDD.scala:917)
[error]     at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1944)
[error]     at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1944)
[error]     at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
[error]     at org.apache.spark.scheduler.Task.run(Task.scala:99)
[error]     at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
[error]     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[error]     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[error]     at java.lang.Thread.run(Thread.java:748)
[error] 
[error] Driver stacktrace:
[error] org.apache.spark.SparkException: Job aborted due to stage failure: Task 35 in stage 3.0 Failed 1 times,executor driver): java.lang.RuntimeException: Unsupported literal type class org.apache.spark.sql.catalyst.expressions.GenericRowWithSchema [1591772400000]
[error]     at org.apache.spark.sql.catalyst.expressions.Literal$.apply(literals.scala:75)
[error]     at org.apache.spark.sql.functions$.lit(functions.scala:101)
[error]     at org.apache.spark.sql.Column.$eq$eq$eq(Column.scala:267)
[error]     at spark_pkg.SparkMain$$anonfun$main$1.apply(SparkMain.scala:880)
[error]     at spark_pkg.SparkMain$$anonfun$main$1.apply(SparkMain.scala:878)
[error]     at scala.collection.Iterator$class.foreach(Iterator.scala:893)
[error]     at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
[error]     at org.apache.spark.rdd.RDD$$anonfun$foreach$1$$anonfun$apply$28.apply(RDD.scala:917)
[error]     at org.apache.spark.rdd.RDD$$anonfun$foreach$1$$anonfun$apply$28.apply(RDD.scala:917)
[error]     at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1944)
[error]     at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1944)
[error]     at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
[error]     at org.apache.spark.scheduler.Task.run(Task.scala:99)
[error]     at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
[error]     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[error]     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[error]     at java.lang.Thread.run(Thread.java:748)
[error] 
[error] Driver stacktrace:
[error]     at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndindependentStages(DAGScheduler.scala:1435)
[error]     at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1423)
[error]     at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1422)
[error]     at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
[error]     at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
[error]     at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1422)
[error]     at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:802)
[error]     at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:802)
[error]     at scala.Option.foreach(Option.scala:257)
[error]     at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:802)
[error]     at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1650)
[error]     at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1605)
[error]     at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1594)
[error]     at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
[error]     at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:628)
[error]     at org.apache.spark.SparkContext.runJob(SparkContext.scala:1918)
[error]     at org.apache.spark.SparkContext.runJob(SparkContext.scala:1931)
[error]     at org.apache.spark.SparkContext.runJob(SparkContext.scala:1944)
[error]     at org.apache.spark.SparkContext.runJob(SparkContext.scala:1958)
[error]     at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:917)
[error]     at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:915)
[error]     at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
[error]     at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
[error]     at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)
[error]     at org.apache.spark.rdd.RDD.foreach(RDD.scala:915)
[error]     at org.apache.spark.sql.Dataset$$anonfun$foreach$1.apply$mcV$sp(Dataset.scala:2286)
[error]     at org.apache.spark.sql.Dataset$$anonfun$foreach$1.apply(Dataset.scala:2286)
[error]     at org.apache.spark.sql.Dataset$$anonfun$foreach$1.apply(Dataset.scala:2286)
[error]     at org.apache.spark.sql.execution.sqlExecution$.withNewExecutionId(sqlExecution.scala:57)
[error]     at org.apache.spark.sql.Dataset.withNewExecutionId(Dataset.scala:2765)
[error]     at org.apache.spark.sql.Dataset.foreach(Dataset.scala:2285)
[error]     at spark_pkg.SparkMain$.main(SparkMain.scala:878)
[error]     at spark_pkg.SparkMain.main(SparkMain.scala)
[error]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[error]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[error]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[error]     at java.lang.reflect.Method.invoke(Method.java:498)
[error] Caused by: java.lang.RuntimeException: Unsupported literal type class org.apache.spark.sql.catalyst.expressions.GenericRowWithSchema [1591772400000]
[error]     at org.apache.spark.sql.catalyst.expressions.Literal$.apply(literals.scala:75)
[error]     at org.apache.spark.sql.functions$.lit(functions.scala:101)
[error]     at org.apache.spark.sql.Column.$eq$eq$eq(Column.scala:267)
[error]     at spark_pkg.SparkMain$$anonfun$main$1.apply(SparkMain.scala:880)
[error]     at spark_pkg.SparkMain$$anonfun$main$1.apply(SparkMain.scala:878)
[error]     at scala.collection.Iterator$class.foreach(Iterator.scala:893)
[error]     at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
[error]     at org.apache.spark.rdd.RDD$$anonfun$foreach$1$$anonfun$apply$28.apply(RDD.scala:917)
[error]     at org.apache.spark.rdd.RDD$$anonfun$foreach$1$$anonfun$apply$28.apply(RDD.scala:917)
[error]     at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1944)
[error]     at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1944)
[error]     at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
[error]     at org.apache.spark.scheduler.Task.run(Task.scala:99)
[error]     at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
[error]     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[error]     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[error]     at java.lang.Thread.run(Thread.java:748)
[error] stack trace is suppressed; run 'last Compile / bgRun' for the full output
[error] Nonzero exit code: 1
[error] (Compile / run) Nonzero exit code: 1
[error] Total time: 137 s (02:17),completed Aug 20,2020 1:16:02 PM

获取错误:TypeError:zip参数3必须支持迭代

如果有人可以帮助我为此创建工具提示,那么将鼠标悬停在坐标上就可以看到名称了。

解决方法

检查“ html.replace(” name“,name_lst,1)

的缩进

使用zip运行for循环没有问题。请参见下面。

对于lat,lng,zip中的名称(lat_lst,lng_lst,name_lst): 打印(lat,lng,name)

48.6064556 -56.3330408新斯科舍省 52.9399159 -106.4508639萨斯喀彻温省 46.510712 -63.4168136爱德华王子岛 51.253775 -85.3232139安大略省

,

更新,可以通过串联解决此问题:

const AddElement = (props) => {
  
  const [dynamicCompList,setDynamicCompList] = useState([]);   

  const addElement = () => {
   const dynamicEl = React.createElement("p",{},"This is paragraph");
   setDynamicCompList(dynamicCompList.concat(dynamicEl)); 
  }

  return (
    <div>
      <button onClick={() => addElement()}>Click here</button>
      <div className={classes.elements}>
        {dynamicCompList} 
      </div>
    </div>
  )
}
export default AddElement;

在地图上产生正确的名称: enter image description here

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。

相关推荐


Selenium Web驱动程序和Java。元素在(x,y)点处不可单击。其他元素将获得点击?
Python-如何使用点“。” 访问字典成员?
Java 字符串是不可变的。到底是什么意思?
Java中的“ final”关键字如何工作?(我仍然可以修改对象。)
“loop:”在Java代码中。这是什么,为什么要编译?
java.lang.ClassNotFoundException:sun.jdbc.odbc.JdbcOdbcDriver发生异常。为什么?
这是用Java进行XML解析的最佳库。
Java的PriorityQueue的内置迭代器不会以任何特定顺序遍历数据结构。为什么?
如何在Java中聆听按键时移动图像。
Java“Program to an interface”。这是什么意思?
Java在半透明框架/面板/组件上重新绘画。
Java“ Class.forName()”和“ Class.forName()。newInstance()”之间有什么区别?
在此环境中不提供编译器。也许是在JRE而不是JDK上运行?
Java用相同的方法在一个类中实现两个接口。哪种接口方法被覆盖?
Java 什么是Runtime.getRuntime()。totalMemory()和freeMemory()?
java.library.path中的java.lang.UnsatisfiedLinkError否*****。dll
JavaFX“位置是必需的。” 即使在同一包装中
Java 导入两个具有相同名称的类。怎么处理?
Java 是否应该在HttpServletResponse.getOutputStream()/。getWriter()上调用.close()?
Java RegEx元字符(。)和普通点?