SparkPool,spark 報錯

 2023-10-18 阅读 24 评论 0

摘要:? ?maven打包時報錯: 報錯信息: "D:\Program Files\Java\jdk1.8.0_131\bin\java" -Dmaven.multiModuleProjectDirectory=D:\Workspace\IDEA_work\Spark_Work\spark01\sparkCore "-Dmaven.home=D:\Program Files\JetBrains\IntelliJ IDEA 20

?

?maven打包時報錯:

報錯信息:

"D:\Program Files\Java\jdk1.8.0_131\bin\java" -Dmaven.multiModuleProjectDirectory=D:\Workspace\IDEA_work\Spark_Work\spark01\sparkCore "-Dmaven.home=D:\Program Files\JetBrains\IntelliJ IDEA 2017.3.1\plugins\maven\lib\maven3" "-Dclassworlds.conf=D:\Program Files\JetBrains\IntelliJ IDEA 2017.3.1\plugins\maven\lib\maven3\bin\m2.conf" "-javaagent:D:\Program Files\JetBrains\IntelliJ IDEA 2017.3.1\lib\idea_rt.jar=61000:D:\Program Files\JetBrains\IntelliJ IDEA 2017.3.1\bin" -Dfile.encoding=UTF-8 -classpath "D:\Program Files\JetBrains\IntelliJ IDEA 2017.3.1\plugins\maven\lib\maven3\boot\plexus-classworlds-2.5.2.jar" org.codehaus.classworlds.Launcher -Didea.version=2017.3.1 -DskipTests=true package
[INFO] Scanning for projects...
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building sparkCore 1.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ sparkCore ---
[WARNING] Using platform encoding (UTF-8 actually) to copy filtered resources, i.e. build is platform dependent!
[INFO] Copying 0 resource
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ sparkCore ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (default) @ sparkCore ---
[WARNING]  Expected all dependencies to require Scala version: 2.11.8
[WARNING]  com.twitter:chill_2.11:0.8.0 requires scala version: 2.11.7
[WARNING] Multiple versions of scala libraries detected!
[INFO] D:\Workspace\IDEA_work\Spark_Work\spark01\sparkCore\src\main\java:-1: info: compiling
[INFO] D:\Workspace\IDEA_work\Spark_Work\spark01\sparkCore\src\main\scala:-1: info: compiling
[INFO] Compiling 1 source files to D:\Workspace\IDEA_work\Spark_Work\spark01\sparkCore\target\classes at 1562322123123
[ERROR] error: error while loading <root>, Error accessing C:\Users\67001\.m2\repository\org\codehaus\jackson\jackson-core-asl\1.9.13\jackson-core-asl-1.9.13.jar
[ERROR] error: scala.reflect.internal.MissingRequirementError: object java.lang.Object in compiler mirror not found.
[ERROR]     at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:17)
[ERROR]     at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:18)
[ERROR]     at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:53)
[ERROR]     at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:45)
[ERROR]     at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:45)
[ERROR]     at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:66)
[ERROR]     at scala.reflect.internal.Mirrors$RootsBase.getClassByName(Mirrors.scala:102)
[ERROR]     at scala.reflect.internal.Mirrors$RootsBase.getRequiredClass(Mirrors.scala:105)
[ERROR]     at scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass$lzycompute(Definitions.scala:257)
[ERROR]     at scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass(Definitions.scala:257)
[ERROR]     at scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1394)
[ERROR]     at scala.tools.nsc.Global$Run.<init>(Global.scala:1215)
[ERROR]     at scala.tools.nsc.Driver.doCompile(Driver.scala:31)
[ERROR]     at scala.tools.nsc.MainClass.doCompile(Main.scala:23)
[ERROR]     at scala.tools.nsc.Driver.process(Driver.scala:51)
[ERROR]     at scala.tools.nsc.Driver.main(Driver.scala:64)
[ERROR]     at scala.tools.nsc.Main.main(Main.scala)
[ERROR]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[ERROR]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[ERROR]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[ERROR]     at java.lang.reflect.Method.invoke(Method.java:498)
[ERROR]     at scala_maven_executions.MainHelper.runMain(MainHelper.java:164)
[ERROR]     at scala_maven_executions.MainWithArgsInFile.main(MainWithArgsInFile.java:26)
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 4.995 s
[INFO] Finished at: 2019-07-05T18:22:03+08:00
[INFO] Final Memory: 26M/698M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (default) on project sparkCore: wrap: org.apache.commons.exec.ExecuteException: Process exited with an error: 1 (Exit value: 1) -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException

Process finished with exit code 1

?

"D:\Program Files\Java\jdk1.8.0_131\bin\java" -Dmaven.multiModuleProjectDirectory=D:\Workspace\IDEA_work\Spark_Work\spark02\sparkCore 
"-Dmaven.home=D:\Program Files\JetBrains\IntelliJ IDEA 2017.3.1\plugins\maven\lib\maven3"
"-Dclassworlds.conf=D:\Program Files\JetBrains\IntelliJ IDEA 2017.3.1\plugins\maven\lib\maven3\bin\m2.conf"
"-javaagent:D:\Program Files\JetBrains\IntelliJ IDEA 2017.3.1\lib\idea_rt.jar=64675:D:\Program Files\JetBrains\IntelliJ IDEA 2017.3.1\bin"
-Dfile.encoding=UTF-8 -classpath "D:\Program Files\JetBrains\IntelliJ IDEA 2017.3.1\plugins\maven\lib\maven3\boot\plexus-classworlds-2.5.2.jar"
org.codehaus.classworlds.Launcher -Didea.version=2017.3.1 -DskipTests=true package [INFO] Scanning for projects... [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building sparkCore 1.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ sparkCore --- [WARNING] Using platform encoding (UTF-8 actually) to copy filtered resources, i.e. build is platform dependent! [INFO] Copying 0 resource [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ sparkCore --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- scala-maven-plugin:3.2.2:compile (default) @ sparkCore --- [WARNING] Expected all dependencies to require Scala version: 2.11.8 [WARNING] com.twitter:chill_2.11:0.8.0 requires scala version: 2.11.7 [WARNING] Multiple versions of scala libraries detected! [INFO] D:\Workspace\IDEA_work\Spark_Work\spark02\sparkCore\src\main\java:-1: info: compiling [INFO] D:\Workspace\IDEA_work\Spark_Work\spark02\sparkCore\src\main\scala:-1: info: compiling [INFO] Compiling 1 source files to D:\Workspace\IDEA_work\Spark_Work\spark02\sparkCore\target\classes at 1562341347300 [ERROR] error: error while loading <root>, Error accessing C:\Users\67001\.m2\repository\commons-configuration\commons-configuration\1.6\commons-configuration-1.6.jar [ERROR] error: scala.reflect.internal.MissingRequirementError: object java.lang.Object in compiler mirror not found. [ERROR] at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:17) [ERROR] at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:18) [ERROR] at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:53) [ERROR] at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:45) [ERROR] at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:45) [ERROR] at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:66) [ERROR] at scala.reflect.internal.Mirrors$RootsBase.getClassByName(Mirrors.scala:102) [ERROR] at scala.reflect.internal.Mirrors$RootsBase.getRequiredClass(Mirrors.scala:105) [ERROR] at scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass$lzycompute(Definitions.scala:257) [ERROR] at scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass(Definitions.scala:257) [ERROR] at scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1394) [ERROR] at scala.tools.nsc.Global$Run.<init>(Global.scala:1215) [ERROR] at scala.tools.nsc.Driver.doCompile(Driver.scala:31) [ERROR] at scala.tools.nsc.MainClass.doCompile(Main.scala:23) [ERROR] at scala.tools.nsc.Driver.process(Driver.scala:51) [ERROR] at scala.tools.nsc.Driver.main(Driver.scala:64) [ERROR] at scala.tools.nsc.Main.main(Main.scala) [ERROR] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) [ERROR] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) [ERROR] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) [ERROR] at java.lang.reflect.Method.invoke(Method.java:498) [ERROR] at scala_maven_executions.MainHelper.runMain(MainHelper.java:164) [ERROR] at scala_maven_executions.MainWithArgsInFile.main(MainWithArgsInFile.java:26) [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 5.117 s [INFO] Finished at: 2019-07-05T23:42:28+08:00 [INFO] Final Memory: 26M/698M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (default) on project
sparkCore: wrap: org.apache.commons.exec.ExecuteException: Process exited with an error: 1 (Exit value: 1) -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException Process finished with exit code 1

?

?解決方法:

SparkPool?因為配置了JobHistoryServer所以需要啟動 yarn 和 HDFS

?

?

?

spark-shell 啟動報錯:

ERROR spark.SparkContext: Error initializing SparkContext.
java.net.ConnectException: Call From hadoop102/192.168.192.102 to hadoop102:9000 failed on connection exception: java.net.ConnectException: 拒絕連接; 
For more details see: http://wiki.apache.org/hadoop/ConnectionRefused at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)at java.lang.reflect.Constructor.newInstance(Constructor.java:423)at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:792)at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:732)at org.apache.hadoop.ipc.Client.call(Client.java:1479)at org.apache.hadoop.ipc.Client.call(Client.java:1412)at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)at com.sun.proxy.$Proxy15.getFileInfo(Unknown Source)at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:498)at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)at com.sun.proxy.$Proxy16.getFileInfo(Unknown Source)at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2108)at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317)at org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:93)at org.apache.spark.SparkContext.<init>(SparkContext.scala:531)at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2320)at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868)at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860)at scala.Option.getOrElse(Option.scala:121)at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860)at org.apache.spark.repl.Main$.createSparkSession(Main.scala:96)at $line3.$read$$iw$$iw.<init>(<console>:15)at $line3.$read$$iw.<init>(<console>:42)at $line3.$read.<init>(<console>:44)at $line3.$read$.<init>(<console>:48)at $line3.$read$.<clinit>(<console>)at $line3.$eval$.$print$lzycompute(<console>:7)at $line3.$eval$.$print(<console>:6)at $line3.$eval.$print(<console>)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:498)at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)at org.apache.spark.repl.Main$.doMain(Main.scala:69)at org.apache.spark.repl.Main$.main(Main.scala:52)at org.apache.spark.repl.Main.main(Main.scala)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:498)at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:743)at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.net.ConnectException: 拒絕連接at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:614)at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712)at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528)at org.apache.hadoop.ipc.Client.call(Client.java:1451)... 71 more java.net.ConnectException: Call From hadoop102/192.168.192.102 to hadoop102:9000 failed on connection exception: java.net.ConnectException: 拒絕連接;
For more details see: http://wiki.apache.org/hadoop/ConnectionRefused at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)at java.lang.reflect.Constructor.newInstance(Constructor.java:423)at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:792)at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:732)at org.apache.hadoop.ipc.Client.call(Client.java:1479)at org.apache.hadoop.ipc.Client.call(Client.java:1412)at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)at com.sun.proxy.$Proxy15.getFileInfo(Unknown Source)at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:498)at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)at com.sun.proxy.$Proxy16.getFileInfo(Unknown Source)at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2108)at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317)at org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:93)at org.apache.spark.SparkContext.<init>(SparkContext.scala:531)at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2320)at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868)at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860)at scala.Option.getOrElse(Option.scala:121)at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860)at org.apache.spark.repl.Main$.createSparkSession(Main.scala:96)... 47 elided Caused by: java.net.ConnectException: 拒絕連接at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:614)at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712)at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528)at org.apache.hadoop.ipc.Client.call(Client.java:1451)... 71 more <console>:14: error: not found: value sparkimport spark.implicits._^ <console>:14: error: not found: value sparkimport spark.sql

?

?

docker 啟動失敗,解決方法:

因為配置了JobHistoryServer所以需要啟動 yarn 和 HDFS

轉載于:https://www.cnblogs.com/LXL616/p/11140975.html

版权声明:本站所有资料均为网友推荐收集整理而来,仅供学习和研究交流使用。

原文链接:https://hbdhgg.com/3/149203.html

发表评论:

本站为非赢利网站,部分文章来源或改编自互联网及其他公众平台,主要目的在于分享信息,版权归原作者所有,内容仅供读者参考,如有侵权请联系我们删除!

Copyright © 2022 匯編語言學習筆記 Inc. 保留所有权利。

底部版权信息