2016-01-05 8 views
5

Appena scaricato la versione pre-costruita di Spark 1.6 con Hadoop 2.6+ su Ubuntu 14.04 sul desktop."./bin/spark-shell" Non funziona con la versione pre-costruita di Spark 1.6 con Hadoop 2.6+ su ubuntu 14.04

ho navigato alla shell scintilla e avviato scintilla secondo link qui di seguito Quick Start Spark Link utilizzando

./bin/spark-shell 

Sto ricevendo i seguenti errori. Ho visto una domanda simile per Mac OSX here.

[email protected]:~/Desktop/spark-1.6.0-bin-hadoop2.6$ ./bin/spark-shell 
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory). 
log4j:WARN Please initialize the log4j system properly. 
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. 
Using Spark's repl log4j profile: org/apache/spark/log4j-defaults-repl.properties 
To adjust logging level use sc.setLogLevel("INFO") 
Welcome to 
     ____    __ 
    /__/__ ___ _____/ /__ 
    _\ \/ _ \/ _ `/ __/ '_/ 
    /___/ .__/\_,_/_/ /_/\_\ version 1.6.0 
     /_/ 

Using Scala version 2.10.5 (OpenJDK 64-Bit Server VM, Java 1.7.0_91) 
Type in expressions to have them evaluated. 
Type :help for more information. 
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 
16/01/05 12:36:25 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 
16/01/05 12:36:25 ERROR SparkContext: Error initializing SparkContext. 
java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries! 
    at sun.nio.ch.Net.bind0(Native Method) 
    at sun.nio.ch.Net.bind(Net.java:463) 
    at sun.nio.ch.Net.bind(Net.java:455) 
    at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223) 
    at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74) 
    at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125) 
    at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485) 
    at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089) 
    at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430) 
    at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415) 
    at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903) 
    at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198) 
    at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348) 
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357) 
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357) 
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111) 
    at java.lang.Thread.run(Thread.java:745) 
java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries! 
    at sun.nio.ch.Net.bind0(Native Method) 
    at sun.nio.ch.Net.bind(Net.java:463) 
    at sun.nio.ch.Net.bind(Net.java:455) 
    at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223) 
    at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74) 
    at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125) 
    at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485) 
    at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089) 
    at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430) 
    at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415) 
    at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903) 
    at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198) 
    at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348) 
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357) 
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357) 
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111) 
    at java.lang.Thread.run(Thread.java:745) 

java.lang.NullPointerException 
    at org.apache.spark.sql.SQLContext$.createListenerAndUI(SQLContext.scala:1367) 
    at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101) 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) 
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526) 
    at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028) 
    at $iwC$$iwC.<init>(<console>:15) 
    at $iwC.<init>(<console>:24) 
    at <init>(<console>:26) 
    at .<init>(<console>:30) 
    at .<clinit>(<console>) 
    at .<init>(<console>:7) 
    at .<clinit>(<console>) 
    at $print(<console>) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:606) 
    at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065) 
    at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346) 
    at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840) 
    at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) 
    at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) 
    at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857) 
    at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902) 
    at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814) 
    at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:132) 
    at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124) 
    at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324) 
    at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124) 
    at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64) 
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974) 
    at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159) 
    at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64) 
    at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108) 
    at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64) 
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991) 
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) 
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) 
    at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) 
    at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945) 
    at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059) 
    at org.apache.spark.repl.Main$.main(Main.scala:31) 
    at org.apache.spark.repl.Main.main(Main.scala) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:606) 
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) 
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) 
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) 
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) 
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 

<console>:16: error: not found: value sqlContext 
     import sqlContext.implicits._ 
       ^
<console>:16: error: not found: value sqlContext 
     import sqlContext.sql 
       ^

Qualsiasi aiuto?

+0

Hai provato a eseguirlo con sudo? Forse richiede i privilegi di root. – kometen

+0

@kometen Anche Nope Sudo non sta aiutando. Stesso errore –

+0

http://stackoverflow.com/a/40523061/2777965 – 030

risposta

0

Ho riscontrato un problema simile in cui il mio master non riesce ad avviarsi con quell'eccezione quando si attiva un cluster.

Per risolvere questo problema ho modificato una proprietà che stavo impostando nel file $ SPARK_HOME/conf/spark-env.sh.

In precedenza avevo impostato "SPARK_MASTER_IP" all'indirizzo IP del mio nodo principale. La modifica di questo al DNS pubblico della casella sembra risolvere il problema.

+0

Grazie per il suggerimento Thomas. Sto avendo lo stesso problema e questo non ha risolto il mio problema. – ammills01

0

Una probabile causa del problema è che si sta tentando di eseguire il binding a un indirizzo IP non valido. Nel numero $SPARK_HOME/conf/spark-env.sh, è presente una variabile denominata $SPARK_LOCAL_IP. Se è impostato, assicurati che rappresenti davvero la macchina, stai usando la shell Spark o prova a commentarla. In caso contrario, se non è impostato, puoi provare a impostarlo su, ad esempio, 127.0.0.1.

+0

il problema persiste – 030

Problemi correlati