2015-05-29 11 views
5

Sto tentando di eseguire un'applicazione Spark scritta in Scala in Intellij 14.1.3. Lo scala sdk è scala-sdk-2.11.6. Ottengo il seguente errore quando eseguo il mio codice:Esecuzione di un'applicazione Spark in Intellij 14.1.3

Exception in thread "main" java.lang.NoSuchMethodError: scala.collection.immutable.HashSet$.empty()Lscala/collection/immutable/HashSet; 
at akka.actor.ActorCell$.<init>(ActorCell.scala:336) 
at akka.actor.ActorCell$.<clinit>(ActorCell.scala) 
at akka.actor.RootActorPath.$div(ActorPath.scala:159) 
at akka.actor.LocalActorRefProvider.<init>(ActorRefProvider.scala:464) 
at akka.remote.RemoteActorRefProvider.<init>(RemoteActorRefProvider.scala:124) 
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 
at java.lang.reflect.Constructor.newInstance(Constructor.java:422) 
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78) 
at scala.util.Try$.apply(Try.scala:191) 
at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73) 
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84) 
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84) 
at scala.util.Success.flatMap(Try.scala:230) 
at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84) 
at akka.actor.ActorSystemImpl.liftedTree1$1(ActorSystem.scala:584) 
at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:577) 
at akka.actor.ActorSystem$.apply(ActorSystem.scala:141) 
at akka.actor.ActorSystem$.apply(ActorSystem.scala:118) 
at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:122) 
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:55) 
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54) 
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1837) 
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:166) 
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1828) 
at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:57) 
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:223) 
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:163) 
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:269) 
at org.apache.spark.SparkContext.<init>(SparkContext.scala:272) 
at LRParquetProcess$.main(LRParquetProcess.scala:9) 
at LRParquetProcess.main(LRParquetProcess.scala) 
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
at java.lang.reflect.Method.invoke(Method.java:497) 
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:140) 

processo terminato con codice di uscita 1

mio pom.xml è la seguente:

<?xml version="1.0" encoding="UTF-8"?> 
<project xmlns="http://maven.apache.org/POM/4.0.0" 
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> 
<modelVersion>4.0.0</modelVersion> 

<groupId>ParquetGeneration</groupId> 
<artifactId>ParquetGeneration</artifactId> 
<version>1.0-SNAPSHOT</version> 
<properties> 
<hadoop.version>2.7.0</hadoop.version> 
</properties> 
<dependencies> 
<dependency> 
     <groupId>org.apache.spark</groupId> 
     <artifactId>spark-core_2.10</artifactId> 
     <version>1.3.1</version> 
     <exclusions> 
      <exclusion> 
       <groupId>org.apache.hadoop</groupId> 
       <artifactId>hadoop-client</artifactId> 
      </exclusion> 
     </exclusions> 
    </dependency> 
    <dependency> 
    <groupId>org.apache.hadoop</groupId> 
     <artifactId>hadoop-hdfs</artifactId> 
     <version>${hadoop.version}</version> 
    </dependency> 
    <dependency> 
     <groupId>org.apache.hadoop</groupId> 
     <artifactId>hadoop-common</artifactId> 
     <version>${hadoop.version}</version> 
     <exclusions> 
      <exclusion> 
       <groupId>org.eclipse.jetty</groupId> 
       <artifactId>*</artifactId> 
      </exclusion> 
     </exclusions> 
    </dependency> 
    <dependency> 
     <groupId>org.apache.hadoop</groupId> 
     <artifactId>hadoop-mapreduce-client-app</artifactId> 
     <version>${hadoop.version}</version> 
    </dependency> 
    <dependency> 
     <groupId>org.scala-lang</groupId> 
     <artifactId>scala-library</artifactId> 
     <version>2.10.5</version> 
    </dependency> 
    <dependency> 
     <groupId>org.scala-lang</groupId> 
     <artifactId>scala-compiler</artifactId> 
     <version>2.10.5</version> 
    </dependency> 


    <dependency> 
     <groupId>org.apache.spark</groupId> 
     <artifactId>spark-sql_2.10</artifactId> 
     <version>1.2.1</version> 
    </dependency> 
    <dependency> 
     <groupId>com.typesafe.akka</groupId> 
     <artifactId>akka-actor_2.10</artifactId> 
     <version>2.3.11</version> 
    </dependency> 

    <dependency> 
     <groupId>org.apache.spark</groupId> 
     <artifactId>spark-hive_2.10</artifactId> 
     <version>1.3.1</version> 
    </dependency> 

</dependencies> 

+0

Spark non è compatibile con scala 2.11. – abalcerek

risposta

6

Vai alla Scala 2.10, sarà migliore al momento

+2

Grazie Thomas. Puoi dirmi come posso passare a 2,10? Sarà davvero utile – Bharath

4

Come suggerito si dovrebbe provare 2.10.x.

Installare 2.10.x e impostare le variabili di ambiente pertinenti per utilizzarlo. Dato che hai già un progetto vai su File -> Struttura del progetto -> Librerie globali e rimuovi 2.11.x. Quindi aggiungi 2.10.x premendo '+' -> Scala SDK -> Sfoglia e seleziona la cartella 2.10.x installata in precedenza.

I requisiti della versione scala sono specificati nello documentation.

Problemi correlati