scala - Activator UI ,java.lang.NoClassDefFoundError: -


I am trying to create the original Scala project in intellij using the Implementator UI I am importing the project to ide and This collection is well

but when im trying to run the code getting simple exception in the formula "main" java.lang.NoClassDefFoundError: Scala / archive / GenTraversableOnce $ Akka.util.Collections $ $ EmptyImmutableSeq $ & LT; Init & gt; (Collections.scala: 15) .. Akka.util.Collections $ EmptyImmutableSeq $ & lt; Clinit & gt; Akka.japi.Util $ .immutableSeq (Collections.scala) (JavaAPI .scala: 209) at akka.actor.ActorSystem $ setting & lt; Init & gt; (ActorSystem.scala:. 150) at akka.actor.ActorSystemImpl & lt; Init & gt; (ActorSystem.scala: 470) in akka.actor.ActorSystem $ .apply (ActorSystem.scala: 111) akka.actor.ActorSystem $ .apply (ActorSystem.scala: 104) in reactivemongo.api.MongoDriver $ .reactivemongo $ API $ MongoDriver $$ defaultSystem on (api.scala: 378) at reactivemongo .api.MongoDriver $$ unexpected $ 3.apply (api.scala: 305) reactive mongo.api.mon go to goDriver $$ anonfun $ 3.Apply (api .scala: 305) in scala.Option.getOrElse (Option.scala: 120) Reactivemongo.api.MongoDriver at & lt; Init & gt; (Api.scala: 305) example.App main (App.scala: 10) at example.App.main (App.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method) sun.reflect.NativeMethodAccessorImpl. In the invoke (NativeMethodAccessorImpl.java.500) on the sun. Reflect.DelegatingMethodAccessorImpl.invoke java.lang.reflect.Method.invoke (Method.java:606) com.intellij.rt.execution.application.AppMain.main on at (DelegatingMethodAccessorImpl.java:43) (AppMain.java:134)

When the project loads, there is an error in the project structure sbt: scala 2.11.2 not in use

What is the catalyst ui Went wrong with IntelliJ project generation?

Thanks

Miki

try to run spark The time I came to this was the inconsistency error which was used to compile the version of the Scalena version which is used for the dependency and Scala which used to run its project

is being removed My Scala version specification was not a hacky way to solve the problem.

  // name build.sbt: = "SparkTest" version: = "1.0" scalaVersion: "" 2.11.4 "& lt; - Remove this libraryDependencies + = "org.apache.spark"% "spark-core_2.10"% "1.3.0"    

Comments

Popular posts from this blog

java - ImportError: No module named py4j.java_gateway -

python - Receiving "KeyError" after decoding json result from url -

.net - Creating a new Queue Manager and Queue in Websphere MQ (using C#) -