Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Ubuntu, Java 7] No configuration setting found for key 'akka.version' #52

Open
davidonlaptop opened this issue Nov 2, 2015 · 1 comment

Comments

@davidonlaptop
Copy link
Member

La compilation fonctionne sur une machine Linux vierge (Ubuntu 14.04). Mais l'execution ne fonctionne pas:

docker run --rm -ti ubuntu:14.04

apt-get update

export JAVA_VERSION="7u85"
export JAVA_HOME="/usr/lib/jvm/java-7-openjdk-amd64"
DEBIAN_FRONTEND=noninteractive apt-get install -y openjdk-7-jdk=$JAVA_VERSION\*

apt-get install -y git
git clone https://github.com/GELOG/adam-ibs.git
cd adam-ibs/

apt-get install -y maven

mvn package

root@7d5f110a4f03:/adam-ibs# java -jar adam-ibs-core/target/adam-ibs-core-0.1.0-jar-with-dependencies.jar --help
23:33:40.516 [main] [INFO ] [c.e.m.c.Main$] : Begin with arguments : --help
      --file  <name>   Specify .ped + .map filename prefix (default 'plink')
      --genome         Calculate IBS distances between all individuals [needs
                       --file and --out]
      --make-bed       Create a new binary fileset. Specify .ped and .map files
                       [needs --file and --out]
      --out  <name>    Specify the output filename
      --show-parquet   Show shema and data sample ostored in a parquet file [needs
                       --file]
  -h, --help  <arg>    Show help message

root@7d5f110a4f03:/adam-ibs# java -jar adam-ibs-core/target/adam-ibs-core-0.1.0-jar-with-dependencies.jar --file DATA/
test --out output --make-bed
23:22:23.977 [main] [INFO ] [c.e.m.c.Main$] : Begin with arguments : --file DATA/test --out output --make-bed
23:22:25.346 [main] [ERROR] [o.a.s.SparkContext] : Error initializing SparkContext.
com.typesafe.config.ConfigException$Missing: No configuration setting found for key 'akka.version'
    at com.typesafe.config.impl.SimpleConfig.findKey(SimpleConfig.java:124) ~[adam-ibs-core-0.1.0-jar-with-dependencies.jar:na]
    at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:145) ~[adam-ibs-core-0.1.0-jar-with-dependencies.jar:na]
    at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:151) ~[adam-ibs-core-0.1.0-jar-with-dependencies.jar:na]
    at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:159) ~[adam-ibs-core-0.1.0-jar-with-dependencies.jar:na]
    at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:164) ~[adam-ibs-core-0.1.0-jar-with-dependencies.jar:na]
    at com.typesafe.config.impl.SimpleConfig.getString(SimpleConfig.java:206) ~[adam-ibs-core-0.1.0-jar-with-dependencies.jar:na]
    at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:168) ~[adam-ibs-core-0.1.0-jar-with-dependencies.jar:na]
    at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:504) ~[adam-ibs-core-0.1.0-jar-with-dependencies.jar:na]
    at akka.actor.ActorSystem$.apply(ActorSystem.scala:141) ~[adam-ibs-core-0.1.0-jar-with-dependencies.jar:na]
    at akka.actor.ActorSystem$.apply(ActorSystem.scala:118) ~[adam-ibs-core-0.1.0-jar-with-dependencies.jar:na]
    at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:122) ~[adam-ibs-core-0.1.0-jar-with-dependencies.jar:na]
    at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54) ~[adam-ibs-core-0.1.0-jar-with-dependencies.jar:na]
    at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53) ~[adam-ibs-core-0.1.0-jar-with-dependencies.jar:na]
    at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1991) ~[adam-ibs-core-0.1.0-jar-with-dependencies.jar:na]
    at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:142) ~[adam-ibs-core-0.1.0-jar-with-dependencies.jar:na]
    at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1982) ~[adam-ibs-core-0.1.0-jar-with-dependencies.jar:na]
    at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56) ~[adam-ibs-core-0.1.0-jar-with-dependencies.jar:na]
    at org.apache.spark.rpc.akka.AkkaRpcEnvFactory.create(AkkaRpcEnv.scala:245) ~[adam-ibs-core-0.1.0-jar-with-dependencies.jar:na]
    at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:52) ~[adam-ibs-core-0.1.0-jar-with-dependencies.jar:na]
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:247) ~[adam-ibs-core-0.1.0-jar-with-dependencies.jar:na]
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:188) ~[adam-ibs-core-0.1.0-jar-with-dependencies.jar:na]
    at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267) ~[adam-ibs-core-0.1.0-jar-with-dependencies.jar:na]
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:424) ~[adam-ibs-core-0.1.0-jar-with-dependencies.jar:na]
    at com.ets.mgl804.core.AppContext$.<init>(AppContext.scala:16) [adam-ibs-core-0.1.0-jar-with-dependencies.jar:na]
    at com.ets.mgl804.core.AppContext$.<clinit>(AppContext.scala) [adam-ibs-core-0.1.0-jar-with-dependencies.jar:na]
    at com.ets.mgl804.core.cli.PlinkMethod$.<init>(PlinkMethod.scala:19) [adam-ibs-core-0.1.0-jar-with-dependencies.jar:na]
    at com.ets.mgl804.core.cli.PlinkMethod$.<clinit>(PlinkMethod.scala) [adam-ibs-core-0.1.0-jar-with-dependencies.jar:na]
    at com.ets.mgl804.core.Main$$anonfun$main$2.apply(Main.scala:26) [adam-ibs-core-0.1.0-jar-with-dependencies.jar:na]
    at com.ets.mgl804.core.Main$$anonfun$main$2.apply(Main.scala:23) [adam-ibs-core-0.1.0-jar-with-dependencies.jar:na]
    at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33) [adam-ibs-core-0.1.0-jar-with-dependencies.jar:na]
    at scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:34) [adam-ibs-core-0.1.0-jar-with-dependencies.jar:na]
    at com.ets.mgl804.core.Main$.main(Main.scala:22) [adam-ibs-core-0.1.0-jar-with-dependencies.jar:na]
    at com.ets.mgl804.core.Main.main(Main.scala) [adam-ibs-core-0.1.0-jar-with-dependencies.jar:na]
Exception in thread "main" java.lang.ExceptionInInitializerError
    at com.ets.mgl804.core.cli.PlinkMethod$.<init>(PlinkMethod.scala:19)
    at com.ets.mgl804.core.cli.PlinkMethod$.<clinit>(PlinkMethod.scala)
    at com.ets.mgl804.core.Main$$anonfun$main$2.apply(Main.scala:26)
    at com.ets.mgl804.core.Main$$anonfun$main$2.apply(Main.scala:23)
    at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
    at scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:34)
    at com.ets.mgl804.core.Main$.main(Main.scala:22)
    at com.ets.mgl804.core.Main.main(Main.scala)
Caused by: com.typesafe.config.ConfigException$Missing: No configuration setting found for key 'akka.version'
    at com.typesafe.config.impl.SimpleConfig.findKey(SimpleConfig.java:124)
    at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:145)
    at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:151)
    at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:159)
    at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:164)
    at com.typesafe.config.impl.SimpleConfig.getString(SimpleConfig.java:206)
    at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:168)
    at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:504)
    at akka.actor.ActorSystem$.apply(ActorSystem.scala:141)
    at akka.actor.ActorSystem$.apply(ActorSystem.scala:118)
    at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:122)
    at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
    at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
    at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1991)
    at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:142)
    at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1982)
    at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
    at org.apache.spark.rpc.akka.AkkaRpcEnvFactory.create(AkkaRpcEnv.scala:245)
    at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:52)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:247)
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:188)
    at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:424)
    at com.ets.mgl804.core.AppContext$.<init>(AppContext.scala:16)
    at com.ets.mgl804.core.AppContext$.<clinit>(AppContext.scala)
    ... 8 more
@iki-v
Copy link
Contributor

iki-v commented Nov 4, 2015

Le jar doit être exécute sur Spark.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants