# MSD - Configuration Zeppelin (0.10.1) / SANSA (0.8-R3) ## Install java / compatible Sansa ### Info pour utiliser une autre JVM sur Spark https://github.com/LucaCanali/Miscellaneous/blob/master/Spark_Notes/Spark_Set_Java_Home_Howto.md ### Download Java (sur toute les machines !!) => /usr/local/openjdk/ ```shell= wget https://github.com/AdoptOpenJDK/openjdk12-binaries/releases/download/jdk-12.0.2%2B10/OpenJDK12U-jdk_x64_linux_hotspot_12.0.2_10.tar.gz tar xvf OpenJDK12U-jdk_x64_linux_hotspot_12.0.2_10.tar.gz ``` #### utilisation ```shell= export JAVA_HOME=/usr/local/openjdk/jdk-12.0.2+10/ export PATH=$JAVA_HOME/bin:$PATH ``` ## Configuration Spark ### Sur ElRond/Sarouman (ou on fait des spark-submit) 1) sudo vi /usr/local/spark/conf/spark-defaults.conf ```shell= spark.executorEnv.JAVA_HOME=/usr/local/openjdk/jdk-12.0.2+10 spark.yarn.appMasterEnv.JAVA_HOME=/usr/local/openjdk/jdk-12.0.2+10 ``` ### Sur les datanodes 1) sudo vi /usr/local/spark/conf/spark-defaults.conf ```shell= spark.executorEnv.JAVA_HOME=/usr/local/openjdk/jdk-12.0.2+10 ``` ## Compilation / Installation SANSA sur ELROND/SAROUMAN ### Compilation SANSA construction one jar !! Le jar qui nous interesse se trouve dans : `./sansa-stack/sansa-stack-spark/target` ```shell= git clone https://github.com/SANSA-Stack/SANSA-Stack.git cd SANSA-Stack sh ./dev/make_spark_dist.sh ``` ### Install SANSA ```shell= cp ./sansa-stack/sansa-stack-spark/target/sansa-stack-spark_2.12-0.8.0-RC3-SNAPSHOT-jar-with-dependencies.jar /usr/share/java/ ``` ## Install Zeppelin sur Elrond - /usr/local/zeppelin-0.10.1-bin-all ZEPPELIN = /usr/local/zeppelin-0.10.1-bin-all ### $ZEPPELIN/conf/zeppelin-env.sh ```shell= export JAVA_HOME=/usr/local/openjdk/jdk-12.0.2+10/ export PATH=$JAVA_HOME/bin:/usr/local/hadoop/bin:$PATH export USE_HADOOP=true # Whether include hadoop jars into zeppelin server process. (true or false) export SPARK_MASTER="yarn" # Spark master url. eg. spark://master_addr:7077. Leave empty if you want to use local mode. export ZEPPELIN_ADDR=147.100.175.224 # Bind address (default 127.0.0.1) export ZEPPELIN_PORT=8080 # port number to listen (default 8080) export ZEPPELIN_ALLOWED_ORIGINS="*" export ZEPPELIN_NOTEBOOK_DIR=/zeppelin/notebook export SPARK_HOME=/usr/local/spark export HADOOP_CONF_DIR=/usr/local/hadoop/etc/hadoop export SPARK_HOME=/usr/local/spark ``` ### Execute Zeppelin *avec l'utilisateur zeppelin* /usr/local/zeppelin-0.10.1-bin-all ``` bin/zeppelin-daemon.sh start ``` ### Zeppelin / Configuration de l'interpreteur Spark - spark.master : yarn - spark.submit.deployMode : client - spark.jars : /usr/share/java/sansa-stack-spark_2.12-0.8.0-RC3-SNAPSHOT-jar-with-dependencies.jar - zeppelin.spark.uiWebUrl : https://yarn.metabolomics-datalake.ara.inrae.fr/cluster/apps/{{applicationId}} ### utils en cas de blocage sur le cluster des jobs yarn application -list yarn application -kill application_1635934656628_0091 ### URLs - https://zeppelin.metabolomics-datalake.ara.inrae.fr/ - https://yarn.metabolomics-datalake.ara.inrae.fr/ - https://spark.metabolomics-datalake.ara.inrae.fr/
×
Sign in
Email
Password
Forgot password
or
By clicking below, you agree to our
terms of service
.
Sign in via Facebook
Sign in via Twitter
Sign in via GitHub
Sign in via Dropbox
Sign in with Wallet
Wallet (
)
Connect another wallet
New to HackMD?
Sign up