spark standalone mode

来源:互联网 发布:淘宝卖衣服从哪进货 编辑:程序博客网 时间:2024/06/11 01:45

1.  Installing Spark Standalone to a Cluster
    To install Spark Standalone mode, you simply place a compiled version of Spark on each node on the cluster.

to visit the below blog:
http://blog.chinaunix.net/uid-29454152-id-5148300.html
http://blog.chinaunix.net/uid-29454152-id-5148347.html

2. Starting a Cluster Manually

   1)at master 

         command to start spark 
sudo ./sbin/start-master.sh
        spark://HOST:PORT can be find in webUI address :  http://localhost:8080   

    2)at worker

           command to connect master 
./bin/spark-class org.apache.spark.deploy.worker.Worker spark://IP:PORT

3. Connecting an Application to the Cluster

   command to start app in spark

sudo ./bin/spark-shell --master spark://IP:PORT --total-executor-cores <numCores>

4.submit jar

   mode:down at spark dir 

        ./bin/spark-submit --class path.to.your.class [options] <app jar>

   example:at standalone

        ./bin/spark-submit \
        --class my.main.classname \
        --master spark://127.0.0.1:7077
        --executor-memory 2G \
        --total-executor-cores 4 \
        /home/warrior/IdeaProjects/sparkTest/out/artifacts/sparkTest_jar/sparkTest.jar
        


0 0
原创粉丝点击