Hadoop+HBase+Spark伪分布式整合部署(mac)

来源:互联网 发布:淘宝店铺装修大学教程 编辑:程序博客网 时间:2024/06/09 19:55

(linux和windows可以参考)

首先修改主机名(建议):

sudo scutil --set HostName localhost


Hadoop下载安装:

brew install hadoop

找到Hadoop配置文件目录

cd /usr/local/Cellar/hadoop/2.7.3/libexec/etc/hadoop

修改core-site.xml

<configuration><property><name>hadoop.tmp.dir</name><value>file:/usr/local/Cellar/hadoop/2.7.3/libexec/tmp</value></property><property><name>fs.defaultFS</name><value>hdfs://localhost:8020</value></property></configuration>

修改hdfs-site.xml

<configuration><property><name>dfs.replication</name><value>1</value></property><property><name>dfs.namenode.name.dir</name><value>file:/usr/local/Cellar/hadoop/2.7.3/libexec/tmp/dfs/name</value></property><property><name>dfs.namenode.data.dir</name><value>file:/usr/local/Cellar/hadoop/2.7.3/libexec/tmp/dfs/data</value></property></configuration>


在/etc/profile添加

#Hadoop environment configsexport HADOOP_HOME=/usr/local/Cellar/hadoop/2.7.3/libexecexport PATH=$PATH:${HADOOP_HOME}/bin

HDFS格式化

cd /usr/local/Cellar/hadoop/2.7.3/bin./hdfs namenode -format

启动DFS

cd /usr/local/Cellar/hadoop/2.7.3/sbin./start-dfs.sh 

如果启动成功,输入

jps

出现

1206 DataNode1114 NameNode1323 SecondaryNameNode


Hbase下载安装

brew install hbase


找到HBase配置文件目录

cd /usr/local/Cellar/hbase/1.2.2/libexec/conf

修改hbase-env,sh

export HBASE_CLASSPATH=/usr/local/Cellar/hadoop/2.7.3/libexec/etc/hadoopexport HBASE_MANAGES_ZK=trueexport HBASE_HOME=/usr/local/Cellar/hbase/1.2.2/libexecexport HBASE_LOG_DIR=${HBASE_HOME}/logsexport HBASE_REGIONSERVERS=${HBASE_HOME}/conf/regionservers


修改hbase-site.xml

<configuration>  <property>    <name>hbase.rootdir</name>    <value>hdfs://localhost:8020/hbase</value>  </property>  <property>    <name>hbase.cluster.distributed</name>    <value>true</value>  </property>  <property>    <name>dfs.replication</name>    <value>1</value>  </property></configuration>

在regionservers文件添加

localhost


在/etc/profile添加

#HBase environment configsexport HBASE_HOME=/usr/local/Cellar/hbase/1.2.2/libexecexport PATH=$PATH:${HBASE_HOME}/bin



启动HBase

cd /usr/local/Cellar/hbase/1.2.2/bin./start-hbase.sh

如果成功,输入

jps

会出现

30465 HRegionServer30354 HMaster1605 HQuorumPeer1206 DataNode30534 Jps1114 NameNode1323 SecondaryNameNode

Spark下载安装

brew install spark


找到Spark配置文件目录

cd /usr/local/Cellar/apache-spark/1.6.0/libexec/conf

修改spark-env.sh

cp spark-env.sh.template spark-env.sh

export SPARK_HOME=/usr/local/Cellar/apache-spark/1.6.0/libexecexport JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.8.0_102.jdk/Contents/Home


在/etc/profile添加

#Spark environment configsexport SPARK_HOME=/usr/local/Cellar/apache-spark/1.6.0/libexecexport PATH=$PATH:${SPARK_HOME}/bin

启动spark-shell

cd /usr/local/Cellar/apache-spark/1.6.0/bin./spark-shell



0 0
原创粉丝点击