sqoop工具在hive和mysql之间互相导数据

来源:互联网 发布:淘宝上卖的淘宝推广 编辑:程序博客网 时间:2024/06/03 01:44
参考:
Sqoop中文手册

1、列出mysql数据库:
sqoop list-databases --connect jdbc:mysql://192.168.100.13:3306 --username hive --password hive

[root@master sqoop]# sqoop list-databases --connect jdbc:mysql://192.168.100.13:3306/ --username hive --password hive
Warning: /opt/cloudera/parcels/CDH-5.5.0-1.cdh5.5.0.p0.8/bin/../lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
15/12/08 12:44:24 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.5.0
15/12/08 12:44:24 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
15/12/08 12:44:25 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
information_schema
hive
mysql
test
[root@master sqoop]#

#################

2、列出mysql数据库hive中的表:

sqoop list-tables --connect jdbc:mysql://192.168.100.13:3306/hive --username hive --password hive

##################

3、复制表结构:
mysql数据的表结构复制到hive中,只是复制表的结构,表中的内容没有复制过去。

sqoop create-hive-table --connect jdbc:mysql://192.168.100.13:3306/hive --username hive --password hive \
--table tbl_6005  --hive-table sqoop_test

[root@master sqoop]# sqoop create-hive-table --connect jdbc:mysql://192.168.100.13:3306/hive --username hive --password hive \
> --table tbl_6005  --hive-table sqoop_test
Warning: /opt/cloudera/parcels/CDH-5.5.0-1.cdh5.5.0.p0.8/bin/../lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
15/12/08 13:29:33 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.5.0
15/12/08 13:29:33 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
15/12/08 13:29:33 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
15/12/08 13:29:33 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
15/12/08 13:29:34 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
15/12/08 13:29:34 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tbl_6005` AS t LIMIT 1
15/12/08 13:29:34 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tbl_6005` AS t LIMIT 1
15/12/08 13:29:46 INFO hive.HiveImport: Loading uploaded data into Hive

Logging initialized using configuration in jar:file:/opt/cloudera/parcels/CDH-5.5.0-1.cdh5.5.0.p0.8/jars/hive-common-1.1.0-cdh5.5.0.jar!/hive-log4j.properties
OK
Time taken: 12.528 seconds
[root@master sqoop]#

###########################

4 从mysql数据库导入文件到hive中

sqoop import --connect jdbc:mysql://192.168.100.13:3306/hive --username hive --password hive \
--table tbl_6005 --hive-import --hive-table sqoop_test -m 1

15/12/08 13:35:01 INFO mapreduce.ImportJobBase: Transferred 11.4287 KB in 66.9003 seconds (174.9319 bytes/sec)
15/12/08 13:35:01 INFO mapreduce.ImportJobBase: Retrieved 72 records.
15/12/08 13:35:01 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tbl_6005` AS t LIMIT 1
15/12/08 13:35:01 INFO hive.HiveImport: Loading uploaded data into Hive

Logging initialized using configuration in jar:file:/opt/cloudera/parcels/CDH-5.5.0-1.cdh5.5.0.p0.8/jars/hive-common-1.1.0-cdh5.5.0.jar!/hive-log4j.properties
OK
Time taken: 1.49 seconds
Loading data to table default.sqoop_test
chgrp: changing ownership of 'hdfs://mycluster/user/hive/warehouse/sqoop_test/part-m-00000': User does not belong to hive
Table default.sqoop_test stats: [numFiles=1, totalSize=11703]
OK
Time taken: 11.517 seconds
[root@master sqoop]#

##############################
#JDBC驱动版本bug(5.1.17版本有bug,需要换到5.1.32),更换更高版本正常。
http://blog.csdn.net/wangtao6791842/article/details/41041677

15/12/08 13:09:24 ERROR manager.SqlManager: Error reading from database: java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@6438b6d is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.
java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@6438b6d is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.
        at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:934)
        at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:931)
        at com.mysql.jdbc.MysqlIO.checkForOutstandingStreamingData(MysqlIO.java:2735)
        at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1899)
        at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2151)
        at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2619)
        at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2569)
        at com.mysql.jdbc.StatementImpl.executeQuery(StatementImpl.java:1524)
        at com.mysql.jdbc.ConnectionImpl.getMaxBytesPerChar(ConnectionImpl.java:3003)
        at com.mysql.jdbc.Field.getMaxBytesPerCharacter(Field.java:602)
        at com.mysql.jdbc.ResultSetMetaData.getPrecision(ResultSetMetaData.java:445)
        at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:286)
        at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:241)
        at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:227)
        at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:327)
        at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1834)
        at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1646)
        at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)
        at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:64)
        at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:100)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
15/12/08 13:09:24 ERROR tool.ExportTool: Encountered IOException running export job: java.io.IOException: No columns to generate for ClassWriter
[root@master sqoop]#
0 0
原创粉丝点击
热门问题 老师的惩罚 人脸识别 我在镇武司摸鱼那些年 重生之率土为王 我在大康的咸鱼生活 盘龙之生命进化 天生仙种 凡人之先天五行 春回大明朝 姑娘不必设防,我是瞎子 生育险差一个月怎么办 驾照扣了38分怎么办 新疆转入山东上学怎么办手续 驾照过日期换证怎么办 机动车被扣24分怎么办 车辆被扣24分怎么办 现在深圳牌十年老车怎么办? 护士证过期4年了怎么办 护士资格证延续注册过期了怎么办 护士资格证过期没注册怎么办 护士资格证注册时间过期怎么办 辅警体检视力不行怎么办 护士延续注册体检怀孕怎么办 护士资格证没有延续注册怎么办 申请信用卡没有座机号码怎么办 网上申请信用卡没有座机号码怎么办 我叫上门服务被骗了怎么办 上门服务被骗了3000多怎么办 微信被骗9000元怎么办 奥迪a8气囊灯亮怎么办 驾考站岗迟到了怎么办 老板欠员工工资不给怎么办 如果有一天我没头发了怎么办 苏州公积金密码忘了怎么办 科二考试第二把怎么办 科一老是记不住怎么办 科目二考试没去怎么办 网约车驾龄不到怎么办 科四预约不上怎么办 教练不退钱怎么办找谁 驾考出入证丢了怎么办 科二成绩单丢了怎么办 考驾照的准考证丢了怎么办 驾考预约去不了怎么办 科目一预约没去怎么办 打狂犬疫苗期间感冒了怎么办 公司社保欠费不交怎么办 25号社保不交怎么办欠费 会计从业停考了怎么办 黑龙江龙育黄了档案怎么办 科目四档案丢了怎么办