sqoop 之数据迁移

Posted shanhua-fu

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了sqoop 之数据迁移相关的知识,希望对你有一定的参考价值。

安装sqoop的前提是已经具备java和hadoop的环境
1、下载并解压
最新版下载地址http://ftp.wayne.edu/apache/sqoop/1.4.6/


2、修改配置文件
$ cd $SQOOP_HOME/conf
$ mv sqoop-env-template.sh sqoop-env.sh
打开sqoop-env.sh并编辑下面几行:
export HADOOP_COMMON_HOME=/home/hadoop/apps/hadoop-2.6.1/ 
export HADOOP_MAPRED_HOME=/home/hadoop/apps/hadoop-2.6.1/
export HIVE_HOME=/home/hadoop/apps/hive-1.2.1


3、加入mysql的jdbc驱动包
cp  ~/app/hive/lib/mysql-connector-java-5.1.28.jar   $SQOOP_HOME/lib/

4、验证启动

$ cd $SQOOP_HOME/bin

$ sqoop-version

预期的输出:

15/12/17 14:52:32 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6

Sqoop 1.4.6 git commit id 5b34accaca7de251fc91161733f906af2eddbe83

Compiled by abe on Fri Aug 1 11:19:26 PDT 2015

到这里,整个Sqoop安装工作完成。

常用命令:

Available commands:
codegen    Generate code to interact with database records
create-hive-table    Import a table definition into Hive
eval    Evaluate a SQL statement and display the results
export    Export an HDFS directory to a database table
help    List available commands
import    Import a table from a database to HDFS
import-all-tables    Import tables from a database to HDFS
import-mainframe    Import datasets from a mainframe server to HDFS
job    Work with saved jobs
list-databases    List available databases on a server
list-tables    List available tables in a database
merge    Merge results of incremental imports
metastore    Run a standalone Sqoop metastore
version    Display version information

测试例子:

将数据库的表数据导入的hdfs:

mysql> select * from tmp;               
+------+------+
| id   | name |
+------+------+
|    1 | f    |
|    2 | a    |
|    3 | b    |
|    4 | c    |
+------+------+
4 rows in set (0.00 sec)

在sqoop 机器上测试是否可以连上mysql:
[[email protected] sqoop-1.4.6]# mysql -h slave1 -uroot -p123456
[[email protected] sqoop-1.4.6]# mysql -h slave1 -uroot -p123456
ERROR 1045 (28000): Access denied for user root@master (using password: YES)
需要授权:
mysql> show grants;
+----------------------------------------------------------------------------------------------------------------------------------------+
| Grants for [email protected]                                                                                                              |
+----------------------------------------------------------------------------------------------------------------------------------------+
| GRANT ALL PRIVILEGES ON *.* TO root@localhost IDENTIFIED BY PASSWORD *6BB4837EB74329105EE4568DDA7DC67ED2CA2AD9 WITH GRANT OPTION |
+----------------------------------------------------------------------------------------------------------------------------------------+
1 row in set (0.00 sec)

mysql> grant all privileges on *.* to root@% identified by 123456;            
Query OK, 0 rows affected (0.00 sec)

mysql> flush privileges;
Query OK, 0 rows affected (0.00 sec)

#####################例子##############

[[email protected] sqoop-1.4.6]# pwd
/usr/local/src/sqoop-1.4.6

[[email protected] sqoop-1.4.6]# bin/sqoop import --connect jdbc:mysql://slave1:3306/test --username root --password 123456 --table tmp --m 1

。。。

。。。

18/01/18 02:20:47 INFO mapreduce.ImportJobBase: Transferred 16 bytes in 146.6658 seconds (0.1091 bytes/sec)
18/01/18 02:20:47 INFO mapreduce.ImportJobBase: Retrieved 4 records.

成功!!!!!!!!!!

提示:sqoop 是要跑mapreduce ,所以所有的MapReduce 机器一定要能给解析mysql 机器(也就是可以ping mysql 机器)

 

默认不指定路径:

[[email protected] sqoop-1.4.6]# hadoop fs -ls /user/root/tmp/
Found 2 items
-rw-r--r-- 3 root supergroup 0 2018-01-18 02:20 /user/root/tmp/_SUCCESS
-rw-r--r-- 3 root supergroup 16 2018-01-18 02:20 /user/root/tmp/part-m-00000
[[email protected] sqoop-1.4.6]# hadoop fs -cat /user/root/tmp/part-m-00000
1,f
2,a
3,b
4,c

 

以上是关于sqoop 之数据迁移的主要内容,如果未能解决你的问题,请参考以下文章

Hadoop中级之Sqoop

大数据开发之Sqoop详细介绍

sqoop数据迁移

sqoop数据迁移

Sqoop数据迁移工具的使用

Sqoop 数据迁移工具