HDFS JAVA API

Posted 代号菜鸟

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了HDFS JAVA API相关的知识,希望对你有一定的参考价值。

http://hadoop.apache.org/docs/r2.5.2/api/index.html

 1 import org.apache.hadoop.conf.Configuration;
 2 import org.apache.hadoop.fs.*;
 3 import org.apache.hadoop.io.IOUtils;
 4 import org.junit.Before;
 5 import org.junit.Test;
 6 
 7 import javax.security.auth.login.AppConfigurationEntry;
 8 import java.io.*;
 9 
10 /**
11  * Created by Administrator on 2017/7/22.
12  */
13 public class TestHDFS {
14 
15     private FileSystem fs;
16 
17     @Before
18     public void setUp() throws IOException {
19         Configuration conf = new Configuration();
20         fs = FileSystem.get(conf);
21     }
22 
23     @Test
24     public void testMakedir() throws IOException {
25         Path path = new Path("/user/jason/TestMkdir");
26         fs.mkdirs(path);
27         FileStatus fileStatus = fs.getFileStatus(path);
28         System.out.println(fileStatus.isDirectory());
29         System.out.println(fileStatus.getOwner());
30         System.out.println(fileStatus.getGroup());
31         System.out.println(fileStatus.getLen());
32     }
33 
34     @Test
35     public void testListStatus() throws IOException {
36         Path path = new Path("/user/jason");
37         FileStatus[] statuses = fs.listStatus(path);
38         for(FileStatus status:statuses){
39             System.out.println(status.getPath());
40         }
41     }
42 
43     @Test
44     public void testGlobalStatus() throws IOException {
45         Path path = new Path("/user/jason/test*");
46         FileStatus[] statuses = fs.globStatus(path);
47         for(FileStatus status:statuses){
48             System.out.println(status.getPath());
49         }
50     }
51 
52     @Test
53     public void testFileUpload() throws IOException {
54         String putFileName = "/user/jason/RHDSetup_hdfs.log";
55         Path writePath = new Path(putFileName);
56         FSDataOutputStream outputStream = fs.create(writePath);
57         FileInputStream inputStream = new FileInputStream(new File("C:\\RHDSetup.log"));
58         IOUtils.copyBytes(inputStream, outputStream,4096,false);
59         IOUtils.closeStream(inputStream);
60         IOUtils.closeStream(outputStream);
61     }
62 
63     @Test
64     public void testFileDownload() throws IOException {
65         String downloadFileName = "/user/jason/wcinput";
66         Path readPath = new Path(downloadFileName);
67         FSDataInputStream inputStream = fs.open(readPath);
68         FileOutputStream fileOutputStream = new FileOutputStream(new File("F:\\TestSpace\\wcinput.txt"));
69         IOUtils.copyBytes(inputStream,fileOutputStream,4096,false);
70         IOUtils.closeStream(inputStream);
71         IOUtils.closeStream(fileOutputStream);
72     }
73 
74     @Test
75     public void testCreateFile() throws IOException {
76         String newFileName = "/user/jason/newfile.txt";
77         Path path = new Path(newFileName);
78         FSDataOutputStream fsDataOutputStream = fs.create(path);
79         fsDataOutputStream.write("This is a new file".getBytes());
80         fsDataOutputStream.close();
81     }
82 
83     @Test
84     public void testModifyFile() throws IOException {
85         String modifyFileName = "/user/jason/newfile.txt";
86         Path path = new Path(modifyFileName);
87         FSDataOutputStream fsDataOutputStream = fs.append(path);
88         fsDataOutputStream.writeUTF("add one more row into it\r\n");
89         fsDataOutputStream.close();
90     }
91 
92     @Test
93     public void testDeleteFile() throws IOException {
94         String modifyFileName = "/user/jason/newfile.txt";
95         Path path = new Path(modifyFileName);
96         fs.delete(path,true);
97     }
98 
99 }

 

以上是关于HDFS JAVA API的主要内容,如果未能解决你的问题,请参考以下文章

HadoopHA 场景下访问 HDFS JAVA API Client

利用JAVA API远程进行HDFS的相关操作

HadoopHA 场景下访问 HDFS JAVA API Client

大数据HadoopHDFS的Java API操作

Java Api操作HDFS

IDEA 创建HDFS项目 JAVA api