libhdfs的配置和使用

Posted cqdxwjd

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了libhdfs的配置和使用相关的知识,希望对你有一定的参考价值。

测试环境:centos6.10,hadoop2.7.3,jdk1.8

测试代码:HDFSCSample.c

#include "hdfs.h"
#include <string.h>
#include <stdio.h>
#include <stdlib.h>

int main(int argc, char **argv) {

    hdfsFS fs = hdfsConnect("default", 0);
    const char* writePath = "/tmp/testfile.txt";
    hdfsFile writeFile = hdfsOpenFile(fs, writePath, O_WRONLY |O_CREAT, 0, 0, 0);
    if(!writeFile) {
          fprintf(stderr, "Failed to open %s for writing!
", writePath);
          exit(-1);
    }
    char* buffer = "Hello, World!";
    tSize num_written_bytes = hdfsWrite(fs, writeFile, (void*)buffer, strlen(buffer)+1);
    if (hdfsFlush(fs, writeFile)) {
           fprintf(stderr, "Failed to ‘flush‘ %s
", writePath);
          exit(-1);
    }
    hdfsCloseFile(fs, writeFile);
}

编译脚本:

  compile.sh

#!/bin/bash
export JAVA_HOME=/root/softs/jdk1.8.0_172
export HADOOP_HOME=/root/softs/hadoop-2.7.3/

gcc -I$HADOOP_HOME/include -L$HADOOP_HOME/lib/native -lhdfs -L$JAVA_HOME/jre/lib/amd64/server -ljvm HDFSCSample.c

执行命令: 

  # chmod +x compile.sh

  # ./compile.sh

执行脚本:

  execute.sh

#!/bin/bash
export JAVA_HOME=/root/softs/jdk1.8.0_172
export HADOOP_HOME=/root/softs/hadoop-2.7.3/
export LD_LIBRARY_PATH=$HADOOP_HOME/lib/native:$JAVA_HOME/jre/lib/amd64/server/

CLASSPATH=./
for f in $HADOOP_HOME/share/hadoop/common/*.jar; do
  CLASSPATH=${CLASSPATH}:$f;
done
for f in $HADOOP_HOME/share/hadoop/common/lib/*.jar; do
  CLASSPATH=${CLASSPATH}:$f;
done
for f in $HADOOP_HOME/share/hadoop/hdfs/*.jar; do
  CLASSPATH=${CLASSPATH}:$f;
done
for f in $HADOOP_HOME/share/hadoop/hdfs/lib/*.jar; do
  CLASSPATH=${CLASSPATH}:$f;
done
export CLASSPATH=$CLASSPATH

./a.out

执行命令:

  # chmod +x execute.sh

  # ./execute.sh

打开/tmp/testFile.txt,可以看到里面写入了一条 Hello, World!

  

以上是关于libhdfs的配置和使用的主要内容,如果未能解决你的问题,请参考以下文章

Hadoop 如何使用libhdfs.so

Hadoop 如何使用libhdfs.so

hdf5 / h5py ImportError: libhdf5.so.7

Yum 在 Amazon Linux 上安装 libhdf5-dev

GDAL:库未加载 libhdf5.100.dylib

Anaconda3 libhdf5.so.9:无法打开共享对象文件[在 py2.7 上工作正常,但在 py3.4 上不能正常工作]