简体   繁体   English

使用scala将文件复制到hadoop hdfs?

[英]Copy file to hadoop hdfs using scala?

I'm trying to copy a file on my local machine to my hdfs. 我正在尝试将本地计算机上的文件复制到我的hdfs。 However, I'm not sure how to do this in scala since the script I'm writing currently writes to a local CSV file. 但是,我不知道如何在scala中执行此操作,因为我正在编写的脚本当前写入本地CSV文件。 How can I move this file to HDFS using scala? 如何使用scala将此文件移动到HDFS?

edit: what I have done now: 编辑:我现在做了什么:

val hiveServer = new HiveJDBC
    val file =  new File(TMP_DIR, fileName)
    val firstRow = getFirstRow(tableName, hiveServer)
    val restData = getRestData(tableName, hiveServer)
    withPrintWriter(file) { printWriter => 
      printWriter.write(firstRow) 
      printWriter.write("\n")
      printWriter.write(restData)} 

I now want to store "file" in the HDFS 我现在想在HDFS中存储“文件”

Scala can invoke Hadoop API directly. Scala可以直接调用Hadoop API。 For example, 例如,

    val conf = new Configuration()
    val fs= FileSystem.get(conf)
    val output = fs.create(new Path("/your/path"))
    val writer = new PrintWriter(output)
    try {
        writer.write(firstRow) 
        writer.write("\n")
        writer.write(restData)
    }
    finally {
        writer.close()
    }

In run method add the code content. 在run方法中添加代码内容。

val conf = getConf()
val hdfs = FileSystem.get(conf)
val localInputFilePath = arg(0)
val inputFileName = getFileName(localInputFilePath)

var hdfsDestinationPath = arg(1)
val hdfsDestFilePath = new Path(hdfsDestinationPath + File.separator + inputFileName)

try {
  val inputStream: InputStream = new FileInputStream(localInputFilePath);
  val fsdos: FSDataOutputStream = hdfs.create(hdfsDestFilePath);
  IOUtils.copyBytes(inputStream, fsdos, conf, true);

} catch {
  case fnfe: FileNotFoundException => fnfe.printStackTrace();
  case ioe: IOException            => ioe.printStackTrace();
}

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM