简体   繁体   English

Azure 存储 Blob:上传的 CSV 文件显示零字节

[英]Azure Storage Blob: Uploaded CSV file shows zero bytes

This problem I am facing in title is very similar to this question previously raised here ( Azure storage: Uploaded files with size zero bytes ), but it was for .NET and the context for my Java scenario is that I am uploading small-size CSV files on a daily basis (about less than 5 Kb per file).我在标题中遇到的这个问题与之前在这里提出的这个问题非常相似( Azure 存储:上传的文件大小为零字节),但它是针对 .NET 的,而我的 Java 场景的上下文是我正在上传小型 CSV 文件每天(每个文件大约少于 5 Kb)。 In addition the API code uses the latest version of Azure API that I am using in contrast against the 2010 used by the other question.此外,API 代码使用最新版本的 Azure API,与另一个问题使用的 2010 版本形成对比。

I couldn't figure out where have I missed out, but the other alternative is to do it in File Storage, but of course the blob approach was recommended by a few of my peers.我不知道我错过了哪里,但另一种选择是在文件存储中进行,但当然我的一些同行推荐了 blob 方法。

So far, I have mostly based my code on uploading a file as a block of blob on the sample that was shown in the Azure Samples git [page] ( https://github.com/Azure-Samples/storage-blob-java-getting-started/blob/master/src/BlobBasics.java ).到目前为止,我的代码主要基于在 Azure 样本 git [页面] ( https://github.com/Azure-Samples/storage-blob-java -入门/blob/master/src/BlobBasics.java )。 I have already done the container setup and file renaming steps, which isn't a problem, but after uploading, the size of the file at the blob storage container on my Azure domain shows 0 bytes.我已经完成了容器设置和文件重命名步骤,这不是问题,但上传后,我的 Azure 域上的 blob 存储容器中的文件大小显示为 0 字节。

I've tried alternating in converting the file into FileInputStream and upload it as a stream but it still produces the same manner.我尝试交替将文件转换为 FileInputStream 并将其作为 stream 上传,但它仍然以相同的方式产生。

fileName=event.getFilename(); //fileName is e.g eod1234.csv
String tempdir = System.getProperty("java.io.tmpdir");
file= new File(tempdir+File.separator+fileName); //
try {
    PipedOutputStream pos = new PipedOutputStream();
    stream= new PipedInputStream(pos);
    buffer = new byte[stream.available()];
    stream.read(buffer);
    FileInputStream fils = new FileInputStream(file);
    int content = 0;
    while((content = fils.read()) != -1){
        System.out.println((char)content);
    }
    //Outputstream was written as a test previously but didn't work
    OutputStream outStream = new FileOutputStream(file);
    outStream.write(buffer);
    outStream.close();

    // container name is "testing1"            
    CloudBlockBlob blob = container.getBlockBlobReference(fileName);
    if(fileName.length() > 0){
       blob.upload(fils,file.length()); //this is testing with fileInputStream
       blob.uploadFromFile(fileName); //preferred, just upload from file
    }
}            

There are no error messages shown, just we know that the file touches the blob storage and shows a size 0 bytes.没有显示任何错误消息,只是我们知道该文件涉及 blob 存储并显示大小为 0 字节。 It's a one-way process by only uploading CSV-format files.这是一种单向过程,仅上传 CSV 格式的文件。 At the blob container, it should be showing those uploaded files a size of 1-5 KBs each.在 blob 容器中,它应该显示每个上传的文件大小为 1-5 KB。

Instead of blob.uploadFromFile(fileName); 代替blob.uploadFromFile(fileName); you should use blob.uploadFromFile(file.getAbsolutePath()); 您应该使用blob.uploadFromFile(file.getAbsolutePath()); because uploadFromFile method requires absolute path. 因为uploadFromFile方法需要绝对路径。 And you don't need the blob.upload(fils,file.length()); 而且您不需要blob.upload(fils,file.length()); .

Refer to Microsoft Docs: https://docs.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-java#upload-blobs-to-the-container 请参阅Microsoft文档: https : //docs.microsoft.com/zh-cn/azure/storage/blobs/storage-quickstart-blobs-java#upload-blobs-to-the-container

The Azure team replied to a same query I've put on mail and I have confirmed that the problem was not on the API, but due to the Upload component in Vaadin which has a different behavior than usual ( https://vaadin.com/blog/uploads-and-downloads-inputs-and-outputs ). Azure团队回复了我在邮件中发送的同一查询,并且我已经确认问题不在API上,而是由于Vaadin中的Upload组件的行为与通常不同( https://vaadin.com / blog / uploads-and-downloads-inputs-and-outputs )。 Either the CloudBlockBlob or the BlobContainerUrl approach works. CloudBlockBlob或BlobContainerUrl方法均可工作。

The out-of-the-box Upload component requires manual implementation of the FileOutputStream to a temporary object unlike the usual servlet object that is seen everywhere. 开箱即用的上载组件需要将FileOutputStream手动实现到一个临时对象,这不同于在任何地方都可以看到的普通servlet对象。 Since there was limited time, I used one of their addons, EasyUpload, because it had Viritin UploadFileHandler incorporated into it instead of figuring out how to stream the object from scratch. 由于时间有限,我使用了他们的附加组件之一EasyUpload,因为它包含Viritin UploadFileHandler而不是弄清楚如何从头开始流对象。 Had there been more time, I would definitely try out the MultiFileUpload addon, which has additional interesting stuff, in my sandbox workspace. 如果还有更多时间,我肯定会在我的沙盒工作区中试用MultiFileUpload插件,该插件还有其他有趣的东西。

I had this same problem working with .png (copied from multipart files) files I was doing this:我在使用.png (从多部分文件复制)文件时遇到了同样的问题,我正在这样做:

File file = new File(multipartFile.getOriginalFilename());

and the blobs on Azure were 0bytes but when I changed to this: Azure 上的 blob 是 0bytes 但是当我改成这个时:

File file = new File("C://uploads//"+multipartFile.getOriginalFilename());

it started saving the files properly它开始正确保存文件

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 无法将 csv 文件上传到 azure 存储 blob - Not able to uploading csv file to azure storage blob 如何更新 Azure 中放置在 Blob 存储中的 CSV 文件 - How to update CSV file placed at Blob Storage in Azure 如何在 Azure Blob 存储中覆盖后命名 csv 文件 - How to name a csv file after overwriting in Azure Blob Storage Azure function C#:写入 Azure 上的块 blob(csv 文件)存储帐户创建了两个版本的 blob - Azure function C#: Writing to Block blob (csv file) on Azure Storage Account creates two versions of the blob Azure Blob 存储文件访问 - Azure Blob Storage file access 为什么json文件上传到azure blob存储c#时出现Rare characters? - Why does Rare characters appears when json file is uploaded to azure blob storage c#? 如何使用 Airflow 将 CSV 文件从 Azure Data Lake/Blob 存储传输到 PostgreSQL 数据库 - How to transfer a CSV file from Azure Data Lake/Blob Storage to PostgreSQL database with Airflow 从 azure blob 存储中获取文件名 - Get file names from azure blob storage 是否可以恢复被覆盖的 azure blob 存储文件? - Is it possible to recover an azure blob storage file that was overwritten? 将文件上传到服务器中的 Azure blob 存储 - Uploading file to Azure blob storage in server
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM