简体   繁体   English

云存储中的 Memory 问题 Function

[英]Memory issues in a Cloud Storage Function

I have deployed a storage trigger cloud function that needs more memory. While deploying the GCF, I have deployed in the following manner with the appropriate flags.我已经部署了一个存储触发器云 function,它需要更多 memory。在部署 GCF 时,我已经通过以下方式使用适当的标志进行了部署。

gcloud functions deploy GCF_name--runtime python37 --trigger-resource bucket_name --trigger-event google.storage.object.finalize --timeout 540s --memory 8192MB

But I observed in the google cloud console, the memory utilization map is not going beyond 2GB.但我在谷歌云控制台中观察到,memory 利用率 map 不会超过 2GB。 And in the logs I am getting this error, Function execution took 34566 ms, finished with status: 'connection error' which happens because of memory shortage.在日志中,我收到此错误, Function execution took 34566 ms, finished with status: 'connection error' ,这是由于 memory 短缺而发生的。 Can I get some help on this.我能得到一些帮助吗?

图像利用率图

Edited已编辑

The application uploads text files to the storage that contains certain number of samples.该应用程序将文本文件上传到包含一定数量样本的存储中。 Each file is read when it is uploaded to the storage and the data appended to a pre existing file.每个文件在上传到存储时都会被读取,并且数据会附加到预先存在的文件中。 The total number of samples will be maximum of 75600002. That's why I need 8GB data.样本总数最多为 75600002。这就是我需要 8GB 数据的原因。 Its giving the connection error while appending the data to the file.它在将数据附加到文件时出现连接错误。

def write_to_file(filename,data,write_meta = False,metadata = []):
    file1 = open('/tmp/'+ filename,"a+")
    if write_meta:
        file1.write(":".join(metadata))
        file1.write('\n')
    file1.write(",".join(data.astype(str)))
    file1.close()

The memory utilisation map was the same after every upload.每次上传后 memory 利用率 map 都是一样的。

You are writing a file to /tmp which is an in-memory filesystem.您正在将文件写入内存文件系统/tmp So start by deleting that file when you finish uploading it.因此,请在完成上传后删除该文件。 In fact those:事实上那些:

Files that you write consume memory available to your function, and sometimes persist between invocations.您编写的文件消耗 memory 可用于您的 function,并且有时在两次调用之间持续存在。 Failing to explicitly delete these files may eventually lead to an out-of-memory error and a subsequent cold start.未能显式删除这些文件可能最终导致内存不足错误和随后的冷启动。

Ref: https://cloud.google.com/functions/docs/bestpractices/tips#always_delete_temporary_files参考: https://cloud.google.com/functions/docs/bestpractices/tips#always_delete_temporary_files

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 云 Function 发送 CSV 到云存储 - Cloud Function Sending CSV to Cloud Storage Trigger in cloud function 监控云存储并触发数据流 - Trigger in cloud function that monitors cloud storage and triggers dataflow 在 Google Cloud Function 中使用 npm 包的问题 - Issues with using npm packages in Google Cloud Function 具有 memory 分配管理的云 Function 调度程序 - Cloud Function scheduler with memory allocation management Firebase 云存储 - Unity - PutStreamAsync memory 大文件崩溃 - Firebase Cloud Storage - Unity - PutStreamAsync memory crash with large files 如何使用 Cloud Function 服务帐户在 Cloud Function 中创建签名的 Google Cloud Storage URL? - How to create signed Google Cloud Storage URLs in Cloud Function using Cloud Function service account? 是否有必要在 Firebase 云 Function 中释放 memory - is it necessary to free memory in Firebase Cloud Function 如何在特定文件集从谷歌云到达云存储时启动云数据流管道 function - how to launch a cloud dataflow pipeline when particular set of files reaches Cloud storage from a google cloud function 无法在云中更改 memory 分配 Function - Unable to change memory allocation in Cloud Function firestore.get function 在云存储安全规则中不起作用 - firestore.get function not working in Cloud Storage security rules
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM