简体   繁体   中英

Unable to change memory allocation in Cloud Function

I am trying to increase the the Memory Allocation for some of my Cloud Function for past few hours. If I change it and deploy the Memory Allocation keeps staying in 512 MiB. It was working when I tried it few days back.

This is what I am doing,

Click on edit in function

Change Memory allocated to 2 GiB and click Next & Deploy

The memory allocated remain 512 MiB after deploying

What am I doing wrong? Can someone help me out on this please?

I'm unable to repro your experience using Cloud Console and gcloud .

gcloud functions describe ${NAME} \
--region=${REGION} \
--project=${PROJECT} \
--format="value(availableMemoryMb)"
256

Then revise it in the Console to 256MB and:

gcloud functions describe ${NAME} \
--region=${ERGION} \
--project=${PROJECT} \
--format="value(availableMemoryMb)"
512

Then revise it to 1024MB using gcloud deploy :

gcloud functions deploy ${NAME} \
--trigger-http \
--entry-point=${FUNCTION} \
--region=${REGION} \
--project=${PROJECT} \
--memory=1024MB

gcloud functions describe ${NAME} \
--region=${ERGION} \
--project=${PROJECT} \
--format="value(availableMemoryMb)"
1024

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM