简体   繁体   中英

Google Cloud SQL import - ERROR: HTTPError 403: The client is not authorized to make this request

I trying to import a database stored in the Cloud Storage using the command:

gcloud sql instances import instance-name gs://connect-to-the-cloud-sql.appspot.com/my-cloud-sql-instance-backup

But, I am getting error:

ERROR: (gcloud.sql.instances.import) HTTPError 403: The client is not authorized to make this request.

I've already logged in using:

gcloud auth login

Make sure the instance-name is correct. I had the same error, it went away as soon as I corrected the instance-name.

I had this problem, my instance name was correct. Turns out I was in the wrong GCP project. Make sure you switch to the correct [target] project or use the project argument:

gcloud sql instances export my-cloud-sql-instance gs://connect-to-the-cloud-sql.appspot.com/my-cloud-sql-instance-backup --project=<your target project>

In my case it was because the cloud sql instance service account didn't have the correct permissions on the storage bucket I was trying to import from.

From the docs :

  1. Describe the instance you are importing to:

     gcloud sql instances describe [INSTANCE_NAME]
  2. Copy the serviceAccountEmailAddress field.

  3. Use gsutil iam to grant the legacyBucketWriter and objectViewer Cloud IAM roles to the service account for the bucket.

  4. Import the database:

     gcloud sql import sql [INSTANCE_NAME] gs://[BUCKET_NAME]/[IMPORT_FILE_NAME] \\ --database=[DATABASE_NAME]

It might sound too obvious, but your service account really may be missing access rights for importing data. Check that it has correct cloudsql.instances.import policy installed on IAM&Admin page.

New step by step:

gcloud sql instances describe name-instance | grep serviceAccountEmailAddress

# output: serviceAccount:account@gcp-sa.com

gsutil iam ch serviceAccount:account@gcp-sa.com:roles/storage.legacyBucketWriter gs://bucket-destino
gsutil iam ch serviceAccount:account@gcp-sa.com:roles/storage.objectViewer gs://bucket-destino

# -----------en vm linux--gcp------------------------------------------------------------------------------------
gcloud init (id-project-bucket-destino hacer default en vm proyecto de bucket donde se guardara info)
gcloud config set project id-project-bucket-destino

gcloud sql export sql --project=id-project-instance name-instance gs://bucket-destino/sqldump.sql \
--database=name-database \
--offload

# ----------cron job in linux------------------------------------------------------------------------------------
#!/bin/sh

#make directory in Cloud storage

datedirect=$(date '+%d-%m-%Y')

echo $datedirect

touch file5

gsutil cp -r ./file5 gs://bucket-destino/$datedirect/

gcloud config set project id-project-bucket-destino

gcloud sql export sql --project=id-project-instance name-instance gs://bucket-destino/sqldump.sql \
--database=name-database \
--offload

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM