简体   繁体   English

GCP 项目迁移 - 维护 ID

[英]GCP Project Migration - Maintain IDs

I'm migrating a project from no organization to a new organization.我正在将项目从无组织迁移到新组织。 Will the project ID, dataset IDs, and any other IDs remain the same?项目 ID、数据集 ID 和任何其他 ID 是否会保持不变? Are there any potential disruptions besides those mentioned in this document ?除了本文档中提到的那些之外,是否还有任何潜在的中断?

According to official documentacion, the project ID can only be modified when you're creating the project, after that you can not modify the project ID.根据官方文档, 项目ID只能在创建项目时修改,之后不能再修改项目ID。 So when you migrate your project to a new organization, the project ID will not change.因此,当您将项目迁移到新组织时,项目 ID 不会发生变化。

Regarding your question about the dataset's ID , once you create a dataset it can't be relocated, but you can make copies of a dataset, manually or using the BigQuery Data Transfer Service .关于您关于数据集 ID的问题,一旦您创建了一个数据集,它就无法重新定位,但您可以手动或使用BigQuery Data Transfer Service制作数据集的副本。 When you copy your dataset to a new location, you can choose to create the new dataset with the same ID or use a different one.将数据集复制到新位置时,您可以选择使用相同 ID 创建新数据集或使用不同 ID。

To manually move a dataset from one location to another, follow this process:要手动将数据集从一个位置移动到另一个位置,请遵循以下过程:

Export the data from your BigQuery tables to a Cloud Storage bucket in either the same location as your dataset or in a location contained within your dataset's location.将 BigQuery 表中的数据导出到 Cloud Storage 存储桶,该存储桶位于与数据集相同的位置或包含在数据集位置内的位置。 For example, if your dataset is in the EU multi-region location, you could export your data to the europe-west1 Belgium location, which is part of the EU.例如,如果您的数据集位于EU多区域位置,您可以将数据导出到属于欧盟的europe-west1比利时位置。

Copy or move the data from your export Cloud Storage bucket to a new bucket you created in the destination location.将数据从导出的 Cloud Storage 存储桶复制或移动到您在目标位置创建的新存储桶。 For example, if you are moving your data from the US multi-region to the asia-northeast1 Tokyo region, you would transfer the data to a bucket you created in Tokyo.例如,如果您要将数据从US多区域移动到asia-northeast1东京区域,您需要将数据传输到您在东京创建的存储桶。 For information on transferring Cloud Storage objects, see Copying, renaming, and moving objects in the Cloud Storage documentation.有关传输 Cloud Storage 对象的信息,请参阅 Cloud Storage 文档中的复制、重命名和移动对象。

After you transfer the data to a Cloud Storage bucket in the new location, create a new BigQuery dataset (in the new location).将数据传输到新位置的 Cloud Storage 存储桶后,创建一个新的 BigQuery 数据集(在新位置)。 Then, load your data from the Cloud Storage bucket into BigQuery.然后,将您的数据从 Cloud Storage 存储桶加载到 BigQuery 中。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 使用 terraform 在 GCP 中为不同的项目 ID 创建计算实例时出现问题 - Problem with creating compute instance in GCP for different project IDs with terraform GCP:如何修剪/维护工件注册表存储? - GCP: How to prune/maintain Artifact Registry storage? 将 GCP 系统与 GCP 项目相关联 - correlate GCP sys- with GCP project 无法迁移 GCP 项目 - Unable to Migrate GCP Project Google 云数据库迁移服务 - 是否有人使用 DMS 使用 VPC 对等将 GCP postgres 实例从一个服务项目迁移到另一个服务项目? - Google Cloud Database Migration Service - Has anyone used DMS to migrate GCP postgres instance from one service project to another using VPC Peering? 如何在 GCP 中为项目添加权限? - How to add permissions to a project in GCP? GCP Python - 删除项目标签 - GCP Python - Delete Project Labels 项目的结算帐号未打开 GCP - Billing account for project is not open GCP 从 Snowflake(在 GCP 实例上)到 Snowflake(Azure 实例)的数据迁移 - Data Migration from Snowflake (on GCP Instance) to Snowflake (Azure Instance) 使用 GCP 的数据库迁移服务迁移过程中出错 MySQL - Error during migration process using GCP's Database Migration Service for MySQL
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM