I searched around to find a way to Batch delete BigTable tables and BigQuery datasets (using python's library) without any luck up to now.
Is anyone aware of an efficient way to do that?
I looked into these links but nothing promising:
Im looking for something similar as this one coming from datastore documentation:
from google.cloud import datastore
# For help authenticating your client, visit
# https://cloud.google.com/docs/authentication/getting-started
client = datastore.Client()
keys = [client.key("Task", 1), client.key("Task", 2)]
client.delete_multi(keys)
I think it's not possible natively, you have to develop your own script.
For example you can configure all the tables to delete, then there are many solutions:
Develop a Python
script, loop on the tables to delete and use Python Bigquery
and Bigtable
clients: https://cloud.google.com/bigquery/docs/samples/bigquery-delete-dataset https://cloud.google.com/bigtable/docs/samples/bigtable-hw-delete-table
Develop a shell
script, loop on the tables to delete and use bq
and cbt
(from gcloud
sdk): https://cloud.google.com/bigquery/docs/managing-tables?hl=en#deleting_a_table https://cloud.google.com/bigtable/docs/cbt-reference?hl=fr
If it's possible on your side, you can also use Terraform
to delete multiple Bigquery
and Bigtable
tables, but it's more adapted if you need to manage a state for your infrastructure: https://registry.terraform.io/providers/hashicorp/google/latest/docs/resources/bigquery_table https://registry.terraform.io/providers/hashicorp/google/latest/docs/resources/bigtable_table
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.