简体   繁体   中英

Batch delete BigTable tables and BigQuery datasets

I searched around to find a way to Batch delete BigTable tables and BigQuery datasets (using python's library) without any luck up to now.

Is anyone aware of an efficient way to do that?

I looked into these links but nothing promising:

  1. BigQuery
  2. BigTable

Im looking for something similar as this one coming from datastore documentation:

from google.cloud import datastore

# For help authenticating your client, visit
# https://cloud.google.com/docs/authentication/getting-started
client = datastore.Client()

keys = [client.key("Task", 1), client.key("Task", 2)]
client.delete_multi(keys)

Batch delete

I think it's not possible natively, you have to develop your own script.

For example you can configure all the tables to delete, then there are many solutions:

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM