简体   繁体   English

Python BigQuery客户端缓存不起作用

[英]Python BigQuery Client Caching does not work

It seems like the caching for the Python bigquery library does not work. 似乎Python bigquery库的缓存不起作用。 The example below always prints None . 下面的示例始终显示None How can I fix this issue? 如何解决此问题?

from google.cloud import bigquery
query = """
SELECT *
FROM (SELECT 1)
"""
job_config = bigquery.QueryJobConfig()
job_config.use_query_cache = True
results = bq_client.query(query, job_config=job_config)
print(results.cache_hit)

Turns out you gotta do something with the results object first. 原来,您必须先对结果对象进行操作。 For example results.to_dataframe() . 例如results.to_dataframe()

Thx to VictorGGI and William Funks: "cache_hit" will return "None if job is not yet complete" ( cache_hit ). 感谢VictorGGI和William Funks:“ cache_hit”将返回“如果作业尚未完成, 则为 None”( cache_hit )。 You would need to run "done" ( done ) to verify job is completed, or any other method which verified this 您将需要运行“ done”( 完成 )以验证作业已完成,或者需要使用其他任何方法来验证此操作

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM