[英]Profiling a Python function reading documents from CosmosDB seems to take more time than expected
I am profiling a Python function which loads 3 very small documents from CosmosDB (using 'container.read_item') and seems to take more time than (subjectively) expected.我正在分析一个 Python 函数,它从 CosmosDB 加载 3 个非常小的文档(使用“container.read_item”)并且似乎比(主观上)预期的花费更多的时间。
I am using the 'pyinstrument' profiler and the results are:我正在使用“pyinstrument”分析器,结果是:
1.065 get_user_projects main.py:147
└─ 1.065 get_user_projects services/project_data_store_service.py:13
├─ 0.918 get_project services/project_data_store_service.py:7
│ └─ 0.918 wrapper_use_tracer azure/core/tracing/decorator.py:75
│ [124 frames hidden] azure, requests, urllib3, http, socke...
│ 0.662 _SSLSocket.read <built-in>:0
│ 0.220 _SSLSocket.read <built-in>:0
Is my suspicion right, and is there some tracing enabled for CosmosDB which takes a lot of time?我的怀疑是否正确,是否为 CosmosDB 启用了一些需要大量时间的跟踪?
Upon creating my CosmosDB client, I disabled logging and lowered the log level using:创建 CosmosDB 客户端后,我禁用了日志记录并使用以下方法降低了日志级别:
client = CosmosClient(endpoint, credential=key, logging_enable=False)
logger = logging.getLogger("azure.core.pipeline.policies.http_logging_policy")
logger.setLevel(logging.WARNING)
Of course, my Azure CosmosDB container setup could be the cause, but I want to rule out tracing issues.当然,我的 Azure CosmosDB 容器设置可能是原因,但我想排除跟踪问题。
Thank you Matias Quaranta posting your suggestion as an answer to help other community members.感谢Matias Quaranta发布您的建议作为帮助其他社区成员的答案。
"Normally when doing performance testing we do not look at 1 operation, but rather, what is the P99 or P95 or PXX latency during a period of time, and the reason is we will always have initial latency when connections are established." “通常在进行性能测试时,我们不会看 1 个操作,而是看一段时间内的P99或P95或 PXX 延迟是多少,原因是我们在建立连接时始终会有初始延迟。”
For more information please refer this SO THREAD : Slow performance on Azure DocumentDB有关详细信息,请参阅此 SO THREAD: Azure DocumentDB 上的性能下降
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.