简体   繁体   中英

AWS DAX Performance issues with table scan

Hi I am working on an project that requires to bring all dyanamo db document in memory. I will be using table.scan() boto3 method which nearly takes 33 seconds for all 10k records.

I have configured the DAX and using it for table scan, which takes nearly the 42 seconds with same 10k records with same lambda configuration. I tried multiple times results are same.

I tried below code :

daxclient = amazondax.AmazonDaxClient.resource(endpoint_url="...")
table = daxclient.Table('table_name')
start_time = time.perf_counter()
retry = True
while retry:
     try:
         response = table.scan(TableName ="table_name")
         retry = 'LastEvaluatedKey' in response
         scan_args['ExclusiveStartKey'] = response.get('LastEvaluatedKey')
     except Exception as e:
         print(e)
print(time.perf_counter()-start_time)

I tried boto3 getItem() method this becomes faster like first time it takes 0.4seconds and after that it takes 0.01 seconds.

Not sure why it is not working with table scan method.

Please suggest.

DAX doesn't cache scan results. You therefore shouldn't expect a performance boost and, since you're bouncing through an extra server on the way to the database, can expect a performance penalty.

You must have very large items to see these performance numbers. And you're doing a scan a lot? You might want to double check DynamoDB is the right fit.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM