I have regularly (each ~7 hours) unexpected waiting time before receiving response from Custom Authorizer.
My systems:
The lambda-test calls the api gateway endpoints:
response1 = requests.get(api1, auth=AUTH, timeout=4)
response2 = requests.get(api2, auth=AUTH, timeout=4)
Every ~7 hours, both requests timeout, as shown in the cloudwatch logs :
07:22:11 START RequestId: beabb449-a41d-11e7-8469-93a8731ae2d8 Version: $LATEST
07:22:16 HTTPSConnectionPool(host='<host>', port=443): Read timed out. (read timeout=4)
07:22:20 HTTPSConnectionPool(host='<host>', port=443): Read timed out. (read timeout=4)
07:22:20 END RequestId: beabb449-a41d-11e7-8469-93a8731ae2d8
07:22:20 REPORT RequestId: beabb449-a41d-11e7-8469-93a8731ae2d8 Duration: 8407.03 ms Billed Duration: 8500 ms Memory Size: 128 MB Max Memory Used: 36 MB
Cloudwatch metrics of lambda-test duration: a peak every ~7h (the peak high changed because I changed the timeout from 2s to 4s a few days ago)
For the requests who happened at 07:22:11:
07:22:11 start lambda-test
07:22:11 try to connect to api1
07:22:12 start authorizer for api1's call
07:22:16 lambda-test: api1 timeout
07:22:16 try to connect to api2
07:22:16 start authorizer for api2's call
07:22:19 start lambda-auth for api1's call
07:22:19 end lambda-auth for api1's call
07:22:19 authorizer sucessfull for api1's call
07:22:19 start lambda-auth for api2's call
07:22:20 end lambda-auth for api2's call
07:22:20 authorizer sucessfull for api2's call
07:22:20 lambda-test: api2 timeout
07:22:20 end lambda-test
If someone have hints about from where this authorizer latency could come, it would be great !
Thank you for your time,
here are all the corresponding logs for each part of the system:
lambda-test:
07:22:11 START RequestId: beabb449-a41d-11e7-8469-93a8731ae2d8 Version: $LATEST
07:22:16 HTTPSConnectionPool(host='<host>', port=443): Read timed out. (read timeout=4)
07:22:20 HTTPSConnectionPool(host='<host>', port=443): Read timed out. (read timeout=4)
07:22:20 END RequestId: beabb449-a41d-11e7-8469-93a8731ae2d8
07:22:20 REPORT RequestId: beabb449-a41d-11e7-8469-93a8731ae2d8 Duration: 8407.03 ms Billed Duration: 8500 ms Memory Size: 128 MB Max Memory Used: 36 MB
api gateway for api1:
07:22:12 Starting authorizer: 2szewn for request: bee365d6-a41d-11e7-9709-8d6614596919
07:22:12 Incoming identity: ********************************************************YzNw==
07:22:19 Using valid authorizer policy for principal: ****E_1
07:22:19 Successfully completed authorizer execution
07:22:19 Verifying Usage Plan for request: bee365d6-a41d-11e7-9709-8d6614596919. API Key: API Stage: 41clweydfc/dev
07:22:19 API Key authorized because method 'GET /api1' does not require API Key. Request will not contribute to throttle or quota limits
07:22:19 Usage Plan check succeeded for API Key and API Stage 41clweydfc/dev
07:22:19 Starting execution for request: bee365d6-a41d-11e7-9709-8d6614596919
07:22:19 HTTP Method: GET, Resource Path: /api1
07:22:20 Successfully completed execution
07:22:20 (bee365d6-a41d-11e7-9709-8d6614596919) Method completed with status: 200
api gateway for api2:
07:22:16 Starting authorizer: 2szewn for request: c15724e7-a41d-11e7-811a-6dd1376e9475
07:22:16 Incoming identity: ********************************************************YzNw==
07:22:20 Using valid authorizer policy for principal: ****E_1
07:22:20 Successfully completed authorizer execution
07:22:20 Verifying Usage Plan for request: c15724e7-a41d-11e7-811a-6dd1376e9475. API Key: API Stage: 41clweydfc/dev
07:22:20 API Key authorized because method 'GET /api2' does not require API Key. Request will not contribute to throttle or quota limits
07:22:20 Usage Plan check succeeded for API Key and API Stage 41clweydfc/dev
07:22:20 Starting execution for request: c15724e7-a41d-11e7-811a-6dd1376e9475
07:22:20 HTTP Method: GET, Resource Path: /api2
07:22:20 Successfully completed execution
07:22:20 Method completed with status: 200
lambda-auth for ap1's call:
07:22:19 START RequestId: beeadfbb-a41d-11e7-82fd-cf842bd93e85 Version: $LATEST
07:22:19 END RequestId: beeadfbb-a41d-11e7-82fd-cf842bd93e85
07:22:19 REPORT RequestId: beeadfbb-a41d-11e7-82fd-cf842bd93e85 Duration: 195.75 ms Billed Duration: 200 ms Memory Size: 128 MB Max Memory Used: 25 MB
lambda-auth for api2's call:
07:22:19 START RequestId: c15db514-a41d-11e7-88e3-1f6800c6e34e Version: $LATEST
07:22:20 END RequestId: c15db514-a41d-11e7-88e3-1f6800c6e34e
07:22:20 REPORT RequestId: c15db514-a41d-11e7-88e3-1f6800c6e34e Duration: 78.51 ms Billed Duration: 100 ms Memory Size: 128 MB Max Memory Used: 25 MB
It could be due to the kill of the VPC by aws-lambda and the time to make it up again. ( cf. amazon aws forums )
Increasing the timeouts solved the problem.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.