简体   繁体   English

AWS Lambda 在 S3 存储桶上触发 Neptune“无法为源启动新负载”

[英]AWS Lambda trigger on S3 bucket to Neptune "Failed to start new load for the source"

I wrote a lambda function which will trigger python code when a create event happens in S3 and Python script is supposed to read files from S3 and post them to Neptune server. I wrote a lambda function which will trigger python code when a create event happens in S3 and Python script is supposed to read files from S3 and post them to Neptune server.

When I test that, I am getting the following error.当我测试它时,我收到以下错误。

{
 "requestId":"xxxxxxxx-1234-5678-9012-xxxxxxxxxxxxx",  
 "code":"ThrottlingException",  
 "detailedMessage":"Failed to start new load for the source s3://my-s3-url/file.ttl. 
    Max concurrent load limit breached. Limit is 1"
}

Code:代码:

def lambda_handler(event, context):
    file_names = ["a.ttl", "b.ttl", "c.ttl"]
    source_url = "s3://my-s3.aws.com/"
    role = "my-role"
    neptune_url = "https://my-neptune-server.aws.com/loader"
    headers = {"Content-Type": "application/json"}

    for name in file_names:
        file = source_url+name
        data = {"source": file, "iamRoleArn": role, "region": "region-1", "failOnError": "FALSE", "format": "turtle"}
        loop = asyncio.get_event_loop()
        task = loop.create_task(post_async(neptune_url, json.dumps(data), headers))
        resp = loop.run_until_complete(task)
        print(resp)

async def post_async(neptune_url, data, headers):
    async with aiohttp.ClientSession() as session:
        async with session.post(neptune_url, data=data, headers=headers) as response:
            result = await response.text()
            return result

I tried both Synchronous and Asynchronous ways.我尝试了同步和异步方式。 I am getting limited documentation in the web.我在 web 中获得了有限的文档。 Can some one point me right direction?有人能指出我正确的方向吗?

According to documentation: Max concurrent load limit is one根据文档: 最大并发负载限制

So probably you need to introduce some queue in your upload process.因此,您可能需要在上传过程中引入一些队列。 Or it may be或者它可能是

  • SQS质量标准
  • preserve loading queue with file names in parameter store and recursively execute lambda function for deploying files one by one在参数存储中保留带有文件名的加载队列并递归执行lambda函数以一一部署文件
  • or you own idea... :)或者你自己的想法... :)

According to documentation https://docs.aws.amazon.com/neptune/latest/userguide/load-api-reference-load.html ,根据文档https://docs.aws.amazon.com/neptune/latest/userguide/load-api-reference-load.html

Max concurrent load limit breached  (HTTP 400)

If a load request is submitted without "queueRequest" : "TRUE", 
and a load job is currently running, the request will fail with this error

you can add the following field to your payload,您可以将以下字段添加到您的有效负载中,

data = {
  "source": file, 
  "iamRoleArn": role, 
  "region": "region-1", 
  "failOnError": "FALSE", 
  "format": "turtle", 

  "queueRequest" : "TRUE"
}

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 AWS Lambda和S3和Pandas-将CSV加载到S3中,触发Lambda,加载到pandas,放回存储桶中? - AWS Lambda and S3 and Pandas - Load CSV into S3, trigger Lambda, load into pandas, put back in bucket? 当新对象到达S3存储桶时触发Lambda函数 - Trigger Lambda function when a new object arrives in S3 bucket Python 中的 AWS Lambda 将新文件复制到另一个 s3 存储桶 - AWS Lambda in Python to copy new files to another s3 bucket 根据将文件从一个 S3 存储桶复制到另一个存储桶的清单文件触发 AWS Lambda function - Trigger AWS Lambda function based on a manifest file which copies files from one S3 bucket to another AWS S3 中的“KeyError: 'Records'” - Lambda 触发器 - “KeyError: 'Records'” in AWS S3 - Lambda trigger python cdk 在 s3 存储桶上创建 lambda 触发器 - python cdk to create lambda trigger on s3 bucket 使用Python处理Lambda中的S3 Bucket Trigger事件 - Handling S3 Bucket Trigger Event in Lambda Using Python AWS Lambda:如何读取 S3 存储桶中的 CSV 文件然后将其上传到另一个 S3 存储桶? - AWS Lambda: How to read CSV files in S3 bucket then upload it to another S3 bucket? 将 wordcloud plot 从 AWS lambda 保存到 S3 存储桶 - saving wordcloud plot from AWS lambda to S3 bucket 使用 AWS Lambda 更改 s3 存储桶中 xlsx 文件的目录 - Change directory of xlsx file in s3 bucket using AWS Lambda
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM