简体   繁体   English

将 confluent-kafka python 库与 AWS lambda 一起使用时出错

[英]Error while using confluent-kafka python library with AWS lambda

I am trying to use the confluent-kafka python library to administer my cluster via a lambda function but the function fails with the error:我正在尝试使用confluent-kafka python 库通过 lambda 函数管理我的集群,但该函数因错误而失败:

"Unable to import module 'Test': No module named 'confluent_kafka.cimpl'"

My requirements.txt我的要求.txt

requests
confluent-kafka

To create the zip file I moved my code to the site-packages location of the virtual env and zipped everything.为了创建 zip 文件,我将代码移动到虚拟环境的站点包位置并压缩了所有内容。

Python Code:蟒蛇代码:

import confluent_kafka.admin
import requests
def lambda_handler(event, context):
    print("Hello World")

I am using the macOS 10.X.我正在使用 macOS 10.X。 On Linux, I noticed that pip install creates a separate confluent_kafka.libs which does not get created on mac在 Linux 上,我注意到 pip install 创建了一个单独的 confluent_kafka.libs,它不会在 mac 上创建

I created the required layer and can verity that it works .我创建了所需的层并且可以验证它是否有效

The technique used includes docker tool described in the recent AWS blog :所使用的技术包括最近AWS 博客中描述的docker 工具

Thus for this question, I verified it as follows:因此对于这个问题,我验证如下:

  1. Create empty folder, eg mylayer .创建空文件夹,例如mylayer

  2. Go to the folder and create requirements.txt file with the content of转到文件夹并创建requirements.txt文件,内容为

echo requests > requirements.txt
echo confluent-kafka >> requirements.txt
  1. Run the following docker command:运行以下 docker 命令:
docker run -v "$PWD":/var/task "lambci/lambda:build-python3.8" /bin/sh -c "pip install -r requirements.txt -t python/lib/python3.8/site-packages/; exit"
  1. Create layer as zip:创建层为 zip:
zip -r mylayer.zip python > /dev/null
  1. Create lambda layer based on mylayer.zip in the AWS Console.在 AWS 控制台中基于mylayer.zip创建 lambda 层。 Don't forget to specify Compatible runtimes to python3.8 .不要忘记将Compatible runtimes指定为python3.8

  2. Test the layer in lambda using the following lambda function:使用以下 lambda 函数测试 lambda 中的层:

import confluent_kafka.admin
import requests

def lambda_handler(event, context):
    print(dir(confluent_kafka.admin))
    print(dir(requests))
    print("Hello World")

The function executes correctly:该函数正确执行:

['AdminClient', 'BrokerMetadata', 'CONFIG_SOURCE_DEFAULT_CONFIG', 'CONFIG_SOURCE_DYNAMIC_BROKER_CONFIG', 'CONFIG_SOURCE_DYNAMIC_DEFAULT_BROKER_CONFIG', 'CONFIG_SOURCE_DYNAMIC_TOPIC_CONFIG', 'CONFIG_SOURCE_STATIC_BROKER_CONFIG', 'CONFIG_SOURCE_UNKNOWN_CONFIG', 'ClusterMetadata', 'ConfigEntry', 'ConfigResource', 'ConfigSource', 'Enum', 'KafkaException', 'NewPartitions', 'NewTopic', 'PartitionMetadata', 'RESOURCE_ANY', 'RESOURCE_BROKER', 'RESOURCE_GROUP', 'RESOURCE_TOPIC', 'RESOURCE_UNKNOWN', 'TopicMetadata', '_AdminClientImpl', '__builtins__', '__cached__', '__doc__', '__file__', '__loader__', '__name__', '__package__', '__path__', '__spec__', 'concurrent', 'functools']
['ConnectTimeout', 'ConnectionError', 'DependencyWarning', 'FileModeWarning', 'HTTPError', 'NullHandler', 'PreparedRequest', 'ReadTimeout', 'Request', 'RequestException', 'RequestsDependencyWarning', 'Response', 'Session', 'Timeout', 'TooManyRedirects', 'URLRequired', '__author__', '__author_email__', '__build__', '__builtins__', '__cached__', '__cake__', '__copyright__', '__description__', '__doc__', '__file__', '__license__', '__loader__', '__name__', '__package__', '__path__', '__spec__', '__title__', '__url__', '__version__', '_check_cryptography', '_internal_utils', 'adapters', 'api', 'auth', 'certs', 'chardet', 'check_compatibility', 'codes', 'compat', 'cookies', 'delete', 'exceptions', 'get', 'head', 'hooks', 'logging', 'models', 'options', 'packages', 'patch', 'post', 'put', 'request', 'session', 'sessions', 'ssl', 'status_codes', 'structures', 'urllib3', 'utils', 'warnings']
Hello World

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM