简体   繁体   中英

Specifying MSK credentials in an AWS CDK stack

I have code that seems to "almost" deploy. It will fail with the following error:

10:55:25 AM | CREATE_FAILED        | AWS::Lambda::EventSourceMapping | QFDSKafkaEventSour...iltynotifyEFE73996
Resource handler returned message: "Invalid request provided: The secret provided in 'sourceAccessConfigurations' is not associated with cluster some-valid-an. Please provide a secret associated with the cluster. (Service: Lambda, Status Code: 400, Request ID: some-uuid )" (RequestToken: some-uuid, HandlerErrorCode: InvalidRequest)

I've cobbled together the cdk stack from multiple tutorials, trying to learn CDK. I've gotten it to the point that I can deploy a lambda, specify one (or more) layers for the lambda, and even specify any of several different sources for triggers. But our production Kafka requires credentials... and I can't figure out for the life of me how to supply those so that this will deploy correctly.

Obviously, those credentials shouldn't be included in the git repo of my codebase. I assume I will have to set up a Secrets Manager secret with part or all of the values. We're using scram-sha-512, and it includes a user/pass pair. The 'secret_name' value to Secret() is probably the name/path of the Secrets Manager secret. I have no idea what the second, unnamed param is for, and I'm having trouble figuring that out. Can anyone point me in the right direction?

Stack code follows:

#!/usr/bin/env python3
from aws_cdk import (
    aws_lambda as lambda_,
    App, Duration, Stack
)
from aws_cdk.aws_lambda_event_sources import ManagedKafkaEventSource
from aws_cdk.aws_secretsmanager import Secret

class ExternalRestEndpoint(Stack):

    def __init__(self, app: App, id: str) -> None:
        super().__init__(app, id)

        secret = Secret(self, "Secret", secret_name="integrations/msk/creds")

        msk_arn = "some valid and confirmed arn"

        # Lambda layer.
        lambdaLayer = lambda_.LayerVersion(self, 'lambda-layer',
            code = lambda_.AssetCode('utils/lambda-deployment-packages/lambda-layer.zip'),
            compatible_runtimes = [lambda_.Runtime.PYTHON_3_7],
        )  

        # Source for the lambda.
        with open("src/path/to/sourcefile.py", encoding="utf8") as fp:
            mysource_code = fp.read()

        # Config for it.
        lambdaFn = lambda_.Function(
            self, "QFDS",
            code=lambda_.InlineCode(mysource_code),
            handler="lambda_handler",
            timeout=Duration.seconds(300),
            runtime=lambda_.Runtime.PYTHON_3_7,
            layers=[lambdaLayer],
        )

        # Set up the event (managed Kafka). 
        lambdaFn.add_event_source(ManagedKafkaEventSource(
            cluster_arn=prototype_mks,
            topic="foreign.endpoint.availabilty.notify", 
            secret=secret,
            batch_size=100,                # default
            starting_position=lambda_.StartingPosition.TRIM_HORIZON
        ))

Looking into a code sample, I understand that you are working with Amazon MSK as an event source, and not just self-managed (cross-account) Kafka.

I assume I will have to set up a Secrets Manager secret with part or all of the values

You don't need to setup credentials. If you use MSK with SALS_SCRAM , you already have credentials , which must be associated with MSK cluster.

As you can see from the doc, you secret name should start with AmazonMSK_ , for example AmazonMSK_LambdaSecret .

So, in the code above, you will need to fix this line:

secret = Secret(self, "Secret", secret_name="AmazonMSK_LambdaSecret")

I assume you already aware of the CDK python doc , but will just add here for reference.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM