简体   繁体   中英

Terraform handle multiple lambda functions

I have a requirement for creating aws lambda functions dynamically basis some input parameters like name, docker image etc.

I have been able to build this using terraform (triggered using gitlab pipelines).

Now the problem is that for every unique name I want a new lambda function to be created/updated, ie if I trigger the pipeline 5 times with 5 names then there should be 5 lambda functions, instead what I get is the older function being destroyed and a new one being created.

How do I achieve this?

I am using Resource: aws_lambda_function

Terraform code

resource "aws_lambda_function" "executable" {
  function_name = var.RUNNER_NAME
  image_uri     = var.DOCKER_PATH
  package_type  = "Image"
  role          = role.arn
  architectures = ["x86_64"]
  
}

I think there is a misunderstanding on how terraform works.

Terraform maps 1 resource to 1 item in state and the state file is used to manage all created resources.

The reason why your function keeps getting destroyed and recreated with the new values is because you have only 1 resource in your terraform configuration. This is the correct and expected behavior from terraform.

Now, as mentioned by some people above, you could use "count or for_each" to add new lambda functions without deleting the previous ones, as long as you can keep track of the previous passed values (always adding the new values to the "list").

Or, if there is no need to keep track/state of the lambda functions you have created, terraform may not be the best solution to solve your needs. The result you are looking for can be easily implemented by python or even shell with aws cli commands.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM