简体   繁体   English

Terrafrom。 数据、archive_file 错误或错误?

[英]Terrafrom. Data, archive_file error or bug?

I've a problem with archive_file data source.我对archive_file数据源有疑问。 I've a condition which is creates a lambda layer:我有一个条件是创建一个 lambda 层:

resource "aws_lambda_layer_version" "layers" {
   for_each = { for layers, name in local.lambdas.options.layers : layers => name if local.lambdas.init.self && local.lambdas.extras.layers }
   filename = "not_working" #data.archive_file.layers[each.key].output_path
   source_code_hash = "not_working" #data.archive_file.layers[each.key].output_base64sha256
   
   ...
}

Resource "aws_lambda_layer_version" "layers" working as expected, and creates a lambda layer, if filename and source_code_hash are defined as shown above (Without using data source).资源"aws_lambda_layer_version" "layers"按预期工作,并创建一个 lambda 层,如果filenamesource_code_hash如上所示定义(不使用数据源)。

And I have a我有一个

data "archive_file"  "layers"  {
   for_each = { for layers, name in local.lambdas.options.layers : layers => name if local.lambdas.init.self && local.lambdas.extras.layers }
   type = lookup(each.value.archive, "type", null)
   
   ...
}

As you can see, the for_each block are same with resource block, but I'v got an error:如您所见, for_each块与资源块相同,但出现错误:

each.value is object with 2 attributes

Finally here is my module block, where I call this resources最后这是我的模块块,我称之为资源

layers  = [
   {
      name        = "test-layer"
      runtimes    = [ "nodejs12.x" ]
      archive     = {
         type        = "zip"
      
         ...
      }
   }
]

So my question is, why same for_each block working differently?所以我的问题是,为什么相同的for_each块工作方式不同?

I've reproduced your code, for me it is working;我已经复制了你的代码,对我来说它正在工作;

resource "null_resource" "cmd" {
   for_each        = { for lambda, name in local.lambdas.init.mods : lambda => name if local.lambdas.init.self && local.lambdas.trigger.laps }
   provisioner "local-exec" {
       command     = lookup(each.value.archive, "command", null)
       working_dir = lookup(each.value.archive, "modules", null)
   }
}
data "archive_file"  "layers"  {
   for_each        = { for lambda, name in local.lambdas.init.mods : lambda => name if local.lambdas.init.self && local.lambdas.trigger.laps }
   depends_on      = [ null_resource.cmd ]
   type            = lookup(each.value.archive,    "type",    null)
   source_dir      = lookup(each.value.archive,    "modules", null)
   output_path     = "${lookup(each.value.archive, "outputs", null)}/${lookup(each.value, "function", null)}.${lookup(each.value.archive, "type", null)}"
   output_file_mode  = lookup(each.value.archive,  "mode",    null)
}

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Terraform aws_s3_bucket_object 未由 archive_file 触发 - Terraform aws_s3_bucket_object not triggered by archive_file Terrafrom 数据源是一个本地文件,并从内容中检索一些键为 output - Terrafrom data source a localfile and retrieve some keys from the content as output If else 条件使用 terrafrom 模块 - If else condtion using terrafrom modules Cloud Functions - 获取存档时发生未知错误 - Cloud Functions - Unknown error while fetching the archive 从可读创建存档,将文件和 pipe 添加到可写 - Create archive from readable, add file and pipe to writable Shutil 解压存档 - 无法解压临时 zip 文件 - Python - Shutil unpack archive - Can not unpack temporary zip file - Python 如何使用 Terrafrom 部署 azure 逻辑应用标准的工作流程 - how use Terrafrom to deploy azure logic app standard's workflows Cloud Scheduler - Terrafrom - 如何在 CloudSheduler 主体中传递当前日期 - Cloud Scheduler - Terrafrom - How to pass current date in CloudSheduler body 代码管道 Terrafrom || 操作“Deploy”的操作配置包含未知配置“DeploymentGroup” - CodePipline Terrafrom || Action configuration for action 'Deploy' contains unknown configuration 'DeploymentGroup' “错误反序列化 JSON 凭据数据”gcp 身份验证文件 c# - "Error Deserializing JSON Credential Data" gcp auth file c#
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM