繁体   English   中英

如何在 terraform 中创建多个胶水作业作为一个胶水作业

[英]How to create multiple glue jobs as one glue job in terraform

我是 terraform 脚本的新手,我想创建多个粘合作业,每个作业包含不同的名称和不同的脚本。 有没有可能在变量的帮助下将这多个工作创建为一个工作?

例如:变量.tf

variable "db2jobnames" {
  description = "db2 glue job names"
  type        = list
  default     = ["sql_db_job", "sql_db_job2"]
}

variable "script_location" {
  description = "db2 glue job scripts"
  type        = list
  default     = ["s3://s3_buget/sql_db_job.py", "s3://s3_buget/sql_db_job.py"]
}

胶水连接.tf

resource "aws_glue_connection" "conn_db2" {
  count           = var.created_CR ? 1 : 0
  connection_type = "JDBC"
  connection_properties = {
    JDBC_CONNECTION_URL = "jdbc:db2://lkidjhyft:50000/ZXHAG006G"
    PASSWORD            = "acfg3"
    USERNAME            = "ndhygsf"
  }

  name = "${var.department}-${var.application}-connection"

  physical_connection_requirements {
    availability_zone      = var.connection_availability_zone
    security_group_id_list = data.aws_security_groups.AWS_Public_Services.ids
    subnet_id              = data.aws_subnet.selected.id
  }
}

和我的胶水工作。 主程序

resource "aws_glue_job" "etl_jobs" {
  count    = var.created_GL ? 1 : 0
  count    = "${length(var.db2jobnames)}"
  count    = "${length(var.script_location)}"
  name     = "${var.db2jobnames[count.index]}_db2etljobs"
  role_arn = aws_iam_role.glue_role.arn

  command {
    python_version  = var.python_version
    script_location = "${var.script_location[count.index]}"
  }
  default_arguments = {
    "--extra-jars"                       = "${var.JarDir}"
    "--TempDir"                          = "${var.TempDir}"
    "--class"                            = "GlueApp"
    "--enable-continuous-cloudwatch-log" = "${var.enable-continuous-cloudwatch_log}"
    "--enable-glue-datacatalog"          = "${var.enable-glue-datacatalog}"
    "--enable-metrics"                   = "${var.enable-metrics}"
    "--enable-spark-ui"                  = "${var.enable-spark-ui}"
    "--job-bookmark-option"              = "${var.job-bookmark-option}"
    "--job-language"                     = "python"
    "--env"                              = "${var.paramregion}"
    "--spark-event-logs-path"            = "${var.sparkeventlogpath}"
  }
  execution_property {
    max_concurrent_runs = var.max_concurrent_runs
  }
  connections = [
    "${aws_glue_connection.conn_db2[count.index].name}"
  ]
  glue_version      = var.glue_version
  max_retries       = 0
  worker_type       = var.worker_type
  number_of_workers = 20
  timeout           = 2880
  tags              = local.common_tags
}

我试图插入两个计数,但出现错误。 我们怎么能用一份工作创造两个工作呢? 就像一个作业需要创建第一个数据库名称和第一个脚本位置,如下所示。

job1--> sql_db_job - s3://s3_buget/sql_db_job.py
job2--> sql_db_job2 - s3://s3_buget/sql_db_job2.py

任何答复将不胜感激。 谢谢你。

根据您提供的变量和代码,您必须更改count ,以便它使用其中一个列表的长度。 例如:

resource "aws_glue_job" "etl_jobs" {
  count    = var.created_GL ? length(var.db2jobnames) : 0
  name     = "${var.db2jobnames[count.index]}_db2etljobs"
  role_arn = aws_iam_role.glue_role.arn

  command {
    python_version  = var.python_version
    script_location = var.script_location[count.index]
  }
  default_arguments = {
    "--extra-jars"                       = var.JarDir
    "--TempDir"                          = var.TempDir
    "--class"                            = "GlueApp"
    "--enable-continuous-cloudwatch-log" = var.enable-continuous-cloudwatch_log
    "--enable-glue-datacatalog"          = var.enable-glue-datacatalog
    "--enable-metrics"                   = var.enable-metrics
    "--enable-spark-ui"                  = var.enable-spark-ui
    "--job-bookmark-option"              = var.job-bookmark-option
    "--job-language"                     = "python"
    "--env"                              = var.paramregion
    "--spark-event-logs-path"            = var.sparkeventlogpath
  }
  execution_property {
    max_concurrent_runs = var.max_concurrent_runs
  }
  connections = [
    aws_glue_connection.conn_db2[0].name
  ]
  glue_version      = var.glue_version
  max_retries       = 0
  worker_type       = var.worker_type
  number_of_workers = 20
  timeout           = 2880
  tags              = local.common_tags
}

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM