简体   繁体   English

如何连续运行 aws bash 命令?

[英]How to run aws bash commands consecutively?

How can I execute the following bash commands consecutively?如何连续执行以下 bash 命令?

aws logs create-export-task --task-name "cloudwatch-log-group-export1" \
  --log-group-name "/my/log/group1" \
  --from 1488708419000 --to 1614938819000 \
  --destination "my-s3-bucket" \
  --destination-prefix "my-log-group1"

aws logs create-export-task --task-name "cloudwatch-log-group-export" \
  --log-group-name "/my/log/group2" \
  --from 1488708419000 --to 1614938819000 \
  --destination "my-s3-bucket" \
  --destination-prefix "my-log-group2"

The problem I have with the above commands is that after the first command completes execution, the script will stuck at the following state, making the second command not reachable.我上面命令的问题是,第一个命令执行完成后,脚本会卡在下面的state,导致第二个命令无法访问。

{
    "taskId": "0e3cdd4e-1e95-4b98-bd8b-3291ee69f9ae"
}

It seems that I should find a way to wait for cloudwatch-log-group-export1 task to complete.看来我应该想办法等待cloudwatch-log-group-export1任务完成。

aws-cli auto access to vim edit mode by default. aws-cli默认自动访问 vim 编辑模式。

You can avoid it by setting AWS_PAGER environment variable is "" before execute aws command.您可以通过在执行aws命令之前将AWS_PAGER环境变量设置为""来避免它。

export AWS_PAGER=""
aws logs create-export-task...

Or, you can fix it in to aws's config file ( ~/.aws/config ):或者,您可以将其修复到 aws 的配置文件 ( ~/.aws/config ) 中:

[default]
cli_pager=

You could have to crate a waiter function which uses describe-export-tasks to get current status of an export job.您可能必须创建一个服务员 function使用describe-export-tasks来获取导出作业的当前状态。

Example of such function:此类 function 的示例:

wait_for_export() {

  local sleep_time=${2:-10}    

  while true; do

    job_status=$(aws logs describe-export-tasks \
                    --task-id ${1} \
                    --query "exportTasks[0].status.code" \
                    --output text)

    echo ${job_status}

    [[ $job_status == "COMPLETED" ]] && break

    sleep ${sleep_time}

  done  
}

Then you use it:然后你使用它:

task_id1=$(aws logs create-export-task \
  --task-name "cloudwatch-log-group-export1" \
  --log-group-name "/my/log/group1" \
  --from 1488708419000 --to 1614938819000 \
  --destination "my-s3-bucket" \
  --destination-prefix "my-log-group1" \
  --query 'taskId' --output text)


wait_for_export ${task_id1}

# second export 

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM