My .ebextensions/00.commands.config
looks like:
container_commands:
00_download_models:
command: "./download.py"
My download.py
has:
#!/usr/bin/env python3
print('now')
But in /var/log/cfn-init.log
, I have:
2020-06-25 17:19:34,933 [ERROR] -----------------------BUILD FAILED!------------------------
2020-06-25 17:19:34,933 [ERROR] Unhandled exception during build: Command 00_download_models failed
Traceback (most recent call last):
File "/opt/aws/bin/cfn-init", line 171, in <module>
worklog.build(metadata, configSets)
File "/usr/lib/python2.7/site-packages/cfnbootstrap/construction.py", line 129, in build
Contractor(metadata).build(configSets, self)
File "/usr/lib/python2.7/site-packages/cfnbootstrap/construction.py", line 530, in build
self.run_config(config, worklog)
File "/usr/lib/python2.7/site-packages/cfnbootstrap/construction.py", line 542, in run_config
CloudFormationCarpenter(config, self._auth_config).build(worklog)
File "/usr/lib/python2.7/site-packages/cfnbootstrap/construction.py", line 260, in build
changes['commands'] = CommandTool().apply(self._config.commands)
File "/usr/lib/python2.7/site-packages/cfnbootstrap/command_tool.py", line 117, in apply
raise ToolError(u"Command %s failed" % name)
ToolError: Command 00_download_models failed
Seems pretty straight forward, so I don't know what I'm doing wrong.
There is nothing wrong with your script and technique, based on what you described.
I verified it using my own EB Environment (64bit Amazon Linux 2 v3.0.3 running Python 3.7; single instance).
To confirm, I used the following:
.ebextensions/50_commands.config
container_commands:
00_download_models:
command: "./download.py"
./download.py was located in the root of my zip package (not in .ebextensions
folder):
#!/usr/bin/env python3
# easier to find the /tmp/test.txt file then search though logs for "now"
with open('/tmp/test.txt', 'w') as f:
f.write('from python script')
Also I made sure that ./download.py
has execution permissions before creating EB environment by executing the following on my local workstation:
chmod +x ./download.py
My download.py
is:
#!/usr/bin/env python3
import datetime
import torch
import torch.nn.functional as F
from transformers import (
CTRLLMHeadModel,
CTRLTokenizer,
GPT2LMHeadModel,
GPT2Tokenizer,
TransfoXLLMHeadModel,
TransfoXLTokenizer,
XLMTokenizer,
XLMWithLMHeadModel,
XLNetLMHeadModel,
XLNetTokenizer,
)
if __name__ == "__main__":
with open('/tmp/test.txt', 'w') as f:
f.write('Starting download', datetime.datetime.now().time(), '\n')
GPT2LMHeadModel.from_pretrained('distilgpt2')
f.write('DistilModel', datetime.datetime.now().time(), '\n')
GPT2LMHeadModel.from_pretrained('gpt2-xl')
f.write('GPT2-XL', datetime.datetime.now().time(), '\n')
GPT2LMHeadModel.from_pretrained('gpt2-medium')
f.write('GPT2-medium', datetime.datetime.now().time(), '\n')
CTRLLMHeadModel.from_pretrained('ctrl')
f.write('CTRL', datetime.datetime.now().time(), '\n')
GPT2Tokenizer.from_pretrained('distilgpt2'),
GPT2Tokenizer.from_pretrained('gpt2-xl'),
GPT2Tokenizer.from_pretrained('gpt2-medium'),
CTRLTokenizer.from_pretrained('ctrl')
f.write('Finished download\n')
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.