简体   繁体   中英

Check if python script is running on an aws instance

I'm trying to set up a python logger to send error emails when it logs an error iff the instance has a tag set. I then quickly run into the problem of local dev computers that aren't on aws. Is there an easy and fast way to check if the script is being run on aws?

I was loading the instance data with:

import boto.utils
from boto.ec2.connection import EC2Connection
metadata = boto.utils.get_instance_metadata()
conn = EC2Connection()
instance = conn.get_only_instances(instance_ids=metadata['instance-id'])[0]

I can of course use a timeout on get_instance_metadata, but there is then a tension between now long to make developers wait vs the possibility of not sending an error email in production.

Can anyone think of a nice solution?

Alternatively, you can check if your environment contains any AWS environmental variables :

is_aws = True if os.environ.get("AWS_DEFAULT_REGION") else False

I chose to inspect if my user name is an ec2 instance: (Not a great solution, but it works)

my_user = os.environ.get("USER")
is_aws = True if "ec2" in my_user else False

AWS instances have metadata, so you could make a call to the metadata service and get a response, if the response is valid, you are @ AWS, otherwise you are not.

for example:

import urllib2
meta = 'http://169.254.169.254/latest/meta-data/ami-id'
req = urllib2.Request(meta)
try:
    response = urllib2.urlopen(req).read()
    if 'ami' in response:
        _msg = 'I am in AWS running on {}'.format(response)
    else:
        _msg = 'I am in dev - no AWS AMI'
except Exception as nometa:
    _msg = 'no metadata, not in AWS'

print _msg

This is just a stab - there are likely better checks but this one will get you the feel for it and you can improve upon it as you see fit. IF you are using OpenStack or another cloud service locally you of course will get a metadata response, so you will have to adjust your check accordingly...

(you could also do this with cloud-init stuff if you are using some kind of launch tool or manager like chef, puppet, homegrown, etc. Drop an /ec2 file in the file system if it's in AWS, or better yet if local drop a /DEV on the box)

Similar to @cgseller, assuming python3, one could do something like:

from urllib.request import urlopen

def is_ec2_instance():
    """Check if an instance is running on AWS."""
    result = False
    meta = 'http://169.254.169.254/latest/meta-data/public-ipv4'
    try:
        result = urlopen(meta).status == 200
    except ConnectionError:
        return result
    return result

I don't think this is really a boto issue. EC2 (and boto in turn) don't care or know anything about what scripts you're running on your servers.

If your script has a particular signature -- eg listening on a port -- that would be the best way to check, but shy of that, you can just look for its signature in the OS -- its process.

Use the subprocess module and your preferred bash magic to check if it's running :

command = ["ssh", "{user}@{server}", "pgrep", "-fl", "{scriptname}"]
try:
    is_running = bool(subprocess.check_output(command))
except subprocess.CalledProcessError:
    log.exception("Checking for script failed")
    is_running = False

A simple solution would be to check the contents of the file /var/lib/cloud/instance/datasource , which on an Ec2 instance contains DataSourceEc2Local: DataSourceEc2Local :

$ sudo cat /var/lib/cloud/instance/datasource
DataSourceEc2Local: DataSourceEc2Local

so in Python:

datasource_file = "/var/lib/cloud/instance/datasource"
try:
    with open(datasource_file) as f:
        line = f.readlines()
        if "DataSourceEc2Local" in line[0]:
            print("I'm running on EC2!")
except FileNotFoundError:
        print(f"{datasource_file} not found")

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM