简体   繁体   中英

How do I abort a set of fabric tasks running in parallel on multiple hosts, if one of those tasks fails?

Currently, I can deploy to multiple hosts quickly using a top-level task with @runs_once , that executes a task decorated with @parallel , which picks up all the hosts defined in the role.

I can run a deploy task:

fab -R production deploy:myApp.tgz

Currently, this is what happens if one of the tasks fails:

[host2] Executing task 'deploy'
[host1] Executing task '_deploy'
[host2] deploy failed on host 2
[host3] Executing task '_deploy'
[host1] deployed b69488da3e on host 1 
[host3] deployed b69488da3e on host 3 
Fatal error: deploy failed on host 2

Aborting.

As you can see, the fabric session does not abort until all the parallel tasks have completed. If I were to run this on production, it could easily take down all hosts even after the first one has failed.

Ideally, I would like to abort my session as soon as a task fails. However this is not possible . I have tried fabric functions abort('failed task') and local("exit 1") as well as more traditional sys.exit(1) , os._exit(1) . The only way I have been able to stop a running fabric session of parallel tasks is a keyboard interrupt.

Consequently, I want to first run things against a single host, and abort if that fails, rather than proceeding to deploy to all hosts. This will allow me to get most of the speed advantage parallel execution has over serial execution (checking if the app came back up involves a lot of waiting) while preventing the disaster scenario of blithely breaking all the prod hosts. Something like a "canary" pattern. I have been unable to modify the hosts list at runtime by passing in a list to execute() or using a settings context.

Using: Python 2 Fabric 1.14.1 Paramiko 2.6.0

You can implement this canary pattern in the following way:

def _one_off_generator(fn, one_off=True):
    if one_off:
        @runs_once                                                                                                                      
        def once(*args):
            return fn(*args)
        return once
    else:
        @parallel                                                                                                                       
        def many(*args):                                                                                                                
            return fn(*args)
        return many

@task
@runs_once
def deploy(fn):
    
    canary_deploy = _one_off_generator(_deploy)

    execute(canary_deploy, fn)

    parallel_deploy = _one_off_generator(_deploy, one_off=False)
    prev = execute(parallel_deploy, fn)
@task
def _deploy(fn):
    #deploy things

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM