简体   繁体   中英

Ansible AWX: Behavior of Job Slicing

I am running task against 3 servers and I have set job slicing option as 3.

After execution is completed I see below behavior

Job 1 : task executed for host-A
Job 2 - task executed for host-B and host-C
Job 3 - no hosts matched.

My understanding is Job 3 should have executed task for host-C or host-B. Is there any specific reason for this behavior?

I tried to set job slicing = 4 but still Job 3 & Job 4 results in no hosts matched.

Regarding

My understanding is Job 3 should have executed task for host-C or host-B.

According the documentation Job Slicing - Considerations

that seems to be not the case

... a sliced job splits each inventory into a number of “slice size” chunks. It then queues jobs of ansible-playbook runs on each chunk of the appropriate inventory. The inventory fed into ansible-playbook is a pared-down version of the original inventory that only contains the hosts in that particular slice.

Usually a/one job will become executed on/from a single node in your Ansible Tower cluster. Job Slicing is meant for distributing a/one job over multiple Ansible Tower Nodes.

... sliced jobs follow normal scheduling behavior (number of forks, queuing due to capacity, assignation to instance groups based on inventory mapping)

If you have ~90 nodes in your inventory and configure 3 slices, than a workflow with 3 jobs each containing 30 nodes will be created and become executed in parallel. Even in your example you will need at least 3 Ansible Tower Nodes (maybe there are only 2?).

Further Documentation

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM