简体   繁体   中英

Best Approach - Monitoring Data Collector scripts (Python)

We are maintaining 15 data collectors, written in Python, where each one of them collects data, perform some manipulation, and send this data (REST API) to some data source.

These collectors, are running as cron jobs in our environment, and we are looking for a way to monitor their running.

What do you think would be the best approach here, for monitoring? We want to know if each and every collector has finished its running successfully, and if not, then what is the reason.

Should we use a Jenkins server? and write pipelines for these scripts? Should we use Github Actions?

Any other suggestions will be great.

I would setup the cron jobs to log their outputs ( cf ) into /var/your-project-name/collector-xy.log .
Then create a script which connects to the computer(s) where the data collectors run and grab the ~15 files (can be as simple as a single scp ).
Now that the files are locally available, check their content for "run is successfull" or "run failed because...", or whatever you decide to use to indicate the successfulness of your data collectors' jobs (can be as simple as a single grep ).

You can run this script yourself on your computer, you can create a Jenkins job to run it and act accordingly,... whatever.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM