简体   繁体   中英

Patching Linux systems with python modules installed via pip

There probably isn't one "right answer" to this question. I'm interested in thoughts and opinions. We have a couple hundred RHEL7/Centos7/Rocky8 nodes. Many of them have python modules installed via pip/pip3.

I've been searching for a best practices on routine/monthly patching these modules...so I far haven't found any. Obviously things installed with rpm/yum/dnf are pretty easy to deal with.

From the pip man page:

pip install --upgrade SomePackage

Great! But how do you update all of them?

Sure. It is possible to do a "pip list/freeze" pipe that to awk...etc.. Surely, there's a better way. Ideally, one that captures things like "boto3 V1.2 replaced with boto3 V1.3" Right now it feels like I'm the only one thinking about this. Maybe I am and it is stupid. I'm ok with that response as well (but please tell me why).

A common solution is to deploy the application code inside a Docker container - the container image contains its own version of Python and all the dependency modules, so you don't have to update each module on all the host machines individually. It also means that the combination of OS, Python and modules that you deploy can be tested and then "frozen" into an immutable image which is then deployed the same everywhere.

Right now it feels like I'm the only one thinking about this.

I realise the above answer is probably not helpful in your situation as you already have a fairly large system deployed... but it might help to explain why not many people are developing solutions to your problem!

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM