简体   繁体   中英

Deploying Python and Dependencies to Elastic Beanstalk

I have two python projects that share some common libraries all organized into three git repositories: project1, project2, and common-lib. The two projects are each meant to be deployed to elastic beanstalk bundled with common-lib.

I'm trying to find the most idiomatic way to structure these projects to make it easy to develop for locally and to build a zip file for deployment using eb deploy .

Setting everything up for local development is easy. Just checkout each repo and do a python setup.py develop in common-lib to make the common libraries available in the virtualenv.

For EB deployment it would be nice to have a single setup.py command that produces an EB compatible zip file that contains the project and common-lib with a requirements.txt file that lists the pip dependencies for both. I have not yet found an easy way to do this which is a bit surprising because I imagine this is a fairly common scenario.

I can't specify the git repository for common-lib in either project1 or project2's requirements.txt file because the repository won't be reachable from AWS.

For me the proper way would be to create a python package from the common lib, publish it to a private pypi server, like https://gemfury.com/l/pypi-server . Have it as a reference in requirements.txt as a python package.

Another solution can be to include the common-lib as a git submodule https://git-scm.com/docs/git-submodule . With that you will have the separation, because it will live in a separate repository and you will have a simple reference as a git submodule in your's project.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM