简体   繁体   中英

Structuring Google Cloud Platform project

I'm working on a project that has many small tasks. Some of these tasks are related and require overlapping apis.

task_1/
    main.py
task_2/
    main.py
apis/
    api_1/
    api_2/
    api_3/
test/
    test_api_1.py
    test_api_2.py
    test_task_1.py
    test_task_2.py
    test_task_3.py

For example, task_1 needs api_1 and api_3 , while task_2 needs api_1 and api_2 . At first I tried using Google Cloud Functions to execute these tasks, but I ran into the issue that GCF needs local dependencies installed in the same folder as the task . This would mean duplicating the code from api_1 into task_1 . Further, local testing would become more complicated because of the way GCF does imports (opposed to .mylocalpackage.myscript ):

You can then use code from the local dependency, mylocalpackage:

from mylocalpackage.myscript import foo

Is there a way to structure my codebase to enable easier deployment of GCF? Due to my requirements, I cannot deploy each API as its own GCF. Will Google Cloud Run remedy my issues?

Thanks!

To use Cloud Functions for this, you will need to arrange your code in such a way that all the code a function depends on is present within that function's directory at the time of deployment. This might be done as a custom build/packaging step to move files around.

To use Cloud Run for this, you need to create a minimal HTTP webserver to route requests to each of your "functions". This might be best done by creating a path for each function you want to support. At that point, you've recreated a traditional web service with multiple resources.

If these tasks were meant as Background Functions , you can wire up Pub/Sub Push integration .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM