简体   繁体   中英

How can two Python applications hosted in different servers communicate?

I'm having an hard time figuring out how to solve a problem with a little project.

Basically i have a Django application. On the other hand i have an external Python script running. I would like to create a system where, each time a form in my Django app is submitted, the data submitted in the form is sent to the external Python application.

The external Python service should receive the data, read it, and according to who is the user and what did he submit, it should perform some tasks, then send a response.

Here is what i thought: 1) Connect the external Python app to the same database that Django is using. So that when the form is submitted, it is saved on the database and the data can be 'shared' with the second Python service. The problem with this solution is that i would need the second app to query the Database every second and perform a lot of queries, that would be a problem with performances. 2) Create an epi endpoint, so that the external python app would connect to the endpoint and fetch the data saved in the database from there. The problem is the same of the first solution. Would a service like Redis or RabbitMQ help in this case?

Importing the external Python process in my Django app is not a solution, it needs to be separate from the Django app. An important requirement for this, is speed. When new data is submitted, it needs to be received by the second Python app in the shortest time possible.

That said, i'm open to any advice or possible solution to solve this problem, thanks in advance :)

You could use a microservices architecture to build this. Instead of sharing databases between two applications you have them communicate with each other through web requests. Django would shoot a request to your other app with the relevant data, and the other server would respond back with the results.

Usually one would use something like Flask (synchronous server) or Sanic (asynchronous server) to receive/reply, but you can also look into something like Nameko . Would also recommend looking into Docker as eventually, as you set up more of these microservices, you'll need it.


The idea is (ie using Flask ), to create an access point that does some computation to your data and returns it back to the Django server.

computation.py
from flask import Flask
from flask import request

app = Flask(__name__)

@app.route("/", methods=["POST"])
def computation():
    data = request.get_json()
    print(data)

    return f"Hey! {data}"

app.run(host="0.0.0.0", port=8090)

The Django server is simply sending a request to your server application.

django_mock.py
import requests

req = requests.post('http://0.0.0.0:8090/', json={"data": "Hello"})
print(req.text)

The above will print out on the computation.py app:

{'data': 'Hello'}

and will print out on the django_mock.py example:

Hey! {'data': 'Hello'}

You should build an API. The 2nd app would now be an application server and the 1st app, when it receives a form submission from the user, would persist its data to the DB and then make an API call to the 2nd app via this API. You would include key information in the API request that identifies the record in the DB.

You can use Django (eg DRF) or Flask to implement a simple API server in Python.

Now, this requires your app server to be up and running all the time. What if it's down? What should the 1st app do? If you need this level of flexibility, then you need to decouple these apps in some way. Either the 1st app implements some kind of backoff/retry if it can't send to the 2nd app. Or you use a reliable queueing mechanism (something like Amazon SQS).

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM