I'm trying to setup a Django project inside a 1GB Digital Ocean droplet. One critical function of the project is to import data from .ods files.
When testing the project with the Django development server manage.py runserver
(already inside the droplet) I can import a quite big number of rows - up to 20000 with no problems and without eating the server's memory as the loop that iterates the ods file sleeps 2 seconds every 100 rows.
But when using Gunicorn (alongside with nginx), only 20 to 30 rows are processed, on average. After that I'm getting Gateway time-out 504 code errors.
I already tried with no result:
eventlet
and using --worker-class eventlet
in the workers --workers 6
This is how a simplified version of the code looks like:
table = read(path_to_ods_file)
stop_every_rows = 100
rows_done = 0
stop_seconds = 1
for i in range(len(table)):
Profile.objects.create(
first_name=table[i][0],
last_name=table[i][0],
)
rows_done += 1
if rows_done >= stop_every_rows:
rows_done = 0
time.sleep(stop_seconds)
Something really weird is that most of the times not even a single user is created.
This is how my gunicorn service looks like:
[Unit]
Description=gunicorn daemon
After=network.target
[Service]
User=app_4
Group=www-data
WorkingDirectory=/home/app_4/backend
ExecStart=/home/app_4/backend/venv/bin/gunicorn --worker-class eventlet --access-logfile - --workers 3 --bind unix:/home/app_4/backend/project/backend$
[Install]
WantedBy=multi-user.target
Since you haved tried gunicorn
. Then I suggest tuning the Nginx
.
Add your conf
by this
proxy_connect_timeout 600;
proxy_send_timeout 600;
proxy_read_timeout 600;
send_timeout 600;
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.