I have a server.js file that uses child_process to call a python file:
const express = require('express')
const app = express()
path = require('path');
fs = require('fs')
app.get('/call_script', function(req, res) {
function_choice = req.query.function_choice
args_choice = req.query.args_choice
// call python scripts
var spawn = require("child_process").spawn;
var process = spawn("python",["test.py", function_args]);
process.stdout.on("data", function (data){
from_python = data.toString("utf8");
from_python = from_python.replace(/\n/g, " ").replace(/\\n/g, " ");
res.json({
"answer" : from_python
})
});
The python file:
function_args = sys.argv[1]
# ..... retrieve function and arguments from function_args
result = locals()[function_choice](*arr_n)
print(result)
sys.stdout.flush()
If I run node server.js in the terminal it runs the file and works perfectly . I can call the express endpoint using a URL with Ajax. The python file gets called and returns the output correctly. No problems.
If I now place server.js into a Docker container the output from python is not returned . There are no errors or crashes, just nothing. The Ajax call just hangs and returns nothing.
The Dockerfile and package.json contain all necessary commands and dependencies. I can run the container with everything working EXCEPT the python output.
I know there are issues with Docker/node and stdout but I have not been able to get the python output to come outside the container.
If I put console.log inside process.stdout.on the terminal and Docker logs show nothing.
Is it something to do with buffering or flushing?
EDIT
It seems it has something to do with modules not being made available. For example, matplotlib is not found, yet it installs during docker build.
Here is my Dockerfile :
FROM node:9
# Create app directory
WORKDIR /usr/src/app
# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
COPY package*.json ./
RUN npm install
RUN apt-get update -y && \
apt-get install -y python3 && \
rm -rf /var/lib/apt/lists/*
RUN apt-get update -y
RUN apt-get -y install python3-pip
# Bundle app source
COPY . .
RUN pip3 install -r requirements.txt
EXPOSE 8899
CMD ["npm","start"]
And this is my requirements.txt
matplotlib
EDIT2
pip3 show matplotlib
Name: matplotlib
Version: 2.1.1
Location: /usr/local/lib/python3.4/dist-packages
Requires: numpy, six, python-dateutil, pytz, cycler, pyparsing
import sys, for p in sys.path: print(p)
/usr/lib/python3.4
/usr/lib/python3.4/plat-x86_64-linux-gnu
/usr/lib/python3.4/lib-dynload
/usr/local/lib/python3.4/dist-packages
/usr/lib/python3/dist-packages
tl;dr: add the -u
option when you call the python executable.
If I put console.log inside process.stdout.on the terminal and Docker logs show nothing.
Is it something to do with buffering or flushing?
Yes, during my testing, it seems that when you use the default node Docker image there are some issues with the buffering of the stdout file, as you have already seen in the linked issue https://github.com/nodejs/docker-node/issues/453 .
From the python documentation this is documented in the sys.stdout
section: https://docs.python.org/3/library/sys.html#sys.stdout
When interactive, stdout and stderr streams are line-buffered. Otherwise, they are block-buffered like regular text files. You can override this value with the -u command-line option.
I solved this issue by calling the python executable with the additional -u
option which forces the stdout and stderr streams to be unbuffered ( https://docs.python.org/3/using/cmdline.html#cmdoption-u )
In your case, the following would probably fix your issue:
var process = spawn("python", ["-u", "test.py", function_args]);
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.