简体   繁体   中英

QProcess, Cannot Create Pipe

I am running a QProcess in a timer slot at 1 Hz. The process is designed to evoke a Linux command and parse it's output.

The problem is this: after the program runs for about 20 minutes, I get this error:

QProcessPrivate::createPipe: Cannot create pipe 0x104c0a8: Too many open files
QSocketNotifier: Invalid socket specified

Ideally, this program would run for the entire uptime of the system, which may be days or weeks.

I think I've been careful with process control by reading the examples, but maybe I missed something. I've used examples from the Qt website, and they use the same code that I've written, but those were designed for a single use, not thousands. Here is a minimum example:

class UsageStatistics : public QObject {
    Q_OBJECT 
public:
    UsageStatistics() : process(new QProcess) {
       timer = new QTimer(this);
       connect(timer, SIGNAL(timeout()), this, SLOT(getMemoryUsage()));
       timer->start(1000); // one second
    }

    virtual ~UsageStatistics() {}

public slots:

    void getMemoryUsage() {
        process->start("/usr/bin/free");
        if (!process->waitForFinished()) {
            // error processing
        }

        QByteArray result = process->realAll();
        // parse result 

        // edit, I added these
        process->closeReadChannel(QProcess::StandardOutput);
        process->closeReadChannel(QProcess::StandardError);
        process->closeWriteChannel();
        process->close();
    }
}

I've also tried manually deleting the process pointer at the end of the function, and then new at the beginning. It was worth a try, I suppose.

Free beer for whoever answers this :)

QProcess派生自QIODevice ,所以我说调用close()应该关闭文件句柄并解决您的问题。

I cannot see the issue, however one thing that concerns me is a possible invocation overlap in getMemoryUsage() where it's invoked before the previous run has finished.

How about restructuring this so that a new QProcess object is used within getMemoryUsage() (on the stack, not new 'd), rather than being an instance variable of the top-level class? This would ensure clean-up (with the QProcess object going out-of-scope) and would avoid any possible invocation overlap.

Alternatively, rather than invoking /usr/bin/free as a process and parsing its output, why not read /proc/meminfo directly yourself? This will be much more efficient.

First I had the same situation with you. I got the same results. I think that QProcess can not handle opened pipes correctly.

Then, instead of QProcess, I have decided to use popen() + QFile().

class UsageStatistics : public QObject {
Q_OBJECT 
public:
UsageStatistics(){
   timer = new QTimer(this);
   connect(timer, SIGNAL(timeout()), this, SLOT(getMemoryUsage()));
   timer->start(1000); // one second
}

virtual ~UsageStatistics() {}

private:
    QFile freePipe;
    FILE *in;

public slots:

void getMemoryUsage() {

    if(!(in = popen("/usr/bin/free", "r"))){
            qDebug() << "UsageStatistics::getMemoryUsage() <<" << "Can not execute free command.";
            return;
    }

    freePipe.open(in, QIODevice::ReadOnly);
    connect(&freePipe, SIGNAL(readyRead()), this, SLOT(parseResult()) );
    // OR waitForReadyRead() and parse here.
}

void parseResult(){
    // Parse your stuff
    freePipe.close();
    pclose(in); // You can also use exit code by diving by 256.
}
}

tl;dr:
This occurs because your application wants to use more resources than allowed by the system-wide resource limitations. You might be able to solve it by using the command specified in [2] if you have a huge application, but it is probably caused by a programming error.

Long:
I just solved a similar problem myself. I use a QThread for the logging of exit codes of QProcesses. The QThread uses curl to connect to a FTP server uploads the log. Since I am testing the software I didn't connect the FTP server and curl_easy_perform apparently waits for a connection. As such, my resource limit was reached and I got this error. After a while my program starts complaining, which was the main indicator to figure out what was going wrong.

[..]
QProcessPrivate::createPipe: Cannot create pipe 0x7fbda8002f28: Too many open files
QProcessPrivate::createPipe: Cannot create pipe 0x7fbdb0003128: Too many open files
QProcessPrivate::createPipe: Cannot create pipe 0x7fbdb4003128: Too many open files
QProcessPrivate::createPipe: Cannot create pipe 0x7fbdb4003128: Too many open files
[...]
curl_easy_perform() failed for curl_easy_perform() failed for disk.log
[...]

I've tested this by connecting the machine to the FTP server after this error transpired. That solved my problem.

Read:
[1] https://linux.die.net/man/3/ulimit
[2] https://ss64.com/bash/ulimit.html
[3] https://bbs.archlinux.org/viewtopic.php?id=234915

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM