简体   繁体   中英

Will Java Buffer size be a bottleneck?

I have created a Java application in which a GUI interacts with an external C++ executable.

I am using ProcessBuilder to create and attach in/out/error from the C++ executsble to the GUI. I am using buffer of size 1024. What if the C++ is executing at very high speed with lots of printf statements while the Java GUI reads 1024 size of charaters. Will this create a bottleneck?

public void run()
{
        try
        {
            char[] buffer = new char[1024];
            for (int n = reader.read(); n != -1; n = reader.read(buffer))
            {
                writeBytes2Text(buffer, 0, n);
            }
        }
        catch (Exception x) 
        {
            //some exception
        }
    }

No matter what the size of the buffer, the bottleneck will stay. The only true way out of it is to optimize your Java side as much as you can. For example, update the GUI at well-spaced intervals, buffering as much as you need to bridge the time gap. Usually less frequent, coarser-grained updates will improve throughput.

1024 bytes buffer size is fine, but 4096 might be slightly better.

Most of the delays will come from context switching, sleeping and waking up threads. Unless the Java-side buffer size is much smaller, you're not going to introduce throughput issues.

The C++ output streams will probably be working with default 4096 byte output buffers. If the C++ process is slowly printing output, you won't see any come out of it until 4096 bytes have accumulated.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM