简体   繁体   中英

Can one thread handles multiple requests simultaneously?

This question is specific for Tomcat, but answers that are general applicable to other Application Servers / Servlet Containers would be interesting too.

From my understanding, it is guaranteed that each request is handled by a single thread, from a request processing thread pool (let's ignore the situation where the application request handling code executes some work asynchronously).

But what I like to know is, if it is guaranteed that a single thread will only serve one request at the same time ?

In other words, is it possible that the work of request R1, that is executed on thread T1 is pre-empted, and that thread T1 is then used to process request R2, after which the processing of R1 continues on T1 ?

Probably, this question can be more generalized to: Can the execution of a Runnable R1 on a thread T1 be 'pre-empted' in favor of the execution of another Runnable R2 on that same thread T1?

I can't get rid of the nagging feeling that I'm just overlooking some kind of fundamental principles of multi-threading in Java, so please, enlighten me!

No, the principle behind multithreading is that a processor can be running multiple threads at the same time, switching between them with given time quantums.

But that's the processor. Threads don't switch between units of work anymore, because that was the processor's job.

Of course with asynchronous servlets this isn't entirely true. The idea is that a request that performs a long waiting operation (request to 3rd party server etc.) can free the service thread while it's waiting for the answer, so a new client request can be handled. However this is not "regular" thread operation and is handled by the application server.

In short, the answer is no.

In details: When you say thread running a 'runnable', I think you mean processor(CPU) running a thread. The thread in java is kind of representative of the work being done and the processor is the actual executor. A multi-threaded execution may happen in a single core CPU, where the processor switches among multiple threads (where each thread has its work already defined) very quickly to give illusion of simultaneous execution, but it is really one at a time.

Same thing holds for multi-core systems where if the number of threads to be run are more than number of cores, then again they are split among the cores and each core executes the threads one by one, stopping and switching to other one.

I agree; short answer is no.

But of course: when talking about any "application server" that deserves that name, then that server will use a pool of threads in which all "work packages" go.

So, while R1 on T1 will not be interrupted, but run to its completion; for sure, afterwards T1 will run some T2; and so on.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM