简体   繁体   中英

Duplex streaming in Java EE

I'm looking for a full duplex streaming solution with Java EE.

The situation: client applications (JavaFX) read data from a peripheral device. This data needs to be transferred in near real-time to a server for processing and also get the response back asynchronously, all while it keeps sending new data for processing.

Communication with the server needs to have an overhead as low as possible. Data coming in is basically some sensor data and after processing it is turned in what can be described as a set of commands.

What I've looked into:

  1. A TCP/IP server (this is a non-Java EE approach).This would be the obvious solution. Two connections opened in parallel from each client app: one for upstream data and one for downstream data.
  2. Remote & stateless EJBs. This would mean that there's no streaming involved and that I pack sensor data in smaller windows (1-2 seconds worth of sensor data) which I then send to the server for processing and get the processing result as a response. For this approach, while it is scalable, I am not sure how fast it will be considering I have to make a request each 1-2 seconds. I still need to test this but I have my doubts.
  3. RMI. Is this any different than EJBs, technically?
  4. Two servlets (up/down) with long polling. I've not done this before, so it's something to be tested.

For now I would like to test the performance for my approach #2. The first solution will work for sure, but I'm not too fond of having a separate server (next to Tomcat, where I already have something running).

However, meanwhile, it would be worth knowing if there are any other Java specific (EE or not) technologies that could easily solve this. If anyone has an idea, then please share it.

This looks like a good place for using JMS . Instead of stateless EJBs, you will probably be using Message-Driven Beans .

This gives you an approach similar to your first solution, using two message queues instead of TCP/IP connections. JMS makes your communications fully asynchronous and is low-overhead in the sense that your clients can send messages as fast as they can regardless of how fast your server can consume them. You also get delivery guarantees and other JMS goodness.

Tomcat does not come with JMS, however. You might try TomEE or integrate your existing Tomcat with a JMS implementation like ActiveMQ .

There are numerous options you could try. Appropriate solutions depend on the nature of your application, communication protocol, data transfer type, control you have over the client and server and firewall restrictions on client server routes.

There's not much info on this in your question, but given what you have provided, you may like to look at netty as it is quite general purpose and flexible and seems to fit your requirements. Netty also includes a duplex websocket implementation. Note that a netty based solution may be more complex to implement and require more background study than some other solutions (such as jms).

Yet another possible solution in GraniteDS , which advertises a JavaFX client integration and multiple server integrations for full duplex client/server communication, though I have not used it. GraniteDS uses comet (your two asynchronous servlets with long polling model) with the Active Message Format for data which you may be familiar with from Flex/Flash.

Have you looked at websockets as a solution? They are known to keep persistent connections and hence the asynchronous response will be quick.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM