简体   繁体   中英

Java Application on Virtual Machine Performance

I implemented a Java Application over my personal machine. The application execution time takes about 20 seconds on my machine. I migrated to a server machine that is 10x as powerful as my personal machine. Unfortunately, the application takes on this server machine twice the execution time. I am pretty sure JVM version and settings are same on both machine, both works as server not client and both run Windows 7

The only thing I suspect that the OS on my machine is native while the OS of the server is run on a virtual machine with dedicated huge memory and 2 physical processors. Am I right? Does running Java Application in a Windows virtual machine impacts the performance despite having 10x more powerful hardware?

I have had first-hand experience with massive server systems (from Sun) which had enormous concurrent I/O throughput, but each specific CPU core was less powerful than a high-end desktop machine. It is quite possible that the task you are running is loading that system in a way which it has not been optimized for.

On the other hand, there are many other factors to consider, such as what the rest of that system was doing while you were making your measurements, what exactly you were doing, was the JIT compiler involved properly, etc.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM