简体   繁体   中英

Java Server Side Client Screen Resolution Detection

I'll start by saying that this question has been answered in some depth at: How can I get the screen resolution in java?

My question is not so much how to get the client's screen resolution but rather an explanation of the values that the Toolkit, GraphicsDevice, and Javascript screen detection return and more importantly why they aren't consistent. Through that I hope to achieve a better way to detect a user's (client side) screen resolution.

I am testing on my own machine and have set up multiple ways of detecting the client side screen resolution. I have a dual monitor setup but i'm only running the browser on one of the monitors. The first monitor is 1920X1080, the second 1680X1050.

Toolkit toolkit = Toolkit.getDefaultToolkit();
Dimension dimension = toolkit.getScreenSize();
System.out.println(dimension.width + "/" + dimension.height); //3600X1093

This method seems the most precise since it's actually returning the resolution of both monitors. It returns a value of 3600X1093

GraphicsDevice gd = GraphicsEnvironment.getLocalGraphicsEnvironment().getDefaultScreenDevice();
int width = gd.getDisplayMode().getWidth();
int height = gd.getDisplayMode().getHeight();
System.out.println(width + "/" + height); //1920X1080

This method seems to return the value of the first monitor. I'm assuming I can get the second monitor by changing the default screen device but there is no way for me to know which monitor the user is using so that's a moot point. None the less, it returns a value of 1920X1080.

var width = screen.width;
var height = screen.height;
console.log(width + "/" + height); //1680X1050

It appears javascript is the most precise (obviously...) because it knows the actual value of the browser window. I understand I can't achieve this server side, but I included the javascript because I am comparing the server side values to that.

First and foremost, the better choice seems to be the toolkit method. Does this actually detect the client's resolution, or is it returning the value of the monitors that the server is currently running on? If it's returning the client values, how is it getting those?

The primary reason for client side resolution detection on the server is because I want to build an adaptive images server. In other words detect that max screen size and return images that are appropriate.

Can anyone explain the difference between the methods and which one (or another perhaps) is the best for detecting the client's resolution?

The java version will only return the server's values, because it is purely server side. The only way to have client side java is by using an applet.

Therefore, I'd recommend using a javascript solution.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM