简体   繁体   中英

Java DecimalFormat class and E-notation

I'm attempting to learn Java(I have novice level programming experience in other languages) and am currently reading Absolute Java 5th Edition. Everything is smooth sailing so far except for a small bit related to the DecimalFormat class as it relates to E-notation. I understand the basics but some of the logic I just can't seem to "get".

For example, the number 12345 formatted with ##0.##E0 ends up as 12.3E3 according to the book. Why did it determine that there are two digits before the decimal instead of, say, one or three? I know the # is an optional digit, but it after playing with some formatting constraints on different numbers it almost seems like the formatting is somewhat arbitrary(although I know it can't be). I've searched for a good explanation outside of the book and have come up short. If someone could "dumb it down" for me I would be really appreciative.

Also, how often is this type of formatting used in real world application?

Thanks a bunch.

It is the number of # in the front part of the pattern, thus if it was #0.##E0 it will be 1.234E4

DecimalFormat df = new DecimalFormat("#0.##E0");
System.out.println(df.format(12345));

The best way to learn a new language if to write little test programs like this.

Also read more on the Oracle site

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM