简体   繁体   中英

Declaring a string from a character has an unexpected result. Why is this so?

Half a day looking for this bug. Why is there an unexpected result in the third case?

        // case 1
        string value1 = "a" + "a" + "A";  
        byte[] asciiBytes1 = Encoding.ASCII.GetBytes(value1); // expected: 97 - 97 - 65
        Console.WriteLine(string.Join(" - ", asciiBytes1));   //   result: 97 - 97 - 65

        // case 2
        string value21 = 'a' + "A"; 
        byte[] asciiBytes21 = Encoding.ASCII.GetBytes(value21); // expected: 97 - 65
        Console.WriteLine(string.Join(" - ", asciiBytes21));    //   result: 97 - 65 

        // case 3
        string value22 = 'a' + 'a' + "A"; 
        byte[] asciiBytes22 = Encoding.ASCII.GetBytes(value22); // expected: 97 - 97 - 65
        Console.WriteLine(string.Join(" - ", asciiBytes22));    //   result: 49 - 57 - 52 - 65

It's the order of operations, In all of the other examples you add a char to a string. However in the third example you add a char to a char, which acts as a byte and does integer multiplication.

And then it integer is added to the string "A"

so 'a' + 'a' = 194 and 194 + "A" = 197A

and thats the results you are seeing

You are mixing chars and strings. This: 'a' + 'a' results in the integer addition of the ascii char values.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM