简体   繁体   中英

Sum of Most Significant digits of two large numbers of char type

I have two numbers of char type say

char* N1;
char* N2;

N1 = "92345610172222";
N2 = "12351097654671";

I need to add MSD's of two numbers ie

9 + 1 = 10

I solved by typecasting and dividing the number until it encounters a single digit using two loops.Sum it and return the result.

int sumMsd(char *N1, char *N2) {
    int one = (int)*N1;
    int two = (int)*N2;
    while (one >= 10)
        one /= 10;
    while (two >= 10)
        two /= 10;
    return one+two;
}

The logic fails when int size is very large. So,I need to optimize my solution without using library functions. I am looking out for a solution in C.

This should work.

int sumMsd(char *N1, char *N2) {
    return (N1[0] - '0') + (N2[0] - '0');
}

Let the first digit of N1 be '3'. Then '3' - '0' is 3, the difference between their ASCII values.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM