简体   繁体   中英

Find the time complexity of a recursive function

Why is the time complexity of this function O(n1*n2) when n1 is the length of s1 and n2 is the length of s2?

I tried to build an equation but failed.

#include <stdio.h>

int f(char* s1, int i1, char* s2, int i2) {
    if (s2[i2] == '\0')
        return 1;
    if (s1[i1] == '\0')
        return 0;
    if (s1[i1] != s2[i2])
        return f(s1, i1 + 1, s2, 0);
    return f(s1, i1+1, s2, i2+1);
}

void main() {
    printf("%d", f("hello", 0, "he", 0));
}

The function increments i1 by 1 for both tail call, hence the time complexity is O(n1) .

Note that this function does not implement a variant of strstr() as it would fail to find a match for f("aab", 0, "ab", 0) .

Let a be the first integer index where s1[a] != s2[a]s1[a] != '\\0's2[a] != '\\0'

If a exists, then your function is O(n1) .

Else, it is O(min{ n1, n2 }) .


In simpler terms, if one of your string is a right trimmed version of the other one, then the function will be O(n), with n the size of the substring.

Else, it will be O(n1).


To understand this, I would advise you to try some iterations of the function yourself, with a pen and a paper, at first without the last if .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM