简体   繁体   中英

why am i getting slightly off outputs for this dp problem?

The Question:
Given an array A of N distinct integers and an array B of integers(need not be distinct). Find the min no. of numbers that need to be added to B to make A a subsequence of it.

My Strategy:
Quite simple- find the longest common subsequence, lcs and so the answer is sizeof(A) - lcs.

My Code:

int lcs(vector<int>A, vector<int>B, int n, int m)  
{  
    int L[m + 1][n + 1];  
    int i, j;  
      
    /* Following steps build L[m+1][n+1] in  
       bottom up fashion. Note that L[i][j]  
       contains length of LCS of X[0..i-1] 
       and Y[0..j-1] */
    for (i = 0; i <= m; i++)  
    {  
        for (j = 0; j <= n; j++)  
        {  
        if (i == 0 || j == 0)  
            L[i][j] = 0;  
      
        else if (B[i - 1] == A[j - 1])  
            L[i][j] = L[i - 1][j - 1] + 1;  
      
        else
            L[i][j] = max(L[i - 1][j], L[i][j - 1]);  
        }  
    }  
          
    /* L[m][n] contains length of LCS  
    for A[0..n-1] and B[0..m-1] */

    return (n - L[m][n]);  
}  

My output:
I am getting wrong output. (Differing mostly by 1.) I was also getting TLE for some test cases.
can someone locate where i am going wrong in logic or in code?

If A == [1, 2, 3, 4, 5] and B == [1, 2, 4, 5] then longest common sequence is 2 and your answer is 3, but you only need to add a single number 3 to B to meet the requirements. So the overall logic seems incorrect

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM