简体   繁体   中英

Python: “IndexError: string index out of range” Beginner

I know, I know, this question has been asked plenty of times before. But I can't figure out how to fix it here - in this particular instance. When I subtract 2, which is what was recommended, I still get the same error within if statement. Thanks The code (at least it should) take a string "s" and measure it against the alphabet "order" and then give an output of the longest substring in s which is in alphabetical order.

order = "abcdefghijklmnopqrstuvwxyz"
s = 'abcbcdabc'
match = ""

for i in range(len(s)):
for j in range(len(order)):
    if (((i + j ) - 2) < len(order) and order[i] == s[j]):
        match += s[i] 

print("Longest substring in alphabetical order is: " + match)

That is because you are using index j of order list to access s list. It is possible that j is greater than len(s) hence the IndexError .

I don't know what you are trying to achieve with the code. But in any case heres what you can change to make it working: match += s[i] OR match += order[j]

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM