This code was written in VS Code, Python. I have a minimum variable in my code and another variable. Let's call them X and Xmin. I give Xmin and X numbers. Then when I compare them with < my code tells me that the smaller one is larger. Here is my code
Xmin = 100
print("X")
X = input()
if X < Xmin:
print("X is too small.")
The problem is when I make X = 500, it will tell me that X is greater than Xmin, but when I give X something really big, like 1000000, it will tell me that X is too small.
If you are using python 3, you need to add an int() around the input statement in order for python to know the user input should be a number, not a string:
try:
Xmin = 100
print("X")
X = int(input())
if X < Xmin:
print("X is too small.")
except:
print('That is not an integer.')
If you are using python 2, watch out, input() in python 2 is the equivalent of eval(input()) in python 3. and we all know that 'eval is evil'.
X = input() #takes input as string
Use below code instead of above:
X = int(input()) #takes input as integer
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.