简体   繁体   中英

Python - point class not returning correct points

So I have a point class where I declare a point and then perform operations on it. One of the operations is scale which takes in a point and scales it while raising an error if the point is not a float value. Here's what it looks like:

def scale(self, f):
    if not isinstance(f, float):
        raise Error("Parameter \"f\" illegal.")
    self.x0 = f * self.x
    self.y0 = f * self.y

And if I test it with this test code:

  print '*** scale'
# f illegal
try:
    p0 = Point(0.0, 0.0)
    p0.scale(1)
except Error as e:
    print 'caught:', e.message

# normal case
p0 = Point(2.0, 3.0)
p0.scale(2.3)
print p0

Then the output I get is:

*** scale
caught: Parameter "f" illegal.
2 3

But the output that I want is:

*** scale
caught: Parameter "f" illegal.
5 7

So the error message looks fine but the values that it's printing out are not. So why doesn't it print out the correct values? Here's my init and str methods:

def __init__(self, x, y):
        if not isinstance(x, float):
            raise Error("Parameter \"x\" illegal.")
        self.x = x
        if not isinstance(y, float):
            raise Error ("Parameter \"y\" illegal.")
        self.y = y

def __str__(self):
    return '%d %d' % (int(round(self.x)), int(round(self.y)))

You are assigning to new attributes :

self.x0 = f * self.x
self.y0 = f * self.y

x0 and y0 are different attributes from x and y . Thus, x and y remain unchanged.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM