简体   繁体   中英

How to organize a set of classes properly?

I want to create a set of objects through inheritance, and I need some advice to organize my implementation.

All objects must inherit from a BaseObject , as so:

class BaseObject:
    class Actuator:
        pass
    class Sensor:
        pass

I want to ensure that all objects inheriting from BaseObject are forced to have a class Actuator and a class Sensor .

Then I have another abstraction level, with some generics objects, for example:

class Camera(BaseObject):
    class Actuator:
        pass
    class Sensor:
        def get_frame():
            pass
        def set_parameter():
            pass

And I want every Camera object I create to have at least get_frame and set_parameters .

I want to force implementation of these classes/function to keep the same synthax for every new object, so that if I want to work with another camera, I can keep my previous scripts, I just need to create a new Camera object. You should also know that these classes won't be instanciated directly, but are a patern to build on.

I looked at metaclasses, wich looks nice to force implementation of methods in a class but I'm not sure if:

  • you can create a metaclass ( Camera ) from a metaclass BaseObject
  • you can force implementation of a class in a metaclass (like Actuator in BaseObject ) and how (via @abc.abstractmethod ?)
  • you can force implementation of method inside a class inside a metaclass? (like get_frame() )
  • Is it really the only way or should I organize my classes totaly differently?
  • Should I better go back to making paperplanes ? (Please don't answer this one)

I would implement them like so:

from abc import ABCMeta

class BaseObject:
    __metaclass__ = ABCMeta
    def __init__(self):
        self.actuator = self._get_actuator()
        self.sensor = self._get_sensor()
    @abstractmethod
    def _get_actuator(self):
        return 'actuator'
    @abstractmethod
    def _get_sensor(self):
        return 'sensor'

Every subclass will have an actuator and a sensor param. Make sure you call super.__init__ if you override __init__ in child classes.

Class Sensor:
    __metaclass__ = ABCMeta
    pass

class CameraSensor(Sensor):
    __metaclass__ = ABCMeta
    @abstractmethod
    def get_frame():
        pass
    @abstractmethod
    def set_parameter():
        pass

class Camera(BaseObject):
    def _get_sensor(self):
        # get the sensor
        assert issubclass(sensor, CameraSensor)

I want to force implementation of these classes/function to keep the same syntax for every new object, so that if I want to work with another camera, I can keep my previous scripts, I just need to create a new Camera object.

OT : do you really mean "object" (=> instance), or subclass ? But anyway:

Since you mentionned that "Sensor" and "Actuator" are not necessarily supposed to have the same interface for different BaseObject subclasses, let's ignore the whole BaseObject part for the moment and concentrate on the Camera part.

If I understand correctly, what you want is generic Camera type. This type must have a sensor and an actuator attributes, which both must respect a given interface, but with possibly different implementations.

At this point we (well, I at least) don't have enough context to decide if we need an abstract BaseCamera type or if we just need abstract BaseCameraSensor and BaseCameraActuator interfaces. What is sure is that we do need BaseCameraSensor and BaseCameraActuator so let's start with this. I assume the sensor and actuator need to be aware of the camera they belong too, which FWIW really screams "strategy" pattern, so we start with a base class - let's call it "BaseStrategy" - that don't do much except get a reference to it's host object:

class Strategy(object):
    def __init__(self, parent):
        self._parent = parent

Now let's define our "CameraSensor" and "CameraActuator" interfaces:

class BaseCameraSensor(Strategy):
    __metaclass__ = ABCMeta

    @abstractmethod
    def get_frame(self):
        raise NotImplementedError

    @abstractmethod
    def set_parameter(self, value):
        raise NotImplementedError

class BaseCameraActuator(Strategy):
    __metaclass__ = ABCMeta

    @abstractmethod
    def random_method(self):
        raise NotImplementedError

Now we can take care of the "Camera" part. If the only variant parts of the implementation are encapsulated in the Sensor and Actuator (which is the point of the strategy pattern), then we don't need an abstract base class - we can just pass the appropriate Sensor and Actuator subclasses as params to the initialiser:

class Camera(object):
    def __init__(self, brand, model, sensor_class, actuator_class):
        self.brand = brand
        self.model = model

        assert issubclass(sensor_class, BaseCameraSensor)
        self.sensor = sensor_class(self)

        assert issubclass(actuator_class, BaseCameraActuator)
        self.actuator = actuator_class(self)

And the problem is solved. FWIW, note that you don't really need the Strategy class nor the ABC part, just documenting the expected interface for sensor_class and actuator_class and let the program crash if they are not respected is enough - that's how we've been programming in Python for years (15+ years as far as I'm concerned) and it JustWork(tm) - but ABC at least makes the contract clearer and the program will crash sooner. But don't be fooled: it wont make your code foolsafe (hint: nothing will, period).

Now if we have some other variant parts that implementation cannot be known ahead of time, the simplest solution would be to follow the same pattern - delegate to a Strategy object that is passed at instanciation time. So the point is: now we have this implemented, we find out we don't have a need for some BaseCamera ABC.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM