So I have a load of functions which each make api calls to a service, do some processing and return something. However, there are several of these services which are similar and I need to basically create classes for each one, with all the same functions, yet only one of these classes will be used at runtime based on the value of an environment variable.
What is the best way to do this in python?
I can use ABC to create an abstract base class, ensuring all these classes implement the same functions.
AbstractThing(ABC):
@abstractmethod
def action1(self):
pass
@abstractmethod
def action2(self):
pass
Thing1(AbstractThing):
def action1(self):
# do something
return something
def action2(self):
# do something
return something
Thing2(AbstractThing):
def action1(self):
# do something
return something
def action2(self):
# do something
return something
However, what I'm struggling with is how I get that class in the rest of the python code.
Should I create another class which has all these same functions, but gets the correct one of these child classes and defers all function calls to that?
Client(AbstractThing):
def __init__(self):
thing_name = os.environ('THING')
self.thing = getattr(sys.modules[__name__], thing_name)
def action1(self):
return self.thing()
def action2(self):
return self.thing()
Or is there a better way?
I think what you want is a factory function that associates env var values with the appropriate implementations.
def make_client() -> AbstractThing:
return {
"THING1": Thing1,
"THING2": Thing2,
}[os.environ('THING')]()
and then elsewhere:
client = make_client()
client.action1()
If Thing
is stateful and you want to make sure to only have one of it, you might want to combine this with a singleton pattern.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.