The background
It all started when two separate teams were working on a very similar problem within the same domain. The structure behind the challenges we were facing was basically the same but we decided to tackle it in two different ways (we didn’t know we were looking at similar problems yet).
My team took a strongly functional approach, whereas we would implement specialised functions to work on all possible cases and we would pass them around as operators to our routines.
For example:
def specialised_operator(data: pd.DataFrame = None) -> pd.DataFrame:
# perform very specific task
def data_prep_operator(data: pd.DataFrame = None) -> pd.DataFrame:
...
def execution_routine(*operators, **kwargs):
for operator in operators:
operator(**kwargs)
This is obviously an extremely simplified version to give you an idea…
The other team decided to define a class hierarchy using a Factory pattern to allow for specialised classes that catered to different use cases. Something like:
class BaseOperator(abc.ABC):
# a bunch of abstract methods and properties
# a bunch of concrete methods
class OperatorCase1(BaseOperator):
# overriding what needs to be overridden
# implementing what needs to be implemented
class OperatorCase2(BaseOperator):
# overriding what needs to be overridden
# implementing what needs to be implemented
Again, a very simplified version of the actual implementation.
This was the situation when we realised that we were doing essentially the same thing and that it would be a damn good idea to merge our approaches and not duplicate our efforts.
Given the complexity of the code and the expectation of growth for our application, there was no immediate reason to prefer one method over the other… The main strength that the class based approach had (and I tend to not favour using classes if I don’t need to) was the fact that the team had also implemented a full fledged pipeline orchestration suite based on the interface defined by the BaseOperator class…
Now, that was a compelling reason to use the class approach… We still had one problem though: converting the whole functional pipeline into the class hierarchy…
A terrible idea
I consider myself to be a decent developer, and as many other decent developers I am constantly struggling to write as little code as possible… Therefore, my very first thought was: “Can I dynamically encapsulate a function as part of an existing class and return and instance of said class, for which the function gets promoted to a method?”.
Or in simpler terms: can I turn the function into a method of a class and dynamically add BaseOperator as a parent?
This would be a reasonably simple task, which we could solve like this:
class OperatorAdapter(BaseOperator):
def __init__(self, func):
self._func = func
def execute(self, *args, **kwargs):
"""
Overriding the execution of BaseOperator with our function
"""
return self._func(*args, **kwargs)
The problem was that the pipeline didn’t just inherit from a single BaseOperator… The hierarchy looked a bit more like this:

And every Base*Operator would have multiple classes inheriting from it. The pipeline depended on having the right operator at the right stage… To do this manually, for every function would mean rewriting most of our application and it would have been almost the same type of effort as adapting everything into classes inheriting from the appropriate base. I wanted to be able to do this dynamically and quickly!
I read many blogs looking for the answer to this, and the most relevant answer I found was, and I quote, “Why the hell would you want to do this??”
It’s a strong argument…
A crazy implementation
At least until I remembered how the Python types work… It turns out you can do exactly what I was trying to achieve by using the `type` function: it turns out you CAN add a parent class to a Python object at runtime using just the language primitives!
# the 'type' call with three arguments is essentially a dynamic call to 'class'
# this
class X(Parent):
a = 1
# is equivalent to this
X = type('X', (parent, ), dict(a=1))
We then used this to build a generic operator we used to inject our code where necessary…
The final adapter looked something like this, similar to the above definition, but without explicitly inheriting from anything:
class Operator:
def __init__(self, func):
self._func = func
def execute(self, *args, **kwargs):
return self._func(*args, **kwargs)
The instantiation of each operator then looked like this:
# instantiate an operator with the embedded function
op = Operator(function)
# create a dictionary of the properties we need in the new class
properties = {
name: getattr(name) for name in dir(op) if name in properties_we_need
}
# dynamically add inheritance as:
new_op = type("op", (TheRelevantParentClass, ), properties)
# and now new_op inherits from whatever BaseOperator we required...
The next step to make everything more obfuscated was to embed this process in a decorator so that the code changes on our part amounted to sprinkling decorators on top of our function definitions and nothing more…
## old code
def do_something():
...
## new code
@inherit(BaseDataPrepOperator)
def do_something():
...
This allowed us to adapt an entire codebase with just a handful of code and a few decorators here and there.
All in all a fun experience which taught us a lot of fun things, including that dynamic inheritance can get complex FAST!








