To Nha Notes | Feb. 17, 2021, 2:25 p.m.
Suppose, there is a case that we wan to keep tables alive in multiple DB servers, and we want to keep their data up to date, and consistent without changing too much code in many places. Below is how I implement it with Django.
Basically, the idea is for each execution of ORM methods such as create/update/delete, we customize to execute it into multiple write DBs one by one in single transaction. We apply transaction to ensure the data consistent in all write DB servers.
To do that, we have to customize methods of classes models, manager and queryset which we are using in our code.
The interface of classes customized like below:
from cacheops.query import QuerySetMixin
from django.db import models, transaction
class MultiDBManager(models.Manager):
def get_queryset(self) -> MultiDBQuerySet:
qs = MultiDBQuerySet(self.model, self._db)
return qs
class MultiDBQuerySet(models.query.QuerySet, QuerySetMixin):
def update(self, **kwargs: Any) -> int:
# to be customized here
def delete(self) -> Tuple[bool, int]:
# to be customized here
class MultiDBModel(models.Model):
objects = MultiDBManager()
...
def save(self, *args: Any, **kwargs: Any) -> None:
# to be customized here
...
def delete(self) -> Tuple[bool, int]:
# to be customized here
...
To apply multiple DB servers for a table model, we just simply replace base models class by customized one, and no changes else required.
Single write DB:
class App(models.Model):
...
Multiple write DBs:
class App(MultiDBModel):
...
For each place of to be customized here, we add logic to save/update/delete into multiple write DBs in single transaction. Please check this code file in GitHub. Or Ask me (Nha) for more information.