I'm implementing a software based on the micro-service architecture using Spring/Java. Each micro-service connects to either a PostgreSQL or a MongoDB database. Is it standard practice to have a shared module that contains database-agnostic components (e.g. entity classes, models, and common repository/service interfaces used by both databases) and then have two other shared modules (one for Postgres and the other for MongoDB) importing the first shared module and implement database-specific implementation of those common components?
- Can you elaborate why you think there are any standard practices for this? I very much doubt that any such standard exists here.– Ben CottrellCommentedJul 1, 2024 at 16:45
- 1If your entity classes, models, and common repository/service interfaces are shared over multiple services, you probably should not separate them. Sounds like a design-error in progress.– mtjCommentedJul 2, 2024 at 6:40
- Please edit the question to limit it to a specific problem with enough detail to identify an adequate answer.– CommunityBotCommentedJul 2, 2024 at 19:18
1 Answer
What you are saying can be done, from a technical perspective.
However, I would urge you to consider why you are decoupling your micro services when there is so much shared logic (as you mentioned) between each micro service. What is wrong with a monolith?
Although microservices seems to be the buzz lately, improperly using them can actually be detrimental to performance. Amazon Prime Video is a really good example of abusing microservices causing really detrimental performance penalties.
Microservices should only be used if you can separate them out as a standalone service that independently takes input and produce output. Even then, there is an argument, from a cost perspective, to favour monoliths over 100 different microservices.
As always, benchmark and profile before jumping to conclusions. Sometimes the most obvious answer is the correct answer.