This idea that Postgres should be used for everything really need to die in a professional context.
I was appointed in a company of 10 dev that did just that. All backend code was PostgreSQL functions, event queue was using Postgres, security was done with rls, frontend was using posgtraphile using graphql to expose these functions, triggers were being used to validate information on insert/update.
It was a mess. Postgres is a wonderful database, use it as a database. But don't do anything else with it.
Before some people come and say "things were not done the right way, people didn't know what they were doing". The dev were all fan of Postgres contributing to the projects around, there was a big review culture so people were really trying to the best.
The queue system was locking all the time between concurrent requests => so queue system with postgres works for a pet project
All the requests were 3 or 4 times longer due to fact that you have to check the rls on each row. We have also all pour API migrated now and each time the sql duration decrease by that factor ( and it is the exact same sql request ). And the db was locking all the time because of that as it feels likes rls breaks the deadlock detection Postgres algorithm
SQL is super verbose a language, you spend your time repeating the same line of code , it makes basic function about 100 lines long when they are 4-5 lines in nodes js
It is impossible to log things inside these functions to have to make sure things will work and if it doesn't you have no way to know where the code did go through
You can't make external API call, so you have to use a queue system to make any basic things there
There are not real lib , so everything need to be reimplemeted
It is absolutely not performant to code inside the db, you can't do a map so you O(n2) code all the time
API were needed for the external world , so there was actually another service in front of the database for some case and a lot of logic were reimplemeted inside it
There was a downtime at each deployment as we had to remove all the rls and recreate them ( despite the fact that all code was in insert if not update clauses) it worked at the beginning but at some point in time it stopped working and there was no way to find why, so drop all rls and recreate them
It is impossible to hire dev that wants to work on that stack and be business oriented , you would attract only purely tech people that care only about doing there own technical stuff
We are almost out of it now after 1 year of migration work and I don't see anything positive about this Postgres do everything culture compared to a regular node js + Postgres as a database + sqs stack
So to conclude, as a pet project it can be great to use Postgres like that, in a professional context you are going to kill the company with this technical choice
I was appointed in a company of 10 dev that did just that. All backend code was PostgreSQL functions, event queue was using Postgres, security was done with rls, frontend was using posgtraphile using graphql to expose these functions, triggers were being used to validate information on insert/update.
It was a mess. Postgres is a wonderful database, use it as a database. But don't do anything else with it.
Before some people come and say "things were not done the right way, people didn't know what they were doing". The dev were all fan of Postgres contributing to the projects around, there was a big review culture so people were really trying to the best.
The queue system was locking all the time between concurrent requests => so queue system with postgres works for a pet project
All the requests were 3 or 4 times longer due to fact that you have to check the rls on each row. We have also all pour API migrated now and each time the sql duration decrease by that factor ( and it is the exact same sql request ). And the db was locking all the time because of that as it feels likes rls breaks the deadlock detection Postgres algorithm
SQL is super verbose a language, you spend your time repeating the same line of code , it makes basic function about 100 lines long when they are 4-5 lines in nodes js
It is impossible to log things inside these functions to have to make sure things will work and if it doesn't you have no way to know where the code did go through
You can't make external API call, so you have to use a queue system to make any basic things there
There are not real lib , so everything need to be reimplemeted
It is absolutely not performant to code inside the db, you can't do a map so you O(n2) code all the time
API were needed for the external world , so there was actually another service in front of the database for some case and a lot of logic were reimplemeted inside it
There was a downtime at each deployment as we had to remove all the rls and recreate them ( despite the fact that all code was in insert if not update clauses) it worked at the beginning but at some point in time it stopped working and there was no way to find why, so drop all rls and recreate them
It is impossible to hire dev that wants to work on that stack and be business oriented , you would attract only purely tech people that care only about doing there own technical stuff
We are almost out of it now after 1 year of migration work and I don't see anything positive about this Postgres do everything culture compared to a regular node js + Postgres as a database + sqs stack
So to conclude, as a pet project it can be great to use Postgres like that, in a professional context you are going to kill the company with this technical choice