My company employs people who add and edit records in a PostgreSQL database using a web interface. These updates then need to be compiled into a mobile app (Android, iOS) via SQLite and released as a new version every few months. We haven't quite gotten around to 'hot patching' the SQLite database; that is to say, downloading updates from the server instead of recompiling the app with new data downloaded during the build process.
I'm wondering what the typical process is here - how to get from server to client. My initial thought is to write a script to:
- download the data (as JSON) that needs to be compiled into the app
- use the SQLite library to construct the database and import the data
- compress/encrypt the database
- put the database in the correct asset folder for iOS and Android
It seems like a reasonable approach, but I'm wondering if there is a better way, or if there is some process that's more standard. Are there caveats to this approach?
And I know that I could expose this to clients via a REST API. That's basically what I'm doing for the 'downloading' aspect. However, that's not what the boss wants to do, so this is the way it has to be. I'm asking if my approach (downloading a JSON export, importing that data, etc.) is a decent approach, or if e.g. dumping through psql and doing some magic with that data) would be better for what I need to accomplish: getting PostgreSQL data from the web into a local SQLite database.