Hi -

I have a project I am going to be working on which will be a python gui. After meeting with the user yesterday I have found out that the computer this application will run on will not have internet access always. Which brings me to my question, how do I go about storing data that the application is going to use? The user will also sometimes be modifying this data as well. SQLite3?

Thanks

Recommended Answers

All 3 Replies

You have very hard times ahead.
The main problem is not to decide if it is sqlite3 or any other desktop database.

The main problem is how to synchronise the user's and the main database.

Is there only one user? Or there are many, and can change the same data offline, causing possible conflicts?

If there is extended modification of offline data that must be posted to the main database, then you have to practically reimplement transaction handling.

I would do for a full offline method(for example a basic invocing application):
* The gui creates transactions.
* The gui operates on the local database, except transaction posting.
* The user functions correspond directly to transactions.
* There are online functions and offline functions.
* Online functions are: Modifing base data, like customer, company address,item price etc..
* Offline functions are transactions, like posting a new invoice
* A functions can be offline, if it is an insert into a table. No update, no delete (except locally).
* The client stores the last known version of the main database.
* If the client goes online, then it locks the main database.
* Then it refreshes the local database.
* The user is online now, and can change the main database.
* Then the client tries to post the transactions to the local database. This may result in a conflict. A record is to be inserted but there is a record with the alternate id already, or a type of invoice is to be inserted, that only valid for a special kind of customer. You should program the handling of the possible conflicts, and support all possible user choices, inclusive change base data.
* If everything ok, then the local transactions are posted to the main database. In one big run.
* The user can now modify every data, but changes are stored locally and posted in a separate function.
* If the user goes (falls) offline again, then any unposted transaction will be lost, and the database released.
* You should implement functions for a deadlocked main database, sanity checks, deadlocked and pending transactions, undo, redo etc...

If the data is specific just to the computer it is being used on (just update a file on the computer) is internet connection not necessary? The database will be no more than 4 tables, maybe 4 columns each, max.

I wouldn't mind trying what you laid out above just for personal use, seems like a nice challenege.

I do not understand the question. I think. If you do not write to any central database, that will be written by others, too, then the above transaction handling is not neccessary at all.

Anyway:
Sqlite is a good choice. It comes with python installed, that means the basic functionality of it, which is enough for most of the cases. The database is portable, has transactions and does not need maintenance.

4 tables with four columns is problematic enough not to store in files. To make a join between two files can be tedious. The relatively hard part is to design the data access layer.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.