I'm looking to write a program that should be capable of holding around 1000 records, each record will contain a varying number of fields and values.

Usually, I'd create a list of dictionaries for something like this but wanted to know if that's the right way to go about it. This data will be routinely sorted and searched.

For something of this volume, is it better (from a performance perspective) to use an SQL database or can I build an equally fast solution in Python. What does industry use? Can I get better performance implementing a different type of data structure in Python?

Any input would be much appreciated!

Recommended Answers

All 2 Replies

A nice alternative is to use a ZODB, or "Zope object database". In such a database, you can store persistent python objects and use them in very much the same way as you handle ordinary python objects. You don't need Zope to install and use ZODB: they form a standalone database module. ZODB are not relational databases, so you don't need SQL either. See the doc here http://www.zodb.org/

From your description "... a varying number of fields and values..." I would think you want a non-relational data store. Depending on things you didn't say, you might want to use your described structure, using pickle/unpickle to persist it to disk; or you might want to go with something like ZODB or some other third party product. 1000 records is very small for a database. A quick search for 'non relational data store' gives quite a lot of good stuff in the first few pages:
A blog item
An article in Linux Magazine
A large overview
Berkeley DB may also be something to look at.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.