Python does not have a type like struct in C or record in Pascal, but it can be easily implemented with a class. This little code snippet shows you how to do it.
I use this code for several years as it is short, speedy and very useful (written by Massimo Di Pierro, writer of web2py).
It also has side advantages as if you use another very useful piece of code: ObjectListView (http://objectlistview.sourceforge.net/cs/index.html), you can feed the lines directly with a Storage object.
# Developed by Massimo Di Pierro <email@example.com> # # Code shamelessly stollen from Massimo because it is very useful ;-) # This line tells the interpreter to only import the content of __all__ # while 'from <module> import *'. Mainly used when we don't want # other PGMs to automatically import some of this file's classes/methods. # If we want to explicitely use, lets say XYZ, from this file, it could # be done by wether explicit call: 'storage.XYZ' or explicit import # 'from storage import XYZ'. __all__=['Storage'] class Storage(dict): """ A Storage object is like a dictionary except `obj.foo` can be used in addition to `obj['foo']`. >>> o = Storage(a=1) >>> o.a 1 >>> o['a'] 1 >>> o.a = 2 >>> o['a'] 2 >>> del o.a >>> o.a None """ def __getattr__(self, key): try: return self[key] except KeyError, k: return None def __setattr__(self, key, value): self[key] = value def __delattr__(self, key): try: del self[key] except KeyError, k: raise AttributeError, k def __repr__(self): return '<Storage ' + dict.__repr__(self) + '>' def __getstate__(self): return dict(self) def __setstate__(self,value): for k,v in value.items(): self[k]=v
I'm not very old to Python, so not on the head, please:)
I'm not so sure it is such a good bargain to use namedtuple because you're limited to... a tuple.
For example, I use very much Storage to store database information (schemas, tables, columns, functions, rights, etc) and it is quite simple:
root = Storage(db="myDb", schemas=, tables=, columns=, funs=)
and, of course, nest other Storage object into the defined lists.
It would be less easy with namedtuple as it is limited to only 2 members; although I'm sure you benefit from an underlying C implantation which should be very fast, my opinion is it would tend to reach the law of Demeter faster by extended the code too much.
Sorry but I can not understand what you are talking about:
>>> from collections import namedtuple >>> Storage = namedtuple('Storage', 'db, schemas, tables, columns, funs') >>> root = Storage(db='myDb', schemas=, tables=, columns=, funs=) >>> print root Storage(db='myDb', schemas=, tables=, columns=, funs=) >>> print root.db myDb >>> print root myDb >>> root.columns.append('Street') >>> print root Storage(db='myDb', schemas=, tables=, columns=['Street'], funs=) >>>
I agree namedtuple isn't the universal solution, but it's part of a standard module and works pretty well for most of the cases. Also, there is no limit to the number of members you can have.
>>> Person = namedtuple('Person', 'name age hobbies') >>> p1 = Person('Tom', 20, ['sleeping', 'programming']) >>> p2 = Person(name='Harry', age=40, hobbies=['hiking', 'cycling']) >>> p1 Person(name='Tom', age=20, hobbies=['sleeping', 'programming']) >>> p2 Person(name='Harry', age=40, hobbies=['hiking', 'cycling']) >>> p2.hobbies.append('gaming') >>> p2 Person(name='Harry', age=40, hobbies=['hiking', 'cycling', 'gaming']) >>> p2.name = 'Dick' Traceback (most recent call last): File "<stdin>", line 1, in <module> AttributeError: can't set attribute
As you can see, the only problem folks might have with namedtuple is that it's immutable i.e. fields can only be set once. But this doesn't pose a very severe restriction because you can still modify the objects pointed by the fields (i.e. mutate the list).
EDIT: Damn, Tony beat me to it
Also, there is no limit to the number of members you can have.
Ok, this is the part I missed.
However, and especially for the DB case, the immutability is a big handicap - i.e.: when I use it as SU to store the groups & users rights on all elements that I'm actually modifying: I can't afford a reload from a DB with more than 500 tables & 1500 functions when I access it remotely, as I also have parallel processes in the same time that send me back a lot of information, BP consumption & waiting cost would be unacceptable.
But I keep that in mind as for simplier cases.
Thanks to both of you.
This discussion reminds me of a class that I wrote once to create an object which __dict__ is an arbitrary dictionary. It allows funny access to global variables like this
>>> class Odict(object): ... """Odict(dictionary) -> an object which __dict__ is dictionary""" ... def __init__(self, dictionary): ... self.__dict__ = dictionary ... >>> glo = Odict(globals()) >>> >>> glo.hello = "hello" >>> hello 'hello' >>> glo.x = 3.14 >>> print x 3.14 >>> >>> glo.glo.glo.glo <__main__.Odict object at 0x7fac2c407b10>
It can also be used as an alternative to global statements in functions.
We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.