hi guys,

I have been using python since last 1yr. For my work I have to use large data that are at least of size 3000 to 30000 or even more quite a time.
Since many months I have been using matrices. But when the size of the Matrix is even 3000X3000 it takes lot of time to build it. I want to know if this is normal or its me who is doing a mistake. I am really bugged by the time factor as I have to wait a lot to get even one single result. I shall be continuing to work with even larger datasets and really like some serious advices from anyone who have an experience or even idea about dealing with large matrices.

Thanks in advance.

Recommended Answers

All 5 Replies

I suppose that you are using numpy for matrix computations, aren't you ? What's the content of the matrices, real numbers ? I think it would be helpful if you post a short working example of a too long computation which you would like to speed up.

Are you using numpy/scipy/matlab for yor data?

Thanks for your reply. I am not using numpy. I am using dictionary for the moment. Actually we need to index the rows and columns with our own keys rather than default numbers. I do not know if this is possible in numpy. That's the reason why am using dictionaries.

If you are using dictionaries, there is little doubt that your code could be speeded up. For example you could still use dictionaries to map your own keys to integer indexes and use numpy matrices to actually store and manipulate the data. Again, a typical example of what you are doing would help.

Thanks Gribouillis. I will try to post some real codes that I am using. In the meantime I will try to use numpy too. Thanks.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.