Hey fellows,

I've created a fairly large math suite for my spectrometer research, which is basically a set of programs that does several math operations, generates plots etc...

It's beginning to get pretty bulky, and seems like it would benefit from a nice class; wherein, I could store all the data into one set and then can update the class in the future rather than one function then another etc...

Here's my question/problem:

I import a large data set in which each column is a time trial. Let's say I have 100 time trials. I can run my analysis on this file one column at a time, generating 100 plots, 100 output files, 100 result pages etc... or I can, say, average every 5 runs, resulting in 20 of each of the aforementioned results. In many instances, I have to look at the data each way over and over, so it's very inefficient for me to format the data in the preprocessing.

The way it looks computationally is that I want to make a class file that takes in several numpy arrays. Each array corresponds to the result of one of my math functions. Let's say I have three functions, func1, func2, func 3, and each outputs a different. Depending on my needs, I may need to make 100 of each of these arrays, or only 50 each, or 20 each. In each run, I will always make the same number of arrays for all the functions (I.E. func1 won't have 20 arrays while func 2 has 30).

Is there a way to define a class file that can accept X number multiple arrays? For example, in trial 1, each function outputs 100 arrays, so the class file would accept and store 100 arrays, but in a second run, would only store 30 arrays? Also, is it possible to to have a class file that takes in data from 3 separate functions, for should I have 3 different classes, one for each function output?

Thanks

Are you thinking along the line of a record structure?
You can easily set those up with a Python class.

Are you thinking along the line of a record structure?
You can easily set those up with a Python class.

I don't really know what this is. I ended up with a solution that works; although I'm not sure it is the best solution. What I ended up doing was writing a class that takes in the number of iterations that will be ran in a given execution, and assigns a properly sized numpy array to hold each result from each function. Therefore, I have all the data stored in large arrays at the end of the runs. These can then be exported into plotting programs to make animations or still-framed plots, can be output to files etc... My main goal was to have all the information stored in one place, so that I could manipulate as necessary from this one large class instance.

The list and tuple can hold any size results also. So basically you can inherit basic properties from those and take in say range of values, number of data points and tuple of functions.

This article has been dead for over six months. Start a new discussion instead.