I've created a fairly large math suite for my spectrometer research, which is basically a set of programs that does several math operations, generates plots etc...
It's beginning to get pretty bulky, and seems like it would benefit from a nice class; wherein, I could store all the data into one set and then can update the class in the future rather than one function then another etc...
Here's my question/problem:
I import a large data set in which each column is a time trial. Let's say I have 100 time trials. I can run my analysis on this file one column at a time, generating 100 plots, 100 output files, 100 result pages etc... or I can, say, average every 5 runs, resulting in 20 of each of the aforementioned results. In many instances, I have to look at the data each way over and over, so it's very inefficient for me to format the data in the preprocessing.
The way it looks computationally is that I want to make a class file that takes in several numpy arrays. Each array corresponds to the result of one of my math functions. Let's say I have three functions, func1, func2, func 3, and each outputs a different. Depending on my needs, I may need to make 100 of each of these arrays, or only 50 each, or 20 each. In each run, I will always make the same number of arrays for all the functions (I.E. func1 won't have 20 arrays while func 2 has 30).
Is there a way to define a class file that can accept X number multiple arrays? For example, in trial 1, each function outputs 100 arrays, so the class file would accept and store 100 arrays, but in a second run, would only store 30 arrays? Also, is it possible to to have a class file that takes in data from 3 separate functions, for should I have 3 different classes, one for each function output?