kalpana0611 0 Newbie Poster

Hi

I am kinda unsure of how the processing time should be computed for a cluster.
Say, for example

I have a cluster of 2 machines and I run a visualization application, I need to record the processing time of certain parallel algorithms.

Okay the question,

a) When I run using 1 machine in the cluster, the time taken to execute Algorithm A is

Processor 1a - 0.50sec

b) When I run using 2 machines in the cluster, the time taken to execute Algorithm A is

Processor 1a - 0.23 sec
Processor 1b - 0.22 sec


How do I analyze the results.. should I take the average of B, say
- (Processor 1a + Processor1b) = add up and take the average

OR

- (Processor 1a + Processor1b) = just add up.

- I need to draw a graph to show the time....? hope someone can point it out