Develop a c++ program that accepts a list of a maximum of 100 voltages as input, determines both the average and standard deviation of the input voltages and then displays the results.
step 1 : requirements : an average, standard deviation
step 2: develop solution: the input is a maximum of 100 voltages (0-100) input by the user
DETERMINE THE OUTPUT:
Calculate the average by adding the voltages and dividing by the number of voltages that were added.
Determine standard deviation by:
1. subtracting the average from each individual voltage: this result in a set of new numbers, each of which is called a deviation
2. square each deviation found in the previous step
3. add the squared deviations and divide the sum by the number of deviations
4. the square root of the number found in the previous step is the standard deviation
(standard deviation is calculated by first determining the sum of the squared deviation. The standard deviation is then obtained by dividing the result sum by 10 and taking its square root)##
**CAN SOMEONE HELP ME TO WRITE A C++ PROGRAM FOR THIS? PLEEEEASE :(( **