java - calculating average of random integers in an array -
i need generate specified number of random integers between 2 values specified user (example, 12 numbers between 10 , 20), , calculate average of numbers. problem if ask generate 10 numbers, generate 9 (shown in output.) also, if enter max range of 100 , min range of 90, program still generate #'s 147, etc on max range... did mess random number generator? can help?
here code have far:
public class arrayrandom { static console c; // output console public static void main (string[] args) { c = new console (); decimalformat y = new decimalformat ("###.##"); c.println ("how many integers generate?"); int n = c.readint (); c.println ("what maximum value these numbers?"); int max = c.readint (); c.println ("what minimum value these numbers?"); int min = c.readint (); int numbers[] = new int [n]; int x; double sum = 0; double average = 0; //n = number of random integers generated (x = 1 ; x <= n-1 ; x++) { numbers [x] = (int) (max * math.random () + min); } (x = 1 ; x <= n-1 ; x++) { sum += numbers [x]; average = sum / n-1); } c.println ("the sum of numbers is: " + sum); c.println ("the average of numbers is: " + y.format(average)); c.println ("here numbers:"); (x = 1 ; x <= n-1 ; x++) { c.println (numbers [x]); //print numbers in array } } // main method } // arrayrandom class
java arrays 0 based. here leave first array element @ default value of 0
. replace
for (x = 1 ; x <= n-1 ; x++)
with
for (x = 0 ; x < n ; x++)
edit: answer question (from deleted comment) of why not produce values between min , max
max * math.random () + min
math.random generates double values between 0.0
, 1.0
. example, min of 90
, max of 100
generate numbers between , 90
, 190
(!). limit values between min , max need
min + math.random() * (max - min) ^ |_________________________| | | 90 value between 0 - 10
Comments
Post a Comment