Write a program that uses NumPy, creates two random 100 × 100 arrays, and adds them together in two different ways: the first by using a double for-loop, and the second by using the NumPy “+” operator. Time how long each method takes to add the arrays.
Time each method 1000 times and record the timing results in a vector length of 1000. For each method: (a) Print the average and standard deviation of the running time. (b) Plot a histogram of the running times.
plz provide code for this quesiton
Write a program that uses NumPy, creates two random 100 × 100 arrays, and adds them together in two different ways: the
-
- Site Admin
- Posts: 899603
- Joined: Mon Aug 02, 2021 8:13 am