So, let's say I want to test the randomness of something that generates 1-10.
I could have it run 10 times, and take the average, it should be somewhere around 5, but it would probably be closer to 4 o4 6.... just because it's random.
Now, say I run it 100 times, the average should be closer to 5.
Find what the average value of your script should be if it is truly random. Run the script 100, 1000, and 5000 times, tally and average the results and check that as you run the script more times, you come closer to your average.
Also, you could just do it 3 times, 20 runs each, and compute a standard deviation, but that takes more math, and I'm too lazy to do more than just mention it as a more mathematical alternative.
Best of Luck,
OmniUni