randomtest.ulx (1.8 KB)
Here’s a simple test to check how long it takes to call random() up to times.
The good news is that takes at most a few milliseconds even in Quixe.
The bad news is that when running it in Windows Git 1.3.4 I always get a number of about 24000 ± 1000. So simply calling count = random(32633);
isn’t good enough (and also means most of the random calls it makes are pointless.)
Actually it’s worse than that. It seems like when I run it every 2 seconds or so, the count is just increasing by 10-20. It is strictly increasing. I’m sure I saw it jump around before, but it’s not doing that now.
I wonder if it’s being seeded with the time, so the very first call to random() is basically completely unscrambled? Maybe keeping the hundred calls to random() that I7 does would help.
Edit: 30 minutes later and it’s ticked over and is now getting a count of ~300. And is still strictly increasing. Maybe I was wrong and it wasn’t jumping around when I first tried it?
Source code
Global count = 0;
Global mainwin = 0;
Global timeend = 0;
Global timestart = 0;
Array gg_result --> 4;
[ glk_current_time _vararg_count;
! glk_current_time(&{int, uint, int})
@glk 352 _vararg_count 0;
return 0;
];
[ glk_set_window _vararg_count;
! glk_set_window(window)
@glk 47 _vararg_count 0;
return 0;
];
[ glk_window_open _vararg_count ret;
! glk_window_open(window, uint, uint, uint, uint) => window
@glk 35 _vararg_count ret;
return ret;
];
[ Main i;
@setiosys 2 0; ! select Glk I/O system
mainwin = glk_window_open(0, 0, 0, 3, 0);
glk_set_window(mainwin);
count = random(32633);
print "Calling random() ", count, " times.^";
glk_current_time(gg_result);
timestart = (gg_result-->1) * 1000000 + gg_result-->2;
for (i = 0: i < count: i++) random(i);
glk_current_time(gg_result);
timeend = (gg_result-->1) * 1000000 + gg_result-->2;
print "Took ", (timeend - timestart), " microseconds^";
];