It does, though. One of the variables in Score() is set to a high value, causing the addition to return a high number, which in turn yields a low decimal when scoreF is divided by 70 in ScorePerc() (= the star calculation).
It might be the ScoreBase variable since it's divided by a whopping 500 when ScoreType is Latency, and used as is when it's Speed (= divided by 1 in the code). I don't know where this one is set. Or it really might simply be the result of a high user count and high load, those are also "divided by 1" with Score and by 10 with Latency.
That's the thing: They all get a score, and that score is 0 stars if you select Speed. That the result of the calculation of ScorePerc(). We can debate whether it's correct or not, of course, since it does look kinda suspicious.
As for me, my best servers are NL with just three stars and less for Speed. For Latency, everything below 135 ms is 5 stars, above are less. But I'd sort by latency, anyway, and choose one of the top 10 with reasonable load; I wouldn't pay attention to some score. Doesn't reflect my needs, anyway.
It's probably also worth noting that I get shown portions of stars, too. Maybe the whole thing changed a little in the last four years.