The latest version (v0.2.4) released today, adds tracking for average position (ranking? can someone think of a better name for this?) and also for average score (in Hypersonic only).
For example, in this run, I was able to see that the changes I made (IDE code) to improve my Hypersonic score (boxes destroyed) did have a very slight score improvement, and also a slight beneficial impact on my average position in each game:
Just an FYI, CG slightly changed the format of their URLs, and CG Spunk wasnāt correctly detecting when you navigate to the IDE. Version 0.2.6 has been published to fix this problem. Thanks to the multiple users who have pointed this problem out to me.
Could you please add support for custom opponent ranking range (i.e. two inputs ārank fromā and ārank toā)?
My use case is that even though my bot is able to win in his range, he might actually score poorly in lower ranks (say, 200 below him), which is very important when re-submitting.
Is it possible also to see full info? Not only STDERR Debug (Errors Stream)
How change script to see all information: Game information Action (Output Stream) Debug (Errors Stream)
Just same with IDE giving us.
Version 0.2.7 has been published and is rolling out to clients. It fixes a few small bugs, and also incorporates Issue #48, requested by @inoryy. Enjoy!
Hi, I donāt know if itās exactly the point of #49, but adding the score plus the reason for a loss (regular loss / timeout / invalid action) would be amazing
Version 0.2.9 is available in the Chrome Web Store, and should be rolling out to clients now. It includes score tracking for CodeBusters, Hypersonic, and Fantastic Bits. It also has timeout and invalid output detection (in addition to the preexistent crash detection).
Also, I have changed the Average Score (in the summary info) from an average of scores into a weighted average of score percentages:
HS: <boxes destroyed> / <total boxes>
CB: <ghosts captured> / <total ghosts>
FB: <score> / <num snaffles>
Take the average with a grain of salt, though. If the game ends prematurely (not a score-based win) then those incomplete scores will still be averaged in. To do anything different is more work than Iām willing to invest at the moment, and Iām not even sure how that should be handled. Any suggestions?
Version 0.2.10 has been published to the Chrome Web Store and should be rolling out to clients now. Thereās not much of a visible change other than the fact that you can now pit your IDE code against your Arena code, a very annoying bug (issue #55).
The changes required for this were, unfortunately, pretty extensive. I did quite a bit of testing, but I wouldnāt be too surprised if there are new bugs introduced as a result. With GitC only 9 days away, Iād like to stomp any bugs ASAP, so please let me know if you see quirky behavior.