We’ve updated a few old machines used for some optimization puzzles and thus rerun solutions for these.
Your submit history is still there but you may see some outdated solutions.
With this change, we’ve added the possibility for us to rerun scores for all optimization puzzles and code golf puzzles with a simple click. We could add validators to these existing puzzles at any time (and possibly break hardcoded solutions).
I’m all for breaking hardcoded solutions
Breaking harcoded solutions just by adding a new validator is not really a solution. Since we’ll just get the new validator and hardcode again.
Of course CodinGame could just add new validators everyday, but i don’t think they will do that
If you want a solution against hardcoded solution, the only thing i can imagine is randomizing validators every day.
Well done adding this feature! Maybe it’s the occasion to use it to recalculate scores on Temperatures golf since some old scores are not valid?
See Temperature Code Golf puzzle discussion
I used to have a score of 8619 in CodinGame Sponsored Contest. I understand it has been invalidated by the rerun, but what’s strange is that it does not even appear in my history anymore. It’s not super serious in itself, but it’s a bit worrying if scores (and code!) can disappear from the history…
There is also functions in some languages that changed over time, so previous code don’t work on new versions.
It didn’t disappear. The code with which you had 8619 now gets you around 5k. We reran the best code.
Are you sure of this ?
Because i just checked the puzzle and my score was weird. The IDE displayed a score of 6800, but a ranking of 9500. I checked the leaderboard and the leaderboard said that my code was in Kotlin with 22 points.
I just submit my C++ code to fix the issue.
Since the (new) validators are not seen, wouldn’t it be difficult to develop hardcoded solutions via repeated submissions and trying out which ones work?
You can find a new validator in one day, more or less. And the recaptcha security is not really an issue because you can create accounts as much as you need. There’s no rule about multi-account in an optimization puzzle.
Actually I can find a validator in less than 5 minutes, as long as there is a viewer with an animation of the game. I can guess the replay ID by
- play in IDE -> get replay ID A
- play in IDE -> get replay ID B
The validator must be somewhere between ID A and B.
I’m pretty sure you can no longer view validator replays outside of the submission window (if applicable). I tried with the bridge, it didn’t work (got a 403 Forbidden)
But you can still download the json data via the API.
That’s enough to obtain the validators, just print the level input to stdout and you have it in plain text.
I never tried myself, but @Neumann said that he can’t download the json data via the API. Or maybe we just don’t know what URL you are talking about.
Try it out yourself. It uses the multiplayer last battles API to get single player replays.
There are 2 APIs, I tried with the wrong one
Oh. And what is the good one ? It’s for a friend.
r = requests.post('https://www.codingame.com/services/gameResultRemoteService/findByGameId', json = [id, None], timeout=10)
data = r.json()
@eulerscheZahl shared interesting hacks to obtain validators for animation games. What about puzzles that do not have animations? Do you guys just try it out manually @Magus, @eulerscheZahl?
Another question: if there are sufficient amount of validators (e.g. 30-50), wouldn’t it be more difficult for a harded-coded solution to be shorter than a genuine solution (since hard-coded answers in lookup tables also count in code size and compressing many answers into one program would take quite some efforts)?
Pretty easy but you need a big amount of submit. Just force a crash (or not) according to the inputs.