Timeouts, how do they work?


I’m trying to set up local simulations for multiplayer games, and at the same time I’m trying to make an AI that measures how much time it’s spending itself. I couldn’t find any useful information in the forum here, so I had to venture somewhere else.

Taking Tic Tac Toe as an example with a 1000ms timeout for the first round and 100ms afterward, I tried to read the java sources of the game engine, referee etc on github, and figured out it was using System.nanoTime() in a huge mess of trying to read the AI output. From what I could gather from the internet, it’s using clock_gettime with the monotonic flag behind the scenes, which is the same syscall used by std::steady_clock in C++ (the language I use). Not sure if you know what you are doing by measuring only real time instead of both real and cpu time but okay whatever.
Going back to how and when you use nanoTime, I figured I could sleep for more than a second right at the beginning of my AI, and it would pass. Indeed, it’s TOTALLY FINE DOODZ because I managed to sleep up to 1600ms in the IDE (add 150ms of real work) without running into trouble. The current AI running in your matchmaker is going for 1s + init, and it’s fine too. For subsequent rounds, I tried sleeping in my loop for 120ms in the IDE (actual work is at most 0.2ms) and it was okay until round 18 or so.

I believe the opponent’s last round time (capped to the timeout value), your JVM initialization and IO fiddling, and maybe more (like, simulating the round ?), is given for free in addition to the 1000 or 100ms when I compute the current round, which is definitely not what it says in the rules. And on the other hand, if your JVM decides to run the GC inbetween the calls to nanoTime() or your system decides to run some task, it’s going to reduce the actual allowed time.

So basically, unless I went wrong at some point, what it means is : please give complete information about that timeout thing, and fix this. Thank you.

The codingame engine on github is pretty new. So if you play on old multiplayers games, we can’t be sure that the timeout is handled the same way.

For the rest of your post, i questionned myself with the same questions.

I don’t think the GC can run when the referee is waiting because it does not create anything. I’m not sure what case could trigger the GC here. But i think it still may be possible. A solution would be to call the GC before waiting for the AI output.

For Tic-Tac-Toe, you are given 1s for the first round and 100ms for the next rounds. That duration is measured between the moment when we start sending you some inputs and the time we read your output.

In practice, you’ll have more than 1s between the start of your program and the timeout. But that extra time (before we start sending inputs) isn’t guaranteed at all! You also have some extra milliseconds after your output and the next read that aren’t guaranteed either.

Btw, the code used for the CodinGame SDK is not the same as the one used in production. And regarding the cpu time vs real time, feel free to open another thread if you want to open a debate.

1 Like

I’ll keep that in mind. Can we assume real monotonic time is always used for every game ? I don’t care if it’s not cpu time as long as you can guarantee some sort of stability for the timeouts given in the description.

Yes, we’re using System.nanoTime() for this reason.

Hi everybody,

Is there a convenient way to see if a game crashes or timeouts in the last battles ? I have to check manually all the losses and look at the last turn to see if the game ended normally or not.
More precisely, I would like to know whether a game crashes or timeouts, and I cannot make the difference without printing logs every 10ms and see how many lines are printed. And when it crashes near the timeout limit is it sometimes impossible to see the difference.