Hi, I am using 60ms as timeout so it gives me 40ms as buffer to output to the console but I still get timeout error. I use the System.nanoTime() and I measure the time right when I get the first input. I implemented MCST and in my while loop, I do condition check to make sure I don’t timeout. But it seems it doesn’t work. Below I put my sample to show my timeout logic.
while ((currTime - startTime)/1000000 < timeout){
MCTS_Node leaf = traverse(root);
if ((currTime - startTime)/1000000 >= timeout){return root.bestMove();}
Game_Status result = rollout(leaf);
if ((currTime - startTime)/1000000 >= timeout){return root.bestMove();}
backpropagate(leaf, result);
currTime = System.nanoTime();
}
I read other posts about it but there is not definite answer how to do the correct way in CG. Is Java the issue here? Should I use another language?
So each node in MCTS needs to be created so yes I create lots of object. 1 object per state. The algorithm does traversial, rollout and backpropagate. I could put more checks inside the methods but I would increase overhead. I guess it is a tradeoff.
At turn 1 you have 1 second because of all the initialization part. But then you have 100ms to respond. I put 50ms as timeout but still randomly I lose because of timeout. I believe GC could be the culprit. I am not sure how I should handle it. I don’t see how Java users handle that case. I am thinking maybe rewriting my code in C++…
Making sure that GC will not randomly fire and cause a timeout is indeed a big issue for Java developers who are using memory intensive algorithms like MCTS.
You can check some informations here, these threads contains some advice for this issue:
Yeah I read the threads and there are no consensus on what to do. I tried to create all my objects on first turn and I am recycling them but it doesn’t seem it improved my timeout. It is kinda frustrating. I might try to code it in C++ or so. It is frustrating I spend so much time to code an engine and losing because of random timeout.