Fall Challenge 2023 - Feedbacks & Strategies

#5, Legend

Endgame

Evaluate all possible “plans” by each player. Example: if the remaining fish are 012 for me, and AB for the opponent, then a plan for me could looks like:

2SB,AS10 (1st drone goes to fish2, then surfaces, then sabotages fishB. 2nd drone sabotages fishA, etc)

Then do a min max search (assume I choose my plan first, so it’s pessimistic). If I find a non-losing “plan”, then use that, otherwise estimate the nash equilibrium (of this matrix game) and pick the highest % chosen plan.

Midgame

Similar to the endgame, but plans consist of just my remaining fish, and objective is to minimize the total turns until I have a “surface-able” position.

Early game

Prioritize rows with “outside fish” (within 2000 of the edge), otherwise move to “midpoints” (pairs of unscanned fish of a certain depth on one side of the board)

Monster Avoidance

brute force 2 step search, minimize distance to “long term target”

Misc

Lots of other tactics/fine tuning, such as:

  • heavy use of the x-symmetry of the game, for example: when seeing an “undisturbed” fish (1 of the 8 starting speeds) or monster (non-moving or very early on), backtrack to figure out where its “mirror” started

  • if a drone has a 1 or 2 turn sabotage available, do that instead (with various restrictions)

  • track upper and lower bounds of the x and y values for each unseen fish/monster, and expand the ranges a bit each turn to factor in the uncertainty

  • lots of areas where efficiency can be gained, such as 1-step look ahead for executing plans (i.e. sabotage in a way which minimizes distance to the “next” target)

16 Likes

Using lights are bad - Monsters chase us
Not using Lights are bad - Can’t scan/gain vision
Moving is bad - Motor would be turned on and fishes can hear and start to flee making us difficult to track
Moving closer to fishes in the edges is bad - Fishes might flee out off map and make us miss combos
Not moving closer to fishes in the edges is bad - We can’t push the fishes outside map making opponent miss their combos…
Moving left and right are bad - as time to go down and reporting back increases
Not moving left and right are bad - as we get less information from radar
Moving towards closest fishes are bad - As this leads to a lot of zig zag for our drones
Moving towards deepest fishes first are bad - as we end up getting chased by monsters while opponents happily scan fishes

Basically anything we do in this game is good and also bad making the game chaotic and random and I felt it’s a very bad game design that’s so imbalanced.

TOOO MUCH DARKNESS I enjoyed the contest only till I reached Gold. Implementing the sim was very hard. The sim won’t work without proper tracking as it needs the position and velocity of fishes and monsters. With so much hidden details I had to go with positions resulted from my bad tracking. Fish’s hearing range is too OP they just flee and my bad tracking end up even worse due to their hearing range… Monster speeds are too OP… Giving them 540 speed and a 500 hit range is insane as they are already hidden and tough to track…

I had a very bad contest and ended up in bottom gold with no clue on how to improve.

3 Likes

Very friendly for heuristic players, very challenging for sim players.
I like it, because newcommers can have fun, and who want to try hard, can do a lot of improvements.
With more days of contest, top players do not have a clear good final version. Even recurse have a room for improvements.

I was #200 with a full heuristic bot, nothing very different for others PM.

Congratz @reCurse @MSz @zasmu and @blasterpoard , it was amazing watching the last day submissions and bot improvements :wink:

7 Likes

Viva la neuroevolution 73th gold.

As with Keep of the Grass I used evolution strategies and coevolution, where each genome in population fought each other 2 games.

NN architecture was simple MLP and the inputs were adjusted to the drone’s POV. The drone could see only nearest 1 monster. Because this was a game with hidden information, a network with memory would be much beneficial. I could use RNN, LSTM or transformers or whatever kids use these days, but I went for simplest - some of the NN outputs were fed back to the inputs. At first I had 3 outputs - vx vy and light, so continous actions (in theory ‘infinite’ resolution) but it was very slow to learn. Discrete actions were much faster. At first I had 8 angles around the circle and full speed, then later switched to 16, then finally to 32 angles around the circle, 1 for light, and some outputs for the next inputs.

At first the fitness was just getting as much points as possible per game mostly ignoring opponent. It was not so bad in the early bronze, but it was not enough. I tried different settings. I tried some penalty when it got eaten by monster, but the way how fitness works, if I gave too much penalty for that then it would figure out that doing nothing is the best. I settled for the simplest - coevolution, where each genome fight each other 2 games and win is 1, draw 0.5 and lose 0. Unfortunately this is also most time consuming, increasing popsize grows quadratically, and the game had much variance so bigger population was needed for better diversity. At first I had 80 genomes, then upped to 120, 160, 240… Last week the evolution had 700 genomes per generation, that’s 490k games per generation and it took 1 minute to finish the generation.

As in previous contest, I needed to restrict obviously stupid moves in order to climb in rank. It learned to avoid monsters on its own for most time, but still could often get eaten in the most stupid ways. Firstly I had restricted moves (made them illegal) that would go to monster. No simulation just very simple rules involving monster position and speed. The winrate really spiked. Then I got another massive improvement with collision detection and also simulated subset of next turn monster vs drone. The drone could avoid swarm of monsters within 1 turn. If monster went out of sight, I kept his position for 1 frame and assumed he was going towards me. This made the drones somewhat intelligent since the early generations - they were going down and up and avoiding monsters and that was enough to gather some points. My last manual intervention was to restrict going to y to 500 if drone was going up with scanned fish, this sped up fish scanning and reporting a little.

I was watching some replays to get idea what else could be improved for the NN, but nothing came to my mind. It was also fun to see how the bot would be stuck in weird local optimum for few thousands generations. For example, the bot was in upper half of gold easily, but when it was going to surface with scanned fish, it was going a little diagonal or zigzaging instead of going straight up and it was losing by 1-2 rounds because of that. I saw some sharp spikes in winrate charts, it was when the NN was finally started going straight up. There were few such spikes meaning the evolution got unstuck from some local optimum and found some important strategic things.

This time I didn’t get into legend. Apparently leaving things to computer isn’t always good. I liked the game overall, but it contained too much variance in the leaderboard. 1 submit was 80th, then the same resubmit was 30th, and overnight it climbs to 15th, then submit of ‘better’ bot made it to 60th. It was frustrating. At first glance the game looked simple, but it has some depth to it :drum:. Though I imagined I would see some very weird strategies like dumping monsters into opponent or something like that. Thank you codingame for another contest.

15 Likes

Hello there, 80th gold, 138th global here.

Main strategy

Tries to find the best path to catch all fishes with a score function.

If I have enough points to win or if my opponent will be able to win if I don’t go back to surface right now, I simply go back to surface.

Creatures area

Like everyone I try to estimate the fishes area, for this I use the previous area, the living zone of this fish type and the radar. Could have been really improved without so much efforts by using the opponent’s scans, my drone’s light radius and the creatures symetry.

Uglies avoidance

Find the closesest destination from my target where my drone won’t be eaten by an ugly in the given turn. Works very well for simple cases, but since I have no simulation for multiple turns my bot can end up trapped between multiple uglies, or not find the most optimal path.

If an ugly was already found before I keep a track of it’s potential position given it’s last position and speed.

Light

Score computation using my light area, all creatures positions/areas and types.

For each fish I compute the probability to catch it, and Above a given global score I will decide it’s worth to turn on the light.

Conclusion

I really hesitated about doing a simulation since it can be so complicated to have something accurate with the fog. My bot ends being really basic, and I’m impressed by what other players achieved.

I wasn’t very excited about the subject, I think it could have been deeper and a little bit more complex even if there is such amazing bots and work right here.

Really congrats to everyone and especially the legend players and the winners :wink:

And of course thanks to CodinGame !

2 Likes

Di_Masta, 460th

Even though the contest seemed pretty random at the end, I’ve enjoyed it.
I’ve spend a lot of time, working on it every day and evening for 21 days straight. This time I really wanted to reach top 100 and fulfil a small dream of mine. But even with the enormous efforts I’ve invested, I wasn’t able to do it and I’m pretty disappointed of myself. But at least I’ve managed to improve my results form previous contests, which puts a little smile on my face. Also I’m happy that I’ve managed to implement every idea that I had, even tough most of them didn’t work at the end.

How did I approach the task:

  • In the first leagues I wanted to research the game using just a simple heuristics bot. And it paid of at the end. I discovered that a predefined path for my drones to follow, works pretty nice. I’ve scattered Check Points with radius 600 in the middle of all habitats, two points per drone and make my drones follow the CPs if the drone is in the radius of the CP target the next one.

  • When the monsters came making one full lap without being hit was pretty hard, so I’ve surface the drone after hitting a CP and that moved me to Silver.

  • There I’ve noticed that in order to collect enough fish I need a simulation of the entities. I wanted to implement accurate simulation and keep the data for the detected fish for the next simulations. This was really hard for me, I spent days porting the referee simulation logic to C++, after that I needed new several days to debug and polish the simulation. I cannot understand why I spent so much time on that, does it happen to anybody else?

  • For the first time I’ve read and used the Java referee code, which was definitely a good idea.

  • When I polished the simulation I noticed that my Check Points weren’t working very well in different scenarios. So I wanted to make an AI search algorithm with simulating the future. I first tried GA with evaluation for each drone, but this didn’t work at all, for days I couldn’t come up with a nice evaluation. Actually I always struggle with this evaluations, could someone give me an advise how to approach them properly. I even made my own visual debug tool to analyze the GA but it didn’t help also.

  • After the fail with GA I tried MC with random moves, simulating the game until a drone is hit or a Game Over, didn’t work at all again…

  • After so many fails I decided to implement DFS for each form getting him closer to the next Check Point without being hit by a monster. I’ve managed to run DFS with 5 turns depth and finally I saw good results which moved me to Gold, the bottom of Gold.

  • There I noticed that my CPs are not working very well in all cases, so I decided that I need more precises targets for my drones. A friend of mine advised me to use the Drones’ radars to estimate the positions of the fish. And I came up with the idea to average the radar direction for all remaining fish taking the directions for fish on the right of the right drone with bigger weight and ignoring them for the left drone and vice versa. This worked well moved me to 185th in Gold, so close to my dream…

  • Then I decided to send the drones to report only if they are above of one of the enemy drones otherwise I wanted to scare unscanned fish by the enemy. Took me days to implement decent scare logic. Tried the new bot against the 100th player and my bot won 8 out of 10 matches, got pretty excited and submitted, moved me 100 places behind …, fixed some bugs again good tests against the new 100th player, submit, 100 places behind …, and again and again … I reverted to my 185th both but maybe it was too late it didn’t reach the same place.

Mistakes:

  • Probably I should have been more active on Discord.
  • For future in Gold league I’ll first test my new bots in separate account before submitting in my main.

Request:

  • CodinGame @TwoSteps , for the next contest please consider a more simulation friendly game, with this collisions and vectors it’s very easy to mess up even with having the game code ending up in days of debugging. The randomness of the fog of war is also very hard to deal with by the average player. Perfectly, I would suggest something that could be solved with MiniMax, I know it’s old school but a nice game, I believe will be a good challenge for the top NN players also. Some classic board game like: Chinese_checkers
3 Likes

17th legend, first python.

Thanks for another nice challenge. It had much depth to it and there was no “clear solution” which kept things interesting and surprising during the competition.

When the challenge was just revealed, It seemed like an easy task - go down, grab some photos, go up.
In the beginning, this was literally the case - my first submission in bronze was just instructing my bots to go up and down blindly, and it went straight to the top 100 and stayed there for almost a week.

As I thought about it carefully, it seemed like getting a good old search algorithm wouldn’t work here in python, since the devastating results of a bad action will usually come up many turns ahead (for example: If I don’t scan a crab and go up, it might let the opponent as many as 20 turns to scare it, which might be catastrophic as it will loose also it’s trophies). I imagine that even @reCurse’s RL bot had a problem learning those kinds of things and misevaluated losing scenarios.
Also, due to the stochastic fog of war type of game, I correctly assumed that games would be inconsistent. Therefore it wasn’t important to always win, but it was important to statistically win more times.

So I went on with a (mostly) heuristic bot and a lean approach, finding the most common reason for me losing games, fixing it, and repeating.

The final bot:
Tracking:
As people previously mentioned: exact tracking with abusing the symmetry, and rectangles for the fish I didn’t see yet.
Differences from the previous PMs:

  • cutting the tracked “fish rectangle” with a circle, and not a rectangle.
  • losing exact track of creatures that collide with ‘fish rectangle’, except for monsters.

Descending state
During the first descent, it was highly beneficial to scare fish close to the edge with my side bot. accordingly, it was important to send my “center” bot towards the side, to scan their symmetric counterparts before the opponent scares them.
The adjustment in the movement was done to put the bot in the bottom-most position after scanning/scaring the relevant fish.
Once no such fish are left to scare or scan close to the wall, go to hunting state.

Hunting state
Searched for the paths that would bring both bots to the surface with enough creatures to ensure a tie or a win. Later patched this search mechanism to also scare fish away and go for the 2nd dive.

Lighting
Use light only when there is a high enough probability to find a fish.
during descent:

  • Not lighting more than one bot at a time. The only exception is if there’s a creature to scan before it can be scared.
  • If no lights are to be used 2 turns in a row, force light on the bot with the higher battery. important to not blindly run into a monster.

Monster avoidance
3 rules here:

  • Don’t get eaten this turn. The closest direction to the fish is calculated in the fish’s frame of reference (there the fish is static, and my bot moves towards it with a relative velocity of bot_v-monster_v, and with some basic linear algebra and trigonometry the direction limitations can be calculated)
  • Try not to agitate monsters by keeping 800u from them the next turn. Also some trigonometry here.
  • Then I check that the final position doesn’t result in my bot being alive but trapped the next turn.

The combination solved nearly all losses due to monsters.

Math
Some basic linear algebra and trigonometry (especially the law of cosines) came very handy in this challenge. It gave me the option to calculate exact solutions quickly, giving me the small boost required to pass the other heuristic bots.

5 Likes

#98 (Gold #40)

It is my second participation, and the task was to get into the Top 100. :sweat_smile:
Until last week, a very simple and naive strategy based on rules kept me on track.

  1. Monster tracking is crucial; a hit is a loss most of the time.
  2. Each drone either go down or up: initially, go down until radar shows BL or BR then go up until y > 500, repeat.
  3. On every turn, each drone proposes his direction only (angle) to move. In early versions, it is just 90 deg or -90 deg. if go down and have BR only or BL only, then go to the center of BR or BL rectangle. If both then just go 90 deg down.
  4. Turn the light if y > 3000 and we are far enough from last light position (~1000u).
  5. Once a drone proposes its angle A, we check if it is far enough from all tracking monsters. if it doesn’t then check the nearest angles (A-delta, A+delta), then (A-2delta, A+2delta), etc. “Far enough” is 570u or 800u in case a drone is close to the edge (1000u). If nothing helps, then just pick an angle to maximize the distance from monsters.
  6. if our drones have a “winning number of fishes” then all drones go up (we report our scanned fishes, foe reports all the fishes, we report the rest). It beats the bots who scan everything first.

Last weekend, I added more heuristics, the key ones are:

  • 5-th rule was improved to check that after the move we have 2nd safe move.
  • 6-th rule was improved: the deepest drone (or both) starts frightening the foe’s fishes (just move to the center of fish rectangle or 1px before the fish)
  • Do not turn on the light if nothing new is scanned.
  • Turn the light on if we have a good chance to catch a crab (at least 10 battery and ~2500u to the center of crab rectangle)
  • First 10 moves, my drones frighten the edge fish if any and the fish will die on next move.
  • And my favorite: First 5 moves are hardcoded if we have an edge Jelly: a “middle” drone moves 90-34 deg or 90+34 deg to the edge. Not losing my first Jelly and kill foe’s first Jelly helps to go up first.

Finally, the games become very dynamic and surprise me sometimes.
But the score system is very strange at first glance. I submitted the code and became #140, submitted again - #70. I think I am in Top 100 by a luck.

5 Likes

#44 Legend

Simple Approach:

lights:

  • Use light if the drone is in a zone where it hasn’t scanned a fish.
  • Activate the light every two turns for each drone.

Tuning:

  • Utilize tracking for both fish and uglies to predict their position and speed each turn without being in my range.
  • use of x-symmetry for fishes and uglies to identify the symmetric location when encountering them for the first time.

Monster Avoidance:

  • Employ a brute force 2-step search.

Report Calculation:

  • Calculate (myScore, oppScore, myCombo, oppCombo) including bonuses.

Midgame:

  • Evaluate whether to ascend to the surface using the report calculation or descend deeper, or claim the remaining area while ascending.Taking appropriate action:
    • Claim the Fish:
      • Do so with minimal effort, optimizing the edges if unable to capture both.
    • Kick Fish Out:
      • Set a target for the drone to make the opponent lose some combo points, prioritizing the easiest target.

Thanks CodinGame for such contests.

12 Likes

Again another wonderful challenge from Codingame !

The game combines several features, with fog of war like in CB (CodeBusters), hidden informations like in OOC (Ocean Of Code), physic motor like in CSB (Code Strikes Back), resource management like in CR (Crystal Rush), …

I finished #25 in Legend (my record out of 36 contests), maybe the 3-week format is better suited for me.

This game actually contains 3 games in 1 :

  • Information gathering
  • Decision-making
  • Execution of the action

My game is more focused on information gathering and executing actions, as I spent a lot of time developing the simulation and trying to make it profitable. Also, my decision strategy is quite frugal and could be summarized simply as : take a bunch of fish, and if the opponent is ahead of me when resurfacing, then eliminate the missing fishes for him.

So my conditions to take a decision are :

  • Have a minimum number of scans with a minimum total number of crabs for a drone to decide resurfacing
  • Have a minimum total number of scans with a minimum total number of crabs to bring up both drones
  • If one of my drone is behind the two opponent drones, go back down and try to drive the missing fishes for him away in the game

Information grabbing :

  • Each creature is in a box in the limits of its territory. The box is truncated with the precedent box from the last turn, radar blips from my drones, symetric radar blips when the number turns elapsed ensures that the creatures have never been disturbed. I don’t use opponent scans.
  • All states, positions and speeds are simulated when I know them from past turns. Positions are compared with the current radar blips, and in the event that a position is outside the radar, this position is canceled
  • Using of light is scripted for the 12 first turns, but my drones never use light if no creature can be in the light radius, and always use light when a scannable creature is certainly in the light radius

Movements up to Gold league :

  • My 12 first turns are dedicated to the direct descent towards the crabs, targetting the points (2000, 8000) and (6999, 8000). Those target points are adjusted horizontally based on radar informations.
  • To avoid monsters, all 360 degrees positions are tested with the getCollision() method. I added a penalized safety margin around the monster’s arrival positions, and near the borders of the game, to prevent collisions on the next turn.
  • To deliberately drive the fish away, my drones target a point 700 pixels behind the the fish.
  • To occasionally drive them away, my simulation for one turn rewards the fish about to exit, when this fish is not scanned by the opponent player, and is scanned by me.
  • To avoid eliminating a fish I haven’t scanned yet, my simulation penalizes when my drone has less than 5 in battery and his position is in a 2000 distance from the fish or its box.
  • Surfacing is limited to a heigyt of y = 500

Upgrading to pass Gold Boss !

  • Simulating 2 turns instead of 1 turn, to delete the safety margin from monsters and borders. The second turn is evaluated for 36 positions, so the number of simulated positions is 360 * 36 = 12960, generally in 29ms. With no margin, drones avoid a lot of long pathes turning around monsters, and even in very simple situations :
  • Adding dead evaluation to occasionally make fish flee on the second turn.

Upgrading in Legend

  • Adding a new target point between jelly fish during my scripted descent
  • Adding an evaluation to favor rounded positions closer to resurfacing when the lost distance to the target is more negligible.

I think that if I had time to calculate scores and make better decisions in the macro game, I could have further improved my bot, but I am already very pleased with the result :smiley:

Thanks a lot to CG and to the designers of this very good game !

14 Likes

Thx for hosting this challenge.

I was very close to Legend, but failed in the end :frowning:
Nothing new in terms of strats of what others wrote, but I would like to praise psyleague for giving me an amazing last weekend in fast testing, should have started with it earlier. Special thanks to @Psyho for this catastic tool!

10 Likes

Rank: #33

Link to the PM: click me, I promise it’s a normal link

Thanks for the contest. I really liked that every decision has good and bad consequences, which generally means that the game is balanced since it supports different strategies. Good game design.

9 Likes

Rank 50 - Heuristics

A big thank you to CG for organising this challenge !
Thank you also to the participants who shared their feedback on this Forum, or even to those who shared tools to make better contests. (can’t wait to try Psyleague !)

I appreciated that the game was very different from the ones I had played before, especially as it was the first time I used the referee code. Good practice for next challenges when I want to try simulation based algorithms.

Here is what I tried to achieve :
0 / Track the location of uglies and fishes seen, or guess it with symetry and radar. (following referee code)
1 / Define targets for my drones in the deepest level with fishes
2 / Score different strategies.
→ estimate the arrival order of the four drones, for each 16 combinations of “going to the surface” or “not” (e.g. “up,up,up,up”,“up,up,up,down”…).
→ compute the bonus difference with the oponent scored, following the drones’ arrival order
→ use a min max approach to choose the moove, I consider that my oponent will follow the best moove by keeping the min score for each of my 2 direction combination, then I choose to follow the one with the final best score
→ Then when searching fishes, evaluate the bonus scored by chasing the fishes arround. I added some assignement between drones if several fishes, to avoid having the closest drone overloaded in this task
→ When going up target fishes with a lower y, if time to do it (compare number of turns remaining to surface with opponent drones)
3 / Avoid uglies : ± 1° check collision, but only one turn ahead. So avoid to land in corners was needed.
4 / Light management : frequency based on the % of fishes in the current zone. Overwritten if a fish tracked or guessed is in range or if no fish in the zone.
5/ Update the speed tracking of fishs and uglies.

4 Likes

Legend #4

(or rather shared #3-#5, given that the number of games in rerun was laughably small for a game with so much hidden information)

Tracking fish: Monte Carlo simulations of each layer (4 fish per simulation), starting from turn 1; the fish spawn positions are generated just like in the referee. I filter the simulations that do not agree with radars/scans (with 100-distance tolerance for imprecision), then keep adding new simulations for 25ms. I ignore collisions between fish in different layers.

Movement: I evaluate ~1000 points in a circle with radius 600 around the drone (approximately half of them on the perimeter). The only evaluated things are:

  • distance to the expected position of every fish (the closer, the better; bottom layer fish have triple the weight; also a bit more weight if I’ve already scanned fish of the same color/type)

  • distance to fish that the opponent has not yet scanned (when the fish is closer to the border, it has more weight)

  • pushing a fish that the opponent hasn’t yet scanned towards the border

  • staying above the opponent’s drone that has scanned the same fish/will complete the same color/type bonus

Red fish avoidance: if (goingToGetHit) dont(); I’ve also precalculated, for every d in [501…1640] the size of the angle that a red fish at distance d blocks if it’s targeting me, to make sure I don’t get stuck between 2+ fish after moving.

Light: in the early game I use light on hardcoded turns, or, if the opponent might push a fish out of the map, I move towards that fish and hold the light until it’s likely that I hit it (if the opponent’s drone if moving away from it, I assume it’s not going to push it). After the first couple of turns, I use the light if the probability (according to my tracking simulations) of hitting a fish if higher that a*battery+b.

Going to surface: when I’d win by doing it, when I’d lose otherwise, or when I go above y=2500 because of trying to stay above the opponent.

I originally intended to use that tracker for some sort of a smitsimax-like algorithm (both players pick a random-ish plan, pick a random fish simulation, play the game out with heuristics, repeat), but the game didn’t motivate me enough to write anything more than a simple evaluation function, so I just kept doing other things while letting psyleague tell me which of me 20 eval changes might be relevant.

13 Likes

Legend #37 Ruby

I should not read PMs before writing mine.
I felt like I just played the game where everybody hacked the game :sweat_smile:.

The contest dates was like last year not the best for me and could dive in only the last week.

I started with a really basic bot the first day that went to silver:

  • Phase 1: A first dive straight down and up activating sometimes the light
  • Phase 2: Next dives, one fish at a time using the radar (crab movements!)

The critical point was to avoid monster. And I spent too much time on it for putting off doing the right thing until the last moment as I did not want in the first place to dig the referee.
On top of that, I first misread the statement about emergency mode and did not take into account that it can happen “during a turn (so not necessarily at the end of the turn)”. i loose quite a lot of time trying to add some margin to compensate something I did not understand!

Test 1

I finally grab the getCollision method and use it with at most 2821 reachable points against the concerned monsters minimizing the distance to destination.
image

Ok, something working, but that was not enough, with a lot of monsters, I was too often sandwiched between.

Test 2

The silver boss seems impressively strong. For sure, with my basic bot as base, it began to avoid monsters but without scanning fishes.
During debug of monster avoidance, I took advantage of replay analysis to add some quick “if” to make useful actions in particular cases.
For example, to push fishes out of the map during the dive of my first phase.

Test 3

For my sandwich problem, I fixed it by looking a second turn ahead assuming all the monster will target me. That could certainly be improve like ignoring monsters out of my range but did not thought of that at the moment.
Because 2821 * 2821 was too big (at least in Ruby), I reduced my set of points to 976 for the next turn and added another 192 points for the second turn.
imageimage

Test 4

Again, first play in the ide (after so many for debugging :)) result in a loss against the silver boss. But my drones were alive during the whole match, even with 6 monsters, youhou !
Need to stop that crab movements. I went for the center of the rectangle as many of you. Did not think to update it from the previous turn, so there is a fresh one each turn :sweat_smile:.

Test 5

After some loses (but also some wins :)) against the boss and new “if”, I analyzed a game where one of my drone went into emergency mode: a face to face with a monster is lethal :head_bandage:.

But I saw that monster before and the game was getting more and more interesting now that the mechanics are well understood… just f***, went to the referee again and grab more methods: updateUglySpeeds, snapToUglyZone, updateUglyTarget, getClosestTo.

Now it was to detect:

  • the light of the opponent => easy. battery < battery_prev
  • emergency of the opponent => not that hard. I thought the trick battery == battery_prev was enough but it was one turn too late. I ended up checking the collision of an opponent to not miss a turn.

The end

And that worked 90% of the time :tada: (wrong simulation could happen if missing a monster, plus I did not use symmetry). Added more “if” to release scans early and scan the way up of phase 1.
I submit my second arena last saturday morning, made it out with the silver boss, and reached surprisingly 50th gold. So no, that can’t be the end :stuck_out_tongue: (but it should have been as gold was my target after the strong silver boss)

Test 6

Now that the monster avoidance was operational (most of the time), time to dive more on my “if”. Added more actions in phase 2, and one in phase 1 that I think worked pretty well to stop surfacing if no bonus point to move fish out of map.

Test 7

My bot was pretty consistant and reach each time the top 10 of the gold league. Between small bug fixes/improvement last sunday evening I searched for the modification that could be done in time, giving me a significant boost and will not break my astonishing sequence of solid “if”.
image

And I saw that improvement of bypassing monsters more quickly. I was minimizing the distance of the next turn causing my drones to first get in front of the monster to finally turn back around it, losing too much turn.

As I was already somehow looking for the second turn, I “just” had to check that distance instead.
To make it worked and not timeout, the number of points tested has been drastically reduced: from 976 * 192 to 60 * 60.
I reached top 1 gold but with >1 difference.
image


The real end?

Almost, looking at some replays reveal that moving fishes out of the map no longer work very well… what? Why?
=> Minimizing the second turn distance should not be use if the target point is less than the drone speed… pfiouuu
Top 1 again (above the boss during the run). Gone to sleep as nothing good would have been done. The good surprised to woke up lucky pushed in legend :slight_smile:

Conclusion

My bot consist of two phases, with 8.5 and 4 actions each where the monster avoidance is run to track/update and prevent emergency. The light is turn on every two turns on some actions of phase 1 and when needed on phase 2.
It was built with 10 arenas, ~200-300 play in the ide and analysis.
And it lack a lot of features:

  • do not use symmetry
  • do not optimize rectangles
  • do not track fishes
  • do not do perf, timeout sometimes
  • do not take into account opponent score and scoring system (in particular to surface)
  • do not coordinate well both drones resulting for the same action sometimes
  • do not test against itself in local play
  • do not remember previous action resulting in some cases to really fun replay

Bravo!

Congratulations to the winners. Reaching legend is a thing, compete with those guys is another :clap:.

Many thanks to Codingame

As always a great contest. I really enjoyed the game. The stats of total scanned fishes and emergency drone was really cool, why not keep them in the multiplayer section?
I also notice a change when scrolling the debug information panel (or was already like that before?). The viewer play the turn only when reaching the end of the log turn: really appreciated that change to debug.

See you in winter?

17 Likes

Rank: #259 (201 in Gold)

Bot: Simple eval on fishes (and possible score gain) and swimming there blindly

Strategy: Pick fish from deepest habitat, that wasn’t scanned yet (my heuristic was that if deeper then better). My other heuristic was about position of fish, I assumed that fish will be somewhere in the middle between drone and wall (respectively left or right). Later in the contest I added some simple if’s that when I will lose in the next turn or I could win, by stealing bonuses then swim up. Avoiding monsters was mostly copied from referee, but I handled one simple edge case that our drone could end in the next turn inside monster, by simply simulating moves and then checking obvious collision.

Challenges: finding motivation to start coding, not only bot, but also some of the viewers/helper. After reading other PM’s and talk with my colleagues, what was lacking in my solution is some tracker.

Game: This wasn’t exactly my type of a game, but after all I can say I had some fun!

5 Likes

Thanks a lot @blasterpoard for your PM
I don’t really get your point here, can you please explain a bit more?

image (this is a reply to the post above, I clicked the wrong reply button)

5 Likes

#2 in Legend.

Thanks organizers for the contest and competitors for high level!

I have a one-turn simulation with samples, iterative improvement with opponent prediction, backed by a pixel-perfect engine. The ingredients are as follows:

  1. Inferer provides information on where fishes and monsters can be and which ones we know exactly. It operates on geometric figures: rectangles, circles, and rounded rectangles. Rectangles come from radar, positive circles from opponent’s scans, and negative ones from not seeing. The inference is performed backwards (to update previous turns based on the current one) and forward (from the past up to now). When an item goes out of sight, it is kept exactly as long as no unknown collision can affect it. All symmetries are taken into account, and the rectangles grow according to the possible speed of the tracked item. For determining if a position is possible, information from the current and past turns is used. Nevertheless, updating the whole thing uses less than 1 ms per turn.

  2. Samples are a mechanism to somehow capture the probability distribution of positions and vectors. I keep a few hundred states. Initialized at random exactly as in the referee, they are fixed whenever they are refuted by inferenced knowledge. In that case, a fish or a monster is replaced with a new random one. Since the eval on multiple samples is costly, I sort the current samples by quality (the number and time of fixes) and use only a few best ones for evaluation. In this way, the best samples hopefully contain positions and vectors most similar to the real ones.

  3. Eval contains a few components. There are, e.g., win, draw, loss values, score if both players go up now, current scans value, the estimated number of turns to scan and to kill a fish, and a penalty for using a lamp. Going up is simulated up to the end by a greedy selection of moves, avoiding monsters and including scanned fishes on the way. For the estimated numbers of turns to scan or kill, one turn simulation for monsters is used, then distance. Finally, to obtain the eval value, these components are mixed together without any sense. The eval is crappy and inconsistent, duplicating some features in other ways, and it works differently for the opponent just because it was worse otherwise.
    I did not find any real improvement from avoiding monsters other than not picking directions that lead to being eaten. Maybe except for a simple small penalty for having a close monster.

  4. Move search. I check maximal vectors around, and halves of the best ones. The moves are assigned and improved in iterations until the end of time. An iteration consecutively reassigns the action for each of the four drones assuming the others apply the already assigned moves, so there is opponent prediction. Each next iteration uses a larger granularity. The eval is taken as the average eval over a few best samples.

  5. Pixel-perfect engine. The rules of the game are extremely complicated, depending on the exact order of double arithmetic in the referee. Preparing a correct and moderately efficient implementation took the first half of the contest or more. The perfectness does not improve the results much, but it was required to properly debug the exact inference and other mechanisms. As a bonus, I can also safely pass monsters by a pixel.
    The overwhelming number of minor cases and inaccuracies were consistently attacked from all the parts until perfection. Definitely the hardest engine I ever needed. The following are some elements that surprised me and I had to take care to achieve the perfectness. You can also use them as a test of knowing the rules.

  • You can slide on the ocean floor by using WAIT, so move without scaring fishes. Not useful, of course.
  • Math.round works differently than std::round for negative numbers. So Java rounding must be emulated.
  • A fish at normal speed (200) can change its vector by subsequent normalization and rounding, i.e., when stops being scared, its vector is normalized to 200; then, in the next turn, it is normalized again with a different result by one pixel. Hence, fish vectors must be always normalized if not scared.
  • To the contrary, the monster vectors must not be normalized every time, but only when the vector length exceeds their normal speed (270).
  • If we pick up a far target and normalize the vector from the drone to the target, and then MOVE by this vector, we can end up at a different position by one pixel since the normalization of a normalized vector is not identical.
  • The normalization must be performed exactly as in the referee, so first divide, then multiply by the desired length. The usual way (multiply by the length divided), which is better in terms of efficiency and numerical accuracy, unfortunately, gives slightly different results.
  • Monsters can get the zero vector and thus stop in rare cases.
  • The visibility radius uses sharp inequality, whereas the other circles use less or equal.
  • The boundary reflection of a monster is calculated after rounding the speed, whereas that for a fish uses the normalized speed before the round. This means that a fish can bounce one pixel earlier than it would be required.

Finally, I have answers to some questions that arose during the contest:

  • How much the full information about the fishes (i.e., all fishes visible) improves the results?
    MSz-knowing-all-fishes vs MSz: 75.00% ±0.3 (99% CI)
    For weaker earlier versions, the improvement was not that much.

  • How much the full information about the monsters improves the results?
    MSz-knowing-all-monsters vs MSz: 51.51% ±0.4

  • And together?
    MSz-knowing-both vs MSz: 78.84% ±0.3

In these tests, I have not taken into account that the inference and samples become unnecessary, so the bot could use time better.

  • What if I have twice larger time limit?
    MSz-100ms vs MSz: 52.06% ±0.4

Summary

The engine was very difficult, beating the record of complication so far.

I tried to optimize everything further to increase the number of directions / use more samples / deepen the simulation. However, these did not improve the results much, whereas a deeper simulation would be far too expensive. The engine could not be significantly optimized; simply, there is no other way than just following the referee’s algorithm. The only more notable optimizations are: precomputing all 4800 maximal drone vectors and applying the action to a partially computed state (other drones, fishes, and monsters moved).

So unfortunately, it seemed that tuning the crappy eval is more important, and I could not find any feature that allows getting me an advantage. I also noted a problem with meta strategies, but they only appeared for early/mid versions, where, e.g., a bot going up earlier wins more often with weaker bots because they do not kill fishes effectively. Later versions were consistently monotonic.

22 Likes

I see the strategy and I want to share my solution. I go down in tracking the fish the most of the side. And go up near than the half of the half(2500 and 7500). I use Genetic Algorithm algorithm with a scoring function with bonus and malus.

In my algorithm, for the collisions, I find a point in my population at a distance at 1080 of all the visible monster. I manage also the last position of each monster for evaluating the collision.

My scoring function is calculated like that: -distToObjective + bonus - malus.

I have 3 mode. The first mode 0 for go down like I explain at beginning. Mode 1 for searching the fish that miss. Mode 2 for hunt the fish outside the board.

For the light, i flash all the turn %5 and turn%2 behind 6500 in depth. If we can know the position of a fish who is blocked between 2 drones or between the drone and the side thanks to the radar, then I flash .

I do a 699th in general and gold league.

1 Like