Fall Challenge 2023 - Feedbacks & Strategies

#98 (Gold #40)

It is my second participation, and the task was to get into the Top 100. :sweat_smile:
Until last week, a very simple and naive strategy based on rules kept me on track.

  1. Monster tracking is crucial; a hit is a loss most of the time.
  2. Each drone either go down or up: initially, go down until radar shows BL or BR then go up until y > 500, repeat.
  3. On every turn, each drone proposes his direction only (angle) to move. In early versions, it is just 90 deg or -90 deg. if go down and have BR only or BL only, then go to the center of BR or BL rectangle. If both then just go 90 deg down.
  4. Turn the light if y > 3000 and we are far enough from last light position (~1000u).
  5. Once a drone proposes its angle A, we check if it is far enough from all tracking monsters. if it doesn’t then check the nearest angles (A-delta, A+delta), then (A-2delta, A+2delta), etc. “Far enough” is 570u or 800u in case a drone is close to the edge (1000u). If nothing helps, then just pick an angle to maximize the distance from monsters.
  6. if our drones have a “winning number of fishes” then all drones go up (we report our scanned fishes, foe reports all the fishes, we report the rest). It beats the bots who scan everything first.

Last weekend, I added more heuristics, the key ones are:

  • 5-th rule was improved to check that after the move we have 2nd safe move.
  • 6-th rule was improved: the deepest drone (or both) starts frightening the foe’s fishes (just move to the center of fish rectangle or 1px before the fish)
  • Do not turn on the light if nothing new is scanned.
  • Turn the light on if we have a good chance to catch a crab (at least 10 battery and ~2500u to the center of crab rectangle)
  • First 10 moves, my drones frighten the edge fish if any and the fish will die on next move.
  • And my favorite: First 5 moves are hardcoded if we have an edge Jelly: a “middle” drone moves 90-34 deg or 90+34 deg to the edge. Not losing my first Jelly and kill foe’s first Jelly helps to go up first.

Finally, the games become very dynamic and surprise me sometimes.
But the score system is very strange at first glance. I submitted the code and became #140, submitted again - #70. I think I am in Top 100 by a luck.

5 Likes

#44 Legend

Simple Approach:

lights:

  • Use light if the drone is in a zone where it hasn’t scanned a fish.
  • Activate the light every two turns for each drone.

Tuning:

  • Utilize tracking for both fish and uglies to predict their position and speed each turn without being in my range.
  • use of x-symmetry for fishes and uglies to identify the symmetric location when encountering them for the first time.

Monster Avoidance:

  • Employ a brute force 2-step search.

Report Calculation:

  • Calculate (myScore, oppScore, myCombo, oppCombo) including bonuses.

Midgame:

  • Evaluate whether to ascend to the surface using the report calculation or descend deeper, or claim the remaining area while ascending.Taking appropriate action:
    • Claim the Fish:
      • Do so with minimal effort, optimizing the edges if unable to capture both.
    • Kick Fish Out:
      • Set a target for the drone to make the opponent lose some combo points, prioritizing the easiest target.

Thanks CodinGame for such contests.

12 Likes

Again another wonderful challenge from Codingame !

The game combines several features, with fog of war like in CB (CodeBusters), hidden informations like in OOC (Ocean Of Code), physic motor like in CSB (Code Strikes Back), resource management like in CR (Crystal Rush), …

I finished #25 in Legend (my record out of 36 contests), maybe the 3-week format is better suited for me.

This game actually contains 3 games in 1 :

  • Information gathering
  • Decision-making
  • Execution of the action

My game is more focused on information gathering and executing actions, as I spent a lot of time developing the simulation and trying to make it profitable. Also, my decision strategy is quite frugal and could be summarized simply as : take a bunch of fish, and if the opponent is ahead of me when resurfacing, then eliminate the missing fishes for him.

So my conditions to take a decision are :

  • Have a minimum number of scans with a minimum total number of crabs for a drone to decide resurfacing
  • Have a minimum total number of scans with a minimum total number of crabs to bring up both drones
  • If one of my drone is behind the two opponent drones, go back down and try to drive the missing fishes for him away in the game

Information grabbing :

  • Each creature is in a box in the limits of its territory. The box is truncated with the precedent box from the last turn, radar blips from my drones, symetric radar blips when the number turns elapsed ensures that the creatures have never been disturbed. I don’t use opponent scans.
  • All states, positions and speeds are simulated when I know them from past turns. Positions are compared with the current radar blips, and in the event that a position is outside the radar, this position is canceled
  • Using of light is scripted for the 12 first turns, but my drones never use light if no creature can be in the light radius, and always use light when a scannable creature is certainly in the light radius

Movements up to Gold league :

  • My 12 first turns are dedicated to the direct descent towards the crabs, targetting the points (2000, 8000) and (6999, 8000). Those target points are adjusted horizontally based on radar informations.
  • To avoid monsters, all 360 degrees positions are tested with the getCollision() method. I added a penalized safety margin around the monster’s arrival positions, and near the borders of the game, to prevent collisions on the next turn.
  • To deliberately drive the fish away, my drones target a point 700 pixels behind the the fish.
  • To occasionally drive them away, my simulation for one turn rewards the fish about to exit, when this fish is not scanned by the opponent player, and is scanned by me.
  • To avoid eliminating a fish I haven’t scanned yet, my simulation penalizes when my drone has less than 5 in battery and his position is in a 2000 distance from the fish or its box.
  • Surfacing is limited to a heigyt of y = 500

Upgrading to pass Gold Boss !

  • Simulating 2 turns instead of 1 turn, to delete the safety margin from monsters and borders. The second turn is evaluated for 36 positions, so the number of simulated positions is 360 * 36 = 12960, generally in 29ms. With no margin, drones avoid a lot of long pathes turning around monsters, and even in very simple situations :
  • Adding dead evaluation to occasionally make fish flee on the second turn.

Upgrading in Legend

  • Adding a new target point between jelly fish during my scripted descent
  • Adding an evaluation to favor rounded positions closer to resurfacing when the lost distance to the target is more negligible.

I think that if I had time to calculate scores and make better decisions in the macro game, I could have further improved my bot, but I am already very pleased with the result :smiley:

Thanks a lot to CG and to the designers of this very good game !

14 Likes

Thx for hosting this challenge.

I was very close to Legend, but failed in the end :frowning:
Nothing new in terms of strats of what others wrote, but I would like to praise psyleague for giving me an amazing last weekend in fast testing, should have started with it earlier. Special thanks to @Psyho for this catastic tool!

10 Likes

Rank: #33

Link to the PM: click me, I promise it’s a normal link

Thanks for the contest. I really liked that every decision has good and bad consequences, which generally means that the game is balanced since it supports different strategies. Good game design.

9 Likes

Rank 50 - Heuristics

A big thank you to CG for organising this challenge !
Thank you also to the participants who shared their feedback on this Forum, or even to those who shared tools to make better contests. (can’t wait to try Psyleague !)

I appreciated that the game was very different from the ones I had played before, especially as it was the first time I used the referee code. Good practice for next challenges when I want to try simulation based algorithms.

Here is what I tried to achieve :
0 / Track the location of uglies and fishes seen, or guess it with symetry and radar. (following referee code)
1 / Define targets for my drones in the deepest level with fishes
2 / Score different strategies.
→ estimate the arrival order of the four drones, for each 16 combinations of “going to the surface” or “not” (e.g. “up,up,up,up”,“up,up,up,down”…).
→ compute the bonus difference with the oponent scored, following the drones’ arrival order
→ use a min max approach to choose the moove, I consider that my oponent will follow the best moove by keeping the min score for each of my 2 direction combination, then I choose to follow the one with the final best score
→ Then when searching fishes, evaluate the bonus scored by chasing the fishes arround. I added some assignement between drones if several fishes, to avoid having the closest drone overloaded in this task
→ When going up target fishes with a lower y, if time to do it (compare number of turns remaining to surface with opponent drones)
3 / Avoid uglies : ± 1° check collision, but only one turn ahead. So avoid to land in corners was needed.
4 / Light management : frequency based on the % of fishes in the current zone. Overwritten if a fish tracked or guessed is in range or if no fish in the zone.
5/ Update the speed tracking of fishs and uglies.

4 Likes

Legend #4

(or rather shared #3-#5, given that the number of games in rerun was laughably small for a game with so much hidden information)

Tracking fish: Monte Carlo simulations of each layer (4 fish per simulation), starting from turn 1; the fish spawn positions are generated just like in the referee. I filter the simulations that do not agree with radars/scans (with 100-distance tolerance for imprecision), then keep adding new simulations for 25ms. I ignore collisions between fish in different layers.

Movement: I evaluate ~1000 points in a circle with radius 600 around the drone (approximately half of them on the perimeter). The only evaluated things are:

  • distance to the expected position of every fish (the closer, the better; bottom layer fish have triple the weight; also a bit more weight if I’ve already scanned fish of the same color/type)

  • distance to fish that the opponent has not yet scanned (when the fish is closer to the border, it has more weight)

  • pushing a fish that the opponent hasn’t yet scanned towards the border

  • staying above the opponent’s drone that has scanned the same fish/will complete the same color/type bonus

Red fish avoidance: if (goingToGetHit) dont(); I’ve also precalculated, for every d in [501…1640] the size of the angle that a red fish at distance d blocks if it’s targeting me, to make sure I don’t get stuck between 2+ fish after moving.

Light: in the early game I use light on hardcoded turns, or, if the opponent might push a fish out of the map, I move towards that fish and hold the light until it’s likely that I hit it (if the opponent’s drone if moving away from it, I assume it’s not going to push it). After the first couple of turns, I use the light if the probability (according to my tracking simulations) of hitting a fish if higher that a*battery+b.

Going to surface: when I’d win by doing it, when I’d lose otherwise, or when I go above y=2500 because of trying to stay above the opponent.

I originally intended to use that tracker for some sort of a smitsimax-like algorithm (both players pick a random-ish plan, pick a random fish simulation, play the game out with heuristics, repeat), but the game didn’t motivate me enough to write anything more than a simple evaluation function, so I just kept doing other things while letting psyleague tell me which of me 20 eval changes might be relevant.

13 Likes

Legend #37 Ruby

I should not read PMs before writing mine.
I felt like I just played the game where everybody hacked the game :sweat_smile:.

The contest dates was like last year not the best for me and could dive in only the last week.

I started with a really basic bot the first day that went to silver:

  • Phase 1: A first dive straight down and up activating sometimes the light
  • Phase 2: Next dives, one fish at a time using the radar (crab movements!)

The critical point was to avoid monster. And I spent too much time on it for putting off doing the right thing until the last moment as I did not want in the first place to dig the referee.
On top of that, I first misread the statement about emergency mode and did not take into account that it can happen “during a turn (so not necessarily at the end of the turn)”. i loose quite a lot of time trying to add some margin to compensate something I did not understand!

Test 1

I finally grab the getCollision method and use it with at most 2821 reachable points against the concerned monsters minimizing the distance to destination.
image

Ok, something working, but that was not enough, with a lot of monsters, I was too often sandwiched between.

Test 2

The silver boss seems impressively strong. For sure, with my basic bot as base, it began to avoid monsters but without scanning fishes.
During debug of monster avoidance, I took advantage of replay analysis to add some quick “if” to make useful actions in particular cases.
For example, to push fishes out of the map during the dive of my first phase.

Test 3

For my sandwich problem, I fixed it by looking a second turn ahead assuming all the monster will target me. That could certainly be improve like ignoring monsters out of my range but did not thought of that at the moment.
Because 2821 * 2821 was too big (at least in Ruby), I reduced my set of points to 976 for the next turn and added another 192 points for the second turn.
imageimage

Test 4

Again, first play in the ide (after so many for debugging :)) result in a loss against the silver boss. But my drones were alive during the whole match, even with 6 monsters, youhou !
Need to stop that crab movements. I went for the center of the rectangle as many of you. Did not think to update it from the previous turn, so there is a fresh one each turn :sweat_smile:.

Test 5

After some loses (but also some wins :)) against the boss and new “if”, I analyzed a game where one of my drone went into emergency mode: a face to face with a monster is lethal :head_bandage:.

But I saw that monster before and the game was getting more and more interesting now that the mechanics are well understood… just f***, went to the referee again and grab more methods: updateUglySpeeds, snapToUglyZone, updateUglyTarget, getClosestTo.

Now it was to detect:

  • the light of the opponent => easy. battery < battery_prev
  • emergency of the opponent => not that hard. I thought the trick battery == battery_prev was enough but it was one turn too late. I ended up checking the collision of an opponent to not miss a turn.

The end

And that worked 90% of the time :tada: (wrong simulation could happen if missing a monster, plus I did not use symmetry). Added more “if” to release scans early and scan the way up of phase 1.
I submit my second arena last saturday morning, made it out with the silver boss, and reached surprisingly 50th gold. So no, that can’t be the end :stuck_out_tongue: (but it should have been as gold was my target after the strong silver boss)

Test 6

Now that the monster avoidance was operational (most of the time), time to dive more on my “if”. Added more actions in phase 2, and one in phase 1 that I think worked pretty well to stop surfacing if no bonus point to move fish out of map.

Test 7

My bot was pretty consistant and reach each time the top 10 of the gold league. Between small bug fixes/improvement last sunday evening I searched for the modification that could be done in time, giving me a significant boost and will not break my astonishing sequence of solid “if”.
image

And I saw that improvement of bypassing monsters more quickly. I was minimizing the distance of the next turn causing my drones to first get in front of the monster to finally turn back around it, losing too much turn.

As I was already somehow looking for the second turn, I “just” had to check that distance instead.
To make it worked and not timeout, the number of points tested has been drastically reduced: from 976 * 192 to 60 * 60.
I reached top 1 gold but with >1 difference.
image


The real end?

Almost, looking at some replays reveal that moving fishes out of the map no longer work very well… what? Why?
=> Minimizing the second turn distance should not be use if the target point is less than the drone speed… pfiouuu
Top 1 again (above the boss during the run). Gone to sleep as nothing good would have been done. The good surprised to woke up lucky pushed in legend :slight_smile:

Conclusion

My bot consist of two phases, with 8.5 and 4 actions each where the monster avoidance is run to track/update and prevent emergency. The light is turn on every two turns on some actions of phase 1 and when needed on phase 2.
It was built with 10 arenas, ~200-300 play in the ide and analysis.
And it lack a lot of features:

  • do not use symmetry
  • do not optimize rectangles
  • do not track fishes
  • do not do perf, timeout sometimes
  • do not take into account opponent score and scoring system (in particular to surface)
  • do not coordinate well both drones resulting for the same action sometimes
  • do not test against itself in local play
  • do not remember previous action resulting in some cases to really fun replay

Bravo!

Congratulations to the winners. Reaching legend is a thing, compete with those guys is another :clap:.

Many thanks to Codingame

As always a great contest. I really enjoyed the game. The stats of total scanned fishes and emergency drone was really cool, why not keep them in the multiplayer section?
I also notice a change when scrolling the debug information panel (or was already like that before?). The viewer play the turn only when reaching the end of the log turn: really appreciated that change to debug.

See you in winter?

16 Likes

Rank: #259 (201 in Gold)

Bot: Simple eval on fishes (and possible score gain) and swimming there blindly

Strategy: Pick fish from deepest habitat, that wasn’t scanned yet (my heuristic was that if deeper then better). My other heuristic was about position of fish, I assumed that fish will be somewhere in the middle between drone and wall (respectively left or right). Later in the contest I added some simple if’s that when I will lose in the next turn or I could win, by stealing bonuses then swim up. Avoiding monsters was mostly copied from referee, but I handled one simple edge case that our drone could end in the next turn inside monster, by simply simulating moves and then checking obvious collision.

Challenges: finding motivation to start coding, not only bot, but also some of the viewers/helper. After reading other PM’s and talk with my colleagues, what was lacking in my solution is some tracker.

Game: This wasn’t exactly my type of a game, but after all I can say I had some fun!

5 Likes

Thanks a lot @blasterpoard for your PM
I don’t really get your point here, can you please explain a bit more?

image (this is a reply to the post above, I clicked the wrong reply button)

5 Likes

#2 in Legend.

Thanks organizers for the contest and competitors for high level!

I have a one-turn simulation with samples, iterative improvement with opponent prediction, backed by a pixel-perfect engine. The ingredients are as follows:

  1. Inferer provides information on where fishes and monsters can be and which ones we know exactly. It operates on geometric figures: rectangles, circles, and rounded rectangles. Rectangles come from radar, positive circles from opponent’s scans, and negative ones from not seeing. The inference is performed backwards (to update previous turns based on the current one) and forward (from the past up to now). When an item goes out of sight, it is kept exactly as long as no unknown collision can affect it. All symmetries are taken into account, and the rectangles grow according to the possible speed of the tracked item. For determining if a position is possible, information from the current and past turns is used. Nevertheless, updating the whole thing uses less than 1 ms per turn.

  2. Samples are a mechanism to somehow capture the probability distribution of positions and vectors. I keep a few hundred states. Initialized at random exactly as in the referee, they are fixed whenever they are refuted by inferenced knowledge. In that case, a fish or a monster is replaced with a new random one. Since the eval on multiple samples is costly, I sort the current samples by quality (the number and time of fixes) and use only a few best ones for evaluation. In this way, the best samples hopefully contain positions and vectors most similar to the real ones.

  3. Eval contains a few components. There are, e.g., win, draw, loss values, score if both players go up now, current scans value, the estimated number of turns to scan and to kill a fish, and a penalty for using a lamp. Going up is simulated up to the end by a greedy selection of moves, avoiding monsters and including scanned fishes on the way. For the estimated numbers of turns to scan or kill, one turn simulation for monsters is used, then distance. Finally, to obtain the eval value, these components are mixed together without any sense. The eval is crappy and inconsistent, duplicating some features in other ways, and it works differently for the opponent just because it was worse otherwise.
    I did not find any real improvement from avoiding monsters other than not picking directions that lead to being eaten. Maybe except for a simple small penalty for having a close monster.

  4. Move search. I check maximal vectors around, and halves of the best ones. The moves are assigned and improved in iterations until the end of time. An iteration consecutively reassigns the action for each of the four drones assuming the others apply the already assigned moves, so there is opponent prediction. Each next iteration uses a larger granularity. The eval is taken as the average eval over a few best samples.

  5. Pixel-perfect engine. The rules of the game are extremely complicated, depending on the exact order of double arithmetic in the referee. Preparing a correct and moderately efficient implementation took the first half of the contest or more. The perfectness does not improve the results much, but it was required to properly debug the exact inference and other mechanisms. As a bonus, I can also safely pass monsters by a pixel.
    The overwhelming number of minor cases and inaccuracies were consistently attacked from all the parts until perfection. Definitely the hardest engine I ever needed. The following are some elements that surprised me and I had to take care to achieve the perfectness. You can also use them as a test of knowing the rules.

  • You can slide on the ocean floor by using WAIT, so move without scaring fishes. Not useful, of course.
  • Math.round works differently than std::round for negative numbers. So Java rounding must be emulated.
  • A fish at normal speed (200) can change its vector by subsequent normalization and rounding, i.e., when stops being scared, its vector is normalized to 200; then, in the next turn, it is normalized again with a different result by one pixel. Hence, fish vectors must be always normalized if not scared.
  • To the contrary, the monster vectors must not be normalized every time, but only when the vector length exceeds their normal speed (270).
  • If we pick up a far target and normalize the vector from the drone to the target, and then MOVE by this vector, we can end up at a different position by one pixel since the normalization of a normalized vector is not identical.
  • The normalization must be performed exactly as in the referee, so first divide, then multiply by the desired length. The usual way (multiply by the length divided), which is better in terms of efficiency and numerical accuracy, unfortunately, gives slightly different results.
  • Monsters can get the zero vector and thus stop in rare cases.
  • The visibility radius uses sharp inequality, whereas the other circles use less or equal.
  • The boundary reflection of a monster is calculated after rounding the speed, whereas that for a fish uses the normalized speed before the round. This means that a fish can bounce one pixel earlier than it would be required.

Finally, I have answers to some questions that arose during the contest:

  • How much the full information about the fishes (i.e., all fishes visible) improves the results?
    MSz-knowing-all-fishes vs MSz: 75.00% ±0.3 (99% CI)
    For weaker earlier versions, the improvement was not that much.

  • How much the full information about the monsters improves the results?
    MSz-knowing-all-monsters vs MSz: 51.51% ±0.4

  • And together?
    MSz-knowing-both vs MSz: 78.84% ±0.3

In these tests, I have not taken into account that the inference and samples become unnecessary, so the bot could use time better.

  • What if I have twice larger time limit?
    MSz-100ms vs MSz: 52.06% ±0.4

Summary

The engine was very difficult, beating the record of complication so far.

I tried to optimize everything further to increase the number of directions / use more samples / deepen the simulation. However, these did not improve the results much, whereas a deeper simulation would be far too expensive. The engine could not be significantly optimized; simply, there is no other way than just following the referee’s algorithm. The only more notable optimizations are: precomputing all 4800 maximal drone vectors and applying the action to a partially computed state (other drones, fishes, and monsters moved).

So unfortunately, it seemed that tuning the crappy eval is more important, and I could not find any feature that allows getting me an advantage. I also noted a problem with meta strategies, but they only appeared for early/mid versions, where, e.g., a bot going up earlier wins more often with weaker bots because they do not kill fishes effectively. Later versions were consistently monotonic.

22 Likes

I see the strategy and I want to share my solution. I go down in tracking the fish the most of the side. And go up near than the half of the half(2500 and 7500). I use Genetic Algorithm algorithm with a scoring function with bonus and malus.

In my algorithm, for the collisions, I find a point in my population at a distance at 1080 of all the visible monster. I manage also the last position of each monster for evaluating the collision.

My scoring function is calculated like that: -distToObjective + bonus - malus.

I have 3 mode. The first mode 0 for go down like I explain at beginning. Mode 1 for searching the fish that miss. Mode 2 for hunt the fish outside the board.

For the light, i flash all the turn %5 and turn%2 behind 6500 in depth. If we can know the position of a fish who is blocked between 2 drones or between the drone and the side thanks to the radar, then I flash .

I do a 699th in general and gold league.

1 Like

:clap: Congratulations to @MSz, @reCurse and @zasmu for the Podium :+1: .
→ you provided us beautiful replays to study :mag:, and inspiration to learn / imitate :thinking: !