Mean Max (CC01) - Feedback & Strategies

I’m a bit disappointed by this first community challenge and my final ranking may be the worst since I play CG (639th / 2512)

I didn’t like :

  • The rules (especially in french at the beginning of the challenge).
  • The similarity with CoTC (that was a bad challenge too !)
  • The “nearest wreck” strategy allowing to reach Silver with no effort
  • The useless speed parameter
  • The confusing viewer

I’m also disapointed by my own failure :
I didn’t find a way to enhance my simple strategy. I tested many things, with very poor impact on the ranking. Observing replays during hours didn’t allow me to understand what was wrong :frowning:

That was not very fun this time…

What similarity ?

Collect something before someone else on a map.

pb4 said in this thread that he reused most of his code from cotc…

41th - Java - TIMEOUT! - 30ms…

Thanks and congratulations to Magus, Agade, Pb4 and Recurse for the theme really interesting and also thanks for the referee which help us to manage collisions!

FEELINGS

This contest could be really fun if i didn’t waste more than a half of my time optimising memory consumption using Java Mission Control to profile it as i often get timeout with no error. I suspected the garbage collector doing major or full gc so i spent a lot of time caching objects avoiding deallocation, removing stream, foreach which create object like iterator that will be garbage later on. I also put some counters to securise my loop so i’m sure i do not have infinite loops, IN VAIN…

Sunday night i resign to set my timeout to 30 ms when in other contests i use 44ms. It takes me so much time doing that as i persevered too much that i couldn’t spend some times optimising the rest of my code like the collision engine. Too bad…

Strategy

Wood to Silver

Simple heuristic

  • Reaper move to the closest wreck
  • Destroyer move to the closest tanker

To compute target and power i just did a vectorial computing like this :

MoveAction createMoveActionStopOnTarget(Unit source, Point target) {
  Point newSpeed = new Point(next.x - source.x - vx, next.y - source.y - vy)
  return new MoveAction((int)(newSpeed .x + source.x), (int)(newSpeed .y + source.y), (int)Math.min(newSpeed.magnitude() / source.mass, MAX_THRUST));
} 

It works pretty well to stop the reaper inside a wreck if it has not too much speed.

Silver to gold

Add doof and skills

  • move doof to the player’s reaper with a greater score than mine (not necessarily the best but the player i can more likely beat)
  • doof skill on wrecks where there will be 1 or more enemy reaper at the end of the turn excluding those with my reaper
  • destroyer skill on my reaper if i will be on a wreck in 0 or 1 turn and reaper in the range of the skill to move aside them
  • reaper wreck target take into account distance and water. I manage also to go to overlaping wrecks to earn more than one point in a turn
  • destroyer tanker target take into account distance and water

Gold to Legend
Hell on earth! Timeout power!!!
I use simulation engine from the referee coupled with my previous heuristics and an evaluation function to compute at depth 3.

First, I compute some possibles good moves at depth 1 for my reaper, doof and destroyer.

Then i test all moves combinaisons using the simulation engine at depth 3 using a dummy for the enemies and for my last 2 moves with a simple AI like :

  • reaper go to the closest wreck with the most water
  • destroyer go to the closest tanker with the most water
  • doof go to the player’s reaper with the score higher than me or lesser if i’m ranked 1st

After that, during the remaining time i play random moves mixed with previous one and i evaluate them. I do not play random skill as it was not really efficient so i prefer compute skills using my heuristic.

Finally i take the best moves and play them.

The evaluation function is as the previous one described by Agade and other players. To reach the legend i had to avoid destroying tankers if my reaper is not close to it so i adjust the evaluation function to do that also as my heuristic to compute a move close to the next tanker position avoiding killing it.

What did not work
Timeout, timeout, timeout…
I think there is a problem with the garbage collector parameters used to launch java player perhaps since langage upgrade occured. I hope this problem will be corrected soon!

5 Likes

I had timeouts in java with 20ms limit too, so i switched to c++ at the end of contest, even though I don’t know it :stuck_out_tongue:

3 Likes

Finished #64, C++ - GA

Intro
First of, thanks a lot to the 4 creators and to CodinGames for hosting such fun and interesting contests (as always)

I’m used to do spaghetti ifs in C# with my best friend Linq, but wanted to try what pros do : c++ and simulation.
This contest was perfect for this: I don’t think I would have had fun with C# heuristics, compared to something like GitC, where heuristics were way easier, because no collisions and much more “units”

I had 0 previous knowledge with Sim/GA (I’ve never played CSB for example) so it was quite long and hard for me to setup a working simulation + GA in C++ within 10 days (got a job, not living alone, barely haven’t done any C++ in years)
So when it started working, I had a blast (not directly commanding through ifs, and seeing units doing smart moves is awesome). For this, I’d like to thank ZarthaxX for his help with GA, and others on the chat.

Strategy
My strategy was quite simple, my eval function using common parameters already mentionned several times before, so i’ll do that quick:

  • Depth 3
  • Additionnal weight for each depth (same ratio between depths)
  • A function determining who is the opponent I target (#2 if I’m #1 or #3, #1 if I’m #2), used in Evaluation Score and below.
  • Ga ran for 2/4 ms on opponents to avoid using dummy moves (the 4 seconds being the opponent I target, and done with moves found for the second less important opponent)

I’ve disabled TAR spell since my reaper was randomly putting that on the ground. Maybe it was useful (I like to tell myself this xD), but I couldn’t tell by watching replays,
and it was eating a lot of rage I prefered to spend on avoiding enemies to collect water. Sometimes it would use grenades as well, didn’t have the time to really validate it was useful (let’s say it was)

The main issue I faced was to have my sim + GA working less than 24h before the end of the contest, so no time to setup a proper environnement to tweak parameters and adjust them correctly. (+ no experience with that)
Plus, during the last 24h, A LOT of players did A LOT of submits, which slowed manual submits/tests even more.

What I think of the design (for Magus who asked)

  • First, I found it quite “messy”, especially when I had a look at games in bronze with full rules. I found it quite hard to explain to random people (like my girlfriend), compared to something like CotC.
    But after a while I got used to it.

  • I think it was not so easy for a noob (I find not so easy to just address the destination issue, using velocity and stuff).
    My brother who is not really good at coding liked it though… until he quit because he couldn’t improve his AI, quite early. His ranking is around #1050

  • The 1v1v1 format is … special, makes it even harder to tweak parameters and make sure your AI improves. But hey, it’s the same for everyone, it’s quite fun to watch and it adds an extra layer of strategy. So I’d say it’s a +

  • The design looks like CSB: collisions, angles, speed, small number of vehicules… (at least in my eyes, but once again, I haven’t really done anything on the CSB contest, so I might have missed huge difference).
    IMO it makes people who worked a lot on CSB (and AFAIK, it’s the “main” multi played outside contests periods) quite good right away (but hey, the worked for it right? :p)
    I say I’d prefer contests where it’s less obvious to know how to handle things (like C4L, GitC, where you can’t just plug a global GA)

Conclusion
Anyway, I’m really glad I tried hard on this one, learning way more than on any previous contest I’ve partipated in (even those where my final ranking was higher)

Congratz to everyone who learned something during this contest!

I really hope those contests will still occur on a regular basis.

Oh, and last thing: thank you reCurse for letting me the #1 Canada spot. :smiley: (DN38416 / Chalzy were close !)

7 Likes

One of the best contests in CodinGame Thanks for organizing :slight_smile: We want more contests from you guys.

I used C++. I used a random search with an evaluation function (I don’t think mutation or crossover would give better results :P). I spent more time on fixing my bugs and optimizing runtime and memory than actually improving my evaluation function. So I’ve learnt a lot of things especially on memory management, pointers, upcasting, downcasting, factory design, state design pattern.

The first thing I noticed in this contest was I didn’t have to track anything globally because I get every required information in every turn. The referee code was shared in the wood 3 league which was very useful for me because that helped me build a nice data structure that I would require in higher leagues.

Enemy score and mana information also gives freedom to play it safe or aggressive or hold the rage for late game and try various such strategies.

The hardest part for me in this contest was to predict opponents’ doof and destroyer movements.

Below is the class structure I used for the contest.

3 Likes

Silver - 450/2512

Best contest ever in my opinion, got to learn so much from it.

Wood to Bronze:

  • Used a dumb code that, had a major bug but it helped me get to to top of bronze pretty quick.

Bronze to Silver:

  • Same code, found that bug from the initial code, was using Unit_ID instead of Player ID FacePalm.
  • After getting to Silver worked on the distanceRatio between enemy Reaper and my Reaper.
  • Destroyer would target tankers only after they have filled up (worked 90% of times) otherwise it says in Water Town.
  • Doof targets closest enemy reaper (well after reading others reactions and strategies, it was not the smartest move, i could have used Enemy_score to evaluate which enemy reaper to target.
  • Finally, my Reaper only targets pools of water that have more than 1 water.

This helped me get to top of Silver but ran out of time to improve my code further. Waiting for the Multiplayer to open soon.

Lastly, A BIG THANK YOU TO THE CREATORS… Awesome Job Guys!! Even though i was among those who thought you could have waited till the Multiplayer opened to submit your codes, but at the end of the day none of you put in that extra effort to claim those top 3 spots. Respect!

3 Likes

Thanks for the contest guys, it was even better than most contests created by CG team.
Some comments:

  • Thanks for providing all game state in the inputs.
  • I’d say the amount of units was too big. With all 9 units and bunch of tankers and skill effects on a map it was hard to be a spectator. I was able to understand what was going on only by going frame by frame
  • Instant change of direction made vehicle movements not smooth, it was not as pleasant to watch as CSB.
  • I was around 60 at the end but was not able to use Reaper skill the way I could profit from it. I guess a lot of people did not use it.
    Overall: Good job guys! Thanks a lot.
    If case if you need any suggestions for the next contest: I’d like to see something more smooth and aesthetically pleasing :slight_smile:
1 Like

Hello all,
I didn’t find any topics on the forum so I will post it there as Razielwar pointed it out.

I also have a lot of timeout and as he said it is probably due to the upgrade of java.
I was looking previous multi games and saw that I was having plenty of matches lost by timeout: FB and CSB.
I have seen that BeardedWhale (using Java too) was also having the issue on FB, which truly make me think that it was due to the Java upgrade.

Is this issue known? We can no longer use the System.currentTimeMillis()?

2 Likes

This thread is discussing specifically how to measure time in Java:

For MM, I had to use a 15 ms margin, which is enormous when we have only 50 ms. @nmahoude as well I believe.
And it’s not as if Java was already well under C/C++ performances…

2 Likes

Yeah good point, no additional burden (coding/execution time) to try to estimate opponent’s scores / mana.
Only missing thing you have to track globally was the round number, I guess it’s OK :slight_smile:

1 Like

Thanks for the contest, it was truly interesting!

My current solution is only an heuristic (posted on tuesday), and that’s what was incredible on this contest, with a good heuristic you can have a truly good solution.

  • Don’t focus tanker with 1water and far from the center (3500)
  • Find the best nearest wreck (dist - nbWater*500) where I am the nearest from (don’t go if my opp will take it first)
  • Destroyer goes to the nearest tanker of the targeted wreck
  • If no wreck, reaper goes to the same tanker as destroyer
  • If no tanker, reaper and destroyer go to the center by default
  • Doof focus the opp reaper with the nearest score and launch oil when it’s possible and when opp reaper is on water :wink:

I was trying to perform an AG but I didn’t had the time to finish the code engine :cry:.

1v1v1 was interesting as we must choose if it’s better to secure a second place or trying to have the first one.
50ms is interesting to have competition between AG and heuristic (and also matches and submit more fast…)
Rules were hard to understand (even more true with french rules at the beginning). It became clearer after rules updates.
Referee is truly a must! As it allow to reply to some question easily by looking it at =).
Circle arena was easier to perform computation, it was a good idea.
Maybe the contest was a little too hard for the beginners in a contest.

A big thanks to CG and our 4 incredible CGamers who created this :wink:

See you for the next challenge :wink:

3 Likes

98/2512


I tried minimax last time with WW and ended up spending too much time dealing with time-out.
So I decide to heuristic all the way this time.

My reaper follows this strategy:

  1. Score all wrecks

  2. Find the best wreck

  3. Go to the wreck

  4. If you are in the wreck and someone is trying to push you out from the wreck, fight back.

  5. If you are in the wreck and have some free time, check if there is any skill possibility.

  6. If no move yet, find tanker with the highest score.

The other 2 of my vehicles follow this pattern:

  1. If next round an enemy reaper will hit you:
    hit it back with full power.

  2. If blocking the way of my own reaper, go away.

  3. If can reach enemy reapers target faster then they do, rush to the target.

  4. Check if there are places for the skill.

  5. If no action had been decided yet, go to the nearest enemy reaper.


Thanks for CG and the 4 Gamers for making this possible.

Best Regard

6 Likes

Thanks for the contest. I liked the game quite a lot but I found it a bit too chaotic with so many units. The circular map with 1v1v1 is a very good idea IMHO.

I finished 16th.

For the search algorithm, I used an AG (reused from CSB). My genes where a type (move/skill), an angle, and a power/distance for each of my looters. I hesitated for the depth, my last submit used 4 turns but I am not sure that it is better than 2 or 3 (very long submit times with players improving their bots).

For evolution, I used random mutations (40% of the times), uniform crossover (40 %), and 2 points crossovers (20%). I observed that large probability of mutations worked best for me. I did not tried to reduce the range of mutation with time, but that is something I will try.

My population size was 10. I tried to change it but there were no obvious improvements. At the start, I used my dummy bots (used to simulate my opponents) to create one solution, reuse the solution found at the previous turn (shifted by one) and random for the rest. At each generation, I kept the best two candidates from the previous generation, introduced 2 random one, and mutate/cross the rest.

My first breakthrough of the quality of my IA, was introducing dummies (used to model opponents and initialize my candidates).

For my dummies, I used:

  • Reaper: go to the nearest wreck that is not covered by an oil skill. The speed depends on the distance and the current motion vector. Checking for oil skill improved my IA significantly.
  • Tanker: go at full speed to the tanker nearest to the reaper target. I tried ignoring wreck covered by tar but it did not seem to help.
  • Doof: go to the reaper of one of the adversary. I settle to the highest scoring player, but I tried various selections.

I think that using my dummies for the tanker/doof helped a lot against some players but did the opposite to some other players (either I was loosing a lot at the begin of the submit, or at the end of the submit). I see in one PM, that it is possible to adapt at run time to the opponent behavior, I will definitively try that :slight_smile:

For my eval, I used:

  • my score
  • my rage
  • the distance of my reaper to the best wreck not covered by oil. Best is: distance to the reaper weighted by a value that depends on its water and the
    water of the other wrecks and their distances.
  • the distance of my destroyer to the best tanker not covered by tar. Best is the same that for wreck but for tankers.
  • the distance of my doof to the one of the reaper (see doof dummy)
  • the score of my opponents with different coefficients depending of the rank of the opponents.

What I realized after a long time and gave me a good boost is that the coefficient applied to the scores (and probably rage) needs to be very large compared to the rest. My interpretation is that we are only interested in the highest score, the rest is only there to select promising candidates to the evolution.

I used my eval at each turn of my simulations and applied an attenuation factor of 0.9. Using it to only evaluate the terminal position was not working for me, but I heard it did work well for other IA. Dunno why (maybe better eval/dummies ?).

I tried using CG brutal tester for checking my IA against previous version but it did not help me this time. A new IA performing worse sometimes was a lot better when submitted to CG, and the same for the opposite. I don’t know why.

I also tried to use my GA to search a solution for the opponents (against WAIT for me, or a solution found with a short time) and then search a solution for me against it, but it did not help. I did not tried very long on this idea.

If you have any questions, don’t hesitate to ask.

10 Likes

Because your dummy is very good to predict your own AI. Try to remove the dummy when you are testing against yourself.

1 Like

Mean Physics

Great job to the creators of the first (of many I hope) community contests! :smiley: Really well executed contest with interesting mechanics.

Couldn’t spend as much time on this contest as before (on vacation most of the week :stuck_out_tongue:) and only had the weekends to work on my bot resulting in my abysmal ranking haha :upside_down: was a great platform for some physics revision though \o/

After the previous contest, I figured out my bot was actually doing something pretty far from design. So this time round, I decided to devote the first weekend to writing a proper visual tool for me to debug my algos.

Turtle

Used the default tkInter package in python (turtle graphics library) to draw each vehicles’ state which made life much easier breaking down the actions my vehicles took each turn.

Perhaps something like having two simple options to draw circles and lines to the debug mode visual screen be possible for the current game engine? Could be an additional regex pattern that accepts input like

"DEBUG [-circle | -line] [-pos] [-dist]..." 

in the referee so our bots could have a visual ‘output’ esp for physics-heavy games like this where it’s pretty hard to imagine the vectors and it’s a lot easier to have a visual feedback on what your bot is doing.

Navigation

Most of my programming time was spent on implementing different navigation strategies for my Reaper which I focused on thinking that it would be the most important factor in a winning strategy:

  1. Collision Avoidance

    • The nearest obstacle intersecting the line from vehicle to target will exert a force onto the vehicle to push it away from a collision.
    • Got my bot stuck many times alternating between diverting left and right as the obstacle moves…
  2. Bug Pathing

  3. Gravity Pathing (For Destroyer)

    • Each body within a certain effective range exerts a ‘gravitational force’ on my Destroyer, empty Tankers repel, full ones attract, enemy Reapers attract, own Reaper repels etc…
    • Sum all forces to get resultant vector (current turn’s thrust)
    • Worked somewhat but needs a lot more tuning with how the state of each vehicle changes it’s force on the Destroyer.

Even with these pathing strategies, they were not as important to the overall performance of the bot. It became clear in Gold League that the complexities introduced by the many collisions and skills quickly made such an approach untenable. Even with a strong navigation algo (goes to wrecks fast and manages to brake in time to stop) won’t ensure your bot will win. Many times, the optimal pathing isn’t followed due to enemy collisions, grenades and oils messing up your prediction.

Deciding Wrecks

Moving from a distance-based selection to a time-based selection (simulate Reaper pathing with no collisions and sorting based on the number of turns instead of distance to wreck) was sufficient to go from Silver to Gold, thereafter I stalled :confused:

Tried improving the selection of wrecks by taking into account vehicles present on and around wrecks, hence blocking it, as well as nearby doofs with the potential to block with oil. Neither of these strategies gave that much improvement to the bot’s overall performance and instead ended up with a lower ranking.

One thing that improved my bot’s performance slightly was the addition of ‘simulated’ wrecks. I scanned through the current wrecks and added wrecks at the intersection of overlapping wrecks with the sum of overlapping wrecks’ water. This allowed my Reaper to treat this as just another wreck to consider going to in it’s selection algo; allowing it to target collecting from multiple wrecks at once.

Deciding Oil Spills

Only added in heuristics for using skills on the last day of contest. Oil target selection was done by running my Reaper’s wreck selection algo for each enemy and making my Doof to go towards this target (Oil it when within range).

Haven’t gotten to tars/grenades before the time ran out for contest :frowning:

Multiplayer!

Navigation turned out to still be slightly buggy (hurhur) and my Destroyer simply spam grenades at wrecks within range with nearby enemies (very sophisticated heuristics). Doof also isn’t aggressive enough towards enemy Reapers denying them from collection water on wrecks.

There’s still quite some work cut out for me to make my bot competitive enough to be promoted into Legend.

Game Design

One thing I loved in this game is the 1v1v1 mechanic, first 3p multiplayer contest I’ve participated in (GitC does seem suitable to be extended into 3 players). Too much complexity, however, resulted in the demise of heuristic bots in this contest :confused:.

The state of the game cannot be easily predicted without running costly simulations, hence limiting the amount of ‘metrics’? a heuristic bot can take into account. Overall the game isn’t suited for heuristics to prevail in the higher leagues.

Still this is an AI-programming competition and I think this just pushes me to properly implement search-based techniques (more widely applicable) instead of relying on heuristics.

:thumbsup: Thanks again Agade, pb4, Magus, reCurse for the amazing contest! Can’t wait to see what comes of the next one :slight_smile:

####P.S. Some blatant advertising :stuck_out_tongue:

The annual battlecode competition will be running in January next year. It’s a RTS-styled bot programming tournament (think programming a StarCraft II AI player). Feel free to join me while we wait for the next contest :smiley: hehe.

9 Likes

Hello,

Hats off to Agade, pb04, Magus and reCurse, I believe Mean Max set the bar high.

What I liked:

  • 3rd player (1v1v1) adds more options, more uncertainty, more challenge
  • Full information - somehow this felt like a good thing to have, it fit the contest.
  • Always being P0 - allows to focus on how you play instead of realizing mid-way through the match, that yellow is the opponent :slight_smile: Who can say it never happened to them?
  • Creators in the leaderboards
  • It felt good from the beginning for you guys to be around, actually playing your game
  • Sure, you had more time to think, how to solve your game, but I strongly believe it was not what you were thinking about before the contest, not a lot anyways.
  • Fighting and beating you felt awesome even in the lower leagues, it must have been a great feeling in Legend, even if you were not trying your hardest.
  • All of us, we were trying to create bots that would beat everyone. It’s everyone, not everyone except for Magus, pb04 and Agade (I believe have not seen reCurse’s bot), so accept it and let’s have fun all together.

What I was missing:

  • Seeing the speed and target in the debug mode would have simplified some debugging…

Anyways, thank you guys a lot, for creating a great game, from the beginning I felt comfortably, from the description I had most information I ever needed to code and not be frustrated about “something working weird”.

My general approach:
This contest was a breakthrough one for me, I really wished I would get to Legend finally, in the end I reach my personal best, 209th in Gold and learned a lot.

When I saw the contest name, I thought I should get back to CSB and spend some time with that (good guess!), but in the end I did not.

Seeing most of high ranking people used some way of local simulation, I spend some time before the contest to get tools done. For me, the tools are:

  • One that’s able to merge multiple C# files into one, based on what classes are used.
    • Right before the contest I added “version control”, copying latest merged file to a new location, sequentially numbered (My doof’s message contained the version, so I always knew what I’m looking at).
  • One that’s able to run many games of my AIs against one another.
  • One that’s able to fetch replays of my games from CG and test the referee
  • And finally, one that generates C# source from Java source

I know there are tools already proven to do most of these things available, but I wanted to do them myself.

And before anyone asks, I will not be providing any of the tools, however I’m more than open to discuss anything that I learned while making them and assist you with creating your own. Write me a PM here on the forum, if you’d like to ask something.

My AI:

  • I decided well before the contest, I would go with heuristics (my sim solutions didn’t do so well in the past)

  • On the first day I wrote some heuristics that would hopefully get me out of the Woods. I went to sleep when W1 submit was in some 30% percent of evaluation.

  • On Saturday I woke up to see promotion to Bronze, it was time for step two.

  • Removed some obvious issues and went hacking on the Referee, there were two problems - the tool didn’t know how to do lambdas yet and had some issues with streams in Java, took me a few hours, and my referee tests where failing on small floating differences on collisions. But I ran three of my AIs against and the results where as expected.

  • I tested referee some more (when it says the change is bad and I submit, I end up worse than I was, when it says it’s better I go up in ranking - check). I wanted to continue with the AI coding, so I left the referee alone, missing a serious bug.

  • Then I proceeded to write what was to become AI that brought me to Gold

    • Reaper: Find a Wreck closer than at least one of enemy reapers and go for it, brake when I would overshoot
    • Destroyer: Find a tanker that’s closest to the Reaper and kill it
    • Doof: go for enemy reaper that has higher score
    • On Sunday I added “course corrections” that would pull me towards more water and push me away from enemies, if they were in some 3000 radius with power decreasing exponentially with the distance. It was working well most of the time, but not perfect. (Today I know the issue was not normalizing target vector to some fixed length, so the effect of push and pull was different based on how far I wanted to get.)
  • On Monday and Tuesday I tried to write a pathfinding based on A*, and included collisions. The results where horrible.

  • On Wednesday I recovered from the changes I did because of the pathfinding, while keeping some improvements.

  • On Thursday I took different approach on the pathfinding, some line-circle intersections and calculating tangents to avoid. The AI seemed to behave as expected, but my tests said it’s worse than what I had already submitted (on Sunday!)

  • On Friday I learned to generate SVG from my bot, it contained positions of all units, speed vector, vector to target, future positions based on speed and some lines I could draw from the AI, as I calculated the solution.

  • These improved my debugging a lot

  • I finally found the bug in the referee! The referee didn’t remove wrecks, when they where out of water, so the endgame rendering would be a million wrecks all with around -100 water remaining :frowning:

  • After fixing this, I improved additional 200 places, to around 100 in Gold.

There was just one big issue:

  • When the pathfinding failed to find a way, my units did just WAIT
  • I tried to fix this issue the whole Sunday, but whatever I changed just increased collision rate, making the AI worse
  • I tried to bring the “course corrections” back, which I ditched for the pathfinding, but it was not good enough, with all the other changes I did.
  • I fiddled with the skill usage and did several submits, that improved me 10s of places, but the others where improving faster. The biggest issue was left unsolved.
  • And as I went to sleep on Sunday, my last submit slowly slipped from 100s to 200s…

What I learned

  • Simpler is better.
  • Stick to the plan. I changed my mind sometimes in crucial moments before, I did so too in this contest, less than before, but I need to work on this.
  • My tools work good enough, but there’s a lot to be improved.
  • I spent around 5-6h each day, which is still too much (with additional 7h of coding at my job)
  • I was way too tired by the end of the contest, and I honestly hated Mean Max on Sunday and wanted for it to be over.

Today, I can say I really enjoyed the game and the time spent was worth what I learned. In this post I focused on what went wrong for me, maybe somebody could take away something for them.

See you all at the next contest.

Once again, big thanks to Agade, pb04, Magus and reCurse, Mean Max was great.
Thank you!

Edit: I’m adding a pic of my renderer output: (in bottom my destro aligning itself with the tanker and the others are doing something too!)

10 Likes

The one thing i definitely didn’t like about this contest was the lack of direction lines that are present in CSB, but not here.

1 Like

Hi, I finished 4th :slight_smile:
Here is my PM : https://github.com/Saelyos/MeanMax/blob/master/README.md

12 Likes