https://www.codingame.com/training/hard/basic-decision-tree---2
Send your feedback or ask for help here!
Created by @Wei-1,validated by @bbb000bbbyyy.
If you have any issues, feel free to ping them.
https://www.codingame.com/training/hard/basic-decision-tree---2
Send your feedback or ask for help here!
Created by @Wei-1,validated by @bbb000bbbyyy.
If you have any issues, feel free to ping them.
I have got an issue on test 5:
I have got it right for features 1 2 and 3, but my method goes further and add feature 4 at the end.
And manually I get the same result:
after filtering with variable 1 2 and 3 the remaining un classified data are:
index output f4
0 1 1
6 3 2
7 3 1
8 3 1
adding a node with f4, decrease entropy (I have 0.81 on the list of labels [1,3,3,3] and 0.69 on the 2 sets [3] and [1,3,3]).
Can you tell me where I might be wrong ?
Thank you for your time,
Mathieu
I like it when a puzzle also teaches me something new, that’s great!
For me, this “part-2” puzzle took to solve more effort and debugging than expected, even after solving part-1 because I took the “…will follow the same entropy algorithm” in the statement literally.
Actually, after the tree is built, binary entropy should be used only at the ‘final’ splits (where left and right branches are groups that are not split more), but the entropy of the internal nodes must be calculated with the weighted formula, up to the root. In restrospect, this is the only way it really makes sense, but in puzzle part-1 I had no internal nodes in any trees, just the root and the leaves, so my misunderstanding was not causing trouble there.
But now I can classify any beetle that I encounter…