I’m trying to write my first Neural Network, decided to go the python way, and use the available libraries.
have read some books about the subject, and am familiar with the drill, overall.
i have my inputs (12000 arrays, each have 500 double values, all in range -1,1. and all are labeled)
my question is :
since my input values have negative values, does it mean I HAVE TO use sigmoid function as activation function (at least for the first layer) ?
(because the RELU function kinda ignores the negative inputs ?)
thanks in advance