Neural networks. Sigmoid for a large sum

I ran into the following problem: when multiplying the input layer by the weights, I get a huge number(about 400 (2500 inputs)), which makes it impossible to perform the activation function(I use a sigmoid((1/ 1+(2.72^(-400))), where (2.72^(-400)) is practically equal to 0, and in the used yap is generally equal to 0, hence 1/1+0 == 1). A mistake? Where? Thank you in advance.

Author: Matvey, 2020-05-15