Commit a64994ce authored by Nicolas Schuler's avatar Nicolas Schuler
Browse files

Some small changes, attribute input to inputData and the function inputData()...

Some small changes, attribute input to inputData and the function inputData() to input_data() to differentiate them better. Same for outputData().
parent 4156233f
......@@ -236,7 +236,7 @@ initialization is prone to lead to the vanishing gradient problem. This is espec
functions like Sigmoid, which squash the given values in a small range (of [0,1] in that case).
On the other hand, RectLU can counter this (since we have no squashing down, as seen above).
So ideally, you would like to initialize the values of the weights in a way, that minimizes the
So ideally, you would like to initialize the values of the weights in a way that minimizes the
danger for something like that to occur. While I'm aware of that, I do not know what the
proper way would be to do so yet.
"""
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment