官术网_书友最值得收藏!

  • Deep Learning with Keras
  • Antonio Gulli Sujit Pal
  • 267字
  • 2021-07-02 23:58:02

Problems in training the perceptron and a solution

Let's consider a single neuron; what are the best choices for the weight w and the bias b? Ideally, we would like to provide a set of training examples and let the computer adjust the weight and the bias in such a way that the errors produced in the output are minimized. In order to make this a bit more concrete, let's suppose we have a set of images of cats and another separate set of images not containing cats. For the sake of simplicity, assume that each neuron looks at a single input pixel value. While the computer processes these images, we would like our neuron to adjust its weights and bias so that we have fewer and fewer images wrongly recognized as non-cats. This approach seems very intuitive, but it requires that a small change in weights (and/or bias) causes only a small change in outputs.

If we have a big output jump, we cannot progressively learn (rather than trying things in all possible directions—a process known as exhaustive search—without knowing if we are improving). After all, kids learn little by little. Unfortunately, the perceptron does not show this little-by-little behavior. A perceptron is either 0 or 1 and that is a big jump and it will not help it to learn, as shown in the following graph:

We need something different, smoother. We need a function that progressively changes from 0 to 1 with no discontinuity. Mathematically, this means that we need a continuous function that allows us to compute the derivative.

主站蜘蛛池模板: 自治县| 涿鹿县| 长顺县| 吉木萨尔县| 临江市| 南岸区| 萨迦县| 红桥区| 灵丘县| 呼玛县| 新干县| 河池市| 绿春县| 米脂县| 石家庄市| 武定县| 清镇市| 高州市| 龙口市| 汝阳县| 会昌县| 涟水县| 汾西县| 秦安县| 定安县| 常州市| 陇南市| 祁阳县| 壶关县| 芦山县| 个旧市| 含山县| 额敏县| 琼结县| 舒城县| 太白县| 邹平县| 孟连| 巩义市| 滦平县| 定南县|