官术网_书友最值得收藏!

Predicting output with a neural network

By combining layers of neurons together we create a stacked function that has non-linear transformations and trainable weights so it can learn to recognize complex relationships. To visualize this, let's transform the neural network from previous sections into a mathematical formula. First, let's take a look at the formula for a single layer:

The X variable is a vector that represents the input for the layer in the neural network. The w parameter represents a vector of weights for each of the elements in the input vector, X. In many neural network implementations, an additional term, b, is added, this is called the bias and basically increases or decreases the overall level of input required to activate the neuron. Finally, there's a function, f, which is the activation function for the layer.

Now that you've seen the formula for a single layer, let's put together additional layers to create the formula for the neural network:

Notice how the formula has changed. We now have the formula for the first layer wrapped in another layer function. This wrapping or stacking of functions continues when we add more layers to the neural network. Each layer introduces more parameters that need to be optimized to train the neural network. It also allows the neural network to learn more complex relationships from the data we feed into it. 

To make a prediction with a neural network, we need to fill all of the parameters in the neural network. Let's assume we know those because we trained it before. What's left is the input value for the neural network. 

The input is a vector of floating-point numbers that is a representation of the input of our neural network. The output is a vector that forms a representation of the predicted output of the neural network.

主站蜘蛛池模板: 江口县| 略阳县| 丹凤县| 大城县| 长子县| 三原县| 岳普湖县| 沧源| 新邵县| 红桥区| 扶绥县| 丰镇市| 云安县| 武穴市| 屏东县| 蛟河市| 宝鸡市| 安吉县| 无锡市| 通化市| 枣庄市| 安泽县| 曲沃县| 鄂托克旗| 准格尔旗| 静安区| 收藏| 自治县| 泸定县| 济南市| 房产| 通州市| 肃南| 湘西| 宁乡县| 阿拉善盟| 乐业县| 小金县| 武威市| 旬阳县| 沁水县|