官术网_书友最值得收藏!

Multilayer perceptrons

The multilayer perceptron is one of the simplest networks. Essentially, it is defined as having one input layer, one output layer, and a few hidden layers (more than one). Each layer has multiple neurons and the adjacent layers are fully connected. Each neuron can be thought of as a cell in these huge networks. It determines the flow and transformation of the incoming signals. Signals from the previous layers are pushed forward to the neuron of the next layer through the connected weights. For each artificial neuron, it calculates a weighted sum of all incoming inputs by multiplying the signal with the weights and adding a bias. The weighted sum will then go through a function called an activation function to decide whether it should be fired or not, which results in output signals for the next level.

For example, a fully-connected, feed-forward neural network is pictured in the following diagram. As you may notice, there is an intercept node on each layer (x0 and a0). The non-linearity of the network is mainly contributed by the shape of the activation function.

The architecture of this fully connected, the feed-forward neural network looks essentially like the following:

Fully connected, feed-forward neural network with two hidden layers
主站蜘蛛池模板: 贵定县| 六盘水市| 封丘县| 沾益县| 巴楚县| 天长市| 灵璧县| 延寿县| 宝清县| 高雄县| 湟中县| 江口县| 晋州市| 陆河县| 永胜县| 尤溪县| 广安市| 西丰县| 仁化县| 屏东市| 新河县| 荣昌县| 个旧市| 环江| 淮南市| 大理市| 盐山县| 镇远县| 嘉荫县| 云南省| 马公市| 库尔勒市| 瑞金市| 衡南县| 临高县| 古丈县| 孝感市| 太谷县| 天津市| 盖州市| 杭锦后旗|