官术网_书友最值得收藏!

Multilayer perceptrons

The multilayer perceptron is one of the simplest networks. Essentially, it is defined as having one input layer, one output layer, and a few hidden layers (more than one). Each layer has multiple neurons and the adjacent layers are fully connected. Each neuron can be thought of as a cell in these huge networks. It determines the flow and transformation of the incoming signals. Signals from the previous layers are pushed forward to the neuron of the next layer through the connected weights. For each artificial neuron, it calculates a weighted sum of all incoming inputs by multiplying the signal with the weights and adding a bias. The weighted sum will then go through a function called an activation function to decide whether it should be fired or not, which results in output signals for the next level.

For example, a fully-connected, feed-forward neural network is pictured in the following diagram. As you may notice, there is an intercept node on each layer (x0 and a0). The non-linearity of the network is mainly contributed by the shape of the activation function.

The architecture of this fully connected, the feed-forward neural network looks essentially like the following:

Fully connected, feed-forward neural network with two hidden layers
主站蜘蛛池模板: 灵宝市| 甘谷县| 绵阳市| 海门市| 休宁县| 藁城市| 疏勒县| 临沧市| 津市市| 杭锦旗| 屏东市| 丰城市| 江源县| 和顺县| 呼玛县| 禄丰县| 靖宇县| 林周县| 十堰市| 香格里拉县| 高安市| 广元市| 咸阳市| 丰原市| 西盟| 金山区| 英山县| 安西县| 泰来县| 墨脱县| 托里县| 策勒县| 搜索| 额济纳旗| 辽宁省| 白城市| 任丘市| 城市| 大安市| 维西| 天水市|