- Python Reinforcement Learning Projects
- Sean Saito Yang Wenzhuo Rajalingappaa Shanmugamani
- 409字
- 2021-07-23 19:05:02
Neural networks
A neural network is a type of computational architecture that is composed of layers of perceptrons. A perceptron, first conceived in the 1950s by Frank Rosenblatt, models the biological neuron and computes a linear combination of a vector of input. It also outputs a transformation of the linear combination using a non-linear activation, such as the sigmoid function. Suppose a perceptron receives an input vector of . The output, a, of the perceptron, would be as follows:
Whereare the weights of the perceptron, b is a constant, called the bias, and
is the sigmoid activation function that outputs a value between 0 and 1.
Perceptrons have been widely used as a computational model to make decisions. Suppose the task was to predict the likelihood of sunny weather the next day. Eachwould represent a variable, such as the temperature of the current day, humidity, or the weather of the previous day. Then,
would compute a value that reflects how likely it is that there will be sunny weather tomorrow. If the model has a good set of values for
, it is able to make accurate decisions.
In a typical neural network, there are multiple layers of neurons, where each neuron in a given layer is connected to all neurons in the prior and subsequent layers. Hence these layers are also referred to as fully-connected layers. The weights of a given layer, l, can be represented as a matrix, Wl:
Where each wij denotes the weight between the i neuron of the previous layer and the j neuron of this layer. Bl denotes a vector of biases, one for each neuron in the l layer. Hence, the activation, al, of a given layer, l, can be defined as follows:
Where a0(x) is just the input. Such neural networks with multiple layers of neurons are called multilayer perceptrons (MLP). There are three components in an MLP: the input layer, the hidden layers, and the output layer. The data flows from the input layer, transformed through a series of linear and non-linear functions in the hidden layers, and is outputted from the output layer as a decision or a prediction. Hence this architecture is also referred to as a feed-forward network. The following diagram shows what a fully-connected network would look like:
- 嵌入式系統(tǒng)及其開發(fā)應(yīng)用
- 21天學(xué)通JavaScript
- PowerShell 3.0 Advanced Administration Handbook
- Seven NoSQL Databases in a Week
- 3D Printing with RepRap Cookbook
- VB語言程序設(shè)計
- 水晶石精粹:3ds max & ZBrush三維數(shù)字靜幀藝術(shù)
- DevOps:Continuous Delivery,Integration,and Deployment with DevOps
- 自動控制理論(非自動化專業(yè))
- 數(shù)據(jù)通信與計算機網(wǎng)絡(luò)
- 工業(yè)自動化技術(shù)實訓(xùn)指導(dǎo)
- 實用網(wǎng)絡(luò)流量分析技術(shù)
- MATLAB-Simulink系統(tǒng)仿真超級學(xué)習(xí)手冊
- Flash CS3動畫制作
- Flash 8中文版全程自學(xué)手冊