- Hands-On Deep Learning Architectures with Python
- Yuxi (Hayden) Liu Saransh Mehta
- 457字
- 2021-06-24 14:48:11
ANNs
The ANNs is built from two components: nodes and weights. Nodes play the role of a neuron while weights are the learnable parameters, which connect the neurons with each other, and control their activation paths.
So how do we make an artificial neuron or node? Consider x to be a scalar input to the neuron and w to be the scalar weight for the neuron. If you don't know what a scalar and vector are, a scalar is simply a single real number element, while a vector is a list of such elements. The artificial neuron can be represented as the following equation:

The circle represents the neuron which takes scalar x as input and outputs a after multiplying it with weight w. Here, b is called as bias. Bias is added in the equation to provide the capability of shifting the output for a specific range of inputs. The role of bias will become more clear once we go through the activation functions.
Now, suppose the neuron didn't just take in a single scalar input, but multiple inputs. The inputs can then be called a vector (say P). Then, P could be written as a set of scalar inputs p1, p2, p3, ...., pn and each input will also have a weight vector (say W = w1, w2, w3, ..., wn), which will be used to activate the neuron. The following matrices represent the P and W vectors:


So what changes do we need to make in the equation to fit these multiple inputs? Simply sum them up! This changes the basic equation a = w.x + b to the following equation:

The artificial neuron taking in multiple inputs would look like the following diagram:

But what would a neuron do alone? Although it can still output values, which can be used to take binary judgements (zero or one), we need a lot of similar neurons arranged in a parallel manner and interconnected just like in the brain in order to go beyond binary decisions. So, what would it look like? Here's the diagram:

What changes are required in the equation now? Just the dimensions of weight vector and output vector. Now we will have n x m number of weights where n is the number of inputs and m is the number of neurons. Also, we will be having a separate output from each of the neurons. The output, thus, also turns into a vector:



So far, we have learned the basic structure and mathematical equations modeling the ANNs; next we shall see another important concept called activation functions.
- 玩轉智能機器人程小奔
- 自動檢測與傳感技術
- ROS機器人編程與SLAM算法解析指南
- 計算機網絡應用基礎
- 嵌入式Linux上的C語言編程實踐
- 21天學通ASP.NET
- WordPress Theme Development Beginner's Guide(Third Edition)
- 工業控制系統測試與評價技術
- Troubleshooting OpenVPN
- 網站入侵與腳本攻防修煉
- FPGA/CPLD應用技術(Verilog語言版)
- Salesforce Advanced Administrator Certification Guide
- 21天學通Linux嵌入式開發
- 一步步寫嵌入式操作系統
- Photoshop CS4數碼照片處理入門、進階與提高