- Python Deep Learning Cookbook
- Indra den Bakker
- 252字
- 2021-07-02 15:43:14
Implementing a single-layer neural network
Now we can move on to neural networks. We will start by implementing the simplest form of a neural network: a single-layer neural network. The difference from a perceptron is that the computations are done by multiple units (neurons), hence a network. As you may expect, adding more units will increase the number of problems that can be solved. The units perform their computations separately and are stacked in a layer; we call this layer the hidden layer. Therefore, we call the units stacked in this layer the hidden units. For now, we will only consider a single hidden layer. The output layer performs as a perceptron. This time, as input we have the hidden units in the hidden layer instead of the input variables:

In our implementation of the perceptron, we've used a unit step function to determine the class. In the next recipe, we will use a non-linear activation function sigmoid for the hidden units and the output function. By replacing the step function with a non-linear activation function, the network will be able to uncover non-linear patterns as well. More on this later in the Activation functions section. In the backward pass, we use the derivative of the sigmoid to update the weights.
In the following recipe, we will classify two non-linearly separable classes with NumPy.
- 多媒體CAI課件設計與制作導論(第二版)
- 測試驅動開發:入門、實戰與進階
- Cross-platform Desktop Application Development:Electron,Node,NW.js,and React
- PyTorch Artificial Intelligence Fundamentals
- INSTANT MinGW Starter
- C語言程序設計立體化案例教程
- Bootstrap 4:Responsive Web Design
- 快速念咒:MySQL入門指南與進階實戰
- Nginx實戰:基于Lua語言的配置、開發與架構詳解
- 數據結構習題解析與實驗指導
- Microsoft Azure Storage Essentials
- OpenCV 4計算機視覺項目實戰(原書第2版)
- Python+Tableau數據可視化之美
- Microsoft 365 Certified Fundamentals MS-900 Exam Guide
- Python從入門到精通(第3版)