- Python Deep Learning Cookbook
- Indra den Bakker
- 252字
- 2021-07-02 15:43:14
Implementing a single-layer neural network
Now we can move on to neural networks. We will start by implementing the simplest form of a neural network: a single-layer neural network. The difference from a perceptron is that the computations are done by multiple units (neurons), hence a network. As you may expect, adding more units will increase the number of problems that can be solved. The units perform their computations separately and are stacked in a layer; we call this layer the hidden layer. Therefore, we call the units stacked in this layer the hidden units. For now, we will only consider a single hidden layer. The output layer performs as a perceptron. This time, as input we have the hidden units in the hidden layer instead of the input variables:

In our implementation of the perceptron, we've used a unit step function to determine the class. In the next recipe, we will use a non-linear activation function sigmoid for the hidden units and the output function. By replacing the step function with a non-linear activation function, the network will be able to uncover non-linear patterns as well. More on this later in the Activation functions section. In the backward pass, we use the derivative of the sigmoid to update the weights.
In the following recipe, we will classify two non-linearly separable classes with NumPy.
- ASP.NET Core:Cloud-ready,Enterprise Web Application Development
- Python快樂編程:人工智能深度學(xué)習(xí)基礎(chǔ)
- 精通軟件性能測(cè)試與LoadRunner實(shí)戰(zhàn)(第2版)
- UI智能化與前端智能化:工程技術(shù)、實(shí)現(xiàn)方法與編程思想
- Banana Pi Cookbook
- Flash CS6中文版應(yīng)用教程(第三版)
- SEO實(shí)戰(zhàn)密碼
- 編譯系統(tǒng)透視:圖解編譯原理
- Windows Server 2016 Automation with PowerShell Cookbook(Second Edition)
- RSpec Essentials
- Building Machine Learning Systems with Python(Second Edition)
- Python 3 數(shù)據(jù)分析與機(jī)器學(xué)習(xí)實(shí)戰(zhàn)
- 實(shí)戰(zhàn)Java高并發(fā)程序設(shè)計(jì)(第2版)
- Visual Basic程序設(shè)計(jì)(第三版)
- Solr權(quán)威指南(下卷)