- Practical Convolutional Neural Networks
- Mohit Sewak Md. Rezaul Karim Pradeep Pujari
- 359字
- 2021-06-24 18:58:49
Building blocks of a neural network
A neural network is made up of many artificial neurons. Is it a representation of the brain or is it a mathematical representation of some knowledge? Here, we will simply try to understand how a neural network is used in practice. A convolutional neural network (CNN) is a very special kind of multi-layer neural network. CNN is designed to recognize visual patterns directly from images with minimal processing. A graphical representation of this network is produced in the following image. The field of neural networks was originally inspired by the goal of modeling biological neural systems, but since then it has branched in different directions and has become a matter of engineering and attaining good results in machine learning tasks.
An artificial neuron is a function that takes an input and produces an output. The number of neurons that are used depends on the task at hand. It could be as low as two or as many as several thousands. There are numerous ways of connecting artificial neurons together to create a CNN. One such topology that is commonly used is known as a feed-forward network:

Each neuron receives inputs from other neurons. The effect of each input line on the neuron is controlled by the weight. The weight can be positive or negative. The entire neural network learns to perform useful computations for recognizing objects by understanding the language. Now, we can connect those neurons into a network known as a feed-forward network. This means that the neurons in each layer feed their output forward to the next layer until we get a final output. This can be written as follows:


The preceding forward-propagating neuron can be implemented as follows:
import numpy as np
import math
class Neuron(object):
def __init__(self):
self.weights = np.array([1.0, 2.0])
self.bias = 0.0
def forward(self, inputs):
""" Assuming that inputs and weights are 1-D numpy arrays and the bias is a number """
a_cell_sum = np.sum(inputs * self.weights) + self.bias
result = 1.0 / (1.0 + math.exp(-a_cell_sum)) # This is the sigmoid activation function
return result
neuron = Neuron()
output = neuron.forward(np.array([1,1]))
print(output)
- 數(shù)據(jù)庫應(yīng)用實戰(zhàn)
- 從0到1:數(shù)據(jù)分析師養(yǎng)成寶典
- Ceph源碼分析
- 金融商業(yè)算法建模:基于Python和SAS
- 高維數(shù)據(jù)分析預(yù)處理技術(shù)
- 達夢數(shù)據(jù)庫運維實戰(zhàn)
- 云數(shù)據(jù)中心網(wǎng)絡(luò)與SDN:技術(shù)架構(gòu)與實現(xiàn)
- 新手學(xué)會計(2013-2014實戰(zhàn)升級版)
- 爬蟲實戰(zhàn):從數(shù)據(jù)到產(chǎn)品
- 商業(yè)智能工具應(yīng)用與數(shù)據(jù)可視化
- Hands-On System Programming with C++
- 數(shù)據(jù)指標體系:構(gòu)建方法與應(yīng)用實踐
- Arquillian Testing Guide
- SQL Server 2012 數(shù)據(jù)庫教程(第3版)
- Hands-On Java Deep Learning for Computer Vision