官术网_书友最值得收藏!

Element–wise operations

In element-wise operations, position matters. Values that correspond positionally are combined to create a new value. 

To add to and/or subtract matrices or vectors:

 

And in Python:

vector_one = np.array([[1,2],[3,4]])
vector_two = np.array([[5,6],[7,8]])
a + b
## You should see:
array([[ 6, 8],[10, 12]])
array([[ 6, 8],[10, 12]])
a - b
## You should see:
array([[-4, -4], [-4, -4]])

There are two forms of multiplication that we may perform with vectors: the Dot productand the Hadamard product.

The dot product is a special case of multiplication, and is rooted in larger theories of geometry that are used across the physical and computational sciences. It is a special case of a more general mathematical principle known as an inner productWhen utilizing the dot product of two vectors, the output is a scalar:

Dot products are a workhorse in machine learning. Think about a basic operation: let's say we're doing a simple classification problem where we want to know if an image contains a cat or a dog. If we did this with a neural network, it would look as follows:

Here, y is our classification cat or dog. We determine y by utilizing a network represented by f, where the input is x, while w and b represent a weight and bias factor (don't worry, we'll explain this in more detail in the coming chapter!). Our x and w are both matrices, and we need to output a scalar , which represents either cat or dog. We can only do this by taking the dot product of w and .

 Relating back to our example, if this function were presented with an unknown image, taking the dot product will tell us how similar in direction the new vector is to the cat vector (a) or dog vector (b) by the measure of the angle () between them:

If the vector is closer to the direction of the cat vector (a), we'll classify the image as containing a cat. If it's closer to the dog vector (b), we'll classify it as containing a dog. In deep learning, a more complex version of this scenario is performed over and over; it's the core of how ANNs work. 

In Python, we can take the dot product of two vectors by using a built-in function from numpy, np.dot():

## Dot Product
vector_one = np.array([1,2,3])
vector_two = np.array([2,3,4])
np.dot(vector_one,vector_two) ## This should give us 20

The Hadamard product, on the other hand, outputs a vector:

The Hadamard product is element-wise, meaning that the individual numbers in the new matrix are the scalar multiples of the numbers from the previous matrices. Looking back to Python, we can easily perform this operation in Python with a simple * operator:

vector_one = np.array([1,2,3])
vector_two = np.array([2,3,4])
vector_one * vector_two
## You should see:
array([ 2, 6, 12])

Now that we've scratched the surface of basic matrix operations, let's take a look at how probability theory can aid us in the artificial intelligence field. 

主站蜘蛛池模板: 江口县| 扎鲁特旗| 根河市| 吉林省| 黄山市| 十堰市| 鲜城| 来安县| 襄垣县| 邳州市| 正蓝旗| 阳山县| 江口县| 杂多县| 斗六市| 渭源县| 青海省| 张家口市| 绵阳市| 马山县| 称多县| 海门市| 双流县| 墨脱县| 巴彦县| 桑植县| 鄂伦春自治旗| 赤城县| 咸丰县| 上思县| 鱼台县| 武山县| 芦山县| 南华县| 三原县| 辽宁省| 蓬安县| 泽普县| 区。| 成安县| 大同县|