- Python Deep Learning
- Ivan Vasilev Daniel Slater Gianmario Spacagna Peter Roelants Valentino Zocca
- 396字
- 2021-07-02 14:31:03
The need for neural networks
Neural networks have been around for many years, and they've gone through several periods during which they've fallen in and out of favor. But recently, they have steadily gained ground over many other competing machine learning algorithms. This resurgence is due to having computers that are fast, the use of graphical processing units (GPUs) versus the most traditional use of computing processing units (CPUs), better algorithms and neural net design, and increasingly larger datasets that we'll see in this book. To get an idea of their success, let's take the ImageNet Large-Scale Visual Recognition Challenge (http://image-net.org/challenges/LSVRC/, or just ImageNet). The participants train their algorithms using the ImageNet database. It contains more than one million high-resolution color images in over a thousand categories (one category may be images of cars, another of people, trees, and so on). One of the tasks in the challenge is to classify unknown images in these categories. In 2011, the winner achieved a top-five accuracy of 74.2%. In 2012, Alex Krizhevsky and his team entered the competition with a convolutional network (a special type of deep network). That year, they won with a top-five accuracy of 84.7%. Since then, the winners have always been convolutional networks and the current top-five accuracy is 97.7%. But deep learning algorithms have excelled in other areas; for example, both Google Now and Apple's Siri assistants rely on deep networks for speech recognition and Google's use of deep learning for their translation engines.
We'll talk about these exciting advances in the next chapters. But for now, we'll use simple networks with one or two layers. You can think of these as toy examples that are not deep networks, but understanding how they work is important. Here's why:
- First: knowing the theory of neural networks will help you understand the rest of the book, because a large majority of neural networks in use today share common principles. Understanding simple networks means that you'll understand deep networks too.
- Second: having some fundamental knowledge is always good. It will help you a lot when you face some new material (even material not included in this book).
I hope these arguments will convince you of the importance of this chapter. As a small consolation, we'll talk about deep learning in depth (pun intended) in chapter 3, Deep Learning Fundamentals.
- Learning Scala Programming
- 微服務(wù)設(shè)計(jì)(第2版)
- Python快樂(lè)編程:人工智能深度學(xué)習(xí)基礎(chǔ)
- 自己動(dòng)手寫Java虛擬機(jī)
- 營(yíng)銷數(shù)據(jù)科學(xué):用R和Python進(jìn)行預(yù)測(cè)分析的建模技術(shù)
- 薛定宇教授大講堂(卷Ⅳ):MATLAB最優(yōu)化計(jì)算
- Visual C++串口通信技術(shù)詳解(第2版)
- Web程序設(shè)計(jì)(第二版)
- C語(yǔ)言程序設(shè)計(jì)上機(jī)指導(dǎo)與習(xí)題解答(第2版)
- 移動(dòng)增值應(yīng)用開(kāi)發(fā)技術(shù)導(dǎo)論
- Scratch·愛(ài)編程的藝術(shù)家
- Visual FoxPro 6.0程序設(shè)計(jì)
- .NET 4.0面向?qū)ο缶幊搪劊簯?yīng)用篇
- Android應(yīng)用開(kāi)發(fā)攻略
- jQuery基礎(chǔ)教程(第4版)