- Python Deep Learning
- Ivan Vasilev Daniel Slater Gianmario Spacagna Peter Roelants Valentino Zocca
- 396字
- 2021-07-02 14:31:03
The need for neural networks
Neural networks have been around for many years, and they've gone through several periods during which they've fallen in and out of favor. But recently, they have steadily gained ground over many other competing machine learning algorithms. This resurgence is due to having computers that are fast, the use of graphical processing units (GPUs) versus the most traditional use of computing processing units (CPUs), better algorithms and neural net design, and increasingly larger datasets that we'll see in this book. To get an idea of their success, let's take the ImageNet Large-Scale Visual Recognition Challenge (http://image-net.org/challenges/LSVRC/, or just ImageNet). The participants train their algorithms using the ImageNet database. It contains more than one million high-resolution color images in over a thousand categories (one category may be images of cars, another of people, trees, and so on). One of the tasks in the challenge is to classify unknown images in these categories. In 2011, the winner achieved a top-five accuracy of 74.2%. In 2012, Alex Krizhevsky and his team entered the competition with a convolutional network (a special type of deep network). That year, they won with a top-five accuracy of 84.7%. Since then, the winners have always been convolutional networks and the current top-five accuracy is 97.7%. But deep learning algorithms have excelled in other areas; for example, both Google Now and Apple's Siri assistants rely on deep networks for speech recognition and Google's use of deep learning for their translation engines.
We'll talk about these exciting advances in the next chapters. But for now, we'll use simple networks with one or two layers. You can think of these as toy examples that are not deep networks, but understanding how they work is important. Here's why:
- First: knowing the theory of neural networks will help you understand the rest of the book, because a large majority of neural networks in use today share common principles. Understanding simple networks means that you'll understand deep networks too.
- Second: having some fundamental knowledge is always good. It will help you a lot when you face some new material (even material not included in this book).
I hope these arguments will convince you of the importance of this chapter. As a small consolation, we'll talk about deep learning in depth (pun intended) in chapter 3, Deep Learning Fundamentals.
- C語(yǔ)言程序設(shè)計(jì)案例教程
- MySQL 8從入門(mén)到精通(視頻教學(xué)版)
- 造個(gè)小程序:與微信一起干件正經(jīng)事兒
- Python數(shù)據(jù)分析(第2版)
- Teaching with Google Classroom
- JavaCAPS基礎(chǔ)、應(yīng)用與案例
- 區(qū)塊鏈技術(shù)與應(yīng)用
- Learning Material Design
- Qt5 C++ GUI Programming Cookbook
- 計(jì)算語(yǔ)言學(xué)導(dǎo)論
- 數(shù)據(jù)科學(xué)中的實(shí)用統(tǒng)計(jì)學(xué)(第2版)
- ASP.NET開(kāi)發(fā)寶典
- Java程序設(shè)計(jì)實(shí)用教程(第2版)
- SAS編程演義
- Learning ECMAScript 6