Optimizing the Storing and Processing of Data for Machine Learning Problems
All of the preceding uses for artificial intelligence rely heavily on optimized data storage and processing. Optimization is necessary for machine learning because the data size can be huge, as seen in the following examples:
- A single X-ray file can be many gigabytes in size.
- Translation corpora (large collections of texts) can reach billions of sentences.
- YouTube's stored data is measured in exabytes.
- Financial data might seem like just a few numbers; these are generated in such large quantities per second that the New York Stock Exchange generates 1 TB of data daily.
While every machine learning system is unique, in many systems, data touches the same components. In a hypothetical machine learning system, data might be dealt with as follows:

Figure 1.4: Hardware used in a hypothetical machine learning system
Each of these is a highly specialized piece of hardware, and although not all of them store data for long periods in the way traditional hard disks or tape backups do, it is important to know how data storage can be optimized at each stage. Let's pe into a text classification AI project to see how optimizations can be applied at some stages.
- Arduino入門基礎(chǔ)教程
- Creating Dynamic UI with Android Fragments
- 深入淺出SSD:固態(tài)存儲核心技術(shù)、原理與實戰(zhàn)
- 硬件產(chǎn)品經(jīng)理手冊:手把手構(gòu)建智能硬件產(chǎn)品
- VCD、DVD原理與維修
- 基于Apache Kylin構(gòu)建大數(shù)據(jù)分析平臺
- Visual Media Processing Using Matlab Beginner's Guide
- 筆記本電腦維修實踐教程
- FL Studio Cookbook
- Intel FPGA權(quán)威設(shè)計指南:基于Quartus Prime Pro 19集成開發(fā)環(huán)境
- Spring Security 3.x Cookbook
- 單片機項目設(shè)計教程
- UML精粹:標(biāo)準(zhǔn)對象建模語言簡明指南(第3版)
- 快·易·通:2天學(xué)會電腦組裝·系統(tǒng)安裝·日常維護與故障排除
- DevOps實戰(zhàn):VMware管理員運維方法、工具及最佳實踐