Optimizing the Storing and Processing of Data for Machine Learning Problems
All of the preceding uses for artificial intelligence rely heavily on optimized data storage and processing. Optimization is necessary for machine learning because the data size can be huge, as seen in the following examples:
- A single X-ray file can be many gigabytes in size.
- Translation corpora (large collections of texts) can reach billions of sentences.
- YouTube's stored data is measured in exabytes.
- Financial data might seem like just a few numbers; these are generated in such large quantities per second that the New York Stock Exchange generates 1 TB of data daily.
While every machine learning system is unique, in many systems, data touches the same components. In a hypothetical machine learning system, data might be dealt with as follows:

Figure 1.4: Hardware used in a hypothetical machine learning system
Each of these is a highly specialized piece of hardware, and although not all of them store data for long periods in the way traditional hard disks or tape backups do, it is important to know how data storage can be optimized at each stage. Let's pe into a text classification AI project to see how optimizations can be applied at some stages.
- 圖解西門子S7-200系列PLC入門
- 深入理解Spring Cloud與實(shí)戰(zhàn)
- Intel FPGA/CPLD設(shè)計(jì)(高級篇)
- 從零開始學(xué)51單片機(jī)C語言
- Camtasia Studio 8:Advanced Editing and Publishing Techniques
- Intel Edison智能硬件開發(fā)指南:基于Yocto Project
- RISC-V處理器與片上系統(tǒng)設(shè)計(jì):基于FPGA與云平臺的實(shí)驗(yàn)教程
- Istio服務(wù)網(wǎng)格技術(shù)解析與實(shí)踐
- Python Machine Learning Blueprints
- Mastering Machine Learning on AWS
- Intel FPGA權(quán)威設(shè)計(jì)指南:基于Quartus Prime Pro 19集成開發(fā)環(huán)境
- 可編程邏輯器件項(xiàng)目開發(fā)設(shè)計(jì)
- Zabbix 4 Network Monitoring
- Learning Less.js
- 現(xiàn)代多媒體技術(shù)及應(yīng)用