- Apache Hadoop 3 Quick Start Guide
- Hrishikesh Vijay Karambelkar
- 139字
- 2021-06-10 19:18:45
Workload and computational requirements
While the previous two areas cover the sizing of the cluster, the workload requirements drive the computational capabilities of the cluster. All CPU-intensive operations require a higher count of CPUs and better configuration for computing. The number of Mapper and Reducer jobs that are run as a part of Hadoop also contribute to the requirements. Mapper tasks are usually higher than Reducer tasks, for example. The ratio of Mapper and Reducer is determined by processing requirements at both ends.
There is no definitive count that one can reach regarding memory and CPU requirements, as they vary based on replicas of block, the computational processing of tasks, and data storage needs. To help with this, we have provided a calculator which considers different configurations of a Hadoop cluster, such as CPU-intensive, memory-intensive, and balanced.
- 火格局的時空變異及其在電網防火中的應用
- 21天學通JavaScript
- 數據庫原理與應用技術學習指導
- Python Algorithmic Trading Cookbook
- 機器人創新實訓教程
- 機器人編程實戰
- 3D Printing for Architects with MakerBot
- 水晶石影視動畫精粹:After Effects & Nuke 影視后期合成
- PowerMill 2020五軸數控加工編程應用實例
- 大數據導論
- Mastering OpenStack(Second Edition)
- 傳感器原理與工程應用
- 玩機器人 學單片機
- 簡明學中文版Flash動畫制作
- 電氣控制及Micro800 PLC程序設計