- Apache Spark 2.x for Java Developers
- Sourav Gulati Sumit Kumar
- 242字
- 2021-07-02 19:02:02
Jobs
Jobs is the default tab of Spark UI. It shows the status of all the applications executed within a SparkContext. It can be accessed at http://localhost:4040/jobs/.
It consists of three sections:
- Active Jobs: This section is for the jobs that are currently running
- Completed Jobs: This section is for the jobs that successfully completed
- Failed Jobs: This section is for the jobs that were failed
It is shown in the following screenshot:

The Jobs tab section of Spark UI is rendered using the org.apache.spark.ui.jobs.JobsTab class that uses org.apache.spark.ui.jobs.JobProgressListener to get the statistics of the job.
After executing all the jobs mentioned in the Spark REPL also known as CLI section, Spark UI will look as follows:

Also, if you expand the Event Timeline section, you can see the time at which SparkContext started (that is, driver was initiated) and the jobs were executed along with their status:

Also, by clicking on any of the jobs, you can see the details of the job, that is, the Event Timeline of the job and the DAG of the transformations and stages executed during the execution of the job, as follows:

- Java高并發核心編程(卷2):多線程、鎖、JMM、JUC、高并發設計模式
- JavaScript前端開發與實例教程(微課視頻版)
- STM32F0實戰:基于HAL庫開發
- Creating Data Stories with Tableau Public
- 計算機應用基礎教程(Windows 7+Office 2010)
- OpenStack Networking Essentials
- Django 3.0應用開發詳解
- RESTful Web Clients:基于超媒體的可復用客戶端
- 從零開始學Android開發
- 算法超簡單:趣味游戲帶你輕松入門與實踐
- Learning iOS Penetration Testing
- 面向對象分析與設計(第3版)
- Python Natural Language Processing
- 網絡工程方案設計與實施(第二版)
- C++程序設計習題與實驗指導