Skip to content

Commit 6108895

Browse files
committed
add stage
1 parent 4110714 commit 6108895

File tree

1 file changed

+12
-0
lines changed

1 file changed

+12
-0
lines changed

articles/synapse-analytics/spark/apache-spark-history-server.md

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -140,6 +140,18 @@ Send feedback with issues by selecting **Provide us feedback**.
140140

141141
![Screenshot showing Spark application and job graph feedback.](./media/apache-spark-history-server/sparkui-graph-feedback.png)
142142

143+
### Stage number limit
144+
145+
For performance consideration, by default the graph is only available when the Spark application has less than 500 stages. If there are too many stages, it will fail with an error like this:
146+
147+
`` The number of stages in this application exceeds limit (500), graph page is disabled in this case.``
148+
149+
As a workaround, before starting a Spark application, please apply this Spark configuration to increase the limit:
150+
151+
`` spark.ui.enhancement.maxGraphStages 1000 ``
152+
153+
But please notice that this may cause bad performance of the page and the API, because the content can be too large for browser to fetch and render.
154+
143155
## Explore the Diagnosis tab in Apache Spark history server
144156

145157
To access the Diagnosis tab, select a job ID. Then select **Diagnosis** on the tool menu to get the job Diagnosis view. The diagnosis tab includes **Data Skew**, **Time Skew**, and **Executor Usage Analysis**.

0 commit comments

Comments
 (0)