Skip to content

Commit 4520c17

Browse files
zhengruifengdongjoon-hyun
authored andcommitted
[SPARK-54340][PYTHON][FOLLOW-UP] Add link and examples for run-with-viztracer
### What changes were proposed in this pull request? Add link and examples for `run-with-viztracer` ### Why are the changes needed? to make it easier for developers to have a try ### Does this PR introduce _any_ user-facing change? dev-only changes ### How was this patch tested? manually check ```sh (spark_dev_313) ➜ spark git:(doc_viz) python/run-with-viztracer -h Usage: run-with-viztracer your_original_commands To view the profiling results, run: vizviewer pyspark_*.json Environment: If SPARK_VIZTRACER_OUTPUT_DIR is set, the output will be saved to the directory. Otherwise, it will be saved to the current directory. Requirements: - viztracer must be installed (pip install viztracer) Check the following documentation for more information on using viztracer: https://viztracer.readthedocs.io/en/latest/ Examples: - Start pyspark shell python/run-with-viztracer bin/pyspark --conf spark.driver.memory=16g - Start pyspark shell in Connect mode python/run-with-viztracer bin/pyspark --remote local ``` ### Was this patch authored or co-authored using generative AI tooling? no Closes apache#53418 from zhengruifeng/doc_viz. Authored-by: Ruifeng Zheng <ruifengz@apache.org> Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
1 parent 0701855 commit 4520c17

File tree

1 file changed

+11
-0
lines changed

1 file changed

+11
-0
lines changed

python/run-with-viztracer

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -37,6 +37,17 @@ Environment:
3737
3838
Requirements:
3939
- viztracer must be installed (pip install viztracer)
40+
41+
Check the following documentation for more information on using viztracer:
42+
43+
https://viztracer.readthedocs.io/en/latest/
44+
45+
Examples:
46+
- Start pyspark shell
47+
$0 bin/pyspark --conf spark.driver.memory=16g
48+
49+
- Start pyspark shell in Connect mode
50+
$0 bin/pyspark --remote local
4051
EOF
4152
exit 0
4253
}

0 commit comments

Comments
 (0)