Commit 6d20477
Make decoding base64 in ContentData much faster
When exporting a large trace from gemini.google.com (~5s recording), it
was taking ~3.4s. Now it takes ~0.5s.
The expensive part was this:
> const bytes = Uint8Array.from(binaryString, m => m.codePointAt(0) as
number);
I believe the slowness came from:
1. overhead of v8 calling the mapper function for each character of the
base64 string
2. needing to create a string instance of length 1 for each character
Additionally: now using charCodeAt instead of codePointAt. I didn't
notice a measurable improvement from this change, but we don't need the
complexity of code points when processing base64.
Bug: 444483828
Change-Id: I5c6ab8a345637672277062e84f4fdf05a06bdbbe
Reviewed-on: https://chromium-review.googlesource.com/c/devtools/devtools-frontend/+/6940949
Commit-Queue: Connor Clark <[email protected]>
Auto-Submit: Connor Clark <[email protected]>
Reviewed-by: Paul Irish <[email protected]>1 parent 887e28e commit 6d20477
1 file changed
+6
-1
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
76 | 76 | | |
77 | 77 | | |
78 | 78 | | |
79 | | - | |
| 79 | + | |
| 80 | + | |
| 81 | + | |
| 82 | + | |
| 83 | + | |
| 84 | + | |
80 | 85 | | |
81 | 86 | | |
82 | 87 | | |
| |||
0 commit comments