Replies: 1 comment 1 reply
-
I can't think of way of providing accessibility features without exposing the site to scraping or bots. If you have any suggestions, please let me know. This particular issue has been raised in multiple forums and #7, I am really interested in solving this, but I don't know how.
Since the pages are cached in memory the screenshotting process is done once on initial page load and then every X seconds (i.e. in case a request comes after cache expiry of X seconds). Also in the future the screenshots can be taken in a "build" steps and stored on the disk to be served later when the server is running. This can be done because currently I am focusing on just static websites, so contents can be "prerendered" as On taking screenshots on the fly and improving response time, I will explore how to get progressive JPEGs / AVIFs working with puppeteer, from my initial research it doesn't support it natively.
I was not aware about these projects, thanks for bringing them to notice. BrowserBox is similar yes, but it falls into the "streaming" solution I mentioned in the second FAQ: it is more resource intensive on servers + won't work with CDNs. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
This should be called OnlyHealthyHumans-Proxy. How about visually impaired people? It looks like an accessibility nightmare.
How about performance? Witing for a full page to load headlessly in order to snapshot it and send it as a PNG image to the client is a waste of time. Better use progressive JPEG or even better an optimized AVIF with progressive snapshotting (e.g. every second send an updated image while loading).
How does your project compare with existing projects such as WRP or BrowserBox?
Also, it doesn't prevent from scraping by multimodal LLM-s like Molmo.
Beta Was this translation helpful? Give feedback.
All reactions