Skip to content
Discussion options

You must be logged in to vote

Hi @Ohthatorh,

Thanks for raising this. Seeing a next-server Node.js process at 100% CPU under heavy load like multiple SEO spiders running is not uncommon, but it does indicate some performance bottlenecks or resource limits being hit.

Here are some suggestions to improve performance and reduce CPU usage:

  • Enable caching

Use ISR (Incremental Static Regeneration) or Static Site Generation (SSG) wherever possible to reduce server rendering on each request.

  • Rate limit bots/crawlers

Configure your server or CDN to throttle or block aggressive crawlers that hammer your site. This helps prevent overload.

  • Use a CDN

Offload static assets and cacheable pages via a CDN to reduce load on your …

Replies: 2 comments

Comment options

You must be logged in to vote
0 replies
Answer selected by Ohthatorh
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Help
Labels
None yet
2 participants