Adam Jones|HomeBlog

Benchmarking the Next.js server vs nginx at serving a static site

Headshot of Adam Jones

Adam Jones

Update: I did a follow-up with a more realistic setup, showing what the performance looks like when this is running on a remote server.

Next.js is a React framework that allows you to build web applications. It offers a bunch of neat features to make web development easier.

While Next.js leans heavily into server-side features like SSR and API routes, it can be used to build static websites. Three ways to deploy these static sites are:1
  1. Export the files and serve them with nginx or similar
  2. Use the default Next.js production server (next build && next start)
  3. (not recommended2) Use the Next.js development server (next dev)

My suspicion is that 1 is likely the fastest. However, 2 might often be simpler - both because you don’t need to configure a second web server, and it means your deployment configuration can be the same across static and dynamic Next.js apps.

Knowing how much slower this is allows us to properly evaluate this penalty, and make evidence-based decisions about what trade-offs we might be making.

So, how fast is it? I took this blog (yes, the very one you’re reading) and tested out how long it takes to load.

Detailed reproduction steps

Specifically, I took commit 34df60, ran npm install with Node v20.12.2, then ran:

nginx

Tested with nginx/1.25.5 installed via Homebrew.

Create nginx.conf:

events {}

http {
    server {
        listen 8080;
        root ./dist;

        index index.html;
        error_page 404 /404.html;

        location / {
            try_files $uri $uri/ $uri.html =404;
        }
    }
}

Then run npm run build && nginx -c $PWD/nginx.conf -p $PWD

next start

Edit next.config.js to remove output: 'export'

Run npm run build && PORT=8080 npx next start

next dev

Run npx next dev --port 8080

Then run benchmark-next-vs-nginx, with URL_TO_TEST=http://localhost:8080/blog/dont-use-contact-forms/ npm start

Median HTTP load times, high reqs/second (i.e. for the HTML response, using k6):

Median page load times, no additional reqs/second (i.e. for the browser to consider the page fully loaded, using Puppeteer):

Full results
nginxnext startnext dev
load_passes500500500
load_avg41.3914428361.76591118244.250632
load_med38.69052158.83241.6322715
load_p9048.187087571.9440042266.7428548
load_p9554.747989977.74398715278.5116128
req_passes28524522401825081
req_duration_avg5.348423601245.83352551782.082563
req_duration_median0.052253.4661688.108
req_duration_p900.064276.5892267.1042
req_duration_p950.093280.72539.82495

Benchmark source code.

Takeaways

Wow, nginx is fast - 4874x faster than Next.js under heavy load for median response time. I had some sense of this before, but just quite how much it dominates on request duration. I think this is largely because Next.js slows down quite a bit under heavy request load, given Next.js’s median time when only processing a single request was much lower.

That said, the Next.js server is plenty fast for real-world use cases. It was just 20ms slower (a fifth of a blink) to actually load the page under low load, and even under heavy load (3733 requests/second) it still had a median request duration of 253ms.

The real-world penalty will also likely be lower than this: network latency will likely start to dominate. And if you’ve got dynamic requests to APIs, that will almost certainly be the bottleneck rather than the static file serving.

Overall, with this small of a penalty in most cases I’d pick running Next.js given it’s significantly easier to set up (provided you’re building a Next.js app already), and means your deployment can be standardised across static and dynamic applications. Only when you’re getting hundreds or thousands of requests per second, or if you've already got a really smooth nginx pipeline, does it probably become worth looking at this.

Update: I did a follow-up with a more realistic setup, showing what the performance looks like when this is running on a remote server.

Footnotes

  1. These are ways to do it if you’re hosting on bare metal. There are of course also services that will effectively do one of these for you, like Vercel’s cloud hosting or GitHub Pages.

  2. I think the HMR functionality opens up additional attack surface area, and I expect this to be a lot slower. I’m testing this anyway as I could imagine people doing this by mistake.