DEV Community

Cover image for Next.js Conf 2024 Highlights and React Framework Future
Aleksei Zhilyuk for FocusReactive

Posted on • Edited on • Originally published at focusreactive.com

Next.js Conf 2024 Highlights and React Framework Future

While the overall ambitious tone of the previous conference editions have cooled off, we enthusiastically welcomed the direction the presenting team have taken, focusing on making Next.js API more streamlined, and self-hosting friendly. We saw updates and celebrated stable release of Turbopack for Dev, and insightful deep-dives from people behind the framework.

Followed up by case studies of major Next.js adopters, and strategic partners, including, our most technological Headless CMS of choice - Sanity, which went all in with their Next.js 15 support and innovative “live” CMS content approach. We got so excited about the last one that we even wrote a separate blog post breaking down the vision behind the new architecture of choice when it comes to Next.js+Sanity combo.

Let’s break down some of the talks we followed the most:

Opening Keynote

Guillermo Rauch (CEO, Vercel & Creator of Next.js) welcomes everyone to the fifth anniversary of Next.js and begins his speech by expressing gratitude to the main contributors to Next.js.

Guillermo Rauch on stage

A lot happened during the year:

  • 4000 PRs
  • 5 Next.js releases were made
  • 570 new contributors joined the project
  • Next.js reaches 7 million downloads on NPM
  • Many large companies have already migrated to APP Router, such as PayPal, wayfair, xAI

Guillermo Rauch believes that Next.js and its ecosystem is something we can rely on in the future. Next.js is used all over the world to create everything from the simplest websites to complex, sophisticated applications.

Guillermo Rauch was guided by these principles when creating Next.js

  1. Make it work
  2. Make it right
  3. Make it fast
  4. Make it blazing fast

make, right, fast

APP Router was created to make data fetching and rendering more predictable, since according to Guillermo Rauch, developers should not worry about caching and focus on creating the application.

In order for the idea to turn into a product as quickly as possible and the time for each iteration to be as short as possible Guillermo Rauch is pleased to announce that Turbopack is finally stable.

The new compiler is 50% faster than the previous one for initial compilation and 90% faster for Fast Refresh.

nextjs.org compile time

Coming soon for Turbopack: Turbopack uses persistent cache for its fast performance.

Simplifying Next.js Cache API

Next, Delba de Oliveira (DX Engineer at Vercel) shows how to make the demo application faster using caching and introduces a new experimental NextJS “use cache” directive that allows you to explicitly specify that a component or the result of a function execution should be cached.

This directive works both in components, similar to “use client”, and in functions, similar to “use server”, as well as a new API “cacheTag” for cache keys and “cacheLife” for specifying how long a given cache should live.

use cache

pertial prerendering

dynamic/streaming/prerender/revalidate

Developers can try these experimental changes in the canary version of Next.js. We will also soon write up our own deep-dive into the new “use cache” API.

Better self-hosting of Next.js

Next, Lee Robinson (VP of Product at Vercel) appears on stage to uncover improvements for self-hosted Next.js they have shipped, re-iterating Vercel’s focus on keeping Next.js open-source first.

In v15, the Vercel team has made it easier to configure how caching works by default, also simplifying the required minimal DevOps setup for self hosting Next.js. Documentation covering various deployment scenarios and templates have also been greatly improved, including the launch of Next.js community GitHub org with up-to-date redeployment recipes for community’s most popular hosting targets.

Previously, Next.js used a web assembly-based image optimization library, but they were dissatisfied with the fact that this library consumed a lot of memory, so now it uses “Sharp”, which is installed automatically.

Default cache control headers have been updated

next14 vs next15 cache control headers

cache controll config

AMA: Next.js Team

AMA: Next.js Team

Variety of topics have been covered during the Ask Me Anything session with the Next.js team, we suggest you dive into it yourself on the stream recording.

React Server Components: Elevating speed, interactivity, and user experience

Aurora starts her talk by saying that the RSC have changed the way we build apps.

Some of the new possibilities, that come with the RSC:

  1. Fetch data async inside the component itself
  2. Access backend resources directly from component
  3. No JS is ship to the client
  4. Streaming

RSC and latest next.js features help increase speed of not only websites, but also development velocity. New versions of next.js and react help you split compute load between client and a server, which will help you build an application based on your needs. Another important aspects that have beed improved, are interactivity and ability to easily build responsive apps.

With the new features comes new limitations, for example you can’t access client hooks such as useState, useEffect, or browser information in such components.

New development patterns to leverage the latest features and create a stunning UI:

  1. Move data fetching closer to UI

Instead of fetching data on top level, and make requests blocking render, move data fetching inside the component, and wrap component in Suspence boundary.

Before and after

render-blocking requests

suspense requests

  1. Add client component and preserve request non-blocking fashion

If you need to make a component inside Suspence boundary client-side, you will be forced to remove server function call from it. In this case, create a promise on the top level, pass it to client component as a prop, and resolve using react new use() function. It will help you avoid render blocking and keep client component in suspense boundary, to show a static shell on initial render.

react use hook

passing promise from RSC to RCC

  1. useTransition, useOptimistic

These new react hooks allows you to create a smooth transitions on any user’s action. Moreover, using these hooks you are able to show instant feedback to the user and mnot depend on network connection.

  1. React cache

Use new cache function to prevent firing the same request multiple times. This means we can reuse existing pattern of calling data directly inside component and maintain composition.

All these new features and patters allow us build fully interactive apps without use of useState and useEffect.

Best development practices:

  • Resolve promises deep in the tree
  • Display pending indicators
  • Put state in the URL

New tools:

Leverage react 19 features:

  • cache() - perform per-render caching for expensive func calls
  • useOptimistic() - respond to user interactions instantly even when the request is slow
  • use() - suspend client components as they resolve a promise passed down

Next.js features:

  • staleTimes - set state times for dynamic page segments to reuse them across subsequent requests
  • PPR - statically renders parts of the page or layout to improve performance

The long and winding road: CSR to static export to SSG

Building user interfaces in the age of AI

Oleg starts of by defining what is great interface? Great interface is measured by the ability to help users accomplish their task as quickly and as effortlessly as possible.

The most important things here are speed and reliability.

What is Generative UI?

It is an umbrella for any project leveraging LLMs, in order to enhance user interface. It is a spectrum. Each side of the spectrum has it’s own tradeoffs and requirements. But they are united by the crucial role of LLM.

Both edges of the spectrum has it’s own requirements, tradeoffs and performance characteristics.

generative UI spectrum

The slide represents the current state of LLM’s capabilities for generating UI.

Generating HTML from text is a hard problem with following challenges:

  1. Slow. takes up to 5-7s to generate a HTML page on perplexity
  2. No reliable way to deploy it
  3. Hard to sync with design system
  4. Token inefficiency

The Perplexity team is focused on text to object generations instead.

One of the patterns guys in Perplexity use is define schemas for each presentational component and define props based on schema.

Example:
schema for component props

Then use component’s schema as input for the model. This way you can create simple components with streaming functionality using LLMs.

component handling streaming schema

Key Takeaways:

  1. Use LLMs as APIs
  2. Faster and cheaper
  3. Fits well into existing flow
  4. Easy to integrate
  5. Structured output
  6. React friendly JSON data
  7. Guaranteed to be valid
  8. Schemas are useful without LLMs
  9. Streaming UX
  10. Address re-rendering issues
  11. Use OSS libs
  12. Implementation depends on your code

Optimizing LCP: Partial Prerendering deep dive

Wyatt Johnson

Wyatt Johnson, a software engineer at Vercel, provides a comprehensive overview of “Partial Prerendering”, an experimental feature aimed at optimizing the Largest Contentful Paint (LCP) by combining the best of static and dynamic rendering techniques. This method has been developed to tackle the limitations of traditional rendering methods, improving site performance and user experience.

what are Core Web Vitals?

Wyatt begins by explaining the significance of Core Web Vitals, particularly focusing on LCP, which measures the render time of the largest image or text block visible within the viewport. He highlights the challenges faced with traditional rendering approaches where developers must choose between the speed of static rendering and the flexibility of dynamic rendering. Static rendering, while fast, cannot incorporate request data, leading to delays in rendering dynamic content. Conversely, dynamic rendering incorporates request data but often at the expense of speed due to server response times.

speed functionality tradeoff

The session delves into the mechanics of Partial Prerendering (PPR), which allows for a static shell of a page to be generated at build time and served from the edge. Simultaneously, it sends a request back to the origin to complete the dynamic rendering. This approach minimizes the time to first byte and ensures that the page loads quickly while still supporting dynamic capabilities.

Wyatt demonstrates practical applications of PPR in an e-commerce setting, showing how PPR can streamline the rendering process, reduce latency, and improve the user experience by delivering a fast initial load with dynamic capabilities intact. He further explains the technical implementation of PPR, discussing how it leverages React’s capabilities to suspend and resume rendering as needed, based on the dynamic content requirements.

enabling partial prerenderring

To conclude, Wyatt emphasizes the future potential of PPR in Next.js and the ongoing efforts to integrate this feature across different hosting environments. He expresses enthusiasm for the capabilities of Next.js in bridging the gap between static speed and dynamic flexibility, ultimately providing developers with the tools to build faster and more responsive web applications.

streaming on 5$ VPS

That's a Wrap

Thank you for following our highlights from the Next.js Conference 2024, we enjoy sharing these with the community, as we eagerly follow all the developments in the Next.js space.

Follow our blog for more Next.js content and practical use cases that we have solved building a variety of Next.js apps; as well as the practices we developed through our Performance and SEO audits for our clients.

We are excited about the direction Next.js team took, prioritizing API simplicity,increased transparency and support towards non-Vercel clients, which is dearly important for us. We support the renewed open source spirit and simplicity it brings for deployment of Next.js to own cloud environments.

Top comments (0)