App Modernization

Transitioning from Redux to Next.js

Shawn Rice
May 28, 2024

In a recent project, I participated in refactoring a Redux application to embrace a micro-frontend pattern using Next.js and Turborepo. Throughout this project, I was (and continue to be) impressed with the excellent developer experience provided by Vercel’s framework. However, the most surprising discoveries came not just from the tooling but from the process of reshaping code from one architectural paradigm to another. We transitioned from a model where a globally available Redux store represented the database fully to one where individual routes in the application were responsible for fetching and managing only the necessary data. This transition prompted us to be mindful of answering questions such as, “Is this code as performant as possible?”, “Is this code easy to reason about and refactor?”, and “Is this the most resilient solution to the problem?”. Here are some of the most interesting discoveries from that process.

TypeScript Helped Identify and Remedy Component Bloat

Early in the process, it became apparent that we were often pulling down complete data sets from Redux to be consumed by components that required only a few properties from each point to display properly. In the context of Redux, this is a perfectly acceptable pattern. However, without a global store, satisfying the contract set by a fully hydrated data model could involve multiple API calls and a significant amount of post-processing.

The path of least resistance would have been to copy over the TypeScript definitions and wire up the component to make all necessary API calls to match the types. But as the number of individual API calls needed to hydrate a given data point grew, we observed longer and longer page load times (because all of these fetch calls were happening on the Next.js server side and no html was returned until all network requests had resolved). Instead, we chose to prioritize both user and developer experience by trimming down our component types to be as light as possible, ensuring that each component received only the data it needed to render a useful UI.

For instance, let’s say we we are working on migrating a view that displays a list of books that uses the following type:

The view itself renders a table where each row displays some basic data and links to a detail view:

Let’s also establish that the Redux app that this component is being migrated from fetches all of the data when the app first bootstraps, so using the full Book type to render the table view is not an issue. But in the Next.js app this component is being migrated to, the data needs to be fetched when the BookTable component mounts. In order to fulfill the contract established by the Book type, you would also need to fetch and process the reviews and similarTitles data. That’s three separate api calls (plus some post-processing) before a single pixel of the UI can be displayed. Depending on how many Books are being displayed per page, and how many reviews/similarTitles are being returned, this view could take several seconds before the page fully loads. This would create an unacceptably negative user experience.

Looking at the BookTable component a little closer, we can see there are only a handful of fields from the Book type which are being used in the component: the author, title, genre, and id. We can modify the props of the BookTable to expect a partial set of values from the Book type using TypeScript’s Pick Utility Type:

Now we can minimize the number of API calls needed to render our UI, ensuring a rapid page response when this route is loaded.

The useDispatch and useSelector Hooks Introduced Unexpected Challenges

While migrating component files, there were many instances where the only changes to a given component were to the ‘top’ of the file (the imports and the data setup, not the returned JSX). Many of these components were connected to Redux via the useSelector and useDispatch hooks. This often led to noisy, difficult to parse diffs during code review and contributed to more than one set of onerous merge conflicts to resolve. But this work helped us identify a far more maintainable pattern: extract data plumbing to custom hooks.

Let’s return to the previous example to illustrate this point. In the redux app we are migrating the BookTable component from, there is a BookList component that pulls the data from Redux:

Since our Next.js app needs to fetch the data for this view, we would need to make the following modification:

This simplified example is straightforward. However, as the number of individual useSelector and useDispatch calls grows, so does the complexity of the changes needed to migrate the files that use them. But there were some cases where multiple calls to useSelector were organized in their own custom hook, which made refactoring a genuine pleasure.

On our BookList component, instead of this:

We would do this:

The BookList component remains unchanged and the changes to the useBookData custom hook are easy to track and reason about.

Next.js API Routes Enable Consistency and Superior Error Handling

This specific project pulls data from several different services, each with different API contracts. Typically, this would lead to inconsistent techniques and patterns for sending and retrieving data. Thankfully, Next.js offers a solution via API Routes. By moving all network calls to these backend routes, we have complete control over how the frontend requests and receives data. This allows us to compose the exact shape of what is returned by the routes and offload expensive post-processing to server-side code. It enables us to adhere to consistent patterns regarding request payloads, responses, and error handling. Additionally, we can move the transmission of potentially sensitive data, such as an auth token, to backend code.

To illustrate this, let’s look at the detail view that’s linked to from the BookList from the previous two examples:

Let’s say we need to query our own apis to fetch the book and similarTitles. And let's say we are using a third-party service for reviews that either return an array of Reviews or a single string that reads 'no reviews found'. Also, it requires that a secret token be included in the headers of the request.

This is the perfect opportunity to employ a server-side api route.

This allows us to avoid any strange race conditions during the data fetching, gives us the chance to handle the inconsistent data returned from the reviews service, ensures that we do not display incomplete data in the event of a network error, and keeps the secret token a secret.

We would then write up a custom hook that queries this new endpoint:

We would then modify the BookDisplay component to use the custom hook like this:

Wrapping Up: The Artisan Standard

The choice to adopt a micro-frontend project architecture has enabled our teams to work quickly and make significant changes without impeding other teams’ velocity. It gave us the opportunity to carefully evaluate the code powering the app and decide if we were happy with the developer experience of writing and maintaining it. We found areas where code was doing more than necessary (with help from TypeScript). We found ways to make our component files more refactorable (with help from custom hooks). And we established a consistent contract between the backend and frontend code, helping new developers get up to speed faster (with help from Next.js's api routes).

In closing, our transition from Redux to Next.js was more than just a technical endeavor—it was an opportunity to practice our commitment to one of Artisan Studios’ core values: handcrafting excellence in everything we do. As we reflect on the lessons learned and the challenges overcome, we invite you to share your own experiences and insights in the pursuit of excellence. Together, let's continue to reshape the landscape of possibility and elevate the art of software development to new heights of excellence.

Related Insights