Powering Your Fastify API with Mercurius: A Deep Dive

Powering Your Fastify API with Mercurius: A Deep Dive

You’ve chosen Fastify for its legendary performance and robust plugin architecture. Now, you want to add a GraphQL layer to your API. You face a choice: do you use a general-purpose GraphQL server that can be adapted to Fastify, or do you choose a tool built specifically for the job?

For a growing number of developers in the Fastify ecosystem, the answer is clear: Mercurius.

Mercurius is not just another GraphQL library; it is the official, high-performance GraphQL adapter for Fastify. It was built from the ground up to integrate seamlessly with Fastify’s core principles, resulting in a developer experience that is as elegant as it is fast. This article explores what makes Mercurius the idiomatic choice for bringing GraphQL to your Fastify application.

What is Mercurius and Why Does It Exist?

At its heart, Mercurius exists to answer one question: “What would a GraphQL server look like if it were designed to take full advantage of Fastify’s architecture?”

The result is a library that prioritizes two things above all else:

  1. Uncompromising Performance: It leverages Fastify’s highly optimized request and reply objects directly, avoiding unnecessary abstractions to process GraphQL queries at blistering speeds.
  2. Deep Integration: It doesn’t just “run on” Fastify; it becomes a natural part of it. It respects Fastify’s encapsulation model and hooks into its lifecycle, allowing you to use familiar patterns for both your REST and GraphQL endpoints.

The Core Features That Set Mercurius Apart

While many libraries can parse a GraphQL query, Mercurius’s power lies in its unique, Fastify-centric features.

1. Context is King: Direct Access to Fastify’s Core

This is arguably Mercurius’s most powerful feature. In any GraphQL resolver, you are given a context object containing request-specific information. In Mercurius, the context object is the Fastify Reply object.

This has profound implications:

  • You get immediate access to context.request, which holds all request data, including headers and the request-specific logger (context.request.log).
  • Any decorators you’ve added to the Fastify instance (e.g., fastify.db from a database plugin) are available on context.app.
  • Any information you add to the request in a hook (e.g., request.user from an authentication hook) is immediately available in your resolvers via context.request.user.

JavaScript

const resolvers = {
  Query: {
    me: async (_obj, _args, context) => {
      // The `context` is the Fastify Reply object
      const userId = context.request.user.id; // From an auth hook
      context.request.log.info(`Fetching user ${userId}`);

      // `context.app` is the Fastify instance
      const user = await context.app.db.users.findById(userId);
      return user;
    }
  }
}

2. Leveraging the Fastify Lifecycle with Hooks

Because Mercurius is a well-behaved Fastify plugin, you can use standard Fastify hooks to manage your GraphQL endpoint. The most common use case is authentication. You can protect your entire GraphQL API by simply adding an onRequest hook to the plugin’s registration.

JavaScript

// In app.js or a dedicated GraphQL plugin
fastify.register(async function (app) {
  // This hook runs BEFORE Mercurius processes the request
  app.addHook('onRequest', async (request, reply) => {
    // Your authentication logic here
    await request.jwtVerify(); 
  });

  app.register(mercurius, {
    // ... schema, resolvers, etc.
  });
});

This is a clean, reusable pattern that keeps your authentication logic separate from your business logic (the resolvers).

3. High-Performance Gateway & Federation

Mercurius is not just for monolithic APIs. It includes mercurius-gateway, a powerful tool that allows your Fastify app to act as a high-performance gateway in front of multiple downstream GraphQL services. It supports schema stitching and, crucially, Apollo Federation (v1 & v2), making it a viable, high-speed alternative to Apollo’s own gateway solutions.

4. Just-In-Time (JIT) Compilation

To squeeze out every last drop of performance, Mercurius compiles your GraphQL resolvers into highly optimized functions Just-In-Time. This gives it a significant performance advantage over libraries that must interpret the query and schema on every single request.

A Practical Look: Building a Simple API

Let’s see how these pieces fit together in a practical plugin (src/plugins/graphql.js).

  1. Installation: Bashnpm install mercurius graphql
  2. The Plugin: JavaScriptimport fp from 'fastify-plugin'; import mercurius from 'mercurius'; async function graphqlPlugin(fastify, options) { const schema = ` type User { id: ID! name: String! } type Query { user(id: ID!): User } `; // Mock database const users = { '1': { id: '1', name: 'Štěpán' }, '2': { id: '2', name: 'Eliáš' }, }; const resolvers = { Query: { user: async (_obj, { id }, context) => { context.request.log.info(`Fetching user with id: ${id}`); if (!users[id]) { throw new mercurius.ErrorWithProps('User not found', { code: 'USER_NOT_FOUND', statusCode: 404, }); } return users[id]; }, }, }; fastify.register(mercurius, { schema, resolvers, // Enables the fantastic in-browser IDE for development graphiql: true, }); } export default fp(graphqlPlugin);

This self-contained plugin can now be loaded by Fastify’s autoload mechanism. When you run your server, you can navigate to /graphiql to use the interactive IDE to test your queries.

Conclusion: The Natural Choice for Fastify

While other libraries like @apollo/server-fastify are excellent and can certainly get the job done, Mercurius offers something more: a truly cohesive experience. It embraces the Fastify philosophy, extending its patterns of performance and predictability into the world of GraphQL.

By choosing Mercurius, you aren’t just adding a library; you’re adopting a tool that was meticulously crafted to feel like a natural, integrated part of the framework you already chose for its speed and structure. If you are building a GraphQL API with Fastify, Mercurius should be at the top of your list—it’s the idiomatic, powerful, and performant choice.

Comments

No comments yet. Why don’t you start the discussion?

    Leave a Reply

    Your email address will not be published. Required fields are marked *