How to add robots.txt to Next.js Application

·

4 min read

In this article, you will learn about how you can add a robots.txt file to your next js app

The framework offers a lot of interesting alternatives to interest developers, whether it's the way it creates pages (statically or on server request) or changes them with IncrementalStatic Regeneration. One of the key advantages of Next.js over other frameworks such as Create React App is its SEO support, which stands out among all of its capabilities.Next.js has a variety of useful features. This

React is a great language for JavaScript developers, but it's terrible for SEO. The reason behind this is that React is rendered on the client side. When a user requests a page, instead of sending HTML, the server delivers JavaScript, which the browser then uses to generate the page.

As a result, the first page load time in a SPA (Single page development) is often longer than in a server-side rendered application. Furthermore, Google bots did not correctly crawl JavaScript for a long period.

Next.js addressed this issue by not only being based on React but also by providing developers with server-side rendering. This improved the migration of developers' apps.

A robots.txt file on your website is important for SEO. This post will explain what a robots.txt file is and how to add one to your Next.js application, which Next does not do by default.

What Is A Robot.Txt File?

robots.txt is a text file created by webmasters to guide web robots (usually search engine robots) on how to crawl pages on their website? The robots.txt file is part of the robots exclusion protocol (REP) , a set of online standards that govern how robots explore the web, access and index material, and serve it to people. The REP also contains directives such as meta robots and page-, subdirectory-, or site-wide instructions for how search engines should interpret links (such as "follow" or "nofollow").

You can disallow URLs like this:

User-agent: nameOfBot Disallow: /admin/

Or allow them like this:

User-agent: * Allow: /

You need also to include a line at the end of your file with the sitemap's destination, such as this:

Sitemap: http://www.domain.com/sitemap.xml

A sitemap is a file that contains information about your site's pages, videos, and other documents, as well as the relationships between them. This file is read by search engines like Google in order to crawl your site more efficiently.

Then, your robots.txt file should look like this:

User-agent: nameOfBot Disallow: /admin/ User-agent: * Allow: / Sitemap: http://www.domain.com/sitemap.xml

Adding A Robots.Txt File To Your Next.Js Application.

Before, you had to create a server.js file and a new path that pointed to the robots.txt file. But not in the most recent version! In the latest version of Next.JS, you can place your robots.txt file in the public directory. The public directory is intended to replace the static directory.

Everything in the public directory will be visible at the root domain level. So, instead of /public/robots.txt, the URL for the robots.txt file would be /robots.txt.

Dynamic Generation Is Another Alternative.

There is a technique to generate your robots.txt file dynamically. You may accomplish this by utilizing two Next.js features: API route and rewrites.

API routes may be defined using Next.js. This ensures that when an API endpoint is requested, you can return the correct data for your robots.txt file.

Create a robots.js file in your pages/api subdirectory to do this. This will automatically generate a route. Add your handler to this file, which will return the contents of your robots.txt:

export default function handler(req, res) {
res.send('Robots.txt data goes there'); // Send your `robots.txt data here
}

Unfortunately, this is only accessible via the Address /api/robots, and as previously stated, search engine crawlers will check for the /robots.txt url.

Thankfully, Next.js has a feature called rewrites. This enables you to redirect a certain target path to another. In this specific example, you want to redirect all /robots.txt queries to /api/robots.

To do so, open your next.config.js file and add the following rewrite:

/** @type {import('next').NextConfig} */ const nextConfig = { reactStrictMode: true, async rewrites() { return [ { source: '/robots.txt', destination: '/api/robots' } ]; } } module.exports = nextConfig

With this setup, whenever you browse /robots.txt, it will call /api/robots and display the message "robots.txt content goes there."

How To Validate Your Robots.Txt File?

Go to the Google Search robots.txt or Technicalseo tester to evaluate your robots.txt file.

Conclusion

SEO is important for sites that need to be found. Websites must be easily crawlable by search engine crawlers in order to have a good page ranking. Next.js facilitates this for React developers by providing built-in SEO support. This includes the option to quickly add a robots.txt file to your project.

You learned what a robots.txt file was, how to add it to your Next.js application, and how to validate it when your app is deployed in this tutorial.

Read More