Fix: Google Search Console "Couldn't Fetch Sitemap" on Next.js (App Router)
Updated on October 10, 2025
Next.js sitemap troubleshooting guide for Google Search Console
If you’ve been pulling your hair out seeing the “Couldn’t fetch sitemap” error in Google Search Console for your Next.js project, you’re not alone. This frustrating Google Search Console sitemap error can prevent Google from properly indexing your site, hurting your visibility. Many developers using the Next.js App Router have encountered this Next.js sitemap troubleshooting challenge, where GSC fails to read the sitemap, even when it appears to be perfectly accessible in the browser.
Fortunately, the developer community has tracked down a reliable solution. This guide walks you through the exact steps to fix the “Couldn’t fetch” error, based on a helpful discovery shared in a Next.js GitHub issue.
Let’s get your sitemap indexed and your site back on track. For more insights and additional discussion, you can also check out my reply on the GitHub issue here.
The Problem: Why Can’t Google Fetch the Sitemap?
The core of the issue seems to be a combination of how Next.js handles routing and middleware, and how Google Search Console caches its fetch attempts. When GSC tries to access your sitemap.xml, it might encounter an interference—often from middleware—that causes the request to fail. Once it fails, GSC seems to cache that failed result for that specific URL. Even after you fix the underlying problem, simply resubmitting the same sitemap.xml URL might not trigger a new attempt.
How to Fix the “Couldn’t Fetch” Error
This solution comes from GitHub user @segabrielcarvalho, who shared a practical approach that addresses caching, routing, and middleware conflicts with four key steps.
Step 1: Force a Fresh Sitemap Fetch with a Trailing Slash
This is the simplest but most crucial step. Google Search Console appears to cache failed fetch attempts aggressively. To force it to try again with a clean slate, you can use a simple cache-busting technique.
Instead of submitting https://your-site.com/sitemap.xml, add a trailing slash:
https://your-site.com/sitemap.xml/
Submitting this slightly different URL will bypass GSC’s cache and trigger a fresh fetch. This simple trick is often enough to get the sitemap read successfully after you’ve implemented the other fixes.
Step 2: Serve Your Sitemap from a Nested Route
Instead of placing your sitemap.ts file in the root of your app directory, move it into a nested folder. This helps to isolate it and ensure it’s served correctly without conflicts.
- Create a new
sitemapfolder inside yourappdirectory. - Move your
sitemap.ts(orsitemap.js) file into it.
Your new file path will be: app/sitemap/sitemap.ts
This changes the public URL of your sitemap. The new URL will be:
https://your-site.com/sitemap/sitemap.xml
Remember to use this new URL when you submit it to Google Search Console (with the trailing slash trick!).
Step 3: Exclude SEO Files from Your Middleware
Middleware is powerful, but it can also accidentally block Googlebot from accessing important files like your sitemap and robots.txt. To prevent this, you need to update your middleware.ts to explicitly exclude these files.
The matcher configuration in your middleware file tells Next.js which paths the middleware should run on. By using a negative lookahead, you can instruct it to run on all paths except for static assets and SEO files.
Here is a middleware example that does nothing but correctly excludes the necessary files. You can adapt this for your existing middleware.
import type { NextRequest } from "next/server";
import { NextResponse } from "next/server";
export default function middleware(req: NextRequest) {
// Your middleware logic can go here.
// If you have no other logic, just return next().
void req;
return NextResponse.next();
}
export const config = {
matcher: [
/*
* Match all request paths except for the ones starting with:
* - _next/static (static files)
* - _next/image (image optimization files)
* - favicon.ico (favicon file)
* - robots.txt (robots file)
* - sitemap.xml (sitemap file)
* - sitemap/ (nested sitemap files)
* - site.webmanifest (web manifest file)
*/
"/((?!_next/static|_next/image|favicon\\.ico|robots\\.txt|sitemap\\.xml|sitemap/.*|site\\.webmanifest).*)",
],
};
This configuration ensures your robots.txt and both potential sitemap paths (/sitemap.xml and /sitemap/sitemap.xml) are never processed by the middleware, avoiding any potential blocks or redirects that could confuse Googlebot.
Step 4: Create a Correct robots.txt File
Finally, ensure you have a clean and correct robots.txt file in your app directory. This file should explicitly allow all user agents and point to your new sitemap URL. If you had a robots.txt in your public folder, it’s best to remove it to avoid conflicts and rely solely on the one generated from your app directory.
Create a file named robots.ts or robots.txt in your app directory with the following content:
import { MetadataRoute } from 'next'
export default function robots(): MetadataRoute.Robots {
return {
rules: {
userAgent: '*',
allow: '/',
},
sitemap: 'https://your-site.com/sitemap/sitemap.xml',
}
}
Or, if you prefer a static file:
User-agent: *
Allow: /
Sitemap: https://your-site.com/sitemap/sitemap.xml
Make sure to replace https://your-site.com with your actual domain. This file directs crawlers to your sitemap’s new location.
Key Takeaways
To summarize the fix:
- Use a trailing slash (
/) on your sitemap URL when submitting to GSC to bust the cache. - Move your sitemap file to a nested route like
app/sitemap/sitemap.ts. - Update your middleware to exclude SEO files like
robots.txtandsitemap.xml. - Point your
robots.txtto the new sitemap URL.
Also, it’s important to ensure you only have one source for your sitemap. If you have a sitemap.xml in your public folder and are also generating one from your app directory, conflicts can arise. Delete any old sitemap files and stick to a single, dynamically generated one.
By following these steps, you should be able to resolve the “Couldn’t fetch sitemap” error and get your Next.js site properly indexed by Google. Happy shipping