Table of Contents
Your sitemap is Google's roadmap to your website. Submit it correctly in Google Search Console and Google knows exactly which pages to crawl and index. Get it wrong — or ignore it entirely — and you'll spend months wondering why your best content isn't ranking.
New to Google Search Console? Start with The Ultimate Google Search Console Guide before diving into sitemaps.
This guide covers everything: how to find and submit your sitemap, how to diagnose sitemap errors in GSC, and how to fix the most common issues including the dreaded "couldn't fetch" and "could not be read" errors.
What Is a Sitemap and Why Does It Matter?
A sitemap is an XML file that lists every URL on your website, along with optional metadata like last-modified dates and change frequency. It's not required for Google to crawl your site — Googlebot finds pages by following links — but it's essential for:
- Large sites where some pages have few or no internal links
- New sites with little external link authority
- Sites with rich media (images, video, news) that benefit from specialized sitemaps
- Pages updated frequently where you want Google to re-crawl quickly
Without a sitemap submitted in Google Search Console, your most important pages might sit unindexed for weeks.
How to Find Your Sitemap URL
Before submitting, you need to know where your sitemap lives. Most platforms generate one automatically:
| Platform | Default Sitemap URL |
|----------|---------------------|
| WordPress (Yoast) | /sitemap_index.xml |
| WordPress (All in One SEO) | /sitemap.xml |
| Shopify | /sitemap.xml |
| Squarespace | /sitemap.xml |
| Wix | /sitemap.xml |
| Next.js (next-sitemap) | /sitemap.xml |
| Hugo | /sitemap.xml |
| Custom/Static | Check your deployment config |
Quick test: Navigate to yourdomain.com/sitemap.xml in your browser. If you see an XML file listing URLs, that's your sitemap.
If you don't find it there, try:
yourdomain.com/sitemap_index.xmlyourdomain.com/post-sitemap.xml- Check your
robots.txtfile — well-configured sites list the sitemap URL there
How to Submit Your Sitemap to Google Search Console
Step 1: Open Search Console
Go to Google Search Console and select your property. If you haven't verified your site yet, you'll need to do that first.
Step 2: Navigate to Sitemaps
In the left sidebar, under Indexing, click Sitemaps.
Step 3: Submit Your Sitemap URL
- In the "Add a new sitemap" field, enter your sitemap path (just the path, not the full domain — GSC adds the domain automatically)
- For most sites, type:
sitemap.xml - Click Submit
Google will immediately attempt to fetch your sitemap. Within a few minutes, you'll see it listed in the Submitted Sitemaps table with a status of "Success" or an error.
Step 4: Monitor the Status
After submission, the Sitemaps report shows:
- Last read: When Google last fetched your sitemap
- Status: Success, Has errors, or Couldn't fetch
- Discovered URLs: How many URLs Google found in the sitemap
- Indexed: How many of those URLs are actually indexed (may differ from discovered)
Pro tip: If you have a sitemap index file (one sitemap that links to multiple sub-sitemaps), submit the index file URL — GSC will automatically detect and process all sub-sitemaps.
Understanding Sitemap Status in GSC
✅ Success
Green checkmark. Google fetched and parsed your sitemap without errors. This is what you want.
⚠️ Has Errors
GSC fetched the sitemap but found issues inside it — invalid URLs, malformed XML, or URLs blocked by robots.txt. Click through to see which specific URLs have problems.
❌ Couldn't Fetch (Most Common Error)
Google tried to download your sitemap file but couldn't retrieve it. This is the most common sitemap error and has several causes (see fixes below).
Fixing "Couldn't Fetch" Sitemap Errors
The "couldn't fetch" error means Googlebot couldn't download your sitemap file. Here's why it happens and how to fix each cause:
Cause 1: Sitemap URL Doesn't Exist
The most common cause. You submitted sitemap.xml but the actual file is at sitemap_index.xml.
Fix: Navigate to the URL directly in your browser. If it 404s, find the correct path and resubmit.
Cause 2: Robots.txt is Blocking Googlebot
Your robots.txt file may have a Disallow: / rule or be blocking the sitemap specifically.
Fix: Check yourdomain.com/robots.txt and look for any rules that might block Googlebot from accessing your sitemap. Your robots.txt should include:
Sitemap: https://yourdomain.com/sitemap.xml
Cause 3: Server Returns a Non-200 Status Code
Your sitemap URL might redirect (301/302), require authentication (401), or return a server error (500).
Fix: Use a tool like curl -I yourdomain.com/sitemap.xml to check the HTTP status code. The sitemap must return a 200 OK — no redirects, no auth walls.
Cause 4: Server is Down or Slow
If Googlebot tried to fetch your sitemap during a brief outage or when your server was under load, it may log a "couldn't fetch" error.
Fix: Resubmit the sitemap in GSC to trigger a fresh fetch attempt. If it was a one-time issue, it should succeed on the next try.
Cause 5: Firewall or Bot Protection Blocking Googlebot
Some WAF (Web Application Firewall) rules or bot-protection services accidentally block Googlebot.
Fix: Check your Cloudflare, Sucuri, or hosting firewall settings. Googlebot should be whitelisted explicitly. You can verify Googlebot's identity by checking the user-agent: it's Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html).
Fixing "Sitemap Could Not Be Read" Errors
This error means Google fetched the sitemap file but couldn't parse it. Common causes:
Invalid XML Format
Your sitemap file has malformed XML — unclosed tags, invalid characters, or encoding issues.
Fix: Validate your sitemap at xml-sitemaps.com/validate-xml-sitemap.html or run it through an XML validator. Look for:
- Special characters (
&,<,>) in URLs not properly encoded - UTF-8 BOM (byte order mark) at the start of the file
- Unclosed tags or broken XML structure
URLs Contain Invalid Characters
URLs in your sitemap must be properly encoded. Spaces, accented characters, and special symbols must be URL-encoded.
Fix: Use your sitemap generator's settings to enable URL encoding, or manually fix the problematic URLs. The most common offender is & in query strings — it should be & in XML.
File Too Large
Google limits sitemap file size to 50MB uncompressed and 50,000 URLs per sitemap. Exceeding either limit causes parse failures.
Fix: Split your sitemap into multiple files and create a sitemap index file that references them all.
Why Discovered URLs ≠ Indexed URLs
One of the most confusing GSC sitemap metrics: your sitemap says 500 URLs discovered, but only 200 are indexed.
This gap is normal and expected. Here's why Google doesn't index every URL in your sitemap:
- Thin or duplicate content — Google determines the page doesn't add unique value
- Crawl budget limitations — Large sites get crawled gradually; not everything gets indexed immediately
- Noindex tags — Pages with
<meta name="robots" content="noindex">are excluded - Canonical issues — Google may index a different version of the page than the one in your sitemap
- Blocked by robots.txt — These shouldn't be in your sitemap at all
- New pages — Recently published pages that haven't been crawled yet
What to do: In GSC, use the URL Inspection tool on specific unindexed pages to see exactly why Google isn't indexing them. The inspection result will tell you the precise reason.
Sitemap Best Practices
Follow these rules to maximize sitemap effectiveness:
✅ Include only canonical, indexable URLs Don't include URLs you've marked noindex, URLs blocked by robots.txt, paginated pages (beyond page 1), or duplicate content. Google sees these as noise.
✅ Use lastmod accurately
Only update <lastmod> when content actually changes. If you update it every day for pages that haven't changed, Google learns to ignore it.
✅ Submit to Bing too Go to Bing Webmaster Tools and submit your sitemap there as well. About 6-7% of US searches happen on Bing — worth the 5-minute setup.
✅ Include your sitemap in robots.txt
Add Sitemap: https://yourdomain.com/sitemap.xml to your robots.txt. This allows any search engine crawler to discover it, even without a webmaster tools account.
✅ Ping Google after major updates After publishing significant new content or making bulk changes, you can request a re-crawl by visiting GSC → URL Inspection → Request Indexing for individual pages, or by resubmitting your sitemap. Once indexed, watch for pages ranking positions 4–20 — those are striking distance keywords worth optimizing.
Frequently Asked Questions
How long does it take Google to process a submitted sitemap?
Google typically begins processing within a few hours to a few days of submission. The "Last read" timestamp in your Sitemaps report will update when Google fetches it. Full indexing of all discovered URLs can take days to weeks depending on your site's authority and crawl budget.
Should I submit multiple sitemaps?
Yes, if your site has different content types. You can submit separate sitemaps for pages, posts, images, and videos — or use a sitemap index file that links them all. This gives GSC cleaner data and helps you monitor indexing by content type.
Do I need to resubmit my sitemap after adding new content?
No — if your sitemap is dynamically generated (which it should be on platforms like WordPress or Next.js), Google will re-fetch it on its regular crawl schedule. You only need to resubmit if you've fixed an error or added a completely new sitemap file.
Why does my sitemap show fewer indexed URLs than I expected?
Google indexes pages based on quality and uniqueness signals, not just because they're in a sitemap. Use URL Inspection in GSC to diagnose specific unindexed pages. Common fixes: improve thin content, fix canonical tags, ensure the page loads correctly for Googlebot.
Once pages are indexed, focus on improving CTR — see How to Fix Low CTR in Google Search Console for a step-by-step process.
Can having a sitemap hurt my SEO?
No — a sitemap is purely additive. In the worst case, Google ignores your sitemap and discovers pages through links instead. A bad sitemap (with blocked URLs, noindex pages, or broken XML) can cause confusion but won't directly penalize you — fix errors when you see them in GSC.
What's the difference between a sitemap index and a regular sitemap?
A regular sitemap lists URLs directly. A sitemap index is a sitemap that lists other sitemaps. Use a sitemap index when you have more than 50,000 URLs or want to organize URLs by type (pages vs. posts vs. products). Most platforms auto-generate the appropriate type.