Table of Contents
When a new page isn't showing up in Google, or a recently updated post isn't reflecting your changes, the URL Inspection Tool in Google Search Console is your first stop. It tells you exactly how Google sees any URL on your site — whether it's indexed, when it was last crawled, what canonical URL Google chose, and why a page might be excluded from search results.
This guide covers everything: how to use the tool, how to submit URLs to Google for indexing, how to diagnose and fix common indexing problems, and how to use it as a real-time Google index checker.
What Is the URL Inspection Tool?
The URL Inspection Tool (formerly called "Fetch as Google" or "Fetch and Render") shows you Google's indexed version of any URL in your property. When you inspect a URL, GSC returns:
- Index status — Is the page indexed? Is it the canonical version Google chose?
- Last crawl date — When did Googlebot last visit this page?
- Crawled page screenshot — What did Google actually see when it rendered the page?
- Discovered canonicals — What canonical URL did you declare vs. what did Google select?
- Structured data — Is your schema valid? Any errors?
- Mobile usability — Is the page mobile-friendly per Google's assessment?
- HTTP response — What status code did Google receive?
This is different from simply checking if a URL appears in search results — it shows you Google's authoritative internal record for that URL.
How to Access the URL Inspection Tool
- Open Google Search Console
- Select your property (make sure you're in the right property)
- Click URL Inspection in the left sidebar
- Type or paste any URL from your site into the inspection bar at the top
- Press Enter or click the search icon
You can also inspect any URL from within GSC reports by clicking on a URL row — an Inspect URL icon appears in the top-right corner of most detail panels.
Tip: The URL must belong to your verified property. If you inspect a URL from a different domain or subdomain, GSC will prompt you to switch properties.
How to Submit a URL to Google for Indexing
Submitting a URL to Google tells Googlebot to crawl and re-index that specific page. This is useful for:
- New pages — Freshly published content that isn't indexed yet
- Updated pages — Significant edits (new sections, title changes, date updates)
- Fixed pages — After resolving a crawl error, noindex tag, or redirect loop
- High-priority pages — Pages you need indexed quickly (product launches, time-sensitive content)
Step-by-Step: Request Indexing
- Open the URL Inspection Tool and inspect your URL
- Review the status — if it says "URL is not on Google" or shows a stale crawl date, proceed
- Click the "Request Indexing" button (blue button at the top of the results)
- Wait for the confirmation dialog — it will say the request was added to the priority crawl queue
- Close the dialog
What happens next: Google adds your URL to a priority crawl queue. Indexing typically happens within a few hours to a few days for new pages, and faster for established sites. You can re-check the URL after 24–72 hours to confirm the index status updated.
Important: The Request Indexing button is rate-limited. As of 2024, Google limits each property to approximately 10–12 indexing requests per day via the tool. For bulk submissions, use a sitemap instead.
Alternative: Submit a Sitemap
For new sites or bulk indexing needs, submitting your sitemap is more efficient than requesting individual URLs:
- Go to Sitemaps in the left sidebar
- Enter your sitemap URL (typically
yourdomain.com/sitemap.xml) - Click Submit
Google will crawl your sitemap and discover all URLs. This doesn't guarantee immediate indexing but signals all pages as crawlable.
How to Use URL Inspection as a Google Index Checker
Want to know if a specific page is indexed? The URL Inspection Tool is the most accurate way to check — more reliable than site: searches, which don't show every indexed URL.
Index Status Messages
| Status | Meaning | Action Needed | |--------|---------|---------------| | URL is on Google | Page is indexed and eligible for search | None — page is live | | URL is not on Google | Not indexed | Check the reason below | | URL is on Google, but has issues | Indexed but has warnings | Review warnings (e.g., schema errors) | | Excluded | Intentionally or accidentally excluded | Review specific exclusion reason |
Common "Excluded" Reasons and Fixes
Crawled — currently not indexed Google crawled the page but chose not to index it. Usually a quality or duplication signal.
- Fix: Improve content depth, ensure the page has unique value, remove thin/duplicate content
Discovered — currently not indexed Google found the URL (via sitemap or links) but hasn't crawled it yet.
- Fix: Request indexing, improve internal linking to the page, verify crawl budget isn't exhausted
Blocked by robots.txt
Your robots.txt file is preventing Googlebot from accessing this URL.
- Fix: Check
/robots.txtfor aDisallowrule covering this path; remove or adjust it
noindex tag detected
A <meta name="robots" content="noindex"> or X-Robots-Tag: noindex header is on the page.
- Fix: Remove the noindex directive and request re-indexing
Canonical points to a different page Google selected a different URL as the canonical — often the result of duplicate content.
- Fix: Review your canonical tags; ensure the
rel=canonicallink points to the correct URL
Soft 404 The page returns a 200 status code but appears to have no content (empty results page, "no results found", etc.)
- Fix: Add meaningful content or return a proper 404/410 status
Redirect The URL redirects to another URL.
- Review: Is this redirect correct? If yes, the destination URL should be indexed instead
How to Get Google to Crawl Your Website
If Google isn't crawling your site regularly, these steps help:
1. Check Your Crawl Stats
Go to Settings → Crawl stats in GSC. This shows:
- Total crawl requests per day
- Total download size
- Average response time
- Crawl errors
A healthy site sees regular daily crawling. If crawl frequency dropped, check for server errors or dramatic response time increases.
2. Verify Your Sitemap Is Submitted and Indexed
In Sitemaps, your submitted sitemaps should show a "Success" status. If Google reports 0 URLs discovered from your sitemap, check:
- Is the sitemap URL accessible without authentication?
- Does it contain the correct URLs (not staging/dev URLs)?
- Is it properly formatted XML?
3. Build Internal Links to Key Pages
Pages with no internal links get crawled less frequently. Make sure every important page on your site has at least one internal link from another crawled page.
4. Fix Server Errors
If Googlebot encounters repeated 5xx errors, it will crawl your site less. In the Coverage report, filter by "Server error (5xx)" to see affected URLs.
5. Remove Crawl Budget Waste
Large sites may have crawl budget issues. Common budget wasters:
- Faceted navigation (e.g.,
/products?color=red&size=M) generating thousands of near-duplicate URLs - Session IDs in URLs
- Duplicate content across
wwwvs. non-wwwversions - Internal search result pages indexed by Google
Block these with robots.txt Disallow rules or noindex meta tags.
Interpreting the URL Inspection Results
The "Coverage" Section
The Coverage section is the most important part of the inspection result. It shows:
User-declared canonical: The rel=canonical tag in your page's <head>
Google-selected canonical: The URL Google actually uses in its index
If these don't match, Google has overridden your canonical declaration. This usually means Google found a "better" canonical (often due to duplicate content on multiple URLs). To reclaim control:
- Make your preferred URL the most internally linked version
- Ensure your preferred URL loads fastest
- Verify the alternate URLs have proper canonicals pointing back to your preferred URL
The "Enhancements" Section
Shows structured data validity, mobile usability, and Core Web Vitals status for the inspected URL. Common issues:
- Structured data errors: Missing required fields → fix the schema on that specific page
- Mobile usability: Viewport not set, tap targets too small, content wider than screen
- Core Web Vitals: LCP too slow, CLS too high → performance work needed
The Crawled Page Screenshot
The rendered screenshot shows you what Google actually sees when it renders your page. Key things to check:
- Does the main content appear in the screenshot? (If not, it might be loaded via JavaScript that Googlebot can't execute)
- Do images appear? (Blocked images can be a signal issue)
- Is the page layout intact?
Bulk URL Submission: Using the Indexing API
For sites that need to index many URLs quickly (news sites, job boards, ecommerce), Google offers the Indexing API — a programmatic way to notify Google when URLs are added or removed.
The Indexing API is officially documented for job posting and livestream pages, but many SEOs use it for broader indexing requests. You'll need:
- A Google Cloud project with the Indexing API enabled
- A service account with "Owner" permissions on your GSC property
- API calls to
https://indexing.googleapis.com/v3/urlNotifications:publish
For most content sites, the URL Inspection Tool + sitemap submissions are sufficient. The Indexing API is only worth the setup complexity for high-velocity sites publishing 50+ new URLs per day.
URL Inspection Tool Limitations
| Limitation | Workaround | |------------|------------| | Rate limit: ~10 requests/day | Use sitemaps for bulk submissions | | Not real-time | Crawling happens within hours to days after request | | Can't inspect external URLs | Only works for verified properties | | Screenshot may differ from live site | Check live test (green "Test Live URL" button) | | No bulk inspection | Use Coverage report for site-wide view |
Test Live URL vs. Cached Version
At the top of inspection results, you can toggle between Google's cached version (what's actually indexed) and Live Test (what Google would see if it crawled right now). Use Live Test after making page changes to confirm your updates are visible before requesting re-indexing.
Quick Reference: URL Inspection Workflow
Use this workflow when a page isn't appearing in search:
1. Open URL Inspection → inspect the specific URL
2. Status = "URL is on Google"?
→ YES: Page is indexed. Issue is ranking, not indexing.
→ NO: Continue below ↓
3. Check "Why URL is not on Google":
- robots.txt blocked? → Fix robots.txt, re-test
- noindex tag? → Remove noindex, request indexing
- Soft 404? → Add content or return proper 404
- Crawled not indexed? → Improve content quality
- Discovered not indexed? → Request indexing + add internal links
4. Check canonical:
- User-declared ≠ Google-selected? → Fix canonical tags
5. Request Indexing → wait 24-72h → re-inspect
Frequently Asked Questions
How long does it take for Google to index a URL after I request indexing?
After clicking "Request Indexing," most URLs are crawled within a few hours to 3 days. For new sites with low crawl frequency, it can take up to a week. Sites that publish content regularly tend to get faster re-crawls. Check back after 24 hours using the URL Inspection Tool to see if the crawl date updated.
How many URLs can I submit via the URL Inspection Tool?
Google rate-limits Request Indexing to approximately 10–12 requests per property per day via the URL Inspection Tool interface. For more URLs, submit or resubmit your sitemap (no per-day limit), or use the Indexing API for programmatic bulk submissions.
Is the URL Inspection Tool the same as "Fetch as Google"?
Yes. "Fetch as Google" was the old name in Google Webmaster Tools. It was renamed and upgraded to "URL Inspection Tool" in 2018. The Request Indexing functionality replaces the old "Fetch and Render" + "Submit to Index" workflow.
Why does Google select a different canonical than the one I declared?
Google may override your rel=canonical if it finds stronger signals pointing to a different URL. Common reasons: another version (e.g., http vs. https, www vs. non-www) has more backlinks or internal links; duplicate content exists across multiple URLs; your canonical tag is on a redirected or unavailable URL. Fix by making your preferred URL the strongest signal in all dimensions.
Can I use the URL Inspection Tool to check if my competitor's pages are indexed?
No. The URL Inspection Tool only works for URLs within your verified GSC property. For competitive analysis, use a site:competitor.com/specific-url search as a rough check, or use paid tools like Ahrefs or Semrush that crawl external domains.
My page shows "URL is on Google" but I can't find it in search results. Why?
Being indexed doesn't mean ranking. "URL is on Google" confirms the page is in Google's index and eligible to appear — but Google may rank it on page 10 or beyond. Use the Performance report to check if the page has any impressions and what queries it's appearing for. Low impressions usually mean a ranking issue, not an indexing issue.
For the full picture of your site's health in GSC, see: Core Web Vitals in Google Search Console — how to fix LCP, INP, and CLS and Index Coverage Errors — how to fix every error type.