1. Set Up Google Search Console (GSC)
Choose Your Property Type
● Domain Property (Recommended): Tracks all subdomains &
protocols (http, https, www, etc.)
● URL Prefix Property: Tracks specific folders or versions like https://www.example.com/blog
Verify Your Website
● For Domain Property: Use DNS TXT Record from your registrar (e.g.,
GoDaddy, Namecheap)
● For URL Prefix: Choose one:
○ HTML file upload to root
directory
○ HTML meta tag in header
○ Google Analytics or Google
Tag Manager
○ Domain name provider access
2. Configure Permissions
User Roles:
● Owner: Full control (can add users, configure settings)
● User (Full / Restricted): View or limited action access
● Associate: External service connection (e.g., GA, AIOSEO)
3. Submit and Manage Sitemaps
Submit a Sitemap
● Go to “Sitemaps”
● Submit URL like /sitemap.xml or /sitemap_index.xml
If Status ≠
"Success":
● “Has Errors”: Fix broken links or sitemap format
● “Couldn’t Fetch”: Use URL Inspection tool to debug
4. Use the URL Inspection Tool
What You Can Do:
● Check index status, last
crawl, mobile usability, structured data
● Use “Test Live URL” to see how Googlebot renders it
● Click “Request Indexing” to push a page into the crawl queue
5. Analyze Site Performance (SEO)
Metrics in “Performance Report”:
● Clicks: How many users clicked from search results
● Impressions: Times a page appeared in search
● CTR: Click-through rate
● Average Position: Ranking for keywords
Tips:
● Improve low CTR pages with
better titles/meta descriptions
● Focus on keywords with high
impressions but low position (striking distance)
6. Fix Indexing Issues
Page Indexing Report:
● Found under Indexing → Pages
● Shows which pages are Indexed / Not Indexed
Common Problems:
● 404 or Soft 404
● "Crawled – currently not
indexed"
● "Discovered – currently
not indexed"
● "Blocked by
robots.txt"
● Fix and click “Validate Fix” to request reindexing
7. Use Enhancements Report (Rich
Snippets)
Structured Data Types Tracked:
● FAQs
● Breadcrumbs
● Product info
● Events, Jobs, Recipes
Fix Any “Invalid Items”:
● Use Schema Validator or Rich
Results Test Tool
8. Review Core Web Vitals (CWV)
Key Metrics:
● LCP (Largest Contentful Paint) – loading speed
● INP (Interaction to Next Paint) – response to user actions
● CLS (Cumulative Layout Shift) – layout stability
Poor
scores = lower rankings. Fix using PageSpeed Insights or Lighthouse.
9. Check Page Experience Report
Tracks:
● HTTPS
● Mobile usability
● Core Web Vitals performance
● % of URLs with good user
experience
10. Monitor Security and Manual
Actions
Go to:
● Security Issues: Hacked/malware detection
● Manual Actions: Penalties for spam, unnatural links, cloaking
If present, resolve the issue & click “Request Review”
11. Optimize Links Using Link Report
View:
● Top linking external domains
● Top linked pages (internal
and external)
● Anchor text used
Strategy:
● Improve underlinked important
pages (internal linking)
● Outreach to boost
high-authority external backlinks
12. AMP and Shopping Reports
AMP (Accelerated Mobile Pages)
● Shows Valid / Invalid AMP
issues
● Fix invalid AMP using
Google’s AMP validator
Shopping:
● Product snippets
● Merchant listings
● Shopping tab errors
Apply the proper schema for product, price, and availability
13. Troubleshoot Crawl & Indexing
Issues
Go to Settings > Crawl Stats
● Check:
○ Fetch errors (robots.txt,
DNS, server)
○ Response status codes (404s,
5xxs)
Fixes:
● 404s: Add redirects or restore page
● 500 errors: Resolve server downtime
● Blocked scripts: Unblock JS/CSS files in robots.txt
14. Use GSC for Advanced SEO Tactics
Strategies:
● Striking Distance Keywords (position 5–15): Optimize content to
push to the top 3
● Monitor keyword decay: Improve content for dropping
queries
● Export bulk GSC data to BigQuery for advanced analysis
● Integrate with Screaming Frog, Ahrefs, and Semrush for deeper insights
15. 21 Growth Tips from WPBeginner
(Quickfire)
- Add & verify your
site (HTML tag or plugin like AIOSEO)
- Submit sitemap
- Connect to Google Analytics
- Fix 404, Soft 404, and server
errors
- Use “Validate Fix” after
resolving issues
- Spot keywords with high
impressions but low clicks
- Find underlinked internal pages
- Get backlinks from “Top Linking
Sites” list
- Track content decay and keyword
drops
- Use Search Statistics
inside WordPress with AIOSEO
Final Bonus Tools
● Use AIOSEO to sync GSC into your WordPress dashboard
● Monitor:
○ “Top losing” and “Top
winning” keywords
○ Last updated content
○ Content needing refresh
(content decay tracking)
In Details
GOOGLE SEARCH CONSOLE: DETAILED GUIDE
(Step by Step)
1. What is Google Search
Console (GSC)?
Google Search Console is a free tool from Google
that helps website owners:
● Monitor how their site
performs in Google Search
● Submit and track URLs
● Fix crawl and indexing issues
● Improve SEO and Core Web
Vitals
● Discover penalties or hacking
attempts
It's
not for live traffic data (use Google Analytics for that), but it's essential for SEO monitoring and fixing
visibility issues.
2. How to Set Up GSC (with
All Methods)
Choose Property Type
|
Option |
Description |
|
Domain
Property |
Tracks
everything: all subdomains, HTTP & HTTPS versions together |
|
URL
Prefix Property |
Tracks
only that exact URL (https://example.com/blog/, not http:// or www) |
Verification Methods (Based on Property Type)
|
Method |
Best For |
How |
|
DNS
Record |
Domain
Property |
Add
TXT record to domain DNS (e.g. Namecheap, Cloudflare) |
|
HTML
File |
URL
Prefix |
Upload
file to root of site (e.g. example.com/google123.html) |
|
Meta
Tag |
URL
Prefix |
Add
meta tag inside <head> |
|
Google
Analytics |
If
GA is already installed |
Auto-verifies
if you’re admin |
|
Google
Tag Manager |
For
GTM users |
Auto-verifies
if container is present |
Tip: Domain property is best for
long-term SEO tracking — no missing data.
3. Submitting a Sitemap
What Is It?
A
sitemap is a file (usually sitemap.xml) that tells Google:
● What pages exist
● How often they update
● Which pages are important
Steps:
- Go to “Sitemaps”
- Enter: https://example.com/sitemap.xml
- Click Submit
Important:
● A “Success” message = Sitemap
was found and readable.
● “Couldn't fetch” = Check URL
or server response (use sitemap validator).
4. URL Inspection Tool (Your
Indexing Microscope)
Use
this to check if a specific page:
● Is indexed or not
● Was crawled and when
● Is mobile-friendly
● Has structured data errors
Steps:
- Paste any URL into the
top bar
- Hit “Enter”
- Read the index status
Actions:
● Click “Test Live URL” to see real-time result
● Click “Request Indexing” if content is updated or newly published
Useful
when:
● Your page isn’t showing in
Google
● You fixed an error and want
fast re-checking
5. Performance Report –
Search Analytics
This
is gold for keyword data and CTR
optimization.
Metrics Explained:
|
Metric |
Meaning |
|
Total
Clicks |
How
many searchers clicked your site |
|
Total
Impressions |
How
many times you appeared in search |
|
Average
CTR |
Clicks
÷ Impressions (helps spot weak titles/descriptions) |
|
Average
Position |
Your
ranking (1 = top of Google) |
How to Use It:
● Filter by:
○ Device (Mobile/Desktop)
○ Country (US, BD, etc.)
○ Page (URL)
○ Query (keyword)
● Sort
by “Impressions” → find high-volume keywords you're ranking for
● Improve title/meta of pages
with high impressions but low CTR
6. Page Indexing Report
Go
to: Indexing > Pages
This tells you:
● Which URLs are indexed
● Which aren't, and why
Common Problems & Fixes:
|
Issue |
What It Means |
Fix |
|
Crawled
– currently not indexed |
Google
visited but didn’t index |
Improve
content quality & internal links |
|
Discovered
– currently not indexed |
Google
found the URL but didn’t crawl yet |
Ensure
it’s linked from other pages, add to sitemap |
|
Soft
404 |
Page
says it's there but Google sees no useful content |
Add
proper content or return a real 404 |
|
Blocked
by robots.txt |
URL
disallowed in robots.txt |
Unblock
if you want indexing |
|
Alternate
page with canonical |
Google
picked another version of the page |
Double-check
your canonical tags |
Always
use the “Validate Fix” button after
solving issues.
7. Enhancements Section (Rich
Snippets / Schema)
See reports for:
● FAQ pages
● Breadcrumbs
● Product reviews
● Recipes
● Events
If
schema markup is incorrect, it will show errors.
Use tools like:
● Schema Markup Validator
Tip:
Always fix warnings — they may prevent you from getting rich results (stars,
pricing, FAQs).
8. Core Web Vitals (Page
Experience Report)
Found
under: Experience > Core Web Vitals
Metrics:
|
Metric |
Meaning |
Target |
|
LCP |
Time
to load biggest element |
<
2.5 seconds |
|
INP
(New!) |
Interaction
to Next Paint |
<
200ms |
|
CLS |
Page
layout shifting |
<
0.1 |
If
URLs fail CWV, Google may reduce their visibility — even if content is great.
Tools to Fix:
● Google PageSpeed Insights
● Chrome Lighthouse
● Web.dev/measure
9. Mobile Usability Report
Shows mobile-specific problems:
● Text too small
● Tap targets too close
● Content wider than screen
Fix using:
● Responsive CSS
● Larger font sizes
● Mobile-first design approach
10. Security & Manual
Actions
Found under:
● Manual Actions = Google penalties (e.g., spammy backlinks, cloaking,
user-generated spam)
● Security Issues = Hacked site, malware, phishing
If
you get a penalty:
- Fix the issue
- Click “Request Review”
with a detailed explanation
11. Link Reports (External +
Internal)
You can view:
● Most linked pages
● Top linking domains
● Anchor texts
● Internal link structure
Use this to:
● Find and strengthen
underlinked pages
● Discover toxic or low-quality
backlinks
● Balance internal link
distribution
12. AMP Report (If Used)
For sites using AMP:
● Check valid AMP URLs
● Resolve AMP-specific issues
(like invalid JS or missing metadata)
AMP is not required for SEO anymore, but still useful in news/carousels.
13. Product & Merchant
Center Reports
For ecommerce sites using schema:
● Shows issues with price,
availability, image, reviews
● Helps you qualify for Google
Shopping and organic product listings
14. Advanced Use Cases
Track Content Decay:
● Export keyword data
● Compare last 28 vs previous
28 days
● Identify
dropping queries → refresh or rewrite content
Striking Distance Keywords:
● Position 5–15 queries
● Optimize page to move to top
3
Combine with Google Sheets/Looker Studio:
● Create custom SEO dashboards
● Automate alerts for ranking
drops
Final Master Checklist
|
Task |
What to Do |
|
Add
& verify GSC (domain property) |
Track
all subdomains |
|
Submit
sitemap |
Only
indexable URLs |
|
Monitor
Performance Report |
Find
high-impression, low-CTR pages |
|
Inspect
key URLs |
Request
indexing after major updates |
|
Fix
coverage issues |
Use
Page Indexing report |
|
Monitor
Core Web Vitals |
Prioritize
pages failing LCP, INP |
|
Watch
for manual/security issues |
Clean
fast & request reconsideration |
|
Improve
structured data |
Use
schema validator, fix warnings |
|
Check
internal linking |
Find
orphan or underlinked pages |
|
Track
backlinks |
Disavow
if needed (rare now) |
Google's Crawl Stats Report –
A Guide to Monitoring Your Site’s Crawlability
WHAT IS THE GOOGLE CRAWL
STATS REPORT?
This
is a Google Search Console feature
that shows how Googlebot crawls your website over time — including frequency,
server response, types of resources crawled, and any crawlability issues.
Useful
especially for large sites (1,000+
pages) or those with complex architecture, content updates, or indexing
issues.
STEP-BY-STEP CHECKLIST FOR USING
& INTERPRETING THE CRAWL STATS REPORT
1. Access the Report in GSC
Path:
Go to GSC → Settings
→ Click "Crawl Stats" under “Crawling”
You'll
see key metrics and graphs such as:
● Total Crawl Requests
● Download Size
● Response Time
● Host Status
● Breakdown by Purpose, File
Type, Bot Type
2. Understand the Main
Metrics
a) Total Crawl Requests
● Total hits Google made to
your domain (HTML, CSS, images, JS, etc.)
● Each individual request is
counted
● Important for monitoring crawl budget
High requests = Google is actively
crawling
Low or dropping requests = You may
have crawl issues
b) Total Download Size
● Shows the amount of data (in bytes) Google
downloads during crawling
● Includes HTML, CSS, images,
scripts, etc.
Large sizes = bloated pages
Minimize via image compression, removing unused JS/CSS, reducing file size
c) Average Response Time
● Time your server takes to
respond to crawl requests
|
Time |
Meaning |
|
<
200ms |
Excellent |
|
200–500ms |
Acceptable |
|
>
500ms |
Needs
improvement |
Fast servers = faster indexing
Slow servers = crawl throttling
3. Check HOST STATUS
Google
checks for crawlability in 3 core areas:
|
Test |
What it Checks |
What to Fix |
|
robots.txt
availability |
Can
Google fetch it reliably? |
Avoid
blocking accidentally |
|
DNS
resolution |
Does
the domain resolve correctly? |
Check
DNS settings and uptime |
|
Server
connectivity |
Does
the server respond to requests? |
Fix
downtime or overload issues |
Look for Green ticks = Good
Red icons = Investigate further
4. Analyze CRAWL RESPONSES
Google classifies every response as:
Good Codes (Expected)
● 200: OK
● 301/308: Permanent redirect
● 302/307: Temporary redirect
● 304: Not Modified (Google uses cached)
Possibly Good Codes
● 404: Not found — acceptable in some cases but excessive 404s = bad UX
● 410: Gone — intentional removals
Bad Codes
● 5XX: Server error – Google couldn’t access the page
● 401/407: Unauthorized access
● DNS Error: Domain can’t be resolved
● robots.txt not available: Google cannot crawl until it gets a response
● Redirect Loop, Page Timeout, or Fetch Error: Serious issues
Fix server, DNS, redirect chains, or missing content as needed.
5. Crawled File Types
This shows the percentage of
different resources crawled:
|
File Type |
SEO Use |
|
HTML |
Content
pages |
|
Image
/ Video |
Media-heavy
sites |
|
CSS
/ JS |
Design
+ interactivity |
|
JSON
/ XML / Feeds |
Data
feeds, structured content |
|
Other |
Fonts,
PDF, geographic data, etc. |
Watch
out for too many image/CSS/JS files
— these may hurt crawl budget or inflate crawl size.
6. Crawl Purpose
|
Type |
What it Means |
|
Discovery |
New
URLs never seen before |
|
Refresh |
Revisit
existing URLs to check for updates |
If discovery is low, and you
publish content regularly, Google may not find your new content.
Use updated XML sitemaps and internal links to help discovery.
7. Googlebot Type Breakdown
Shows
which bots are hitting your site:
|
Bot |
Purpose |
|
Smartphone / Desktop |
Primary
Googlebot agents |
|
Image / Video |
For
Google Images or Video search |
|
Page resource load |
For
rendering pages (CSS, JS, etc.) |
|
AdsBot |
For
Dynamic Search Ads |
|
StoreBot |
Shopping-related
crawl |
|
Other |
News
bot, Translate bot, etc. |
Normal
to see mostly “Smartphone” crawls.
If AdsBot is overwhelming your site,
limit ad targets or slow feed ingestion.
TROUBLESHOOTING CRAWL ISSUES
Crawl Rate Too High?
If
your server is overloaded:
● Block heavy bots using robots.txt temporarily
● Or serve HTTP 503 or 429 (too busy)
● Limit AdsBot if running
Dynamic Search Ads
● Once stabilized, remove
blocks
Crawl Rate Too Low?
If Google isn’t crawling enough:
● Ensure your sitemap is updated and submitted
● Improve internal linking
● Avoid noindex/nofollow on key pages
● Fix slow pages (Google slows
crawl if performance is poor)
● Remove robots.txt blocks for important assets
(CSS/JS)
Crawl Budget Waste?
Look for:
● Crawled-but-not-indexed pages
● Duplicate content
● Unnecessary parameters
● Tag/category pages (in CMS
like WordPress)
Fix using:
● Canonical tags
● Robots.txt
● Parameter handling in GSC
● Pruning low-value pages
Optimization Use Cases
Here’s how SEOs & devs use this
report in real-world scenarios:
|
Use Case |
Action |
|
New
content not getting indexed |
Check
discovery vs refresh ratio |
|
Server
overloaded |
Watch
crawl spike & response time |
|
Too
many 5XX or 4XX errors |
Fix
broken pages or server bugs |
|
Check
crawl on faceted nav |
Use
file type breakdown |
|
Detect
ads-related issues |
Watch
for excessive AdsBot requests |
|
Site
speed issues |
Use
average response time + download size |
FINAL RECOMMENDED CHECKLIST
|
Task |
Tool |
Goal |
|
Track
crawl spikes |
Crawl
Requests chart |
Spot
overload or surge in crawl |
|
Monitor
file sizes |
Total
Download Size |
Optimize
images, CSS, JS |
|
Audit
server health |
Avg
Response Time + Host Status |
Fix
downtime or DNS issues |
|
Fix
crawl errors |
Crawl
Responses |
Resolve
4xx, 5xx, robots.txt issues |
|
Ensure
new content is crawled |
Discovery
rate & Sitemap |
Ensure
indexing of latest content |
|
Clean
crawl waste |
File
Types + Robots.txt |
Block
unnecessary resources |
In Details
What Is Google’s Crawl Stats
Report?
The
Crawl Stats Report is a section in Google Search Console (GSC) that tells
you how Googlebot is crawling your site
— including:
● How often Google visits your
pages
● How large the downloaded
files are
● What response codes (200,
404, 500, etc.) your server is returning
● Whether your server is fast
enough for Googlebot
● Which types of files (HTML,
images, JS) are crawled
This data helps SEOs and developers
detect:
● Indexing issues
● Crawl budget waste
● Server overload
● Unseen technical problems
How to Access the Crawl Stats
Report
Steps:
- Go to Google Search Console
- Click on your verified property
- In the left menu, scroll to Settings
- Click “Crawl Stats”
From
here, you will see crawl data updated every
2–3 days for the last 90 days.
What You See in the Report
1. Total Crawl Requests
● How many files Googlebot
requested from your server (pages, scripts, images, etc.)
● Example: 200,000 requests in
30 days = high activity
Use
it to understand how often Google crawls your content.
2. Total Download Size
● The combined size (in MB or GB) of all resources Google downloaded
● Includes HTML, images,
videos, CSS, JS, and fonts
If
this is large, your pages are heavy. Compress images and minify JS/CSS to
reduce it.
3. Average Response Time
● How fast your server responds
to Googlebot’s requests
|
Response Time |
Meaning |
|
<
200ms |
Excellent |
|
200–500ms |
Acceptable |
|
>
500ms |
Google
may slow crawling due to slowness |
Tip:
High response time = slow indexing. Upgrade hosting or fix bottlenecks.
4. Crawl Requests Over Time (Graph)
Shows
spikes, drops, and patterns. Key things to look for:
● Spikes
→ Google found lots of new URLs (e.g., after a site migration)
● Drops
→ Server issues or robots.txt blocks
If
there’s a major crawl drop, check for:
● Site outages
● DNS problems
● Server throttling
Host Status: Is Your Server
Healthy?
There
are 3 tests under “Host Status”:
|
Test |
Meaning |
What to Check |
|
robots.txt fetch |
Can
Google access your robots.txt file? |
Make
sure it’s not blocked or slow |
|
DNS resolution |
Can
Google resolve your domain name? |
Check
your DNS configuration |
|
Server connectivity |
Can
Googlebot connect to your server reliably? |
Check
for 5xx errors or firewalls |
If
any show "Failed", Google will crawl less — and you might lose
rankings.
What Types of Files Is Google
Crawling?
Google
divides requests by file type:
|
Type |
Meaning |
|
HTML |
Web
pages (critical for SEO) |
|
CSS,
JS |
Styling
and interactivity |
|
Images |
Product
photos, banners, etc. |
|
Video |
Embedded
video content |
|
Other |
Fonts,
XML, etc. |
Tip:
● Too many requests to non-HTML files can waste crawl budget
● Ensure JS/CSS are
fast-loading and not blocking content rendering
Crawl Purpose: Discovery vs
Refresh
|
Type |
What It Means |
|
Discovery |
Google
found and crawled a new URL |
|
Refresh |
Google
re-crawled an existing page |
Use this to answer:
● “Is Google finding my new
pages?”
● “Are my old pages being
updated regularly?”
If Discovery is low:
● Add links to new pages
● Include them in your sitemap
● Internally link from
high-authority pages
Bot Type Breakdown
Googlebot
has multiple “agents” — the Crawl Stats report shows which ones visit your
site:
|
Bot |
Purpose |
|
Smartphone |
Mobile-first
indexing crawler (most important) |
|
Desktop |
Occasionally
crawled for desktop preview |
|
Image / Video |
For
Google Image or Video search |
|
AdsBot |
For
Google Ads & Shopping feeds |
|
Page Resource Loaders |
Crawl
for rendering (CSS, JS, fonts, etc.) |
If
Smartphone Googlebot is not your top
crawler, your site may have mobile issues.
Errors & Warnings You
Might See
Common HTTP Response Codes
|
Code |
Meaning |
Action |
|
200 |
OK |
Good |
|
301/308 |
Redirects |
Acceptable
if not chained |
|
404 |
Not
Found |
Remove
broken links |
|
500/503 |
Server
errors |
Fix
server or hosting issues |
|
403 |
Forbidden |
May
block indexing |
|
DNS errors |
Domain
not resolved |
Check
hosting or DNS records |
Excessive
errors = Google slows down or skips crawling your site.
Real SEO Use Cases
Use Case #1: Site Not Getting Indexed
● Check
Crawl Stats → Is “Discovery” happening?
● Use the URL Inspection Tool on problem URLs
● Add those URLs to your
sitemap + internal links
Use Case #2: Server Too Slow
● High “Response Time” in Crawl
Stats
● Fix:
○ Optimize server
○ Use CDN
○ Enable browser caching
Use Case #3: Crawl Budget Waste
● High requests for:
○ Tag pages
○ Filtered URLs (?color=red)
○ Unimportant media or JS
● Solution:
○ Block with robots.txt
○ Use noindex
○ Canonicalize duplicate pages
Use Case #4: Migration or Redesign
● Spike in crawl requests =
expected
● Monitor:
○ Errors (404s, 500s)
○ Slowdowns
○ Drops in discovery
If Google isn’t crawling your new URLs → Check internal links and XML sitemap
Final SEO Monitoring Checklist Using Crawl Stats
|
Task |
What to Look At |
Why |
|
Monitor
response time |
<
200ms |
Fast
= better crawling |
|
Track
file types |
Too
many JS/images? |
Crawl
budget waste |
|
Audit
HTTP codes |
404s/5xx
= bad |
Fix
broken URLs |
|
Compare
discovery vs refresh |
New
pages discovered? |
Helps
indexing new content |
|
Detect
crawl drops |
Sudden
dips |
Server
or robots.txt issues |
|
Clean
up bot noise |
AdsBot/Image
bot overloading? |
Optimize
or restrict |
|
Use
with GSC > Indexing report |
Combine
both views |
Full
crawl + index clarity |
