SEO Data is crucial for making business decisions about your website. The most important type of SEO data includes:
- Keyword intent (is the keyword transactional, informational, navigational, and commercial?)
- Click-through rate (the percentage of people that clicked through your data to a keyword result.
- Impressions: The number of times your content appeared in search results.
- Clicks: The number of times people clicked through to your website.
- Average Position: Where your website appeared in Google SERPs for a keyword.
Let's take a look at how you can gather SEO data to make decisions about your online marketing. All of our SEO services are ideal for helping you make the most out of your gathered SEO data.
What To Do If You Don't Have an Expensive Tool to Gather SEO Data
SEM tools are expensive. You will likely pay $300 per month or more to use SEMrush, Ahrefs, Majestic, or another SEO tool. There is a platform called Diib, which is an SEO data tool. There is a free version and a paid version. But if you want to gather SEO data, you can get everything you need from Google Search Console.
What Is Google Search Console?
Google Search Console is a free tool. It shows you all the keyword data about your website. You can also examine backlinks, see technical errors on your website, and use that data to improve your SERPs. Here is how to submit your site to Google Search Console so that you can see what you are doing and how you are doing online:
Key Features of Google Search Console for Gathering SEO Data
GSC offers a range of features that allow users to analyze their website's search performance effectively. Some of the most important metrics include:
- Search Impressions – This metric shows how often a website appears in Google search results, even if the user does not click on it.
- Clicks – GSC tracks how many users click on a website’s link in search results and visit the site.
- Search Queries – It provides insights into the keywords and search terms that drive users to the website.
- Page Performance – Users can see which pages receive the most clicks and impressions, helping them identify high-performing content.
Google Search Console’s Limitations For Gathering SEO Data
While GSC provides critical data about search visibility, it does not track user behavior once they land on the website. For example, it does not provide insights into how long visitors stay on a page, whether they interact with content, or whether they complete a desired action such as filling out a form. For these types of analytics, website owners need to use Google Analytics.
Comparing Google Search Console and Google Analytics
One of the key differences between GSC and GA is their focus. GSC primarily deals with search data before a user visits a website, while GA focuses on user interactions after they land on a site. Because of this distinction, GSC is the best tool for analyzing search rankings, keyword performance, and how often a site appears in Google Search.
Using Google Search Console Data in Google Analytics
To get a more comprehensive view of website performance, users can connect GSC with GA. This integration allows for deeper analysis of search data alongside user behavior metrics. By linking the two tools, users can:
- View Search Console reports directly within Google Analytics.
- Analyze search queries and landing pages driving organic traffic.
- Track conversions and goal completions linked to Google Search visitors.
How To Gather SEO Data With Google Sheets
Watch this video for the TLDR:
1. Navigate to the Performance Report in Google Search Console
- Log into Google Search Console and select the website (property) you want to analyze.
- In the left-hand menu, click on “Performance” (or “Search results” under “Performance”).
- By default, this report shows the last 3 months of search data, but you can adjust the time range as needed
2. Apply Filters to Refine Your Data (Optional)
Before exporting, you can customize your data using the available filters:
- Search Type: Choose between Web, Image, Video, or News search.
- Date Range: Select a predefined period (last 7 days, 3 months, etc.) or set a custom date range.
- Queries: Filter for specific search queries (keywords).
- Pages: View data for specific landing pages.
- Countries: Break down search performance by country.
- Devices: Analyze traffic from desktop, mobile, or tablet users.
- Search Appearance: Filter by rich results, videos, AMP, and more.
Applying filters before exporting ensures that your dataset is relevant and manageable.
3. Export the Data to Google Sheets
Once you’ve selected your filters and are ready to analyze your SEO performance:
- Click the “Export” button in the top right corner of the Performance report.
- Select “Google Sheets” from the export options.
- A new Google Sheet will open automatically with the exported data.
The exported sheet includes key SEO metrics:
- Queries (keywords)
- Pages (landing pages that receive traffic)
- Clicks (number of times users clicked your site in search results)
- Impressions (number of times your site appeared in search results)
- CTR (Click-Through Rate)
- Average Position (average ranking of your page for a given keyword)
4. Analyze and Customize the Data in Google Sheets
Once your data is in Google Sheets, you can perform further analysis:
- Sort and filter to find top-performing keywords and pages.
- Create pivot tables to compare CTR vs. Position or Clicks by Device.
- Use conditional formatting to highlight high and low-performing pages.
- Apply formulas to calculate growth trends over time.
For advanced automation, you can also connect the Google Search Console API to Google Sheets for real-time updates.
How to Use Your Gathered SEO Data
Keyword Optimization
To optimize a specific page using impressions, clicks, CTR, and average position from Google Search Console (GSC), start by identifying pages with high impressions but low clicks and CTR. A high number of impressions means the page is frequently appearing in Google search results, but if the clicks are low, it suggests that users are not finding the listing compelling enough to engage with. This could be due to an unoptimized title tag, meta description, or a lack of relevance to search intent. Improving these elements by making the title more engaging, adding power words, or ensuring the meta description clearly conveys value can help increase CTR and bring in more traffic without necessarily improving rankings.
Next, analyze the average position of the page to understand how close it is to the top search results. If the page is ranking in positions 5-10, it's getting visibility but likely losing clicks to higher-ranking competitors. Optimizing on-page content—by updating headings, improving keyword relevance, adding internal links, and enhancing content depth—can help push it higher. Additionally, comparing the targeted keywords to the actual search queries users are using can reveal gaps where content might need slight rewording or additional sections to better match user intent.
Finally, improving engagement metrics like dwell time and reducing bounce rate can also contribute to better rankings over time. If users are clicking but not staying long, the page may need better structuring, more engaging visuals, or improved readability. Another powerful approach is to add schema markup (such as FAQ schema) to enhance visibility in search results. In some cases, building backlinks to the page or internally linking from other high-authority pages can also boost rankings and increase clicks. By continuously monitoring GSC data and adjusting accordingly, you can refine your optimization strategy and significantly improve a page’s organic performance.
Backlink Optimization
Google Search Console’s Link Report provides valuable insights into the external websites linking to your site. It helps SEO professionals monitor their backlink profile and detect potential harmful or spammy links. One of the most effective ways to identify bad backlinks is by looking at domains that have an unusually high number of links pointing to your site. If a single site is sending hundreds or thousands of links, especially from irrelevant or low-quality domains, this could be a sign of spam, link manipulation, or a negative SEO attack.
To analyze potential bad backlinks, navigate to Search Console > Links > External Links and review the "Top Linking Sites" section. Here, check for suspicious patterns such as:
- Sites with low authority or unrelated content linking excessively.
- Domains that look like link farms or spam networks (e.g., random directory sites, private blog networks, or auto-generated sites).
- A high number of links coming from a single page rather than diverse, reputable sources.
- Sites that seem to have irrelevant anchor text or contain links in unnatural placements.
Once bad backlinks are identified, you have two primary actions: reach out to the site owner and request link removal, or use Google’s Disavow Tool to tell Google to ignore those links in ranking calculations. While Google’s algorithms are good at detecting and discounting spam links, excessive bad backlinks can still pose risks, potentially affecting your SEO performance or triggering manual penalties.
Internal Linking
Internal linking is a crucial SEO strategy that helps search engines understand the structure of your website and distribute ranking power across different pages. Google Search Console’s "Internal Links" report allows you to identify pages with a low number of internal links, which could be underperforming due to a lack of internal authority. Pages with fewer internal links are harder for search engines to crawl and may not rank as well as they should. By strategically increasing internal links to these pages, you can improve their visibility and rankings.
To optimize internal linking, start by navigating to Google Search Console > Links > Internal Links and look at the pages that receive the fewest internal links. These are often deep pages, newly published content, or neglected pages that could benefit from more internal linking. You can increase internal links by:
- Adding links within existing content on high-authority pages that already rank well.
- Including links in navigation menus to make the page more accessible.
- Using the website footer to add important links that should be indexed frequently.
- Placing internal links in contextual anchor text that naturally fits within the content.
A well-structured internal linking strategy ensures that important pages receive more link equity, improving their chances of ranking higher in search results. Additionally, internal links help enhance user navigation by guiding visitors to relevant content, increasing engagement and dwell time. Regularly auditing your internal linking structure in Google Search Console and adjusting accordingly can significantly improve your website’s crawlability, authority distribution, and search performance.
Page Indexing
Google Search Console’s Page Indexing Report is an essential tool for monitoring how well your site’s pages are being indexed by Google. It provides insights into which pages are successfully indexed, which are excluded, and why certain pages have errors preventing them from appearing in search results. By analyzing this data, you can identify technical SEO issues, fix indexing problems, and improve your website’s overall search visibility.
Common Page Indexing Errors & How to Fix Them
-
"Excluded by ‘Noindex’ Tag" – This means that a meta noindex directive has been applied to the page, telling Google not to index it. If the page should be indexed, remove the noindex tag from the meta robots directive or HTTP header.
-
"Crawled – Currently Not Indexed" – Google has crawled the page but has not yet added it to the index. This could be due to thin content, duplicate content, or a lack of SEO value. Improving the page’s content quality, internal linking, and keyword relevance can help.
-
"Discovered – Currently Not Indexed" – Google has found the URL but has not yet crawled it. This usually happens when a site has too many new pages or a low crawl budget. Improving internal linking, generating backlinks, and updating the sitemap can encourage indexing.
-
"Blocked by robots.txt" – If a page is blocked in the robots.txt file, Google will not crawl it. Review your robots.txt settings to ensure critical pages are not unintentionally disallowed.
-
"Soft 404" – This occurs when a page appears to be a 404 (not found) error but still returns a 200 (OK) HTTP status code. Fix this by ensuring that non-existent pages return a proper 404 status or by redirecting users to relevant content.
-
"Page with Redirect" – If a page has a 301 or 302 redirect, Google may not index it. Ensure the redirect chain is correct, and avoid unnecessary redirections.
-
"Alternate Page with Proper Canonical Tag" – This means the page is considered a duplicate of another page and is being consolidated under the canonical URL. If this is incorrect, review your canonical tags to ensure the right page is indexed.
How to Use the Page Indexing Report to Improve Your Site
- Prioritize Critical Pages – Focus on fixing indexation issues for high-value content, landing pages, and product pages that should be appearing in search results.
- Submit Pages for Reindexing – After fixing an issue, use the "Request Indexing" feature in Google Search Console to speed up reindexing.
- Improve Internal Linking – Strengthen the internal linking structure of non-indexed pages to make them easier for Google to discover and crawl.
- Update and Improve Content – If pages are being ignored due to low quality or thin content, expand and optimize them to provide more value.
- Monitor Indexing Trends – Regularly check the Indexing Report to catch and resolve issues before they negatively impact your search performance.
By actively using Google Search Console’s Page Indexing Report, website owners and SEO professionals can ensure that important pages are properly indexed, technical errors are minimized, and overall search visibility improves.
Core Vitals Data
Core Web Vitals (CWV) are a set of performance metrics that Google uses to measure a website's loading speed, interactivity, and visual stability. These factors significantly impact user experience and search rankings, making them essential for SEO. Google Search Console (GSC) provides a Core Web Vitals Report that helps site owners analyze, diagnose, and optimize their website’s performance based on real user data from the Chrome User Experience Report. Understanding and improving these metrics can lead to higher rankings, better user engagement, and increased conversions.
To gather Core Web Vitals data, log into Google Search Console, select your website, and navigate to Experience > Core Web Vitals in the left-hand menu. The report is divided into mobile and desktop performance, classifying URLs into three categories: Good, Needs Improvement, and Poor. Clicking on "Open Report" provides a detailed breakdown of which URLs require attention and the specific Core Web Vitals metrics affecting their performance.
There are three primary Core Web Vitals metrics to focus on. Largest Contentful Paint (LCP) measures loading speed, indicating how long it takes for the largest visible content (such as an image or text block) to fully load. A good LCP score is 2.5 seconds or less, while anything above 4 seconds is considered poor. To improve LCP, website owners should optimize server response time, compress images, implement lazy loading, and use efficient caching techniques.
First Input Delay (FID) measures interactivity, tracking the delay between a user’s first interaction (like clicking a button) and the browser’s response. A good FID score is 100 milliseconds or less, while anything over 300 milliseconds requires improvement. To optimize FID, reducing JavaScript execution time, code splitting, and minimizing third-party scripts is crucial.
Cumulative Layout Shift (CLS) measures visual stability, assessing whether unexpected layout shifts occur while a page is loading. A CLS score below 0.1 is ideal, while anything above 0.25 can create a poor user experience. Fixing CLS issues involves setting size attributes for images and videos, avoiding dynamically injected content, and using CSS animations responsibly.
Improving Core Web Vitals can have a significant impact on SEO. Since Google prioritizes mobile-first indexing, addressing mobile performance issues should be a top priority. By analyzing affected URLs in GSC and using tools like Google Lighthouse and PageSpeed Insights, website owners can obtain actionable recommendations to optimize their CWV scores. Additionally, monitoring performance over time ensures that implemented fixes lead to real improvements. A website with faster loading times, better interactivity, and stable layouts not only improves rankings but also reduces bounce rates, increases engagement, and enhances user experience, ultimately driving more traffic and conversions.
Get Help Gathering SEO Data and Interpreting It For Your Business
If you need help interpreting your SEO Data, let us know. SEO Gone Wild can conduct a thorough analysis of your website in order to help you start ranking more quickly and get the results that you want.