The crawl errors are of two types: 1) SiteErrors 2) URL Errors
Site errors:This section of the report shows the main issues for the past 90 days that prevented Googlebot from accessing your entire site (click any box to display its chart).
URL errors:This section lists specific errors Google encountered when trying to crawl specific desktop or phone pages. Each main section in the URL Errors reports corresponds to the different crawling mechanisms Google uses to access your pages, and the errors listed are specific to those kinds of pages.
Crawl Stats: It definesPages crawled per day, Kilobytes downloaded per day, Time spent in downloading a page (in milliseconds).
Fetch as Google: The Fetch as Google tool enables you to test how Google crawls or renders a URL on your site. You can use Fetch as Google to see whether Googlebot can access a page on your site, how it renders the page, and whether any page resources (such as images or scripts) are blocked to Googlebot.
Fetch: Fetches a specified URL in your site and displays the HTTP response. Does not request or run any associated resources (such as images or scripts) on the page. This is a relatively quick operation that you can use to check or debug suspected network connectivity or security issues with your site, and sees the success or failure of the request.
Fetch and render: Fetches a specified URL in your site, displays the HTTP response and also renders the page according to a specified platform (desktop or Smartphone). This operation requests and runs all resources on the page (such as images and scripts). Use this to detect visual differences between how Googlebot sees your page and how a user sees your page.
robots.txt Tester:Used to check and edit for errors.A robots.txt file is a file at the root of your site that indicates those parts of your site you don’t want accessed by search engine crawlers.
Sitemaps:Creating a sitemap helps search engines better crawl and categorize your site. You can easily create and verify a sitemap for any of your publicly viewable Google Sites through Google Search Console.
Steps to submit the sitemap in the Google search console
- Sign in to Google Search Console.
- On your Search Console home page, select your website.
- In the left sidebar, click Crawl and then Sitemaps.
- Click the Add/Test Sitemap button in the top right.
- Enter sitemap_index.xml into the text box that appears.
- Click Submit.
- If your sitemap is already submitted then by making use of Resubmit button you can submit the sitemap once again for google crawling.
The Search Analytics Report shows how often your site appears in Google search results. Filter and group data by categories such as pages, countries, devices, search type and Dates. Use the results to improve your site’s search performance, for example:
- See how your search traffic changes over time, where it’s coming from, and what search queries are most likely to show your site.
- See which pages have the highest (and lowest) click-through rate from Google search results.
Search analytics normally reports on:
Clicks: Clicks from Google SERP’s on the property by the visitor
Impressions: The number of links that a visitor or user view in Google SERP
CTR: Abbreviated as Click through Rate (CTR)andis defined as CTR=Clicks/impressions*100.
Position: It shows the position of the Website in Google SERP listings.
Internal links: Internal links are links that go from one page on a domain to a different page on the same domain. They are commonly used in main navigation.
Links to Your Site:
The Links to Your Site report lists links that Googlebot discovered during its crawling and indexing process, as well as the most common links sources and the pages on your site with the most links. In addition you can also see the most common anchor text found by Google. Click each list item to see more detailed information. If users reach a page on your site as a result of clicking a link with a redirect, that intermediate link will also be listed.
Internal Links Report:
View internal links to other pages on your site
The number of internal links pointing to a page is a signal to search engines about the relative importance of that page. If an important page does not appear in this list, or if a less important page has a relatively large number of internal links, you should consider reviewing your internal link structure.
The Internal Links page lists a sample of pages on your site that have incoming links from other internal pages.
If you’re deleting or renaming pages on your site, check this data first to help identify and prevent potential broken links.
Manual Actions report:
The Manual Actions report lists instances where a human reviewer has determined that pages on your site are not compliant with Google’s webmaster quality guidelines. Google’s algorithms can detect the vast majority of spam and demote it automatically; for the rest, we use human reviewers to manually review pages and flag them if they violate the guidelines. Flagged sites can be demoted or even removed entirely from Google search results.
If you manage one or more websites designed for users in a specific country speaking a specific language, you want to make sure that search results display the relevant language and country version of your pages.
Many websites serve users from around the world with content translated or targeted to users in a certain region. Google uses the rel=”alternate” hreflang=”x” attributes to serve the correct language or regional URL in Search results.
Imagine you have an English language page hosted at http://www.example.com/, with a Spanish alternative at http://es.example.com/. You can indicate to Google that the Spanish URL is the Spanish-language equivalent of the English page in one of three ways:
HTML link element in header. In the HTML <head> section of http://www.example.com/, add a link element pointing to the Spanish version of that webpage at http://es.example.com/, like this:
<link rel=”alternate” hreflang=”es” href=”http://es.example.com/” />
Global web traffic from mobile devices is on the rise, and recent studies show that mobile visitors are more likely to revisit mobile-friendly sites. The mobile usability report identifies pages on your site with usability problems for visitors on mobile devices.
HTML Improvements report:
The HTML Improvements page shows you potential issues Google found when crawling and indexing your site. We recommend that you review this report regularly to identify changes that potentially increase your rankings in Google search results pages while providing a better experience for your readers.
These issues don’t prevent your site from being crawled or indexed, but paying attention to them can improve the user experience and even help drive traffic to your site. For example, title and meta description text can appear in search results pages, and useful, descriptive text is more likely to be clicked on by users.
To view data for your site, you need to make sure you’ve added your site to your account and verified ownership. If we haven’t crawled or indexed your site yet, we won’t be able to display this data.
Data that may be included on this page include:
- Title problems: Potential problems with the title tag on your pages, such as missing or repeated page titles.
- Meta description problems: Potential problems with duplicate or otherwise problematic Meta descriptions.
- Non-indexable content: Pages containing non-indexable content, such as some rich media files, video, or images.
Because we’re always working to improve the type and quality of the data we provide in site reports, this data may change from time to time.
The links shown below some of Google’s search results, called sitelinks, are meant to help users navigate your site. Our systems analyze the link structure of your site to find shortcuts that will save users time and allow them to quickly find the information they’re looking for.
Accelerated Mobile Pages (AMP) report:
The AMP report shows a count of AMP pages either successfully indexed or with AMP-related errors encountered when Google crawled your site. Connectivity errors, broken links, and other errors are not shown in this report. Pages with AMP errors or lacking required structured data elements will not be shown in Google search results with AMP-related features.
AMP standards developed webpages load quickly on mobile devices.
Structured data report:
If Google understands the markup on your pages, it can use this information to add rich snippets and other features to your search result. For example, the search snippet for a restaurant might show its average review and price range. You can add structured data to your page using the schema.org vocabulary and formats such as Micro data and RDF, alongside other approaches such as Micro formats. You can also add structured data by tagging the data on your page using Data Highlighter.
The Structured Data page in Search Console shows the structured information that Google was able to detect on your site. It also provides information about errors in page markup that may prevent rich snippets (or other search features) from being displayed.
About Data Highlighter:
Data Highlighter is a webmaster tool for teaching Google about the pattern of structured data on your website. You simply use Data Highlighter to tag the data fields on your site with a mouse. Then Google can present your data more attractively — and in new ways — in search results and in other products such as the Google Knowledge Graph.
The Index Status report provides data about the URLs that Google tried to index in the current property for the past year.
In the index status you will find the information about how many pages are indexed in Google search results is any improvement and decrement in the indexed pages.
Blocked by robots
Here you can track the URLs that are being blocked.
Here you can track the webpages which are removed from the website from last one year
These hosts serve site resources that are blocked to Googlebot. If Googlebot can’t access important resources on your page, the page might be indexed incorrectly.
Temporarily remove URLs that you own from search results. To remove content permanently, you must remove or update the source page.
Disavow back links
PageRank is Google’s opinion of the importance of a page based on the incoming links from other sites. (PageRank is an important signal, but it’s one of more than 200 that we use to determine relevancy.) In general, a link from a site is regarded as a vote for the quality of your site.
Google works very hard to make sure that actions on third-party sites do not negatively affect a website. In some circumstances, incoming links can affect Google’s opinion of a page or site. For example, you or a search engine optimizer (SEO) you’ve hired may have built bad links to your site via paid links or other link schemes that violate our quality guidelines. First and foremost, we recommend that you remove as many spammy or low-quality links from the web as possible.
This is a two-step process. First, you’ll need to download a list of links to your site. Next, you’ll create a file containing only the links you want to disavow, and upload this to Google.
- Download links to your site
- Choose the site you want on the Search Console home page.
- On the Dashboard, click Search Traffic, and then click Links to Your Site.
- Under who links the most, click more?
- Click Download more sample links. If you click Download latest links, you’ll see dates as well.
Upload a list of links to disavow
- Go to the disavow links tool page.
- Select your website.
- Click Disavow links.
- Click Choose file.