Google webmaster tutorial
Let’s start with the Google tools, if you have not already verified your website with Google Search Console (webmaster tools) it can take a few days or weeks for Google-bot to crawl, understand and then to display that data in the console. So it’s a good idea to get this process started as soon as possible.
Even though having Google Analytics and Search Console connected to a website may or may not directly affect it’s SEO score there is still a wealth of information and handy tools that can be utilised to improve search engine optimisation. Here is an example, let’s say a website for a web design company has a lot of information about graphic design within the content. Search engines find the words related to graphic design used many times and might show the website in the search results when someone is looking for graphics work. If the web design company offers web graphics but not graphic design services, for instance logo design, then they may find they get lots of enquiries that are not quite relevant. So by using Google webmaster tools it’s easy to find the most common keywords and also find which search queries a website is positioned for in search. If this was the case you would have to update more content into the website related to website design and web graphics and less content related to graphic design. Let’s say the web design company moved from one city to another but had not yet updated their site with the new address. Google analytics might show that most of the traffic is coming from the old city and that might not be ideal or preferred. So the address would need to be updated so that the search engines can see that the company is offering website design services in their current city. These are just two basic examples but you can see how using these tools can improve your search engine visibility to the right target market and audience.
Google Analytics is important for monitoring the traffic that lands on a website. It can be very useful for learning the source of traffic, wether it comes from social media or search engines or other sources. The country the user is located. The length of time a user spends on a website. Keyword queries used to find a website. It can also show you the browser type and version used to view a website and much more. Analysing this data can really help to make sure the website traffic is coming from the targeted market that will benefit a website the most.
Googles Search Console (Webmaster Tools)
Google Search Console is a website that web masters use to help index and monitor the health of websites. It provides a lot of valuable information and tools, such as displaying a list of the most common keywords that make up a websites content. Average positions in Google search, impressions and click through rates for search queries. Broken links that are still in Google search that are not 301 redirected or removed from their index. Pages that Google considers duplicate content. The amount of pages a website has indexed. Country targeting and many other features that can be used to build the performance for a website on Google and also other the top search engines.
Verifying a website with Google Search Console (GSC)
The first step to adding a website to Google Search Console is verifying the webmaster of a site. This can be done by adding a verification code to the header section of a website. From the GSC home page select “add a property”. Then enter the URL for the desired website and follow the instructions to verify that you are a webmaster of the site. After the site is verified the next step is to submit a site map. It’s important to note that once a site is verified and a sitemap has been submitted, it can take from a few days to a few weeks for GSC and the Google algorithms to crawl, understand and display the relevant data related to a site.
Submit Site Map Tool
To submit a site map first make sure the website has a sitemap by entering it’s URL www.example.com/sitemap.xml into a web browser. You should see a page that looks like this example.
xml site map example:
If the website has a XML site map in place then look for the function in Google Search Console called ‘Crawl’ where you can find an option called ‘Sitemaps’. Then look for the Add/Test Sitemap button and enter sitemap.xml after the website address. If the site map has a different name simply enter the correct name and press submit. You can also test a site map before submitting. Once you have submitted a site map it will be processed buy Google shortly there after (usually within a few days) and then you can see how many webpages are indexed.
By navigating to Search Traffic > Search Analytics, Search Console provides web-masters with a comprehensive list of search queries that a site is graded for. Within this feature there are many options that can be adjusted to reveal a lot of important ranking data. So for example, there are four main options that include – Clicks, Impressions, CTR and Position. Using the check boxes it is possible to find how many clicks a site received from Googles Search Index on an individual day or within a thirty day period. Selecting the impressions check box can show how many times a site made an appearance within the search results. So if a site is positioned on page two of Google for a query and a searcher clicks through to page two and sees the website in the search results, this will create an impression. CTR stands for Click Through Rate. This means a user clicked on the listing in the search results and then continued to browse other pages within the website. CTR is important because for a website to be more successful, users should really be engaged with a site and not just view the one page. Position will show the average position that a query is located in search for that particular phrase.
Below the check box options there is another set of radio button options that include; Queries, Pages, Countries, Devices, Search Type and Dates. These options helpful to understand the first four check box options for different types of data that include; clicks, impressions, CTR and positions of a websites individual pages. You can set it to countries and find how the first four options apply to different locations around the world. This is really important to find which country a sites traffic is coming from. Data for devices will show you which device types are used to land on the page. So you can see how many clicks, impressions, CTR and the positions for desktop, mobile and tablet devices. The last option dates is useful for helping to understand when a site is receiving activity. So if a site receives more activity on a Sunday it maybe useful information that a webmaster can implement into a marketing strategy.
GSC also has a handy feature that can help to understand the keyword density within an entire website. Navigate to Google Index > Content Keywords, and there you can find a list of the most commonly used keywords. This is an important feature that can help web-masters and SEO’s learn how the search engine will perceive a site based on how many times a particular word is present. If you find a keyword high on this list that is not really relevant to a site. It maybe easy to remove the amount of times this word appears within a site and also to ensure that the most important keywords are more prevalent.
Fetch and Render Tool
If you have created new pages on a website you can speed up indexing by using the ‘Fetch as Google’ function. This can be found under the Crawl section. Crawl > Fetch as Google. Then enter the URL for a webpage and press fetch and render. After Googlebot has finished rendering the page it can be submitted to the index. This is a great way to let Google know that you have created a new webpage and you should find them in Google’s search index a lot faster than waiting for the spiders next crawl.
Google Search Console (GSC) has made it easy to specify which country a website is targeting by using the ‘International Targeting’ function within the ‘Search Traffic’ menu. Simply choose the country you want to target in the list and set that as a preferred location to display the website within the search results. If you’re targeting multiple countries languages you can use HREFLang tags to show search engines that this website or page is relevant to a particular language or country. The HREFLang tag allows a website that has multiple languages and locations to swap out the URL in Googles search results to an appropriate website version or page.
Navigating to Crawl > Crawl Errors is another important feature of Search Console that can be utilised to monitor and rectify the health of a site. In this section you can find URL errors for Desktop and Mobile. So if you changed the name or deleted a URL, this feature will assist you to identify and then fix the URL by using some of the options explained below.
GSC will list the URL’s within a website that show an error page or URL not found. The easiest way to avoid these kind of errors is not to change the name or delete any website pages. This cannot always be possible so identifying the URL’s is the first step and then either setting up 301 redirects to a relevant page is one option. You can also request to remove or hide the URL from within Googles index. Simply navigate to Google Index > Remove URL’s and submit the URL’s that are creating the crawl errors. Creating a 404 page is another important solution that will show the 404 page instead of a browser message when a webpage or URL is not found.
There are many myths in the SEO community regarding duplicate content penalties. Even though Google has announced there is no such thing as a “duplicate content penalty” the affect of having duplicate content can reduce the amount of traffic a website receives. Google Webmaster Tools now has a function thats identifies pages within a website that it considers to be duplicated. This has made it easier to correct any problems and to learn how to prevent duplicate content issues in the first place.
Google Search Console also provides information about how many pages a website has listed in Googles search index. Considering that a website with four pages will not generally provide the same level of content and traffic performance as a website with forty pages, it’s handy to see which pages a website has indexed.