SEO Overview - Analytics, Metadata, Robots

SEO (Search Engine Optimisation) is all about improving the visibility of a web page in search engines via natural or unpaid search results.

The higher ranked on the search results page your website is, the more likely you’ll receive increased visitors.

SEO provides long term results which are more favorable than the short term gain of a Pay per Click campaign, as well as being more cost effective. It can increase your volume of sales, reputation within your industry/niche community and overall profitability.

On each information and product page there is a section for SEO. It contains the following headings:

  • SEO Page Title: A title tag is the main text that describes an online document. It is the second most important on–page SEO element (the most important being overall content), and appears in three key places: browsers, search engine results pages, and external websites. A maximum amount of 70 characters will display in the search results. The engines will show an ellipsis, “…” to indicate that a title tag has been cut off.

  • SEO Meta Keywords: Meta keywords are ignored by all major search engines. We do not recommend populating this field.

  • SEO Meta Description: Meta description tags, while not important to search engine rankings, are extremely important in gaining user click-through from SERPs (Search Engine Result Pages). Meta descriptions can be any length, but search engines generally truncate snippets longer than 160 characters. It is best to keep meta descriptions between 150 and 160 characters.

  • SEO Page Heading: This field overwrites the pages <h1> tag, which is by default the product / page name.

  • SEO Canonical URL: If you have multiple pages with the same content you can tell search engines to index the page using this URL.

Another way to optimise SEO is to customise the URL created for a page. Simply un-tick the Automatic URL box and enter your own custom URL. Keep in mind that the more readable by a human being, the better. Most, if not all of the automatically generated URLs within Neto follow that trend.

Robots Exclusion Protocol (Robots.txt)

Robots exclusion protocol or REP or robots.txt allows the webmaster to specify certain pages they don’t want the crawler to access and gives them a level of control over how their site is indexed. It is useful for blocking pages that have a lot of duplicate content, which can negatively impact the ranking results.

You can access your REP via SFTP access by going to /httpdocs/robots.txt from the root directory.

Google Search Console

Google Search Console (previously Google Webmaster Tools) is web service by Google for webmasters. It allows webmasters to check indexing status and optimize visibility of their websites.

In order to be verified by Google to use this service, you’ll be required to place a verification file in the root directory of your domain. To do this, simply apply for SFTP access and place the verification file in the /httpdocs/ directory.

Google Analytics

Google Analytics can track visitors from all referrers, including search engines and social networks, direct visits and referring sites. It also tracks display advertising, pay-per-click networks, email marketing and digital collateral such as links within PDF documents.

Simply install the Google Analytics addon from the Neto addon store and register your Google Analytic’s account.

  • Last Modified: 11/10/2016 Neto Version: 6.5