SEO Written by 0

When it comes to starting a website, it is commonly thought that all you have to do is create a website with a Some SEO Pointers to Starting a Websiteprogram such as Blogger or WordPress, start writing articles and those articles will start to appear on search engines such as Google, Yahoo and Bing. Unfortunately, website creations are not as simple as that. There are a few things you will need to take into consideration and do to make sure your website is healthy in terms of SEO so that search engines can digest the content on the website as easily as possible.

 

#1 Choose a Smart Domain

The first step comes even before you have created the site. The SEO starts with your domain. A good domain name can do wonders for your site (just like PPC.org) whereas a bad domain will put your website at a severe permanent disadvantage. For example, if you was to create a website about flowers, a domain name such as ‘flowers.com’ would most likely do much better than a domain name such as ‘mothernature.com’. They are both good domain names. But, the ‘flowers.com’ is shorter and includes a crucial keyword that can be associated better with the content.

 

 

#2 Use Google Search Console

Google has their own program, the Google Search Console, which enables website owners to better manage the SEO of their site with Google. It helps them include things such as:

  • Submitting the sitemap.xml of the website to Google.
  • Choosing preference over http://____ or http://www.____.
  • Geo-targeting.

The list goes on and on. It is a necessity to make an account for every website you own with Google Search Console because not only will it help Google index your website for listing on organic search results, the Google Search Console will also let you become aware of any problems Google may face with your website such as crawl errors – by knowing the problem will let you fix it!

 

 

#3 Understand Your robots.txt

The robots.txt of any website can be found by putting /robots.txt at the end of the homepage URL. For example, by doing this for PPC.org will show you our robots.txt:

 

User-agent: *

Crawl-delay: 10

Yes, it is pretty boring and empty. But, at the same time, it is an optimized robots.txt. At its most basic, the robots.txt is a file that search engines use to understand what pages to process and crawl through. The crawl delay tells the search engines what delay there is between every crawl so that the server does not overload the website (10 is a healthy number for this). What you may find with your robots.txt is that there are ‘Allow’ and ‘Disallow’ comments underneath the crawl delay. As it says in the name, the ‘Allow’ allows the search engine to process and crawl that page whereas the ‘Disallow’ does the exact opposite.

For this reason, make sure there are no ‘Disallow’s in your robots.txt that you do not want there. A great example I had a few years ago was when I downloaded an ‘iffy’ SEO plugin for a WordPress website. It was going great until it decided to disallow post content from being crawled by search engines. Due to this, I had to change the robots.txt to allow search engines to crawl through the content. Try to make sure you websites don’t fall into the same mistake!

Will created Ask Will Online back in 2010 to help students revise and bloggers make money developing himself into an expert in PPC, blogging SEO, and online marketing. He now runs others websites such as Poem Analysis, Book Analysis, and Ocean Info. You can follow him @willGreeny.

Comments are closed.