Learn SEO in 99 Links. The Beginners Guide to…The Beginning

Prolegomenon to the Exordium


“It is customary to preface a work with an explanation of the author’s aim, why he wrote the book, and the relationship in which he believes it to stand to other earlier or contemporary treatises on the same subject.”



With the all the recent discussions of what an SEO is, or what an SEO does or should do, as an inbound marketing/SEO tyro I thought it would be helpful to provide a resource to others who are new to the field of SEO and inbound marketing.

Also, in the wake of recent algorithm changes, there seems to be a renewed emphasis on the fundamentals of search engine optimization.  In this context, my guide may also be of use to seasoned SEOs looking to refresh their understandings of the basics, and perhaps to discover a few new ways in which the basics of SEO are currently being utilized to promote web properties.

This guide is in no way a canonical or comprehensive set of resources to everything you will ever need to know regarding our field of work.  It should, however, provide ample guidance to those who are just now embarking on the wild waters of SEO in 2013.

  Continue reading


How to choose the best keywords for your website or blog

Including the right keywords in your website will boost your search engine ranking and drive potential new customers to your website. Find out how to choose the best keywords for your website


  • continually refine and update your keywords
  • experiment with keywords
  • give your keywords time to show results.


  • choose single words, such as “insurance”
  • use highly competitive keywords, such as “cheap loans”
  • write content intended only for search-engine ranking.

Continue reading

Google’s Crawler Now Understands JavaScript: What Does This Mean For You?

Googles Crawler Now Understands JavaScript: What Does This Mean For You? image googlebot crawler javascript indexing illustration 300x78Last week Google announced a significant change to their crawler’s behaviour. The “Googlebot” finally has the ability to interpret JavaScript, the last remaining core construct used to create and manipulate content on web pages (HTML and CSS being the other two).

In an official post on their Webmaster Central Blog, Google shared that when it first began crawling sites, the Internet was a very different place. Sophisticated JavaScript wasn’t really commonplace; instead, it was used to “make text blink”.

The Internet and the way we use it evolved over time and JavsScript’s unique ability to essentially reach into web pages and pro-grammatically manipulate content became increasingly recognized. For the past decade or so, web pages have become dependent on the ability to use JavaScript, and although it’s best practice to have them degrade gracefully so that users who disable JavaScript can still use the page, to say that that doesn’t always happen is an understatement. Continue reading