SEO Testing: the Great Debate

Search Geeks Speak
Directors Cut: Testing is a time honored tradition with SEO’s from the days when infoseek provided the exact point at which the repetition of phrases, keyword stuffing images and other spamilicious activity would get you penalized and the penalty was simply it did not accept the submission! That’s a test…. some “testing” is based on correlation and conjecture as to causation or an SEO theory and verification of the theory by reaching a benchmark or goal indicating verification of the theory.

SEO tools and Organizations like the Moz can do the big “tests” determine correlations between ranking and SEO techniques but in the end…. testing “the pack” means the correlation found is determined by the skill of the SEO and web development implementers in the pack. If a design trend (like big asse sliders and images at the top of home pages and blog posts) is a “page quality” and/or a conversion menace everyone following that trend is being affected the same way so the data is to some extent skewed by everyone making the same mistake.

Likely the best example is SEO consensus on title length. Consensus is 60 chars or X pixels depending on what width Google gives the SERP. Both of those numbers are based on what is “displayed” in the SERP which IMO, is not likely a variable Google engineers would use to determine relevance of a page. The main reason for that is it would require re-calibration with each modicifcation to the home page width. Truth be told it has been 82-85 characters for literally as long as a friend of mine started the test and verified it using Google Search Syntax allintitle to verify his findings.

His test was based on the theory that Google was indexing more than the consensus number. I know it was around the same time as SEO Pros was being launched (around 2003). My friend was testing the word count. I later determined it was characters (as a programmer that also seemed the logical way to consume the least amount of cycles to determine what to index). The 82-85 was confirmed on Dave Naylor’s company blog (look it up on the Google you might learn something! ) and that still seems to be a good number.

Remember you are not trying to rank the internet! Each site is unique so the SEO strategy you develop is a plan and each step in the plan needs a “test” to determine success. I would suggest having a “measurable goal” is a “real test” that once verified can be repeated on the test site and others. I would also say that having a few “test domains” to test more risky techniques is a must. Every experiment should start with a risk assessment of the SEO technique and implementation should IMO, never be initiated until the client is fully aware of the risk in the experiment. then, know what you

The Hangout Video:

Highlights

Introductions: Bill Slawski blogs on SEO By the SEA and is Director of SEO Research at Go Fish Digital, Eric Enge General Manager of Perficient Digital is and Dr. Pete/ Pete Myers Marketing Scientist at the Moz and Steve Plunkett an SEO for hire in Dallas.

Deconstructing Google Algos 3:33: Dave kicked the debate off with a quick descriptions of testing that deconstructing of a Google algo and why and how these tests can be skewed by query space and classification. The panel covered correlation studies and other large scale tests/studies.
SEO Private Testing 23:50: Dave started this segment with a discussion of testing on a single site, split testing on a page, and annotating all activity on a site as other methods of testing.
SEO Testing 39:45: Dave asked the panel what have they learned about how to do SEO testing and/or learned about SEO from doing SEO testing.