The paradox of The LongTail for keyword and niche research is a statement that apparently contradicts itself and yet might be true (or wrong at the same time).

The underlying principle of The LongTail in Internet Marketing in my opinion is generally being used wrong.

Just because you create or find keyword phrases with more words in it does not constitute what was originally meant by The LongTail.

The basic idea was about how the virtual world created almost endless possibilities to undiscovered profits just because of it’s nature as compared to brick and mortar businesses. Chris Anderson who is credited for The LongTail brought to light the concept that online businesses like Amazon, could make big money selling small amounts of hard to find items.

This was at the time in direct opposition too mainstream practices of how items were being sold. Businesses due to their physical limitations of distribution and storage were limited to selling hits, the best of best. With the sheer physical economics of scale, hits had a firm grip on what was in inventory and what was rejected.

The theory of the Long Tail is that our culture and economy is increasingly shifting away from a focus on a relatively small number of “hits” (mainstream products and markets) at the head of the demand curve and toward a huge number of niches in the tail.

 

longtail

As a result decisions of hits was left in the hands a few decision makers, largely ignoring a much larger lucrative market due to physical constraints of how our culture has conducted business since the beginning of time.

What’s this got to do with Internet Marketing’s abuse of The LongTail?

Internet Marketing advocates claim the use of single or short keyword phrases are highly competitive in their world for traffic generation from search engines. So if you were to find and use many keyword phrases that included two or more words the competition for them would be reduced. This would make it easier to rank for them and would eventually outweigh what traffic a single keyword would generate.

The concept is very sound and makes commonsense in it’s usage.

Here’s where they are right and go wrong with The LongTail concept.

This is where IMer’s  get it right. First single keywords are very broad in attracting the desired traffic, including insane for even trying to rank for single keywords. Considering the resources and authority required in today’s world with all the algorithmic changes that have occurred with search engines its foolish to even attempt this.

By targeting The LongTail many of the visitors to your content made the decision to investigate interested them due to the nature of The LongTail, all you can hope for is both you and the visitor match up. What attracted that visitor to your content was it takes more than one word to define and match traffic to content as opposed to relying on the single keyword for any niche.

Here is where many Internet Marketers make their fatal mistake with The LongTail usage and their quest for traffic to their content. The mistake is that just because any particular phrase has multiple words that it intrinsically becomes less competitive, this is far from the truth. In reality it’s competitiveness occurs because of interest in it’s usage this is the nature of it’s statistics in the real world.

Length of keyword phrases do not equate to low competition for search engine rankings. As I participate in multiple forums trying to help individuals seeking the truth. The first thing I always run across is, when I search in Google Planner I found keywords with low search counts etc…, the story is always the same.

Just because there is low search counts, it does not mean it is low competition. It just means that as the keyword phrase gets elongated it just means that less and less people are interested in a keyword phrase that is very specific for a very narrowly defined topic.

The search count itself does not define competition, it is like trying to fit a square peg into a round hole assuming this.

The culprit to this is Overture many years ago as the web was getting defined. With Overture providing search volume data from advertising data, combine this with wishful thinkers trying to garnish the benefits from the stated volume of visitors to any particular keyword. This practice still continues and misleads the misinformed.

Today Google Keyword Planner has replaced Overture for search counts. They have done nothing to inform the ignorant masses of their flawed thinking. Why should they, their goal is to make money with Pay Per Click and gather data to understand patterns of where people focus efforts producing content for their search engine listings.

Google gathers data everywhere on their real estate, especially from Google Keyword Planner. Google’s gathering of data from Keyword Planner usage is monitored closely and what they do with it is beyond the scope of this writing. But be assured your being watched very closely in their information gatherings while you expose your hand on how your about to conquer their rankings.

So how do we go about taking advantage of The LongTail?

In short it is all about sifting through massive amounts of data, connecting the dots, and investigating particular patterns of human behavior in a particular niche with the primary function to draw targeted traffic from the search engines.

A mouthful in itself, the task at hand a bigger one.

As a trained Engineer and a early evaluator of browsers starting with Mosaic later called Netscape in the early days of a think tank I worked in. Taking that accumulated knowledge weeding out every possible combination that has ever been suggested for ranking in the search engines. I’ve pretty much solved how to increase your chances of working The LongTail to your advantage without blindly using Google Keyword Planner.

Notice I said increase your chances it is not the end all but it is the best we have till new metrics become available.

In the last three plus years the primary search engine Google has made some significant changes to their algorithms. Since they are the 800 pound gorilla holding over 70% of market for search engine traffic I will demonstrate with case studies using them as the focus.

The significant change that has occurred with Google’s Search Engine Result Pages (SERPs) is they primarily focus on the authority of a domain. Originally the algorithms had focused on page rank, they still do but not to the extent they have in the past.

Examine any listing in the SERPs and you will see why.

To demonstrate this lets look at a small sample to illustrate what I mean.

Medical advice, medical problems etc… At one time anyone could rank in this niche now it is dominated by groups of doctors such as Web MD, Hospitals, and qualified professionals. For example a popular info product about embarrassing problems which was the rage of the internet, those problems were easy to rank for in the SERPs in the day.  Today most of those problems that require medical attention and part of The LongTail have as much competition as a single keyword.  With Google only ranking websites of authority, good luck trying to be visible on page one.

Finance, looking for financial advice, the ranking sites are dominated by big financial institutions. More authority sites.

Cooking, even this niche is dominated by large authority concerns. Large TV networks, magazines and prominent individuals such as Martha Stewart are on page one. You don’t stand a chance unless you carry the authority.

Brand Names, affiliate marketers had their day in the past,  now Amazon, eBay, Manufacturers the list goes on. The common denominator is authority, without this authority you don’t stand a chance.

This has been the trend lately with no sign of relief unless you know where to look. You can bet it is not Google Keyword Planner. Google has no interest in showing where. No wonder so many fail, following old outdated how to information. To make things worst it is still being proliferated as truth.

The masses still don’t get it, frustrated, most are looking for fertile ground elsewhere. The hot thing now is social media, the barrier of entry to traffic is so much lower. Even this is starting to slowly strangle easy profits. Especially Facebook.

But this is not where I am going. I am still focused on ranking in Google with The LongTail.

The smoking gun is the SERPs! They hold all the proof, not Google Keyword Planner. Still puzzles me how so many SEO, Keyword and Niche software out there still use Google Keyword Planner as the core metrics in their software.

I guess the software would not be so sexy if it didn’t have search counts, oh my if I don’t know how many people to expect is anyone looking for that keyword, will I be wasting my time etc… every possible paranoid excuse to cling to something that also is their Achilles Heel to their problem of not succeeding in the SERPs.

Being a software developer I have specialized software that crawls the internet and returns copious amounts of data to be logged and analyzed for every detail with surprising results.

It was in those days when Google was transitioning its algorithms from page rank to authority that led me to have a hard look for a common denominator. In the past when Google made changes it was easy and consistent, since page rank was the primary factor in ranking. But almost overnight things were not working.

Just about every SEO professional I was associated with was experiencing the same problems. Almost overnight many got out of the SEO business because you really could not SEO your way to the top at will anymore.

Problem with all of this is how do you define authority. Google isn’t going to help you there, it gave you the page rank tool bar at one time. Both sides benefited from this. Individuals tried to skew top 10 listings using this as a crowbar to pry themselves into the top 10 positions creatively. Google benefited with it’s intelligence gatherings of its usage. A win win situation, rare especially with Google.

Now that’s gone. It was the only numerical metric we could easily take a number and apply it to SEO.

When you look at authority sites in the SERPs, most are easy to spot, some not so much. What is missing is something, anything, that places a numerical value to easily filter out authority in the SERPs.

Many who read this are familiar with MOZ and it’s domain authority metric. This is all we have that even comes close to what is needed without Google producing a metric like the page rank tool.  So, if we assume for the sake of discussion that there numbers are approximately correct, we can conclude that we can filter the SERPs at will.

This post took much longer than anticipated and long time in coming.

If you continue working with obsolete methods and hoping to be seen on page one of Google good luck.

I recently had a conversation with a misguided individual who was so frustrated and ready to give up when he read my latest offer and realized he didn’t have a A-Ha moment but instead had a stupid moment. He was using obsolete methods that were obvious and praying for results. Our conversation was a reality check that he was clinging on to futile methods.

Don’t let this be you!

In posts to come I will be exposing a little discussed topic and industry coming on strong. I predict in about 5 years the little guy is going to have a real difficult time having a presence in Google with the evidence I am seeing. So far the evidence has proven the predictions of many.

Don’t miss out, keep reading and share.

Ed Chiasson

SlickTimeSavers.com