Fuzzy Data, SEO Metrics and The Trouble With Tools

Be Social! Share This Article...

In most aspects of life we undertake important decisions with care, especially when there is money involved, and we often depend on others in the process. But when someone offers simple solutions to particularly difficult and complex problems we are usually a bit skeptical (or should be). Hard problems are not easy to solve. Yet some SEO tool vendors make big business of selling fuzzy data and promising push-button solutions.

Opaque and UndocumentedSneaky Cat Caught

In many industries (think pharmaceutical or investment management companies) we have come to expect a basic level of transparency from those that offer products and services. Why, then, are users of SEO tools so willing to accept as reliable fact the mystery meat metrics and data served to them by the major tool providers?

Most of the search marketing community would agree that there are many misguided assumptions associated with SEO ‘best practices’. Likewise, the popular SEO tools that are widely used by enterprises and agencies on down to solopreneurs are often misunderstood. This can lead people to misuse these tools, with potentially costly results. Significant time and resources are invested into content creation or link-building campaigns and whole businesses are sometimes launched based on the data and recommendations provided by these vendors’ tools.

To be clear, my focus in this article is the vendors that promote proprietary (and opaque) metrics aiming to put a single, numeric score on a page, domain or keyword. Another problematic area is the accuracy and origin of mission-critical data such as search volume. I am not here referring to the on-page focused correlation analysis tools such as Cora, POP, etc, which emphasize pure data collection and reporting, and provide clear documentation. I am also not including the many rank tracking tools in this article, although much could be said of the accuracy issues in that space as well.

Market Data Matters

My background is in finance, in the institutional trading space. For years I have worked with and around information and data that is used for critical decision making. In trading, understanding the source, reliability and proper use of a number can mean the difference between Rise Ahead and Behindmaking or losing a lot of money.

Investors in the capital markets, both institutional and retail alike, depend heavily on financial data and market indicators. And for the listed markets (the stock, options and futures exchanges) the accuracy of market data has historically been consistently high. Consider too, that fundamental data such as company earnings reports and shares outstanding must, by law, be accurate. Furthermore, the vendors that scrub and redistribute such financial market data actually compete with one another to be the most accurate and fastest. And it is understandable why: billions of dollars in investment decisions are made each day based on this data, so it has to be correct. With SEO, the costs and revenue at risk that often depend on the accuracy of a tool and the data it delivers can be just as impactful.

With $570M per day in ad budgets and SEO campaigns on the line, I think it’s worth having a little talk about — or with — the martech companies.

Marketing Money Matters

The online marketing industry should demand a higher standard of accuracy and transparency. Global digital ad spend in 2019 is projected to be over $129 billion and the SEO industry is (conservatively) approaching $80 billion in size, in the US alone. Yet, industry participants routinely make costly decisions based on the opaque and potentially inaccurate or incomplete data delivered by the various SEO tool vendors (and Google, of course). Moreover, they put full trust into the various mystery metrics these vendors offer without knowing how the numbers are actually calculated. With $570M per day in ad budgets and SEO campaigns on the line, I think it’s worth having a little talk about — or with — the marketing technology (martech) companies.

Got Sketchy Milk?
Guy Montag

Unfortunately, nothing compels these martech vendors to disclose the quality and freshness of their data and link indexes, or how their metrics are calculated. Did they crawl and scrape this data themselves or use third party data (or both)? How frequently is it updated? How do their metrics actually perform as predictive measures? We don’t know because these vendors are unwilling to tell us.

Would you buy milk if the carton had no expiration date printed on it? Would you invest in a stock every time a cool new tool’s “profit difficulty” number was under 30? I’m guessing you said “no” to both questions.

So, why do people pursue a certain keyword based on a ‘difficulty score’ a vendor assigned without knowing how that score is calculated? Why do they disqualify a keyword or niche based on inaccurate search volume data or because the competitive SERP analysis (based on a potentially incomplete and/or stale link profile) suggested it would be too hard to take on?

Schrödinger’s SEO Tool Box

Given that none of the vendors openly disclose this information, it should be assumed that the data reflected in the tool’s analysis is either incomplete or out of date (or both). You could also just trust that it is accurate, but without greater visibility or validation we can only guess. And I do believe that these tools can be extremely valuable by simplifying complex data collection and aggregation tasks, among other things.Do You Trust This Tool?

But people tend to take as gospel whatever these vendors offer. And they are only as good as their underlying data. Equally important, the predictive power of their mystery metrics is only as reliable as your last documented use indicates (because you have nothing else to go by).

Of course, the SEO tool vendors gain little from showing us how the sausage is made. Data collection is not cheap and, in many cases, their data stores may be stale, inaccurate or just underwhelming in coverage. And they declare their special metrics to be “proprietary”, making performance testing and comparisons across vendors more difficult. Yet we, their users, are supposed to use their tools to make important decisions (involving significant investments of time and money).

We Need a Hero

Perhaps a forward-thinking vendor out there might take the initiative and change how they do business. The bold martech company that more openly shares this important information would surely get rewarded with new clients. In time, it might set the standard and become commonplace (even unthinkable) for clients and agencies to put their trust and budgets with an SEO tool vendor that isn’t open and fully Data Source and Quality Concernstransparent. Which progressive tool provider will lead the charge?

The company that steps up first should set the bar high and fully disclose what constitutes their products. They should openly and regularly declare the size of their link index, and the frequency with which it is updated. They should share the primary and third party sources and freshness of their search volume data, and any adjustments or estimates applied. And they should reveal precisely how their various metrics (page, domain and keyword scores, etc) are calculated.

Until things change, users of these tools must take more ownership over their own data collection and analysis decisions. Ultimately, a holistic approach will yield the best results. These tools are major efficiency boosters but scrutiny, common sense and basic research methods should all be used alongside them. Applying more rigor and taking a DIY approach is certainly not as convenient but can potentially make a huge difference in revenue outcomes. Your business or that of your clients deserves more than blind faith in an off-the-shelf tool.

–J Paul