One of the greatest challenges many SEO specialists must face is the Wall of Pseudo-knowledge that dominates SEO thinking. It doesn’t matter who you think is an expert in SEO. We’re all prone to believing something that isn’t true. Pseudo-knowledge sounds true and isn’t false enough to lead to immediate negative consequences when used in search optimization.
An example of common pseudo-knowledge you’ll find in online discussions about search engine optimization is that “my competitor’s site gets more traffic than mine.” I see people make this assertion every week. And they’re not merely assuming the competitor of their focus is earning more traffic than them – some SEO tool is telling them this is the case.
SEO tools provide traffic and ranking estimates, but they’re not accurate. They’ll probably never be accurate. The most accurate traffic and ranking data you’ll ever have access to is the data provided by the search engines – and that data isn’t 100% accurate, either. But it’s more accurate and reliable than anything you’ll get from a 3rd-party SEO tool.
So when people say their competitors are outranking them and getting more search referral traffic, I almost always tell them “unless you have access to your competitor’s analytics data, you don’t know how much traffic they receive”.
Some people act like I’ve slapped them in the face. It never occurred to them that these SEO tools might be estimating traffic and reporting only limited rankings. It takes a while even for experienced SEO specialists to unlearn pseudo-knowledge like this. And none of us is immune to believing things that sound true, aren’t true, but don’t really cause us direct harm. We’ve all believed pseudo-knowledge at some point(s) in our SEO careers.
Here are a few lessons most Web marketers must eventually unlearn in order to advance in their SEO knowledge and skills.
1 – SEO Tools Make Good Decisions
Every SEO tool was created by someone who wanted to help people improve their optimization. But every successful tool eventually begins adding bells and whistles – pointless work-creating features that either don’t contribute to real optimization or even hurt it.
A good SEO tool forces you to make a decision about whether to use any given feature. There are almost always niche cases where certain features make sense. But none of these features is universally useful. Even the generation of XML sitemap files isn’t universally useful.
One of the most common mistakes I see WordPress-using SEO specialists make is turning on “noindex” directives for all Category archives. Why do they do this? “Because so-and-so said I should.”
Just because an SEO tool allows you to do this – or even if one of their employees says it’s a good idea in some specific context – is insufficient reason to tell search engines not to index your category archives. You need to understand the role those archives play in helping search engine understand how the blog is set up.
You might have some duplicate category archives. In that case, if you can’t reorganize the content then noindexing the duplicates might be a worthy SEO decision. Usually I tell people I’d rather redirect them, but even that’s not good universal advice. Someone might be unaware of the redirects years later when the archives are needed, and they fail to notice there are redirects.
SEO tools are not there to make decisions for you. Yes, there are some people who claim their tools can do all the work. I’ve never found an SEO tool I would trust to that extent. And I’ve tested many SEO tools.
SEO tools make default decisions, when they make decisions at all. It’s always a bad decision to let the SEO tool make any decision for you.
2 – Page Titles Should Be A Certain Length
I occasionally go along with people who cite a specific recommended length for page titles. Not always.
Whether someone says your page titles should be 140 characters or 160 characters long doesn’t matter. What matters is whether you understand what the purpose of the title is. If you’re teaching someone about SEO then suggesting they limit the length of their titles is probably a good idea. People need to learn some boundaries before they begin experimenting with crossing those boundaries.
I almost always remain silent when someone else recommends keeping titles to 160 characters or less when they are educating someone who is still learning SEO basics.
However, if someone says they’ve been optimizing sites for more than a year, I might encourage them to experiment with longer (or shorter) titles – especially if they’re frustrated with whatever they see in the search results.
Any 1 page can appear in multiple search results with different titles. The search engines may truncate what you put there, or they may rewrite it completely. Nothing is guaranteed in SEO. If you already know that, and you’re upset that you don’t see the same title every time you run some query, then you’re ready to experiment with your titles.
Change something. Don’t be afraid to experiment. Yes, you might lose some search referral traffic. But if the SEO formula you learned to follow isn’t satisfying you, it’s time to try something else.
3 – SEO Tools Show You Accurate Backlink Profiles
This pseudo-knowledge usually runs alongside the pseudo-knowledge that SEO tools can report your rankings and traffic to you.
A good SEO backlink crawler finds a lot of links. What these tools don’t do is tell you which links the search engines keep in their indexes and allow to help (or hurt) your site. No SEO tool will ever be able to do that (unless the search engines give them access to their backlink data – and they won’t).
SEO specialists learn to use 1 or more backlink research tools for various reasons – most of them bad reasons, in my experience. But if you’re trying to figure out what it’s going to take to do a little better (or maybe a lot better) in the search results, you’ll find yourself doing some backlink research.
The first thing every good SEO specialist should do when they start working on a new site (either as a consulting specialist or as an employee) is to audit the site. And that audit should include backlink research. I’ve had clients (and prospects) say to me, “Every agency I approach wants to do an audit! I’ve paid for 3 audits in the past year!”
I understand their frustration – but it’s unrealistic to expect someone who has never worked on your site to start making recommendations without first doing an audit. And, frankly, customers often hold back information (usually because they think it’s not important) that a good, deep site and backlink audit reveals.
And all that is to say that, yes, we need to use backlink research tools – we need to use them a lot. But that doesn’t mean these tools are finding and reporting all the links. And they’ll never be able to tell us what the search engines think of those links. Backlink analysis begins with the SEO tool – but it ends with your own additional research, intuition, and analysis.
Always assume the SEO tool you’re using is only showing you some of a site’s backlinks – and that its own assessments of those links is nothing more than an opinion. It may be a darned good opinion, but take all tool opinions with a huge grain of salt until they are confirmed by other means.
4 – Google’s Quality Rater Guidelines Teach You How to Optimize for Search
One of the most damaging assumptions in the world of SEO pseudo-knowledge is the idea that you can read the Quality Rater Guidelines and learn how to rank a site better in Google’s search results. Anyone who tries to “improve E-A-T” as a ranking strategy doesn’t know what they are doing. Everyone who says they improved rankings that way benefited from sheer, dumb luck and nothing else.
Google’s Quality Rater Guidelines do not explain:
- How the algorithms work
- How things are scored by the algorithms
- Why any given page outranks any other page
The QRG’s only purpose is to explain to the human contractors who review examples of search results provided to them by Google how they should rate those example results. If you want to be a Google quality rater, read the QRG and memorize it.
If you want to understand why Google ranked page Alpha above page Beta, get a job with one of Google’s search engineering teams so you can get access to their magic tool that breaks down all the rankings in the search results.
But former Googler Matt Cutts – who ran the search engine’s Web Spam Team for over a decade – who spoke at many conferences and provided many tips to Website owners and SEO specialists on how to improve site optimization – is credited with being the first person to leak and subsequently recommend openly that people read the Quality Rater Guidelines.
So that must mean there is some value in learning what those guidelines say, right?
But if it’s not how to rank better in Google, then what can you learn from the QRG?
Simple: the QRG teaches you how to create the kinds of Websites (and content) that Google hopes to one day reward in its search results. In many cases, doing what the QRG define as “good” or “high quality” content will get you closer to where you want to be. But you won’t know why, other than your interests and Google’s interests have aligned.
Since there is no Google E-A-T algorithm or E-A-T signal, you cannot “improve E-A-T” no matter how badly you want to. And every article I’ve read up until now that purports to explain what E-A-T is and how you can improve it fails to mention most of what Google explains about E-A-T (expertise, authoritativeness, and trustworthiness). I’m not joking. These E-A-T experts deliberately do NOT quote the majority of Google’s definition of E-A-T.
5 – You Can or Should Disavow “Toxic” Links
This bit of pseudo-knowledge fell out of the world of SEO tools, audits, and penalty recoveries. I was writing about “toxic links” before most people even had an idea of what the phrase might mean. I only ever used “toxic links” to refer to the links that search engines penalize sites for. I still prefer that definition today.
Unfortunately, some clever tool vendor somewhere decided that warning people against “toxic links” would give them a competitive advantage. And it didn’t take long for many SEO tools to copy these apocalyptic prophecies of doom – or to begin recommending that people disavow whichever links they designate as toxic.
Today’s toxic link warnings are nothing more than a witch hunt. These SEO tools don’t know which links are bad or good. At best they can only provide you with an opinion on which links may lead to a penalty. And in my experience, they’re all doing very badly in forming those opinions.
Technically, the search engines only care about the links you buy, the links you place for your own SEO benefit. You know, like all those guest posting links you create. The links you rely on to improve your SEO. Those are the only toxic links in a search engine’s algorithmic opinion.
They don’t care about the automated junk links that fill up your backlink profiles. Disavowing those links is more likely to hurt your SEO than help it. Yes, I know “some people” claim it helps – but search rankings and referrals change for many reasons. You could stand on a corner and snap your fingers for a week and your SEO will randomly improve or get worse.
Link disavowals are largely unnecessary. They were helpful from 2012 (when Google gave us the ability to disavow links) to 2016 (when Google made it far less necessary to disavow them).
If you’re really worried about “toxic links”, then stop using link schemes to improve your SEO. Problem solved.
Conclusion
Some SEO pseudo-knowledge is harmful. But it might be more accurate to say that these false ideas are two-edged swords. They can cut both ways and often do (when they cut at all). Often I find these assumptions are rather blunt swords and doing anything based on pseudo-knowledge is usually just a waste of time (and money).
People in the SEO industry refuse to adopt real standards. Until that attitude changes, you should expect the majority of what you read on the Web and hear at conferences to at best be of limited value. Standards would protect everyone from pseudo-knowledge and dangerous beliefs and mischief. The lack of standards means you’re more likely to believe things you shouldn’t, pay for things you shouldn’t, and do things you shouldn’t.
Read More about Search Engine Optimization
How Long Does It Take Google to Credit A Website with Links?
Natural Backlink Profile: Endless Ways to Build One
Website Not On Google? Why Some Internal Pages Aren't Indexed
RankBrain and Neural Matching: What Is the Difference?
Follow Reflective Dynamics |
Click here to follow Reflective Dynamics on Twitter: @refdynamics. Click here to follow SEO Theory on Twitter: @seo_theory. Reflective Dynamics' RSS Feed (summaries only) |