Someone once left a comment on an SEO Theory article — one of my longer ones — complaining that it wasn’t scannable. He criticized my writing style, suggesting that people would miss out on most of my points because the articles were not easy to scan.
I have seldom tried to make an article easy to scan for time-sensitive readers, especially a theoretical article. While it is true that I can be very wordy, I find that when I take the time to edit my copy and condense it I pack even more information into it. And my writing sometimes becomes a little choppy. This is one of many reasons why you don’t want to be your own editor.
Andy Beal recently invited me to contribute articles to Marketing Pilgrim, which is one of less than 10 Websites that I have read every day for many, many years. It’s that good, in my opinion. I was honored to receive the invitation. But I don’t have much time for taking on new writing commitments because, frankly, I stay pretty busy. And when I am not busy I need time to unwind from all the writing and optimizing.
Still, I managed to turn out this article on how to block the SEMalt spam network from leaving “referrer spam” in your analytics. The first draft was a bit rough and quite long, and Andy asked me to make a few changes. Although he did not ask me to shorten it, he mentioned it was longer than is normally published on Marketing Pilgrim.
So I cut about 25% of the words from the article and in the process added even more information. That’s a good thing. But when I read the article after it had been published I noticed a couple of paragraphs that, had I been able to devote more time to the project, I think I could have revised to read better. As it is, if you just scan the article you’ll miss out on a lot of information.
In fact, because I try to pack so much information into whatever I write if you simply scan the sub-headings and images (which I don’t always include) you’ll miss many important points.
Sometimes I’ll break up article copy with data tables or bullet point lists just to help readers break out of monotonous streams of paragraphs, but I try NOT to make my articles scannable. I just don’t believe that an article where you can pull out all the important information from scanning is a very good article.
And now Google may have given guest bloggers reason to reconsider just how scannable they make their copy. Search Engine Roundtable is reporting that Google just updated its Webmaster Guidelines to include “low quality guest blogging” as a form of content spam. I wrote about content spam earlier this year, where I named BIG NAMES in the search marketing field as content spam publishers.
And while my article may look like it is scannable because I used sub-headings, you won’t get much information from those sub-headings. You have to read the paragraphs that follow them to understand what I am talking about. Content spam tends to be very scannable. You can almost always extract the information conveyed by the content spam from its scannable elements: images, sub-headings, and captions.
Whether Google’s idea of “low quality guest blogging” matches what I call “content spam” remains to be seen. But I think it’s clear that other people are growing weary of the too-easily-read, shortly-written content that rarely says anything new or significant. Now, just because your article is scannable doesn’t mean it is spam. But making it easy to extract the important information from sub-headings and bullet points makes whatever else you write less relevant and useful.
And so I think that people who write Web copy for a living need to think about striking a balance between making the article easy to digest and making it informative in a deep, entrenched style. You can try to emulate my own experiments in copy or you can develop your own. But if you just write “scannable copy” then you probably don’t need to write as much copy as you have been. And it may be that your copy will be viewed as “low quality” simply because it’s mostly fluff.
Yes, One Man’s Fluff is Another Man’s Detail It’s true that “fluff” is almost a purely debatable and subjective term, but I think that an algorithm which has to detect low quality content is going to draw some harsh lines. The slack you cut yourself may prove to be too much slack.
Low Quality Content Is Not Always about the Words Too many people have complained about “low quality content” as the source of Google’s Panda downgrades. The Google Panda algorithm doesn’t really grade content quality. Rather, the Google Panda algorithm is grading the quality of your Website. I have argued for this equating to “presentation”. Regardless of whether you agree with me about what Panda is measuring, as an SEO you cannot ignore the fact that presentation is important to search engine optimization. We have to fix many presentation issues (canonicity, indexability, semantic markup, sensible use of language, image markup, etc.) in the course of performing basic on-site optimization.
But what Google seems to be taking aim at with its “low quality guest blogging” guideline may indeed be about words: not so much “how many words you use” as “how important you make the words you use”. In other words, if you’re just embedding a few relevant points in fluffy filler content then your article won’t be perceived as very high quality by anyone.
When writing scannable content, you should make the sub-headings demand that the following paragraphs be read. And if the reader fulfills that demand then the content should deliver on the promise. You have to make SOME points only in the unscannable copy. Otherwise, why do you have that extra text on the page?
In-depth Content is Not Natively, Inherently Scannable Most if not all of the articles you write will NEVER be treated as “in-depth content” by the Google algorithm (listed in a special section at the bottom of a search result page). Nonetheless, if you’re trying to write an “in-depth” article on a complicated subject your article’s organization must reflect the complexity of the topic, or the format of your presentation — NOT THE DETAILS you are providing to the reader.
In other words, if your copy is scannable because you put all the important facts into the sub-headings and bullet point lists then you’re not writing in-depth content. If the article is scannable then the reader is able to extract meaningful information from the scan points; everything else is superfluous. An in-depth article provides real information, real detail that cannot be extracted from scanning the article.
And if your in-depth content is scannable then you’re either wasting words on a really short topic or else you’re leading readers to miss out on important information. The structure of the article can be such that the page is broken up into digestible components for easier reading, but making it easier to read the page doesn’t mean you are making it easier to read the information.
When you really want your reader to get as much information as possible from your article you need to draw that reader deeper into the text. If you don’t believe this is possible then you just need to write a short article, not something so long you feel it must be scannable.
Some people stop reading long articles before they reach the end. That is true. But it’s also true that most people stop reading short articles before the end. Why? Because short articles rarely have anything worthwhile to say.
That doesn’t mean conciseness cannot be informative. Rather, it means that conciseness must not sacrifice information. Concise content must be useful and informative. It can be long or short and still be concise. But if it’s short it cannot be scannable.
Too many people have been writing scannable content for the Web. All the marketing advice columnists have sworn on their holy mantras that readers don’t stay long on a Web page. Well, that’s simply not true. Readers won’t stay long on a page that doesn’t hold their attention. They’ll keep coming back time and time again to a page that they believe is useful, helpful, and informative.
Whether the page is scannable really has nothing to do with how informative and helpful it is.
Read More about Search Engine Optimization
How Long Does It Take Google to Credit A Website with Links?
Natural Backlink Profile: Endless Ways to Build One
Website Not On Google? Why Some Internal Pages Aren't Indexed
RankBrain and Neural Matching: What Is the Difference?
Follow Reflective Dynamics |
Click here to follow Reflective Dynamics on Twitter: @refdynamics. Click here to follow SEO Theory on Twitter: @seo_theory. Reflective Dynamics' RSS Feed (summaries only) |