The changes that took place in 2017 were of a different, more fundamental nature, lurking just below the surface, and impacting the future of online marketing in ways we haven’t yet fully digested.
This has been a deceptively silent year from Google. The search engine optimisation (SEO) industry didn’t experience the kind of dramatic, sweeping algorithm updates that it has in previous years, but make no mistake, a lot more has changed than you probably realised.
Let’s take a look at eight of those changes now.
While the majority of homes still don’t currently have a smart speaker, Gartner predicts that about 75 percent of homes will have one by 2020. But 2017 was already a big year for smart speakers, with 15 million Amazon Echos making their way into US homes.
Google is vying for Amazon’s US market share of the smart speaker market, and their Google Home project is claiming roughly 25 percent of that market in the fourth quarter of 2017.
The interplay between smart speakers and digital assistants like Google Now, Siri, and Cortana is changing the way people are searching for information. A much larger percentage of searches are now focused on getting bite-sized pieces of information such as facts, locations, figures, and statuses, as opposed to searching for full-sized web pages.
It’s not that these changes indicate that people are less interested in finding long-form content than they were in the past. There are 6.5 billion searches performed each day, and that figure is still growing every year.
What is changing is that a higher percentage of those searches are for bite-sized pieces of information, in large part because of the fact that smart speakers and digital assistants are now capable of interpreting the information on the web and presenting it in the form of a direct answer to a vocal question.
Perhaps, the most lasting impact this shift is having is on the way in which search is personalised. Search personalisation is not at all new, but digital assistants rely on user’s search history far more than Google has in the past, due in large part to the fact that AI needs as many contextual questions as possible in order to understand the nature of a search query and give a direct answer.
The SEO industry has been saying that it’s less about “rankings” for a long time, but this shift makes the value of earning a report with your audience far more valuable than it has ever been.
One of the biggest changes to Google’s search results in 2017 was the growing prevalence of ‘people also ask’ questions. These have been around for a very long time, but their prevalence in search results has grown a staggering 1,723 percent since July of 2015, with most of that growth occurring in 2017.
Since people on smartphones are less likely to use their keyboards, and because there’s only so much that Google’s digital assistant is capable of interpreting, the value of these suggested searches has gone through the roof and, as a result, so has their prevalence in the search results.
In addition to showing ‘people also ask’ questions, the questions lift answers from web pages which, in turn, draws extra attention to those pages. If there is an information gap between the question and something else the searcher might want to know that they suspect the page will tell them, this also leads to increased clicks from the search engines.
Some SEOs are spreading doom and gloom in response to these changes, arguing that Google is trying to replace web pages with their own answers, but the reality is that the people who are looking for long-form content will continue to look for it.
This does mean, however, that for a web page to be valuable in 2017 and the years to come, it must be providing answers and solving problems that can’t be solved in a few short sentences.
Related to this change, dictionary results started taking up a much larger piece of the pie than they have in the past, and Wikipedia entries, while still a huge influence on search results, lost some ground relative to those gains.
According to Google, RankBrain is Google’s third most powerful ranking factor, behind content and links. RankBrain is a machine-learning algorithm that Google uses to interpret the meaning of a user query.
While RankBrain was first released in 2015, and there is no way to be certain to what extent it was modified or how much its influence on search results may have been adjusted, it is clear that query interpretation is playing a big part in the search results. The prevalence of voice search and digital assistants discussed above is playing a pivotal role in this, since queries directed at the search engine, instead of being directed at finding pages that mention specific queries, are starting to dominate.
In retrospect, it’s easy to believe that 2017 could be seen as some sort of turning point, where this really started to dominate. We are seeing far fewer in the SEO industry focusing specifically on keywords, and are seeing changes in the way that search traffic finds its way to pages, despite the fact that when RankBrain was first introduced in 2015, nobody even noticed until Google announced the change.
The influence of such AI on search traffic is invisible in individual cases because of the fact that it affects the long tail so much more than the short tail. What ranks for the primary, most popular search terms hasn’t changed very dramatically, but what ranks for unique or entirely novel queries has changed pretty dramatically, and in many cases what you will find is similar to what you will find searching for the short tail queries.
The fundamental takeaway from this is that brand presence and trust are more important than ever. Using unique keywords is not as valuable as it used to be, providing unique information and unique solutions is far more valuable than it ever has been.
There is an increased reliance that SEOs are placing on an information architecture based around topic clusters, which is related closely to the growing influence of RankBrian.
Before RankBrain, search engines had a limited ability to interpret the intent behind a searcher’s query. For that reason, some sites would target a wide variety of keywords that were all actually about the same topic, just using different keyword variations.
This is an approach that we have advised against for quite some time, primarily because of the fact that it creates a poor user experience and can be harmful for your brand.
But as it stands, this is a bad move from a strictly SEO perspective as well. Now that the search engines are capable of understanding that keyword variations all refer to the same thing, targeting a wide variety of keywords in such an explicit way actually signals that your site may not be trustworthy, as well as presents a cluttered information architecture that makes it difficult for search engines to point searchers to the appropriate page.
The topic cluster model instead focusses on building a site hierarchy based on the specificity of the information requested, working its way down from broad general topics down to highly specific questions with specific solutions.
This method presents a cleaner architecture to Google that makes it more likely for traffic to be sent to the appropriate page, and in some cases, more likely for Google to show a tree of links to your site in the search results.
A full 92 percent of consumers currently trust user-generated content more than traditional advertising. Google may not be explicitly factoring this into its ranking algorithms, but most SEOs believe that the types of pages that are clicked on in search results influence RankBrain and other aspects of Google’s algorithm.
It’s not clear whether click-through rates are a direct ranking factor, but it’s clear that Google’s machine learning algorithms are designed to predict which sites are most likely to get clicked on and lead to user satisfaction. We would also be making a mistake if we avoided mentioning that there are case studies out there supporting that clicks directly influence search results.
User generated content seems to be showing up in more search results than it did in the past. The overall impression of changes in search results in 2017 is one that puts more emphasis on trusted brands and user generated content, and less emphasis on affiliate and “made for AdSense” sites. This is part of a trend that predates 2017 and even Panda, but it certainly continued this year, and the capability of the search engines to tell the difference between UGC sites and lower-trust “blog spam” seems to have reached a tipping point.
Encouraging user-generated content is becoming a more important factor than it was in the past, and the need for good moderation to avoid spam within the UGC has also reached a sort of critical mass.
Google kicked off the year in January by rolling out a penalty for sites that displayed intrusive interstitials on mobile devices. The update flushed web pages out of search results if their ads negatively affected the user experience on mobile devices, particularly, if they made the interface difficult to use.
The update specifically targeted pop-ups that were impossible or difficult to close on mobile phones, interstitial standalone pages that users had to view before viewing the intended page, and full-size, above-the-fold ads that appeared to be interstitials, even if the user only had to scroll down to view the intended content.
But this was not the only Google update that hit sites with intrusive ads. Glenn Gabe tracked a large number of updates that occurred in 2017, broadly termed “quality updates.” In addition to identifying and flushing pages with thin content, these updates hit pages that used deceptive links, intrusive ads, interface breaking manipulation, and other forms of “optimisation” focused on getting users to click on things by accident or out of frustration.
If your business model was focused on extracting unintentional clicks from users in 2017, odds are you took a hit from Google by one of these updates. This trend will almost certainly continue.
Everybody knew this was coming, but 2017 was the year that mobile search traffic finally outpaced desktop traffic. While the value of something resembling a desktop is unlikely to go away anytime in the near future, mobile devices are now the default devices of the typical searcher.
Shortly before 2017, in November, Google announced that it would be taking a mobile-first indexing approach. Google has continued to promote sites that perform better on mobile devices, are easier to use on mobile devices, and sites that use accelerated mobile pages (AMP).
SEOs and developers who focused on mobile-first design, interfaces, and information architecture were likely rewarded by the search engines in 2017. Sites that still haven’t updated to meet the needs of the mobile user will often have experienced dramatic downward shifts in traffic.
Of primary concern for the modern developer is designing for both mobile and desktop audiences. SaaS and products designed for intensive use are, of course, some of the most difficult to design for mobile audiences. Some features are next to impossible on mobile devices, and this introduces the need for device-dependent features, in addition to responsive design, which is at this point the default, and considered more useful in most cases than mobile “versions” of sites.
2017 was the first year in a long time to lack the dramatic algorithm updates of the past. Aside from the “Fred” update and the quality updates tracked by Glenn Gabe, this has been a quiet year for the type of updates that make noise in the SEO industry.
This isn’t to say that Google hasn’t been making a lot of changes. If you’ve read this far into the post, you already know that. What’s less obvious, however, is that Google seems to have switched from abrupt updates to gradual ones.
In November this year, for example, Google clarified that they are still making changes to the Panda and Penguin algorithms. Those changes are not, however, producing the same kind of noise they did in the past.
The reason for that is not that the changes are less dramatic. It is because the changes are being rolled out more gradually, or being taken into account as pages are indexed, rather than after.
This, in turn, means that it is more difficult for the SEO industry to identify, which factors are negatively or positively influencing search results, since, we can no longer identify huge numbers of similar sites all being affected at the same time by the same update.
In conclusion, 2017 might not have been a year in which sweeping algorithm updates created a lot of noise in the SEO community, but it is a year in which fundamental shifts have taken and are continuing to take place in terms of what the search engine business model looks like and how the typical searcher operates. Mobile search, AI, digital assistants, and smart speakers are having a dramatic impact on the future of search, while keyword-focused, misdirection-based revenue models continue to fall by the wayside. With all of this taking place now, 2018 is bound to be an interesting year.
(Disclaimer: The views and opinions expressed in this article are those of the author and do not necessarily reflect the views of YourStory.)