Google has come a long way from the user experience of “ten blue links.” Today, Google pulls in a vast amount of the information it searches, has a keener understanding of what you are looking for — and serves it up to you directly.
Google’s organization of the world’s data, called ‘The Knowledge Graph,’ affects about 25% of all search queries. Google serves more and more rich data to minimize the need for users to click a second time. Search for the term “weather” or the title of a movie, and Google will serve up relevant, local data above any linked results.
When I recently searched for the correct spelling of the name of a director at Harvard, Google surprised me with a Wikipedia entry above a link to the site.
What does this mean for web content publishers?
This scraping and delivery of content is convenient for users eager to save a click. It also has practical ramifications for the originating content publishers. Today, a search engine optimization (SEO) must go far beyond meta tags and content keywords. Publishers need to closely watch and respond to web traffic analytics (for example, understanding dark social and developing a robust Wikipedia strategy) as well as technical features offered by search engines (for example, rich snippets and structured data).