It’s Time to Update Your SEO Strategy for 2018


Article ImageA quarter-century ago, when SEO was in its infancy, it was as easy as slathering on a heavy layer of keywords and tossing in abundant backlinks to rise in the rankings on search engine results pages. But by 2018, that rudimentary approach will be a distant memory, with Google, Bing, and other search engines’ sophisticated algorithms in a state of constant refinement. Those organizations will seek to outwit unscrupulous players determined to manipulate results at the cost of user experience. Meeting the challenge of effective SEO may take on even more importance in the year to come, considering the renewed pressure on search engine providers to help users weed out fake news and other low-quality content.

Search behaviors also continue to evolve. Google remains, of course, the outsize search champion, so an in-depth understanding of its algorithms is critical for SEO marketers. Alexander Kesler, founder and president of digital marketing agency inSegment, says, “This year, there have been a lot of changes happening to the Google algorithm and Google RankBrain [Google’s machine learning artificial intelligence technology], and the pace of those changes is only accelerating.”

But with consumers using nontraditional search engines, including Facebook, Amazon, and Google Maps as their initial stop—or issuing verbal commands to kitchen countertop virtual assistants to conduct the search for them—there are still some nuanced, critical best practices for SEO marketers to consider implementing in 2018 to help their organizations and brands rise above in search.

Whose Good Opinion Do You Seek?

First, a quick review of the search engines that matter right now. “As far as search engines go, Google is still by far the biggest player,” says William Richards, founder and CEO of EasyRedir, a URL redirection service that helps organizations reinforce their brands and maintain SEO. With upward of 78% of desktop searches and an astounding 95% of mobile searches conducted via Google in January 2017, according to NetMarketShare, there’s simply no way to master SEO without understanding what makes Google’s search engine algorithms tick.

Beyond Google, the same January 2017 NetMarketShare figures show Bing and Baidu ranking second and third, with 7.8% and 7.7% respectively, while Yahoo (5.1%), Ask (0.15%), AOL (0.05%), and Excite (0.01%) bring up the back of the train.

Of course, the way consumers discover the answers they seek has grown far beyond traditional search. Think of turning to Yelp first for a restaurant recommendation on a business trip or to Amazon to type in “coffee maker” to see what people are saying about various models. Forrester analyst Collin Colburn says, “There are two kinds of discovery driving use of nontraditional engines. One is intent-based discovery, where people go anywhere there’s a search bar, like Amazon or Trivago.” In those cases, he says, marketers should use all the best practices they’ve learned managing for Google SEO to optimize their product listings in each service.

Colburn calls the second type of discovery “serendipitous discovery.” He says, “This is when you’re on Facebook, Twitter, or Instagram and happen to come across something you didn’t know about. Like when you see a picture on Instagram that a friend took of a restaurant in New York City and decide to go there based on the post, instead of Googling for an idea.” That type of discovery, Colburn points out, has the potential to slowly erode search traffic from Google and other search engines. “That’s hard for marketers to control, but it points to the need to plot the channels that your customers are buying through, using a data management platform, and fine- tune your approach.”

Don’t Overlook the Basics

The good news is that in 2018, the basic, most helpful, necessary condition for SEO success remains the same: high-quality content. The bad news is that there are few shortcuts to high-quality content. It would be comforting to believe that there is some magic formula that could be discerned for what search engines term “high-quality content”—a certain length or layout, perhaps—but Colburn says that definition will differ for every organization. “I don’t think search engines really care how long your content is or whether you’re using graphics,” he says. What search engines notice is when your customers consume it.

Kesler agrees, saying organizations should strive for “content that gets consumed fully—think of a video that people watch to completion or an article where people stay on the page long enough to read the whole thing.” Richards summarizes it simply: “Write great content for the audience you’re trying to reach.” Creating that targeted, consumable content puts the onus on organizations to truly understand their clients’ needs and expectations. Colburn says, “My perspective is that you first have to take a step back and ask, ‘What are the types of content our customers engage with?’ and second, ‘Do we as a brand need to see ourselves as publishers?’ There are some products that really don’t need a lot of content to support them.”

On the technical side, it seems obvious that having a website load quickly and securely is a fast track to better search results placement. Even so, Kesler says, “The vast majority of websites we see are missing the basics of technical best practices: how their code is laid out, a quick load time.” Colburn agrees that technical basics are an often overlooked way to garner better SEO results: “With so much emphasis on the content side, a lot of the technical aspects of SEO have been left behind, and that’s a worry.”

One easy-to-overlook basic is making sure search engine crawlers can find their way around your site easily by making good use of schema markup techniques. “You want a reasonable code-to-content ratio,” says Kesler, meaning that a website’s underlying source code has plenty of readable content that a search crawler can understand versus code that it can’t. Kesler says, “A page may look fine to the visitor, but because everyone’s creating their own CMS, the underlying code is stuffed with bad programming.” Cleaning up the code-to-content jumble is an effective behind-the-scenes tactic to improve SEO quickly.

Colburn says it’s also important to know how far crawlers are even getting in a site, by using tools such as DeepCrawl to analyze web architecture and adjusting accordingly. “Navigation and site structure get left behind,” he says. “If content, like product pages, is buried too deep in a website, crawlers may just never get there.”

 



Source link

?
WP Twitter Auto Publish Powered By : XYZScripts.com