Wednesday, 6 December 2017

Highlights from TechSEO Boost: The key trends in technical SEO

Although most search conferences contain some sessions on technical SEO, until now there has been a general reluctance to dedicate a full schedule to this specialism.

That is an entirely understandable stance to take, given that organic search has evolved to encompass elements of so many other marketing disciplines.

Increasing visibility via organic search today means incorporating content marketing, UX, CRO, and high-level business strategy. So to concentrate exclusively on the complexities of technical SEO would be to lose some sections of a multi-disciplinary audience.

However, the cornerstone of a successful organic search campaign has always been technical SEO. For all of the industry’s evolutions, it is technical SEO that remains at the vanguard of innovation and at the core of any advanced strategy. With an average of 51% of all online traffic coming from organic search, this is therefore not a specialism that marketers can ignore.

Enter TechSEO Boost: the industry’s first technical SEO conference, organized by Catalyst. Aimed at an audience of technical SEOs, advanced search marketers and programmers, TechSEO Boost set out to be a “technical SEO conference that challenges even developers and code jockeys”.

Though the topics were varied, there were still some narrative threads through the day, all of which tie in to broader marketing themes that affect all businesses. Here are the highlights.

Towards a definition of ‘Technical SEO’

Technical SEO is an often misunderstood discipline that many find difficult to pin down in exact terms. The skills required to excel in technical SEO differ from the traditional marketing skillset, and its aim is traditionally viewed as effective communication with bots rather than with people. And yet, technical SEO can make a significant difference to cross-channel performance, given the footprint its activities have across all aspects of a website.

The reasons for this discipline’s resistance to concrete definition were clear at TechSEO Boost, where the talks covered everything from site speed to automation and log file analysis, with stops along the way to discuss machine learning models and backlinks.

Though it touches on elements of both science and art, technical SEO sits most comfortably on the scientific side of the fence. As such, a precise definition would be fitting.

Russ Jones, search scientist at Moz, stepped forward with the following attempt to provide exactly that:

This is a helpful step towards a shared comprehension of technical SEO, especially as its core purpose is to improve search performance. This sets it aside slightly from the world of developers and engineers, while linking it to the more creative practices like link earning and content marketing.

Using technology to communicate directly with bots impacts every area of site performance, as Jones’ chart demonstrates:

tech_seo_2

Some of these areas are the sole preserve of technical SEO, while others require a supporting role from technical SEO. What this visualization leaves in little doubt, however, is the pivotal position of this discipline in creating a solid foundation for other marketing efforts.

Jones concluded that technical SEO is the R&D function of the organic search industry. That serves as an apt categorization of the application of technical SEO skills, which encompass everything from web development to data analysis and competitor research.

Technical SEO thrives on innovation

Many marketers will have seen a technical SEO checklist in their time. Any time a site migration is approaching or a technical audit is scheduled, a checklist tends to appear. This is essential housekeeping and can help keep everyone on track with the basics, but it is also a narrow lens through which to view technical SEO.

Russ Jones presented persuasive evidence that technical SEO rewards the most innovative strategies, while those who simply follow the latest Google announcement tend to stagnate.

Equally, the sites that perform best tend to experiment the most with the latest technologies.

There are not necessarily any direct causal links that we can draw between websites’ use of Accelerated Mobile Pages (AMP), for example, and their presence in the top 1000 traffic-driving sites. However, what we can say is that these high-performing sites are the ones leading the way when new technologies reach the market.

That said, there is still room for more companies to innovate. Google typically has to introduce a rankings boost or even the threat of a punishment to encourage mass adoption of technologies like HTTPS or AMP. These changes can be expensive and, as the presentation from Airbnb showed, fraught with difficulties.

That may go some way to explaining the gap between the availability of new technology and its widespread adoption.

Jones showed that the level of interest in technical SEO has increased significantly over the years, but it has typically followed the technology. We can see from the graph below that interest in “Technical SEO” has been foreshadowed by interest in “JSON-LD.”

tech_seo_trends

If SEOs want to remain vital to large businesses in an era of increasing automation, they should prove their value by innovating to steal a march on the competition. The performance improvements that accompany this approach will demonstrate the importance of technical SEO.

Everyone has access to Google’s public statements, but only a few have the ability and willingness to experiment with technologies that sit outside of this remit.

Without innovation, companies are left to rely on the same old public statement from Google while their competitors experiment with new solutions.

For more insights into the state of technical SEO and the role it plays in the industry, don’t miss Russ Jones’ full presentation:

Automation creates endless opportunities

The discussion around the role of automation looks set to continue for some time across all industries. Within search marketing, there can be little doubt that rules-based automation and API usage can take over a lot of the menial, manual tasks and extend the capabilities of search strategists.

Paul Shapiro’s session, ‘Working Smarter: SEO automation to increase efficiency and effectiveness’ highlighted just a few of the areas that should be automated, including:

  • Reporting
  • Data collection
  • 301 redirect mapping
  • Technical audits
  • Competitor data pulls
  • Anomaly detection

The above represent the fundamentals that companies should be working through in an efficient, automated way. However, the potential for SEOs to work smarter through automation reaches beyond these basics and starts to pose more challenging questions.

As was stated earlier in the day, “If knowledge scales, it will be automated.”

This brings to light the central tension that arises once automation becomes more advanced. Once we move beyond simple, rules-based systems and into the realm of reliable and complex automation, which roles are left for people to fill?

At TechSEO Boost, the atmosphere was one of opportunity, but SEO professionals need to understand these challenges if they are to position themselves to take advantage. Automation can create a level playing field among different companies if all have access to the same technology, at which point people will become the differentiating factor.

By tackling complex problems with novel solutions, SEOs can retain an essential position in any enterprise. If that knowledge later receives the automation treatment, there will always be new problems to solve.

There is endless room for experimentation in this arena too, once the basics are covered. Shapiro shared some of the analyses he and his team have developed using KNIME, an open source data analysis platform. KNIME contains a variety of built in “nodes”, which can be strung together from a range of data sources to run more meaningful reports.

For example, a time-consuming task like keyword research can be automated both to increase the quantity of data assessed and to improve the quality of the output. A platform like KNIME, coupled with a visualization tool like Tableau or Data Studio, can create research that is useful for SEO and for other marketing teams too.

Automation’s potential extends into the more creative aspects of SEO, such as content ideation. Shapiro discussed the example of Reddit as an excellent source for content ideas, given the virality that it depends on to keep users engaged. By setting up a recurring crawl of particular subreddits, content marketers can access an ongoing repository of ideas for their campaigns. The Python code Shapiro wrote for this task can be accessed here (password: fighto).

You can view Paul Shapiro’s full presentation below:

Machine learning leads to more sophisticated results

Machine learning can be at the heart of complex decision-making processes, including the decisions Google makes 40,000 times per second when people type queries into its search engine.

It is particularly effective for information retrieval, a field of activity that depends on a nuanced understanding of both content and context. JR Oakes, Technical SEO Director at Adapt, discussed a test run using Wikipedia results that concluded: “Users with machine learning-ranked results were statistically significantly more likely to click on the first search result.”

This matters for search marketers, as advances like Google’s RankBrain have brought machine learning into common use. We are accustomed to tracking ranking positions as a proxy for SEO success, but machine learning helps deliver personalization at scale within search results. It therefore becomes a futile task to try and calculate the true ranking position for any individual keyword.

Moreover, if Google can satisfy the user’s intent within the results page (for example, through answer boxes), then a click would also no longer represent a valid metric of success.

A Google study even found that 42% of people who click through do so only to confirm the information they had already seen on the results page. This renders click-through data even less useful as a barometer for content quality, as a click or an absence of a click could mean either high or low user satisfaction.

Google is developing more nuanced ways of comprehending and ranking content, many of which defy simplistic interpretation.

All is not lost, however. Getting traffic remains vitally important and so is the quality of content, so there are still ways to improve and measure SEO performance. For example, we can optimize for relevant traffic by analyzing our click-through rate, using methods such as the ones devised by Paul Shapiro in this column.

Furthermore, it is safe to surmise that part of Google’s machine learning algorithm uses skip-gram models to measure co-occurrence of phrases within documents. In basic terms, this means we have moved past the era of keyword matching and into an age of semantic relevance.

The machines need some help to figure out the meanings of phrases too, and Oakes shared the example of AT&T to demonstrate query disambiguation in action.

at_and_t

Machine learning should be welcomed as part of Google’s search algorithms by both users and marketers, as it will continue to force the industry into much more sophisticated strategies that rely less on keyword matching. That said, there are still practical tips that marketers can apply to help the machine learning systems understand the context and purpose of our content.

JR Oakes’ full presentation:

Technical SEO facilitates user experience

A recurring theme throughout TechSEO Boost was the relationship between SEO and other marketing channels.

Technical SEO has now sprouted its own departments within agencies, but that can see the disciplined sidelined from other areas of marketing.

This plays out in a variety of scenarios. For example, the received wisdom is that Google can’t read the content on JavaScript websites, so it is the role of SEO to reduce the quantity of JavaScript code on a site to enhance organic search performance.

In fact, Merkle’s Max Prin posited that this should never be the case. The role of an advanced SEO is to facilitate and enhance whichever site experience will be most beneficial for the end user. Often, that means working with JavaScript to ensure that search engines understand the content of the page.

That begins with an understanding of how search engines work, and at which stages technical SEO can make a difference:

search_engines

Prin also discussed some useful technologies to help pinpoint accessibility issues, including Merkle’s fetch and render tool and the Google Chrome Lighthouse tool.

Another significant area in which technical SEO facilitiates the user experience is site speed.

Google’s Pat Meenan showcased data pulled from the Google Chrome User Experience Report, which is open source and stores information within BigQuery.

His research went beyond the reductive site speed tests we usually see, which deliver one number to reflect the average load time for a page. Meenan revealed the extent to which load speeds differ across devices, and the importance of understanding the component stages of loading any web page.

The load times for the CNN homepage showed some surprising variation, even between high-end smartphones such as the iPhone 8 and Samsung Galaxy S7 (times are in milliseconds):

load_times

In fact, Meenan recommends using a low- to mid-range 3G smartphone for any site speed tests, as these will provide a truer reflection of how the majority of people access your site.

Webpagetest offers an easy way to achieve this and also highlights the meaningful points of measurement in a site speed test, including First Paint (FP), First Contentful Paint (FCP), and Time to Interactive (TTI).

This helps to create a standardized process for measuring speed, but the question still remains of how exactly site owners can accelerate load speed. Meenan shared some useful tips on this front, with HTTP/2 being the main recent development, but he also reiterated that many of the existing best practices hold true.

Using a CDN, reducing the number of HTTP requests, and reducing the number of redirects are all still very valid pieces of advice for anyone hoping to reduce load times.

You can see Pat Meenan’s full presentation below:

Key takeaways from TechSEO Boost

  • Technical SEO can be defined as “any sufficiently technical action undertaken with the intent to improve search performance.”
  • Automation should be a central concern for any serious SEO. The more of the basics we can automate, the more we can experiment with new solutions.
  • A more nuanced understanding of Google’s information retrieval technology is required if we are to achieve the full SEO potential of any website.
  • HTTP/2 is the main development for site speed across the web, but most of the best practices from a decade ago still hold true.
  • Improving site speed requires a detailed understanding of how content loads across all devices.

You can view all of the presentations from TechSEO Boost on Slideshare.

This article was originally published on our sister site, ClickZ, and has been republished here for the enjoyment of our audience on Search Engine Watch.



source https://searchenginewatch.com/2017/12/06/highlights-from-techseo-boost-the-key-trends-in-technical-seo/

No comments:

Post a Comment