Thursday 31 May 2018

What data do you need to find, pitch and win new SEO clients?

For SEO agencies and independent consultants looking for new business, a two-step strategy might be all you need to demonstrate your efficacy and separate from competitors. First, identify prospective clients that are well-suited to your offering. Second, send them a pitch that unequivocally communicates the potential results they can achieve from your services. The key to both is data and, conveniently, the same data that you use to recognize great potential clients in the first step can also be used to make a powerful case as to what your services can deliver for them.

Identifying your ideal SEO clients

The businesses you probably want to approach are those that are tuned in to how SEO works and have already invested in it, but that still have a veritable need for your services in order to achieve their full SEO potential. Naturally, a brand that has already climbed to the top of the most relevant search engine results pages (SERPs) isn’t a great candidate because they simply don’t need the help. Nor is a business with no SEO experience and no real SERP presence; they might require a particularly hefty effort to be brought up to speed, and perhaps won’t be as likely to invest in – and commit to – an ongoing SEO engagement.

While investigating potential business opportunities within this desired Goldilocks Zone of current SEO success, you might also be looking to target companies in the industries that your agency has previously done well in – both to leverage those past successes and demonstrate relevance to prospective clients with an adjacent audience. This makes it easy for prospects to see themselves in the shoes of those clients you’ve already helped, and for you to apply and repeat your tried-and-tested techniques.

Putting this advice together, you can begin the client search process by looking at keywords important to industries you’re familiar with. You will probably want to explore companies within the mid-range SERPs (ranks 10–30) that could contend for the top spots if they had better professional assistance. You can then perform an analysis of these sites to determine their potential for SEO improvement. For example, a potential client that derives a great deal of its traffic from a keyword in which it still has room to grow and move up in the SERPs is ideal.

Use data to make your pitches irrefutable

While a slick email pitch can go a long way toward winning over new clients, it’s hard to beat the power of data-driven evidence (of course, your pitch can succeed on both style and substance). Be sure to use specific insights about the client’s SEO performance in your pitch – adding visuals will help make your case more clear and digestible. Also, consider using an SEO case study focused on your success in an overlapping space to offer an example of the results they could expect. Your pitch should culminate with a call-to-action for the potential client to get in touch to go deeper into the data and discuss how to proceed.

There are a number of approaches to framing your pitch. Here are four examples of different appeals you can use:

  • Show how you can increase the client’s share of voice for a high value keyword
  • Tell the business about keywords they’re missing out on
  • Show where the client could (and should) be building backlinks
  • Explain technical SEO issues the client has and how to fix them.

Show how you can increase the client’s share of voice for a high value keyword

In SERPs, holding a top ranking is often exponentially superior. For example, it’s not unusual to see the top result capture 30% of all traffic (or ‘share of voice’) for a given keyword, while the tenth result receives a mere 1%.

Craft a pitch that pairs a result like this with data on how much traffic the keyword actually delivers to the company’s site, and the vast potential to multiply that traffic by improving their share of voice for that keyword becomes clear.

Tell the business about keywords they’re missing out on

Where a client has gaps in their keyword strategy, demonstrating your ability to fill them and deliver traffic the prospect didn’t realize they could be earning goes a long way towards proving your agency’s expertise and value. These keywords can be found by examining sites that share an audience overlap with the potential client, and then studying the keywords they rank highly on. This investigation may yield unexpected results, reinforcing that your agency can drive results in areas the client never would have thought of without your expertise.

Show where the client could (and should) be building backlinks

Discover gaps where a potential client’s competitors are outmaneuvering and outperforming them when it comes to establishing links from other sites – and then report these within the framework of a strategy that will close these gaps at every level. If a client clearly isn’t pursuing this strategy (i.e. if the average competitor has multiple times their backlinks), be sure to communicate the value of competing on this front.

Explain technical SEO issues the client has and how to fix them

To use this approach, perform an audit of the potential client’s site to identify opportunities to improve its SEO practices. This is especially important if your firm specializes in these services.

Providing specific tips and guidance serves as a substantial upfront gesture, and clearly demonstrates the value of an ongoing relationship.

Conclusions

Conveying the specific remedies and additions that you would pursue to optimize a client’s site and search strategy serves as a strong introduction and major first step toward becoming indispensable to the client as the one hired to execute a winning SEO game plan. Be sure to personalize your pitches as much as possible to stand out from competitors, while ensuring that compelling data makes up the crux of your appeal and the proof that you’re the firm to help that business achieve its full SEO potential.

Kim Kosaka is the Director of Marketing at Alexa.com, which provides insights that agencies can use to help clients win their audience and accelerate growth.



source https://searchenginewatch.com/2018/05/31/what-data-do-you-need-to-find-pitch-and-win-new-seo-clients/

Wednesday 30 May 2018

Where we’re going, we won’t need websites 

As voice becomes the dominant force in search and people spend more time consuming content via social media, the future for the humble home page looks very bleak.

If comScore is correct and half of all searches by 2020 are made via voice, a crucial question arises: will we still need websites?

Even if the research is over-egged and the tipping point is reached a year or two later, the question still remains.

As consumers increasingly get used to asking Alexa, Siri or Google for the news headlines, a dinner recipe or flight options for a weekend away, answers will not be provided by ten blue SEO links. Rather, the options will be weighed up by an algorithm before what is considered to be the best answer is read out.

Remember Lycos and AltaVista?

New technology can always delight early adopters, but as it becomes more mainstream, seasoned observers know some huge names may become casualties as the public adopts new behaviors. Remember AltaVista, AskJeeves and Lycos, as well as when Yahoo! was a force in search? Read these names out loud and you may be less inclined to wonder whether voice will have an impact and shift focus to picking winners and losers.

Make no mistake, this is happening: a tide of disruption heading for search. Canalys estimates 56.3 million smart speakers will ship this year alone. The Amazon Echo has first-mover advantage and so has a 69% share. Google is in second spot with 25%.

However, given the core function of these speakers (beyond playing audio) is to perform voice searches, it would take a brave digital marketing executive to bet against Google closing the gap and even coming out on top – eventually.

Brands rush to the call of Alexa

To get an idea of how this impacts search, as well as consumers’ interaction with their favorite brands, one need only look at the early rush to set up Alexa skills.

In travel, Expedia and Kayak can find flights and trips via voice search; an Uber or Lyft ride can be hailed too. Capital One lets users check out their balance and Vitality has recipes and health advice available. If that sounds too healthy for a Friday night, both Pizza Hut and Domino’s are set up to receive an order via Alexa. On the other hand, Vitality allows users to find their own recipes and discover a workout to shift the calories.

Then, of course, there are the weather, travel and news travel updates that can be handled via voice rather than a visit to a website.

VR keyboard, anyone?

It isn’t just voice. Canalys is predicting that this is the year when VR headset sales will increase five-fold as the sector moves towards shipping almost 10 million units per year by 2021.

It’s hard to imagine VR users typing a search enquiry into a virtual keyboard in the air. Even harder to imagine that they will scan through a list of blue links to no doubt pick out a text-heavy page.

Results will be aggregated through a dominant source of information in the each vertical: taking a tour of your next house will likely be made possible by Zoopla, or a similar aggregator; picking out a hotel via a VR version of Expedia; test-driving your next car perhaps via something like AutoTrader. Content would be coming from multiple sources, but will likely be accessed through a single aggregator: no need to type in a query and certainly no blue links to choose which home page to visit.

Is the home page already dying?

This is already starting to happen in news and media. Alarm bells no doubt started to ring when a chart for the New York Times showed how bad things had got with direct traffic.

Source: New York Times.

The dates are old, but that underlines how this trend for news sites to lose direct traffic has been developing for at least 5 years.

Look at the latest figures for two British newspapers, The Times and The Telegraph, and the trend seems very clear. Even though the sites are subscription-based (presumably giving users an impetus to get the most from their monthly fee and bookmark the home page), direct traffic accounts for one third and one fifth of all visitors respectively. This is dwarfed by search, with social bringing up the rear.

Source: SimilarWeb

If you then compare these paid-for sites with two free resources, The Mirror and Independent.co.uk, the trend becomes even more notable. When people have no need to validate paying a monthly fee to get their money’s worth, both sources of free news sink to just one in five visitors arriving direct. Here social is far closer to direct traffic in importance, with search still way out ahead as the number one source of visitors.

Source: SimilarWeb

Putting the data to one side and asking consumers where they get their news results in a huge spike in favor of social media. GlobalWebIndex results from 2017 revealed nearly half, or 44%, say they get news from social media while 37% said they go direct to a news website. This 37% is matched by those who reveal they get their news via referrals from ‘somewhere else’ and a news aggregator service. The overall percentage exceeds 100% because of mixed behaviors.

People say they access news mostly through social, but the traffic-monitoring data says mostly through search. Either way, going direct to the home page is a habit the majority of people no longer have.

The mobile factor

It’s also clear that mobile websites’ importance is beginning to fade. App usage has now overtaken the mobile web, suggesting that although people still use mobile sites, they have favorite apps for brands or key tasks.

It’s perfectly reasonable to assume this behavior will tap in to the trend for brands to make their content voice-friendly. If a consumer has a preference to book hotels on Expedia and dinner with Domino’s, they will likely ask Alexa or Google to look for a Paris weekend deal or a two-for-one pizza offer through these favored brands. No need for a home page, though the app might be required to give an order reference or calendar reminder for peace of mind.

No more home pages?

If you look at the direction of travel, the future of the home page appears bleak.

Within 2 years we’ll hit a tipping point in voice search and this year should see a spike in sales of VR headsets – the former having far more immediate effect on search than the latter.

Also, in a mobile-first world, consumers are steering towards apps where they already know which brand they want to interact with, or trust an aggregator to come up with the right offer.

I’d suggest this means the home page will still limp on for a few years, providing information to voice search algorithms, as well as being a resource for information and ecommerce.

Ultimately, the job of a search marketer is going to shift towards getting their clients’ products and services in front of consumers via voice, and perhaps VR. There is no need for a home page here and we’re already seeing, particularly in news, how home pages are increasingly not the first port of call.

Consumers are increasingly looking for the simplicity of using voice and brands must adapt to find the best ways to make their ‘skill’ used for those searches or to craft their data so it becomes the top answer.

This will mean websites will eventually fall into disuse and become redundant. Not so much a fall off a cliff, but a long march into obscurity.



source https://searchenginewatch.com/2018/05/30/where-were-going-we-wont-need-websites/

Four ways Google is making SEO easier

One of the easiest ways to understand SEO’s importance to the marketing mix is to pay attention to what Google says and does. Google is very keen on good SEO because it makes the internet a better place for users. If the internet is a better place for users, then Google can sell more ads.

Here are four things Google has said and done to help marketers improve SEO that you may not be aware of.

Google added an ‘SEO’ audit to its Lighthouse extension

Google is actively giving developers advice on how to improve the sites they work on: its Lighthouse auditing tool now has an SEO component that can analyse any page for basic SEO competency and tell you how to make it better.

This is a nice change for search marketers, who have for a long time made up for Google’s radio silence with research and educated guesswork. Some of the tips offered by the audit extension are fairly obvious and well known (tile tag exists, canonicals not broken, etc.), but others give an interesting insight into how Google assesses a page – such as the importance of making sure your text is big enough. Beyond being useful to marketers, it’s interesting to see how many different factors contribute to a positive user experience and correlate with a higher search engine ranking.

Google made significant improvements to Search Console

Search Console – formerly known as Webmaster Tools – helps you understand what’s going on beneath the hood of your website. It’s a comprehensive piece of software that, in its latest beta version, allows you to immediately index blogs and view up to 16 months of data in the search analytics (Performance) report.

For search marketers, this is particularly important; just think back to the days before ‘(not provided)’ was your most common GA keyword. Now you have a rich bounty of keywords, just waiting to be incorporated into your search strategy.

It’s worth mentioning that Google is taking Search Console seriously: it’s actively asking for suggestions and potential improvements, and even implementing some of them.

Google has revamped its SEO guide

By relaunching its SEO starter guide, Google is offering newbies an easy way to improve the quality of their websites. If you’re reading this, you’re probably a bit beyond starter guides, but it never hurts to brush up on the basics, especially when they’re directly from the horse’s mouth – after all, who knew text size was such a big deal?

It’s a useful primer for anyone looking to brush up on their on-site optimization, and a strong indicator that Google is taking organic search as seriously as ever. With content, for example, it dedicates a whole section to advice on organising topics, understanding readers’ desires, optimising copy, images, and headlines for users (not engines), writing link text, and generally creating blogs and web pages that your target audience actually wants to read.

Google has hired a new public search liaison

Finally, Google’s hiring of a public search liaison suggests not only that organic search is here to stay, but that the company is willing to be more open and transparent about it.

When Matt Cutts – who led Google’s WebSpam team and served as a kind of unofficial liaison between the company and the SEO community – resigned in 2016, search marketing professionals started communicating with Google in a number of different ways. They popped up in Google hangouts with engineers, asked questions in official Google Threads, and turned up to conferences where Google’s employees were present.

Google, in turn, started communicating more with them via  the Google Security Blog, the Google Chrome blog, the general Google blog,  the Google Webmaster Central Blog, the Google Analytics blog, and the Google Search blog. It then appointed its first public liaison for search in October 2017: Danny Sullivan, a former SEO journalist and analyst.

No doubt he’ll prove a useful resource for the SEO and marketing communities. But more importantly, perhaps, is what Sullivan’s appointment says about Google’s shifting philosophy to search marketing. If it was once obscure and opaque about organic search, it’s now open and consultative.

 

Luke Budka is director at integrated marketing agency TopLine Comms.



source https://searchenginewatch.com/2018/05/30/four-ways-google-is-making-seo-easier/

Tuesday 29 May 2018

Five ways to use predictive analytics

The era of graphs and spreadsheets as a way of thinking about analytics is beginning to approach its end. Predictive analytics, along with associated artificial intelligence (AI) and machine learning technologies, are changing the way in which we deal with data. These tools are becoming more accessible, and ‘big data’ thinking is no longer limited to firms with billion dollar budgets.

Predictive analytics provides a glimpse into the future, as well as access to strategic insights that can open up new opportunities. Here are five ways you can put predictive analytics to use, and how you can change the way you think about data.

Qualifying leads

According to Forrester research, predictive analytics has found three main use cases for dealing with  leads. Specifically:

  1. Predictive scoring: This method analyzes how leads are responding to your marketing attempts and how likely they are to take action based on that information. In this way, you can more quickly identify which leads to focus more resources on and which to divert resources from.
  2. Identification models: This use case is an approach that focuses on comparing leads to customers who have taken actions in the past. In doing so, you can divert resources to those leads who are most promising based on previous actions they have taken, as well as identify new markets that you weren’t previously aware of.
  3. Personalization: In concert with predicting which leads are most likely to take which actions, the same data can be used to determine which leads respond best to which types of messaging. This advanced form of segmentation can take things deeper than simply splitting leads into groups – instead sending them much more personalized messages.

One prominent example of this was covered in the Harvard Business Review, detailing how a Harley Davidson dealership increased sales leads by 2930% using an AI named Albert.

The AI crunched CRM data to identify characteristics and behaviors of previous buyers. It then split them into micro-segments based on those characteristics. For each segment, it tested different combinations of headlines, visuals, and other elements to determine which worked best for each segment.

The value of your lead qualification is highly dependent on the value and quantity of your data. No matter how good your statistical models are, their abilities are still very limited without access to the information that they need to learn about your customers.

In the digital space – particularly if you are not using a CRM – the best place to start with predictive analytics will almost certainly be an integration of Google Analytics and Google BigQuery.

Modeling customer behavior

While lead qualification and conversion is the most obvious use-case for predictive analytics, and likely the one worth looking into first, it’s far from the only marketing application of this emerging technology. But virtually any use is going to have customer modeling at its core.

You can divide customer modeling into three basic types: cluster models, propensity models, and collaborative filtering.

Cluster models

Clustering is a way of segmenting customers into groups based on many variables. A cluster model looks for correlations between various attributes and identifies a number of equilibria in which certain types of attributes tend to accumulate. What makes clustering special, compared with traditional segmentation, is the sheer number of variables involved. Clusters often use 30 variables or more, far more than would be possible if you were manually segmenting customers, or even if they were manually segmenting themselves.

Clusters come in three forms:

  1. Product clusters: These are clusters of customers who tend to only buy specific types of products, ignoring other things in your catalog
  2. Brand clusters: These customers tend to buy from a specific collection of brands
  3. Behavioral clusters: These are segments of customers with a specific collection of behaviors, such as frequent buyers who place small orders, or customers who prefer the call center over the checkout cart.

What’s important to recognize about these clusters is that they enable predictions about which clusters people belong to – even with limited information. If they buy one product with a specific brand, your brand cluster can predict what other brands they may be interested in, rather than just the more obvious recommendation of simply offering everything else by the same brand.

Propensity models

A propensity model is one that makes future predictions about customer behavior based on correlations with other behaviors and attributes. This may be accomplished using regression analysis or machine learning. A good propensity model controls for as many variables as possible so that correlations aren’t confused for causes.

Here are a few examples of propensity models:

  • Propensity to unsubscribe: A model like this allows you to determine the appropriate email frequency, weighing the possibility that a recipient will unsubscribe against any possible positive outcome
  • Propensity to churn: These are customers who are likely to move on if you don’t take action, but who may be high value otherwise
  1. Lifetime value: Modeling the lifetime value of a customer can help you make strategic marketing decisions if it leads you to customers with more lifetime value, or leads to behaviors that extend lifetime value.

Other propensity models include predicting how far through somebody’s lifetime value you are, and how likely they are to convert or buy.

Collaborative filtering

If you’ve seen Amazon’s “customers who liked this product, also liked…” recommendations, you know what type of model this is. At first glance collaborative filtering might sound similar to product-based cluster models, but collaborative filtering is a bit different. Rather than grouping customers by the types of products they are likely to buy, collaborative filters make recommendations based on aggregate behavior.

In other words, this is less about the user’s product preferences and more about the behaviors that products tend to cause for users.

There are three types of collaborative filters:

  1. Up-sell recommendations. These are recommendations for a higher tier version of a product before the sale is made
  2. Cross-sell recommendations. Also offered before the sale is made, this is a recommendation for a product that is often bought at the same time as the initial one
  3. Follow-up recommendations. These are recommendations for products that people tend to buy a certain time period after buying a prior product, such as replacing a product that runs out, or buying dishes after buying a table.

Connecting the right product to the right market

Working backwards from customer modeling, it’s possible to identify markets for your products that you may not have been aware of. Here are just a few examples of how this use case can play out:

  • Incorporate referral sources into your cluster models. This will allow you to identify which traffic sources correlate with which types of products, brands, or behaviors. From this, you can immediately identify a new market for these products or brands
  • Incorporate referral sources into your lifetime value propensity models. This will allow you to determine which locations to invest more of your marketing resources into, since you roughly know what the ROI will be
  • Look for correlations between traffic sources and success with up-sells, cross-sells, and follow-up recommendations
  • Look for correlations between keywords and your customer models
  • Analyze the attributes that are strong predictors of buying specific types of products and brainstorm other markets that might share those attributes that you have not yet targeted
  • Investigate high performing outliers where limited data is available and investigate whether expanding in those markets is a good option.

Connecting the right users to the right content

There are a number of ways that you can leverage your customer models to connect prospects with content in ways that move you toward your goals, some of them more obvious than others. Here are a few examples:

  • Matching content related to products or brands based on the appropriate clusters
  • Matching users to conversion copy when propensity models predict they are most likely to buy
  • Recommending content to users that improves their propensity scores
  • Recommending content to users that enhances their likelihood of responding well to an up-sell or cross-sell
  • Matching traffic sources to the content that tends to produce high propensity scores for each particular traffic source.

As you can see, the number of approaches you can take here grows pretty quickly. Think strategically about how best to put your models to use and make the most of your models.

Discovering strategic marketing insights

While some predictive analytics tools can automatically streamline your marketing process and generate results (like Albert did for Harley Davidson), it’s important to remember that human decisions still play a very important part in the process.

Where predictive analytics and related AI tools often fail is in a propensity to ‘over-fit’ the data. They can get stuck at local maximums and minimums, incapable of making the leap to new terrain.

Escaping from traps like these, and making the most of these tools in general, requires you to find strategic insights from within your predictive analytics models.

For example, suppose you discover that a specific piece of content has a tendency to raise your prospects’ propensity scores; any automation you have in place can be applied to customize how your users are marketed to, and push them toward that piece of content. But what predictive analytics can’t tell you is whether there might be other traffic sources you haven’t tried yet that would be a good fit for that content. Using your experience and brainstorming capabilities, you can identify other potential markets for that content, feed them into your model, and see how the exposure changes things.

Your goal in working with these kinds of models must always be to find insights like these and test them to see if the results are as expected. If your model runs on autopilot it will not discover any new opportunities alone.



source https://searchenginewatch.com/2018/05/29/five-ways-to-use-predictive-analytics/

Tools to assist your SEO check-up

Whether you are new to SEO and looking for a litmus test of your website’s health, or have an ongoing campaign that may need a little refreshing, it’s always a good idea to perform regular check-ups or audits.

If you have complex campaigns that are already in full swing this may seem like a lot of effort. It’s a bit like going to the dentist; you may not like it but it’s necessary. Better to identify areas for improvement or success that can be capitalized on than to continue blindly following the original strategy with more limited results.

This article explores some of the tools that are available to you in order to perform an SEO check-up.

Keyword research

With the growing complexity of inbound marketing, voice search, Rankbrain and content marketing, the phrase ‘keyword’ is starting to feel somewhat of a profanity. In fact, Hubspot is even in the process of removing its keyword tracking function from the platform.

However, there is still significant emphasis placed on high value target keywords by clients and management alike. Furthermore, sound keyword research (or searcher intent-led research) can be incredibly valuable as a foundation for a more comprehensive, conversion-driven campaign.

The queries that searchers use to find your business can change over time so it’s always a good idea to audit existing target keywords to ensure that they are still viable. Google’s Keyword Planner should be your first port of call; after all it provides direct access to search data.

If you understandably don’t want to pin your campaign to certain keywords, focus instead on the solutions and value that you are trying to provide for your clients. This will help you analyse the overarching objectives of your campaign and influence how you then track the successes and areas for improvement. Use your analytics data (more on this later) to see whether the campaign is performing according to your original strategy.

Of course, there are other research tools that you can use including Answer the Public, Keywordtool.io or BuzzSumo.

Indicative metrics

For many, having indicative metrics can provide peace of mind with regard to the incremental improvement of campaigns. While this can be somewhat of a flawed system, they do provide easily digestible statistics to help with a check-up.

The two most popular metrics used by the industry are provided by Moz and Majestic. These figures should be viewed as quick indicative figures and should not be taken as chapter and verse for the health of your SEO campaign.

Moz’s Open Site Explorer

Domain authority (DA) is the most popular of the metrics provided by Moz’s Open Site Explorer, providing a score between 0 and 100. The theory is that according to the factors taken into account by Moz’s analytics, a website with a higher DA is more likely to rank in search. Moz also provides page authority, which is useful for landing or category pages on your website.

Majestic

In the same vein as Moz, Majestic provides two main metrics: trust flow and citation flow. These metrics are heavily based on the quantity and quality of linking domains and are potentially more useful for a check-up owing to the specific nature of the metrics. Majestic also provides a deeper breakdown of link factors, allowing users to deep dive into the health of their website’s backlink portfolio.

Pingdom, GT Metrix and PageSpeed Insights

With the roll out of Google’s mobile first index, load speed has never been more important. A slow loading website is a slow loading website. It has a dual impact: lower rankings and lower conversion rates, all underpinned by a poor user experience.

Any SEO check-up or audit should evaluate a website’s load speed. There are a number of tools available, all with their pros and cons. Google’s PageSpeed Insights is much like the Keyword Planner; it’s run by Google so pretty hard to ignore. However, the advice provided is reasonably generic. Use it conjunction with other tools such as Pingdom’s Website Speed Test and GT Metrix to really hone in on some of the load speed issues faced by your site and get your site loading quickly on both desktop and mobile.

W3C Validator

Websites are like cars: the more you use them, the more maintenance they need. Over time a website is likely to develop errors in the code, which will have an impact on how your website is viewed Google, especially if it then effects the aforementioned load speed. Use W3C’s Markup Validation Service to highlight errors in the code for your development team to fix.

Consumer research

The tools mentioned thus far give an overarching view of your website’s SEO health, allowing you to start to form, or reassess, the foundations of a campaign. However, it’s always possible to dig a little deeper to draw user-based insights into the current performance of a website and therefore not only provide higher value to the user (and therefore increase your chances of being returned in the SERPs), but also have a positive impact on your conversion rate.

User and usage data is essential for any successful, agile campaign. It provides the data necessary to see if your original strategy is paying dividends, or whether you need to start shaking things up!

Google Analytics and Search Console

If you haven’t already set up detailed conversion tracking on your site, please do so now. Conversions are often what dictates a campaign’s success so make sure that you can correctly attribute them.

Using Google Analytics for data that can positively influence an SEO campaign is a whole suite of articles alone. However, here are a few quick wins for you.

Content analysis

Chances are that you are investing heavily into your own content creation, which is great. The pitfall is that you create on piece of content and then move on to create a completely fresh piece. Use your check-up/audit as an opportunity to refresh existing content by:

  • Identifying successful pieces. Can they be improved or updated? Do they have impressive user metrics but are not delivering conversions? Is there a pattern appearing with successful articles that can influence ongoing content creation?
  • Improving underperforming posts. Can you spot posts that are failing to rank in the SERPs? They may need to be reviewed for a more focussed value proposition, or maybe your onsite is lacking. These present real opportunities to make the most of time that you have already invested into content, effectively retrofitting to ensure performance.

User flow

Keyword rankings are but the tip of the iceberg when it comes to measuring SEO success and they are by no means the most pertinent data. Conversions are the real gold at the end of the rainbow and without a clear user flow you can severely inhibit your conversion rate.

The user flow function in Google Analytics shows you entry and exit pages, as well as the main flow from page to page. Geek out even further with tools like Hot Jar but be warned – you can waste a lot of time watching endless videos of your users’ sessions.

Review searches

Google Search Console is invaluable for making sure that all of your onsite optimization, content creation and link building is actually delivering the right type of traffic to your site. Use the platform (you can also sync and view data via Google Analytics) to make sure that the types of searches people are using to find your website are relevant. Another tip is to find those search terms for which you are gaining lots of impressions, but have a low CTR – this can help you refine how your pages actually appear in the SERPs.

Conclusions

You only need do a brief search to understand that there are multiple tools for just about every aspect of SEO, although hopefully the ones listed above will get you off to a healthy start in your review. You may already be a premium subscriber to platforms such as Ahrefs or SEMrush, in which case we would advise exploring the functionality offered by these providers. For example, the site audit feature on SEMrush is particularly useful.

Use these tools to provide indicators of success or areas for improvement. Don’t be afraid to adjust your strategy according to the findings of your check-up – you may well spot opportunities in the market that your competitors have not.


Read more about performing a technical SEO audit on Marcela De Vivo’s most recent column.


 



source https://searchenginewatch.com/2018/05/29/tools-to-assist-your-seo-check-up/

Monday 28 May 2018

How to expand marketing reach in the slow season, part 3: Yahoo Gemini

When most marketers think of Yahoo, they think of low volume and little impact to overall scale. However, Yahoo Gemini is not only search, but it taps into native advertising also.

Yahoo is one of the top sites for premium content and generates over 1 billion monthly visitors. Gemini primarily lies in-feed alongside Yahoo’s owned content; it also gives your ads access to Yahoo’s top syndication partners such as Hearst and Vox. Overall, it provides great access to extensive audiences.

Yahoo Gemini ad types

Yahoo Mail Ads

These ads reside in Yahoo Mail and come in a variety of different formats

Pencil ads: Native ads that appear within a user’s mailbox:

Sponsored Mail ads: Native ads that appear in your mailbox with a ‘learn more’ button that opens up to a more detailed ad:

Video Mail ads: Video ads located on the side of the mailbox:

Dynamic Product ads: Dynamic retargeting ads that are personalized to each user’s shopping experience and show up in the feed (note: if you are an ecommerce company, you will definitely want to leverage these ads):

Static ads: This is the simplest and most straightforward ad format: static ads that appear right within the native feed:

Carousel ads: These vary between desktop and mobile in format. Desktop allows advertisers to show a more premium format for their ads:

Mobile allows advertisers to use up to five images to tell a visual story:

 

Tips for maximizing Yahoo Gemini

When launching on native, start with carousel ads as your preferred ad type. Carousel ads tend to perform better with regard to direct response because they are only served on Yahoo properties with premium placements.

Break out desktop versus mobile campaigns. You can do this by creating separate desktop campaigns in which you reduce mobile bids by max levels. Having separate desktop and mobile campaigns allows you greater control over each campaign’s performance and budget.

Similarly, break out your Yahoo Mail campaigns. We have found that whenever Mail campaigns are combined with general native campaigns, the majority of spend will go to Mail because of its vast impression volume. Again, segmenting campaign types allows better control over budget allocation and overall performance optimization.

If you have solid competitors in the space, you should run a Mail domain-targeting campaign in which you target users who receive ads from your competition.

Summary

Yahoo Gemini can be an excellent source of reach, new users, and even direct response with its retargeting options. It’s worth investing your time in as you hit your slow season and are looking for incremental volume.


Read more in this series:

Part 1: How to expand marketing reach in the slow season with Quora

Part 2: How to expand marketing reach in the slow season with Amazon Marketing Services



source https://searchenginewatch.com/2018/05/28/how-to-expand-marketing-reach-in-the-slow-season-part-3-yahoo-gemini/

Friday 25 May 2018

The 12 most important elements of a technical SEO audit

Contrary to popular belief, technical SEO isn’t too challenging once you get the basics down; you may even be using a few of these tactics and not know it.

However, it is important to know that your site probably has some type of technical issue. “There are no perfect websites without any room for improvement,” Elena Terenteva of SEMrush explained. “Hundreds and even thousands of issues might appear on your website.”

For example, over 80% of websites examined had 4xx broken link errors, according to a 2017 SEMrush study, and more than 65% of sites had duplicate content.

Ultimately, you want your website to rank better, get better traffic, and net more conversions. Technical SEO is all about fixing errors to make that happen. Here are 12 technical SEO elements to check for maximum site optimization.

1. Identify crawl errors with a crawl report

One of the first things to do is run a crawl report for your site. A crawl report, or site audit, will provide insight into some of your site’s errors.

 

You will see your most pressing technical SEO issues, such as duplicate content, low page speed, or missing H1/H2 tags.

You can automate site audits using a variety of tools and work through the list of errors or warnings created by the crawl. This is a task you should work through on a monthly basis to keep your site clean of errors and as optimized as possible.

2. Check HTTPS status codes

Switching to HTTPS is a must because search engines and users will not have access to your site if you still have HTTP URLs. They will get 4xx and 5xx HTTP status codes instead of your content.

A Ranking Factors Study conducted by SEMrush found that HTTPS now is a very strong ranking factor and can impact your site’s rankings.

 

Make sure you switch over, and when you do, use this checklist to ensure a seamless migration.

Next, you need to look for other status code errors. Your site crawl report gives you a list of URL errors, including 404 errors. You can also get a list from the Google Search Console, which includes a detailed breakdown of potential errors. Make sure your Google Search Console error list is always empty, and that you fix errors as soon as they arise.

Finally, make sure the SSL certificate is correct. You can use SEMrush’s site audit tool to get a report.

 

3. Check XML sitemap status

The XML sitemap serves as a map for Google and other search engine crawlers. It essentially helps the crawlers find your website pages, thus ranking them accordingly.

You should ensure your site’s XML sitemap meets a few key guidelines:

  • Make sure your sitemap is formatted properly in an XML document
  • Ensure it follows XML sitemap protocol
  • Have all updated pages of your site in the sitemap
  • Submit the Sitemap to your Google Search Console.

How do you submit your XML Sitemap to Google?

You can submit your XML sitemap to Google via the Google Search Console Sitemaps tool. You can also insert the sitemap (i.e. http://example.com/sitemap_location.xml) anywhere in your robots.txt file.

Make sure your XML Sitemap is pristine, with all the URLs returning 200 status codes and proper canonicals. You do not want to waste valuable crawl budget on duplicate or broken pages.

4. Check site load time

Your site’s load time is another important technical SEO metric to check. According to the technical SEO error report via SEMrush, over 23% of sites have slow page load times.

Site speed is all about user experience and can affect other key metrics that search engines use for ranking, such as bounce rate and time on page.

To find your site’s load time you can use Google’s PageSpeed Insights tool. Simply enter your site URL and let Google do the rest.

You’ll even get site load time metrics for mobile.

This has become increasingly important after Google’s roll out of mobile-first indexing. Ideally, your page load time should be less than 3 seconds. If it is more for either mobile or desktop, it is time to start tweaking elements of your site to decrease site load time for better rankings.

5. Ensure your site is mobile-friendly

Your site must be mobile-friendly to improve technical SEO and search engine rankings. This is a pretty easy SEO element to check using Google’s Mobile-Friendly Test: just enter your site and get valuable insights on the mobile state of your website.

You can even submit your results to Google to let them know how your site performs.

A few mobile-friendly solutions include:

6. Audit for keyword cannibalization

Keyword cannibalization can cause confusion among search engines. For example, if you have two pages in keyword competition, Google will need to decide which page is best.

“Consequently, each page has a lower CTR, diminished authority, and lower conversion rates than one consolidated page will have,” Aleh Barysevich of Search Engine Journal explained.

One of the most common keyword cannibalization pitfalls is to optimize home page and subpage for the same keywords, which is common in local SEO. Use Google Search Console’s Performance report to look for pages that are competing for the same keywords. Use the filter to see which pages have the same keywords in the URL, or search by keyword to see how many pages are ranking for those same keywords.

 

In this example, notice that there are many pages on the same site with the same exact keyword. It might be ideal to consolidate a few of these pages, where possible, to avoid keyword cannibalization.

7. Check your site’s robots.txt file

If you notice that all of your pages aren’t indexed, the first place to look is your robots.txt file.

 

There are sometimes occasions when site owners will accidentally block pages from search engine crawling. This makes auditing your robots.txt file a must.

When examining your robots.txt file, you should look for “Disallow: /”

This tells search engines not to crawl a page on your site, or maybe even your entire website.  Make sure none of your relevant pages are being accidentally disallowed in your robots.txt file.

8. Perform a Google site search

On the topic of search engine indexing, there is an easy way to check how well Google is indexing your website. In Google search type in “site:yourwebsite.com”:

It will show you all pages indexed by Google, which you can use as a reference. A word of caution, however: if your site is not on the top of the list, you may have a Google penalty on your hands, or you’re blocking your site from being indexed.

9. Check for duplicate metadata

This technical SEO faux pas is very common for ecommerce sites and large sites with hundreds to thousands of pages. In fact, nearly 54% of websites have duplicate metadata, also known as meta descriptions, and approximately 63% have missing meta descriptions altogether.

Duplicate meta descriptions occur when similar products or pages simply have content copied and pasted into the meta descriptions field.

A detailed SEO audit or a crawl report will alert you to meta description issues. It may take some time to get unique descriptions in place, but it is worth it.

10. Meta description length

While you are checking all your meta descriptions for duplicate content errors, you can also optimize them by ensuring they are the correct length. This is not a major ranking factor, but it is a technical SEO tactic that can improve your CTR in SERPs.

Recent changes to meta description length increased the 160 character count to 320 characters. This gives you plenty of space to add keywords, product specs, location (for local SEO), and other key elements.

11. Check for site-wide duplicate content

Duplicate content in meta-descriptions is not the only duplicate content you need to be on the lookout for when it comes to technical SEO. Almost 66% of websites have duplicate content issues.

Copyscape is a great tool to find duplicate content on the internet. You can also use Screaming Frog, Site Bulb or SEMrush to identify duplication.

Once you have your list, it is simply a matter of running through the pages and changing the content to avoid duplication.

12. Check for broken links

Any type of broken link is bad for your SEO; it can waste crawl budget, create a bad user experience, and lead to lower rankings. This makes identifying and fixing broken links on your website important.

One way in which to find broken links is to check your crawl report. This will give you a detailed view of each URL that has broken links.

You can also use DrLinkCheck.com to look broken links. You simply enter your site’s URL and wait for the report to be generated.

Summary

There are a number of technical SEO elements you can check during your next SEO audit. From XML Sitemaps to duplicate content, being proactive about optimization on-page and off is a must.



source https://searchenginewatch.com/2018/05/25/the-12-most-important-elements-of-a-technical-seo-audit/

Thursday 24 May 2018

From being too broad to being too lazy: three common PPC fails 

PPC and search marketing are both vital to a company’s success. So, it’s amazing to see the mistakes that so many brands still make today. AdWords has added tools like upgraded URLs to make it a little easier to manage campaigns.

But glaring errors still happen – and frequently. While these mistakes can seem small – especially if a brand has a big SEM budget – each one can have a significant impact on an advertiser’s reputation and ROI. Here are a few of the most common PPC mistakes search marketers make, and some methods to address them.

Your search terms are too high-level

A common mistake for many first-time (and even experienced) search advertisers is that they start out too broadly. For example, if you’re an electrician in Boston starting out AdWords for the first time, you don’t want to go in big on the mains terms such as [electrician] and rely on city geo-targeting. Instead, be selective about your target keywords and build campaigns around specific terms such as [Electrician Arlington], or [24 callout Brookline Electrician].

The same rule applies to different verticals, including for example retail. It can be costly to start driving traffic on the term [dresses] if you’re a retailer. However, terms like [size 12 red dress] would have a higher propensity to convert. Start with these terms, then start adding more terms that could be higher up in the funnel for more awareness.

This process will install more discipline into how you measure the individual ROI of your range of keywords and bring scale when running on AdWords.

If you’re about to make the leap into broad expensive generics, then why not just target these keywords with RLSA only to build retargeting lists. It’s a more conservative step than going full throttle to make an impact in that auction.

Lazy ad management

Lazy ad copy is a big no-no in paid search. And using the same copy for all sponsored listings should be banned. Tailored ad copy offers the best way to get clicks and conversions, boosting ROI and generally making a much bigger impact than if a brand used the same ad copy for every keyword they were targeting. 

Brands should always add context to their ad copy, and changing the wording for specific ads allow them to do that. For example, if a cruise line offers ads for an all-inclusive trip, they also need to add something tailored to the copy, like the customer perks, to make users want to click. For trips that aren’t all-inclusive and are designed for families with children, the cruise line should change the copy to appeal to those looking for the best deals for kids or for example entertainment. 

Speaking of lazy ad management, in these two ads, the brands somehow forgot to change the auto prompt in their ad setup. This means they’re not only targeting the string “add your keywords here,” but they’ve also set the ad to autofill the headline based on the keywords. This results in a silly ad that’s unlikely to get any clicks.

Now, this could be a simple oversight, since the prompt text sometimes fails to disappear when you start typing your keywords into the box. However, marketers need to check their targeted keywords on an ongoing basis to achieve truly successful PPC management.

In addition, as Google continues to push for more hands-off automation in the AdWords workflow — through features like Dynamic Search Ads — it’s important to keep a close eye on what’s going on. These suggestions may help you automate, but they might not be the best fits to meet campaign goals and could actually hurt your standing if they don’t take the actions of your competitors into account. So advertisers, stay vigilant.

Not mastering your seasonal spikes in demand

If you don’t know your seasonal trends inside out, then there’s a very good chance you’ll be left behind in auctions and miss the spike in demand.

AdWords and analytics allow you to get into the weeds as far as time of day, or week that’s driving impressions. You must be ready to react to these trends, but still within your target margin for a good ROI.

For some big brands, plans around mastering Black Friday peak periods start around three months before the event. A great deal of planning goes into the price, product and type of promotion for this XX day period. As auctions become increasingly competitive, it’s vital that you have a strategy to win, too.

To get the most out of your seasonal spikes, you need to master all match types, segment your RLSA lists, increase bid modifiers by device and day/part bidding where possible and opt into DSA to fill in any gaps you may have missed.

The common theme here is a lack of attention. It’s important that PPC advertisers always monitor campaigns to ensure that they don’t make the same mistakes seen here. This means a more thoughtful strategy, with the right tools – safeguards in AdWords, third-party technology to support by monitoring for errors and mistakes and beyond – in place on the back-end to safeguard.

 



source https://searchenginewatch.com/2018/05/24/from-being-too-broad-to-being-too-lazy-three-common-ppc-fails/

An introduction to innovation in consumer search optimization

It may be obvious statement, but over the last 15 years the internet has completely transformed the retail industry. The once flourishing high-street is declining, as more and more consumers are swapping the shopping experience for the convenience and added choice that online retail offers. Mobile technology means more people are purchasing products via apps while on-the-go, rather than popping out on a Saturday morning to browse the isles and rails in-store.

The digital retail space has seen a huge number of disruptive innovations over the years, from artificial intelligence (AI) offering tailored recommendations to smart chatbots transforming and streamlining customer service and the new additions of drone deliveries and augmented reality – it’s an exciting time. Despite these leaps, most search technologies still used by modern retailers and brands are lagging behind. It’s these search engines that are the next element of the retail sphere set to have an innovation makeover.

Search engines today

When it comes to current search engines there are some big issues with accuracy. A Spoon Guru scan of the food search landscape revealed that the majority of leading supermarkets around the globe – including Walmart in America, Sainsbury’s in the UK and Carrefour in France – fail to return accurate search results for common dietary search terms, such as gluten free, low sugar or vegan. These needs seem simple, so imagine the difficulty of finding the right food if you have more complex requirements or multiple preferences.

Similarly, Google – the biggest search engine on the planet – came up short when looking for specific requirements. An analysis of the first page of results on a Google shopping search for vegan sausages found a staggering 19 of the 40 products are not vegan. In fact, two even contained pork.

 

So, what do you cook when you have a vegan, coeliac and nut allergy sufferer coming over for dinner? It isn’t the start of a bad joke as I’m sure you suspect, but a real-life scenario.

The grocery market (the industry in which Spoon Guru’s technology currently sits) is of course an area where there should be no margin for error, as for those with serious allergies and intolerances the consequences of a mistake can be fatal. However, no matter what the search enquiry – from homeware to sports equipment and clothing – these days consumers deserve, and ultimately should be able to get correct results that perfectly match their personal requirements.

The next generation of search technology

Despite a wealth of available products, people are still finding it challenging to find what they need owing to unstructured metadata and unconnected databases. Being innovation-led is crucial for retailers and brands who want to survive the digital boom, and responsive changes are required to match the shift in consumer expectations. Consumers want a personalized, seamless sand consistent experience. So how do you optimize search capabilities to match this?

A combination of AI, machine learning and human expertise, powers the next generation of search technology: Customer Search Optimisation (CSO). The secret to the CSO system is tagging and natural language processing. Natural language is the biggest problem facing machine learning, as it is presented with imperfect data owing to the ambiguities and variables within it.

Spoon Guru’s TAGS technology breaks down this language and translates it into labels that can then be assigned and organized. Over 24 hours, the system analyses 14 billion data tags, 2.5 million statements and over 16 million words, classifying over 180 preferences within the grocery retail space. CSO literally crunches billions of data points every day, opening up the market to match more products than ever before to specific consumer requirements.

Another important part of CSO is the human-in-the-loop system. Incorporating expertise – in Spoon Guru’s case, nutritional expertise – with the algorithms means that any inconsistencies, conflicts or erroneous classifications can be resolved. It also means that the latest scientific knowledge continues to be integrated into the technology’s DNA.

Tesco have already adopted the TAGS technology on their digital shopping platforms – via desktop and the app, helping shoppers find more products to match dietary needs, from the simple to the very complex.

CSO not only provides a better service for users and consumers, but by appearing in specific online searches, it will help boost brands’ profiles by providing further visibility, as well as becoming a core revenue driver.

Future-gazing

Currently TAGS technology and CSO is transforming the grocery industry, as it is an area where specific requirements are becoming more of a necessity to the consumer – 64% of the world’s population are now on some kind of exclusion diet. With Tesco, one of the UK’s largest retailers, on board we can expect the technology to become a set standard across the industry.

Eventually this technology can be expanded and modified to work across different sectors, from entertainment, fashion, sports, hospitality, events, and pets. The possibilities of this transformative technology are pretty wide.

By leveraging smart technology (and smart people) we can cater for the modern multi-preference consumer, providing much more accuracy, relevance and choice.



source https://searchenginewatch.com/2018/05/24/an-introduction-to-innovation-in-consumer-search-optimization/

Wednesday 23 May 2018

When to just say “no” to bidding on brand

Brand bidding is often touted as a hotly debated topic in paid media. That said, a cursory glance will reveal that the typical advice given by PPC experts is to bid on it, without exception. But the question is whether this is good advice or just self-serving.  Let’s look at each of the main arguments for brand bidding in more detail and see if the answer is a bit more nuanced and not just black and white.

 

  • Protect your brand

If you have competitors appearing for brand terms, either in paid or organic then yes, you need to consider how you protect this traffic stream and PPC brand bidding is an obvious option, along with improving your SEO rankings. On the flip side, if you have limited or no competition, it really is worth considering pausing branded PPC.

Logic would dictate that if a user is searching specifically for your brand, then they want to visit your site or engage with content. If your SERPs don’t contain competitors and only links associated with properties you own, then the benefits of bidding on brand terms become dubious.

Even if you are convinced that you will lose some brand traffic in this scenario, you need to ask yourself if the incremental value that PPC provides is worth it. Let’s say you lose 5% of brand traffic by turning off PPC brand. What that’s telling you is that 95% of your PPC brand traffic did nothing.

In other words, because you would have to appear on 100% of those impressions to protect your brand loses, a £10,000 brand spend which usually would get a £100,000 return, would actually have an incremental return of just £5,000 When viewed this way your ROI goes from 9.00 to -0.5.

 

  • Dominate the SERPs

Dominating the SERPs is a desirable goal for a lot of advertisers, however, this argument falters if you are already dominating the SERPs without a paid ad hogging the top spot. If you don’t have paid competitors bidding on your brand terms and you have good organic rankings for your owned properties you already dominate the SERPs and are just paying to push all your other listings down the page.

If you do have some organic competitors creeping onto your first page results, you should look to manage your other properties outside of your main site. This is a good way to ensure that you capture as many organic listings as possible and push any SEO competitors of the first page longer term. Ensure that your social media sites, Wikipedia page, and local listing are optimized and appearing for brand terms.

 

  • Control your messaging

It’s often argued that PPC is better at controlling message and landing page choice. While it is indeed more agile at both, you still have control from an SEO point of view. SEO’s have been optimizing listing copy in the meta descriptions since SEO began and any half decent SEO team will have categorized your pages and actively optimized your brand terms to land on the most appropriate pages.

Potentially if you have a sale running or have launched a new range, you may want to quickly reflect this in your copy. But again, if you have no competitors, it’s logical that you will get the click anyway and you can use your landing page to convey any important messaging.

 

  • Your brand terms are cheap

Brand terms often are cheaper than non-brand terms, but unless they add benefits, it is just an additional and unnecessary cost. You could be using that spend on other new customer driving activity to grow your business.

 

  • Capture high-quality traffic near the point of conversion

PPC managers and teams love to bid on brands because it converts well, makes reports look great and can mask poor performing activity.

This obviously isn’t a good enough reason on its own and only applies if you must defend from losing conversions against competitors. Otherwise, you are just taking credit for conversions you already would have received.

 

  • Brand terms improve overall account quality score

The premise here is that brand bidding will improve overall quality score and therefore decrease cost per clicks on other terms (at least in the early stages when you launch new keywords and they are yet to establish a QS). However, Google has never admitted that account level quality score exists. They have chopped and changed QS measurement over the years, all new keywords used to launch with a QS of 6, now they start at zero and build as they accrue data. According to Google, keywords are assessed on their own merit and gain a quality score once they build data. PPC experts have asserted that account level QS exists, but we have little hard evidence to support that claim, so the known benefit is wishy-washy at best.

 

My view is that brand bidding should be avoided if possible. A PPC manager will only add value by growing prospecting activity, not piggybacking off the success of a brand name.

If you can turn off brand, ensure that you break your brand terms into different categories and assess the need to bid on them independently. A core brand term [brand x] may have no competition, but a brand + product term [brand x shoes] may be more likely to receive competition as other advertisers broad match on the product term (rather than directly bidding on your brand). Also, add negatives as well as pausing keywords and continue to monitor the situation, in case competitors appear.

If you must bid on brand, then task your PPC manager with making it work for you as efficiently as possible, getting that traffic cheaply. If it’s within your power, take steps to remove the need to bid on a brand. Boost SEO rankings on other owned properties, like Facebook and Wikipedia, reach out to resellers and even competitors about brand bidding, you may be able to reach an agreement to stop bidding on each other’s terms.

Do remember to assess your situation on a case by case basis and understand if brand bidding is right for you.

 



source https://searchenginewatch.com/2018/05/23/when-to-just-say-no-to-bidding-on-brand/

Tuesday 22 May 2018

A review of the payday loans algorithm in 2018

For several years, the search term ‘payday loans’ has regularly attracted more than 200,000 searches per month on Google.co.uk. Whether providing loans or generating leads, the payday loans industry has notoriously been big business and at its peak, was estimated to be worth around £2 billion per year.

Because of this, the top positions on Google’s SERPs for ‘payday loans’ have been a hugely lucrative and sought-after search term; and subsequently was dominated by SEO professionals using massive manipulation to hack their way to the top of the search results.

Until 2013, page one for payday loans barely listed a real payday loan company. Instead, the listings were made up of ‘hacked sites’ including bicycle sales, women’s magazine and frankly, just random domain names that once clicked on redirected to a dubious data capture form.

 

 

Introducing the payday loans algorithm

With customer data at risk and a mountain of complaints from UK consumers (and similar results in the US), Google reacted and introduced an official “payday loans algorithm” in June 2013. For the search giant to acknowledge a particular search term – demanding its own algorithm and focusing on a micro-industry across the pond – it was certainly out of the ordinary and we are yet to see any other industry treated in the same regard.

The payday loan algorithm update was rolled out over a two-month period. The first payday loan update occurred in June 2013, followed by Payday 2.0 on 16 May 2014 and Payday 3.0 which was rolled out shortly thereafter in June 2014.

Whilst the first algorithm change was a general clean up, payday loans algorithm 2.0 focused on targeting spammy queries, abusing Google+ accounts, doorway and hacked websites. Payday loans 3.0 was geared towards tackling spamming links including links of low quality, reciprocal links, forums, blog networks and websites which require paid submissions in exchange for a link.

Soon after the rollout of Payday 3.0, the search results were essentially cleaned up and have since been a much clearer representation of how rankings for payday loans should be by showing legitimate companies.

Those websites that were targeted by changes in the algorithm were subsequently penalized from Google searches, which included dropping 10 pages or even off the face of Google altogether. There were a handful of sites that had previously dominated the SERPs and then ceased to maintain any online real estate including Tide U Over and Red Wallet.

 

Bringing payday to today

The payday loans business took another drastic change following the introduction of FCA regulation in January 2015. Whilst the industry remains lucrative, the number of companies’ active has diminished significantly in the last three years – from 200 lenders to around 40 and originally hundreds of comparison sites to around a dozen. Margins have been hit by the introduction of a price cap, keeping the daily interest at a maximum of 0.8% and tougher regulation on the selling of data – leading to much higher operating costs and barriers to entry.

While there have not been any additional releases of the payday loans algorithm, Google is still keeping an eye on it and even implemented a ban on PPC ads for payday loans in 2016. The outcome was far stricter in the US than in the UK where lenders and comparison sites can still show paid ads but are required to show proof of their regulatory license to Google before going live.

 

How to successfully rank for payday loans in 2018

Fast forward to 2018 and there are 10 legitimate companies ranking in the top 10 for ‘payday loans’ in the organic search on Google.co.uk.

Our SEO company has successfully ranked five of the websites that are currently positioned in the top 10 and based on the success we have seen, we have identified some of the main trends below, which seem to be very specific to a payday loans algorithm and differ to the techniques used for ranking for other keywords in loans and insurance.

 

Direct lenders win over comparison websites: All websites positioned in 1 to 10 are essential providers of payday loans, known as ‘direct lenders’ and not comparison websites. While the main comparison sites in the UK dominate the search results for things like life insurance, car insurance and personal loans, none of these companies come near the top 3 pages for ‘payday loans’ despite all having a landing page to target this keyword.

In positions 1 to 20, there is only one comparison website that features all the lenders and we are responsible for their SEO. However, their homepage resembles a more direct lender with a calculator and apply now button versus a comparison table format.

Brands win over exact match or partial match domains: There is no website listed in the top 10 that has the word ‘payday’ in their domain, suggesting that Google prefers to see brands over exact match or partial match domains. Compare this to other industries where logbookloans.co.uk ranks first for ‘logbook loans’ and two companies ranking on page one for ‘bridging loans’ that include the main keyword in their domain name.

Keeping in line with the brand theme, sites that rank well will have quality traffic from several sources including direct, paid, social and email. To benefit their SEO, the users should have high engagement rates, high average time on site and low bounce rates. This can be hugely beneficial for search rankings but is not an isolating factor. Companies such as Sunny and Lending Stream advertise heavily on TV and will generate good direct traffic as a result, but their lower search rankings do not correlate with enhanced direct traffic.

Domain age less relevant: Whilst several industries such as car insurance use the age of the domain as an important ranking factor, this seems to be less relevant for payday loans. Notably, 3 of the top 5 that rank (Cashfloat, Drafty and StepStone Credit) are less than two years old. This could be attributed to accumulating less spam and a history of low-quality links compared to much older domains.

Links still win… domains with more links tend to outrank those with fewer links. Interestingly, around 7 of the top 10 seem to have similar domains linking to them, suggesting there are some links that Google clearly values in this industry. However, finding the balance here is key as some of these similar links have a very low DA and spammy link history. Understanding which will work well will be the difference between better search positions or a penalty.

Strong user experience: A strong UX making it clear where to apply for a payday loan is proving to be more effective than providing thousands of words explaining what payday loans are. Keeping in line with user intent, successful websites are making use of calculators, images and videos to drive the application and not provide thin content.

Room for alternatives: Two sites currently in the top 5 for payday loans are offering alternatives (StepStone Credit and Drafty.) This could highlight Google’s moral obligation to offer a variety of products and not just high-cost short-term loans, thus alluding to whether they are in fact manually organizing the SERPs themselves.

 

To conclude, the usual SEO techniques of brand building, link acquisition and good user experience still apply to rank well in a modern payday loans algorithm. However, there is no doubt that payday loans in 2018 still requires a very specific approach; which can be achieved by looking at the sites that rank successfully and getting a feel of what content they write and what links they get.

In an ideal scenario, we should see MoneyAdviceService ranking top of the tree since it has the most authority and has numerous links from every single payday loans company in the UK – but as they sit on page 3 and have for some time, this is proof that the beast of ranking for payday loans surely has a mind of its own.



source https://searchenginewatch.com/2018/05/22/a-review-of-the-payday-loans-algorithm-in-2018/