Wednesday, 31 January 2018

The must-have tools for paid search success

Paid search marketers look to technology to provide them with a competitive advantage.

AdWords is host to a range of increasingly sophisticated features, but there are also numerous third-party tools that add extra insight. Below, we review some of the essential tools to achieve PPC success.

The paid search industry is set to develop significantly through 2018, both in its array of options for advertisers and in its level of sophistication as a marketing channel. The pace of innovation is only accelerating, and technology is freeing search specialists to spend more time on strategy, rather than repetitive tasks.

Google continues to add new machine learning algorithms to AdWords that improve the efficacy of paid search efforts, which is undoubtedly a welcome development. This technology ultimately becomes something of an equalizer, however, given that everyone has access to these same tools.

It is at the intersection of people and technology that brands can thrive in PPC marketing. Better training and more enlightened strategy can help get the most out of Google’s AdWords and AdWords Editor, but there are further tools that can add a competitive edge.

The below are technologies that can save time, uncover insights, add scale to data analysis, or a combination of all three.

Keyword research tools

Identifying the right keywords to add to your paid search account is, of course, a fundamental component of a successful campaign.

Google will suggest a number of relevant queries within the Keyword Planner tool, but it does have some inherent limitations. The list of keywords provided within this tool is far from comprehensive and, given the potential rewards on offer, sophisticated marketers would be well advised to look for a third-party solution.

A recent post by Wil Reynolds at Seer Interactive brought to light just how important it is to build an extensive list of target keywords, as consumers are searching in multifaceted ways, across devices and territories. According to Ahrefs, 85% of all searches contain three or more words and although the shorter keywords tend to have higher search volumes, the long tail contains a huge amount of value too.

Add in growing trends like the adoption of voice search and the picture becomes more complex still. In essence, it is necessary to research beyond Google Keyword Planner to uncover these opportunities.

Keywordtool.io takes an initial keyword suggestion as its stimulus and uses this to come up up to 750 suggested queries to target. This is achieved in part through the use of Google Autocomplete to pull in a range of related terms that customers typically search for. A Pro licence for this tool starts at $48 per month.

Ubersuggest is another long-standing keyword tool that search marketers use to find new, sometimes unexpected, opportunities to communicate with customers via search. It groups together suggested keywords based on their lexical similarity and they can be exported to Excel.

This tool also allows marketers to add in negative keywords to increase the relevance of their results.

ppc

We have written about the benefits of Google Trends for SEO, but the same logic applies to PPC. Google Trends can be a fantastic resource for paid search, as it allows marketers to identify peaks in demand. This insight can be used to target terms as their popularity rises, allowing brands to attract clicks for a lower cost.

Google Trends has been updated recently and includes a host of new features, so it is worth revisiting for marketers that may not have found it robust enough in its past iterations.

Answer the Public is another great tool for understanding longer, informational queries that relate to a brand’s products or services. It creates a visual representation of the most common questions related to a head term, such as ‘flights to paris’ in the example below:

atp2

As the role of paid search evolves into more of a full-funnel channel that covers informational queries as well as transactional terms, tools like this one will prove invaluable. The insights it reveals can be used to tailor ad copy, and the list of questions can be exported and uploaded to AdWords to see if there is a sizeable opportunity to target these questions directly.

For marketers that want to investigate linguistic trends within their keyword set, it’s a great idea to use an Ngram viewer. There are plenty of options available, but this tool is free and effective.

Competitor analysis tools

AdWords Auction Insights is an essential tool for competitor analysis, as it reveals the impression share for different sites across keyword sets, along with average positions and the rate of overlap between rival sites.

This should be viewed as the starting point for competitor analysis, however. There are other technologies that provide a wider range of metrics for this task, including Spyfu and SEMrush.

Spyfu’s AdWords History provides a very helpful view of competitor strategies over time. This reveals what their ad strategies have been, but also how frequently they are changed. As such, it is a helpful blend of qualitative and quantitative research that shows not just how brands are positioning their offering, but also how much they have been willing to pay to get it in front of their audience.

A basic licence for Spyfu starts at $33 per month.

Spyfu

SEMrush is a great tool for competitor analysis, both for paid search and its organic counterpart. This software shows the keywords that a domain ranks against for paid search and calculates the estimated traffic the site has received as a result.

The Product Listing Ads features are particularly useful, as they provide insight into a competitor’s best-performing ads and their core areas of focus for Google Shopping.

It is also easy to compare desktop data to mobile data through SEMrush, a feature that has become increasingly powerful as the shift towards mobile traffic continues.

A licence for SEMrush starts at $99.95 per month.

semrush

Used in tandem with AdWords Auction Insights, these tools create a fuller picture of competitor activities.

Landing page optimization tools

It is essential to optimize the full search experience, from ad copy and keyword targeting, right through to conversion. It is therefore the responsibility of PPC managers to ensure that the on-site experience matches up to the consumer’s expectations.

A variety of tools can help achieve this aim, requiring minimal changes to a page’s source code to run split tests on landing page content and layout. In fact, most of these require no coding skills and allow PPC marketers to make changes that affect only their channel’s customers. The main site experience remains untouched, but paid search visitors will see a tailored landing page based on their intent.

Unbounce has over 100 responsive templates and the dynamic keyword insertion feature is incredibly useful. The latter adapts the content on a page based on the ad a user clicked, helping to tie together the user journey based on user expectations.

unbounce

Brand monitoring tools

Branded keywords should be a consistent revenue driver for any company. Although there is no room to be complacent, even when people are already searching for your brand’s name, these queries tend to provide a sustainable and cost-effective source of PPC traffic.

Unless, of course, the competition tries to steal some of that traffic. Google does have some legislation to protect brands, but this has proved insufficient to stop companies bidding on their rivals’ brand terms. When this does occur, it also drives up the cost-per-click for branded keywords.

Brandverity provides some further protection for advertisers through automated alerts that are triggered when a competitor encroaches on their branded terms.

This coverage includes Shopping ads, mobile apps, and global search engines.

BV

Custom AdWords scripts

Although not a specific tool, it is worth mentioning the additional benefits that custom scripts can bring to AdWords performance. These scripts provide extra functionality for everything from more flexible bidding schedules, to stock price-based bid adjustments and third-party data integrations.

This fantastic list from Koozai is a comprehensive resource, as is this one from Free Adwords Scripts. PPC agency Brainlabs also provides a useful list of scripts on their website that is typically updated with a new addition every few months.

AdWords Scripts

Using the tools listed above can add an extra dimension to PPC campaigns and lead to the essential competitive edge that drives growth. As the industry continues to evolve at a rapid rate, these tools should prove more valuable than ever.



source https://searchenginewatch.com/2018/01/31/the-must-have-tools-for-paid-search-success/

The future of visual search and what it means for SEO companies

The human brain has evolved to instantly recognize images.

Visual identification is a natural ability made possible through a wonder of nerves, neurons, and synapses. We can look at a picture, and in 13 milliseconds or less, know exactly what we’re seeing.

But creating technology that can understand images as quickly and effectively as the human mind is a huge undertaking.

Visual search therefore requires machine learning tools that can quickly process images, but these tools must also be able to identify specific objects within the image, then generate visually similar results.

Yet thanks to the vast resources at the disposal of companies like Google, visual search is finally becoming viable. How, then, will SEO evolve as visual search develops?

Here’s a more interesting question: how soon until SEO companies have to master visual search optimization?

Visual search isn’t likely to replace text-based search engines altogether. For now, visual search is most useful in the world of sales and retail. However, the future of visual search could still disrupt the SEO industry as we know it.

What is visual search?

If you have more than partial vision, you’re able to look across a room and identify objects as you see them. For instance, at your desk you can identify your monitor, your keyboard, your pens, and the sandwich you forgot to put in the fridge.

Your mind is able to identify these objects based on visual cues alone. Visual search does the same thing, but with a given image on a computer. However, it’s important to note that visual search is not the same as image search.

Image search is when a user inputs a word into a search engine and the search engine spits out related images. Even then, the search engine isn’t recognizing images, just the structured data associated with the image files.

Visual search uses an image as a query instead of text (reverse image search is a form of visual search). It identifies objects within the image and then searches for images related to those objects. For instance, based on an image of a desk, you’d be able to use visual search to shop for a desk identical or similar to the one in the image.

While this sounds incredible, the technology surrounding visual search is still limited at best. This is because machine learning must recreate the mind’s image processing before it can effectively produce a viable visual search application. It isn’t enough for the machine to identify an image. It must also be able to recognize a variety of colors, shapes, sizes, and patterns the way the human mind does.

The technology surrounding visual search is still limited at best

However, it’s difficult to recreate image processing in a machine when we barely understand our own image processing system. It’s for this reason that visual search programming is progressing so slowly.

Visual search as it stands: Where we are

Today’s engineers have been using machine learning technology to jumpstart the neural networks of visual search engines for improved image processing. One of the most recent examples of these developments is Google Lens.

Google Lens is an app that allows your smartphone to work as a visual search engine. Announced at Google’s 2017 I/O conference, the app works by analyzing the pictures that you take and giving you information about that image.

For instance, by taking a photo of an Abbey Road album your phone can tell you more about the Beatles and when the album came out. By taking a photo of an ice cream shop your phone can tell you its name, deliver reviews, and tell you if your friends have been there.

Google Lens logo, which resembles a simplified camera with a red and yellow outline, blue lens and green flash.

All of this information stems from Google’s vast stores of data, algorithms, and knowledge graphs, which are then incorporated into the the neural networks of the Lens product. However, the complexity of visual search involves more than just an understanding of the neural networks.

The mind’s image processing touches on more than just identification. It also draws conclusions that are incredibly complex. And it’s this complexity, known as the “black box problem”, that engineers struggle to recreate in visual search engines.

Rather than waiting explicitly on scientists to understand the human mind, DeepMind — a Google-owned company — has been taking steps toward programming the visual search engine based on cognitive psychology rather than relying solely on neural networks.

However, Google isn’t the only company with developing visual search technology. Pinterest launched its own Lens product in March 2017 to provide features such as Shop the Look and Pincodes. Those using Pinterest can take a photo of a person or place through the app and then have the photo analyzed for clothing or homeware options for shopping. 

PinterestLensPromotedPin

What makes Pinterest Lens and Google Lens different is that Pinterest offers more versatile options for users. Google is a search engine for users to gather information. Pinterest is a website and app for shopping, recipes, design ideas, and recreational searching.

Unlike Google, which has to operate on multiple fronts, Pinterest is able to focus solely on the development of its visual search engine. As a result, Pinterest could very well become the leading contender in visual search technology.

Nevertheless, other retailers are beginning to catch on and pick up the pace with their own technology. The fashion retailer ASOS also released a visual search tool on its website in August 2017.

The use of visual search in retail helps reduce what’s been called the Discovery Problem. The Discovery Problem is when shoppers have so many options to choose from on a retailer’s website that they simply stop shopping. Visual search reduces the number of choices and helps shoppers find what they want more effectively.

The future of visual search: Where we’ll go from here

It’s safe to assume that the future of visual search engines will be retail-dominated. For now, it’s easier to search for information with words.

Users don’t need to take a photo of an Abbey Road album to learn more about the Beatles when they can use just as many keystrokes to type ‘Abbey Road’ into a search engine. However, users do need to take a photo of a specific pair of sneakers to convey to a search engine exactly what they’re looking to buy.

Searching for a pair of red shoes using Pinterest Lens

As a result, visual search engines are convenient, but they’re not ultimately necessary for every industry to succeed. Services, for instance, may be more likely to rely on textual search engines, whereas sales may be more likely to rely on visual search engines.

That being said, with 69% of young consumers showing an interest in making purchases based on visual-oriented searches alone, the future of visual search engines is most likely to be a shopper’s paradise in the right retailer’s hands.

What visual search means for SEO

Search engines are already capable of indexing images and videos and ranking them accordingly. Video SEO and image SEO have been around for years, ever since video and image content became popular with websites like YouTube and Facebook.

Yet despite this surge in video and image content, SEO still meets the needs of those looking to rank higher on search engines. Factors such as creating SEO-friendly alt text, image sitemaps, SEO-friendly image titles, and original image content can put your website’s images a step above the competition.

However, the see-snap-buy behavior of visual search can make image SEO more of a challenge. This is because the user no longer has to type, but can instead take a photo of a product and then search for the product on a retailer’s website.

Currently, SEO has been functioning alongside visual search via alt-tagging, image optimization, schema markup, and metadata. Schema markup and metadata are especially important for SEO in visual search. This is because, with such minimal text used in the future of visual search, this data may be one of the only sources of textual information for search engines to crawl.

Meticulously cataloging images with microdata may be tedious, but the enhanced description that microdata provides when paired with an optimized image should help that image rank higher in visual search.

Metadata is just as important. In both text-based searches and visual-based searches, metadata strengthens the marketer’s ability to drive online traffic to their website and products. Metadata hides in the HTML of both web pages and images, but it’s what search engines use to find relevant information.

Marking up your images with relevant metadata is essential for image SEO

For this reason, to optimize for image search, it’s essential to use metadata for your website’s images and not just the website itself.

Both microdata and metadata will continue to play an important role in the SEO industry even as visual search engines develop and revolutionize the online experience. However, additional existing SEO techniques will need to advance and improve to adapt to the future of visual search.

The future of SEO and visual search

To assume visual search engines are unlikely to change the future of the SEO industry is to be short-sighted. Yet it’s just as unlikely that text-based search will be made obsolete and replaced by a world of visual-based technology.

However, just because text-based search engines won’t be going anywhere doesn’t mean they won’t be made to share the spotlight. As visual search engines develop and improve, they’ll likely become just as popular and used as text-based engines. It’s for this reason that existing SEO techniques will need to be fine-tuned for the industry to remain up-to-date and relevant.

But how can SEO stay relevant as see-snap-buy behavior becomes not just something used on retail websites, but in most places online? As mentioned before, SEO companies can still utilize image-based SEO techniques to keep up with visual search engines.

Like text-based search engines, visual search relies on algorithms to match content for online users. The SEO industry can use this to its advantage and focus on structured data and optimization to make images easier to process for visual applications.

Additional techniques can help impove image indexing by visual search engines. Some of these techniques include:

  • Setting up image badges to run through structured data tests
  • Creating alternative attributes for images with target keywords
  • Submitting images to image sitemaps
  • Optimizing images for mobile use

Visual search engines are bound to revolutionize the retail industry and the way we use technology. However, text-based search engines will continue to have an established place in industries that are better suited to them.

The future of SEO is undoubtedly set for rapid change. The only question is which existing strategies will be reinforced in the visual search revolution and which will be outdated.



source https://searchenginewatch.com/2018/01/31/the-future-of-visual-search-and-what-it-means-for-seo-companies/

Tuesday, 30 January 2018

How to find the perfect domain strategy for international SEO

As you look to expand the reach of your business to customers in different countries, your website setup and the content you have in place will need to change and evolve.

Before you even begin thinking about content localization and local keywords for each market, the technical setup of your website needs to be considered. The first step of this process is domain strategy.

What domain you use when targeting local markets can impact how your site performs. There are a number of options for your domain structure:

  • Country code top-level domains (ccTLDs)
  • Subfolders or subdirectories
  • Subdomain

There are pros and cons for each of these. In this article, I’ll examine each of the different options, their benefits and drawbacks, and consider how you can find the best domain strategy for your individual situation.

Country code top-level domains (ccTLDs)

ccTLDs (or Country Code top-level domains) are specific to a country: for example, .de for Germany or .fr for France.

Pros of ccTLDs

  • Automatically associated with the country they cover (.de to Germany)
  • Clear to visitors that this site is meant for them
  • Obvious in the search results the site is targeted to a specific country
  • In many countries, customers prefer a locally based website
  • In some markets, local ccTLDs perform better in the rankings.

Cons of ccTLDs

  • Increased costs of domain registration (if you are in 32 countries you need 32 ccTLDs)
  • Starting from scratch with no domain history or links when you launch into a new market
  • You can’t as easily set up language specific websites – so a German-language website on a .de domain will look like a German-focused website, not one which can also serve customers in German-speaking Switzerland, or Austria
  • Your website will have lots of external links on it if you have a language selection dropdown on all pages. This can lead to your backlink profile being dominated by links from your own sites – that means any amazing backlinks you’ve managed to create won’t be as powerful as if your own links weren’t present (a drop in the ocean, you might say)
  • SEO work on one site won’t benefit all sites, as they are all separate websites.

Subfolders or subdirectories

Subfolders (also known as subdirectories) for specific languages or countries can be added to any domain (www.yourdomain.com/de), but for this to work effectively, the site needs to be on a top-level domain such as a .com, and not a local ccTLD.

Pros of subfolders

  • SEO performed on one part of the domain will benefit all the country folders as it’s one site
  • There is also the added inheritance of the authority of your original website so you aren’t starting from scratch when you go into a new market
  • Links between countries are seen as internal links, not external ones, which helps your backlink profile as it will be made up predominantly of links from other people’s sites and not mainly from your own site
  • No extra domain hosting costs.

Cons of subfolders

  • In the search results, it’s not as obvious that the country subfolder is specifically for users in that country (/de/ could be a page about your German products rather than a page specifically aimed at German users)
  • No automatic association in search to the target country
  • Risk of internal cannibalization – different international landing pages wind up competing with each other in search results, and it can be difficult to get the right landing page to rank in the relevant country’s search
  • Be wary of automatic optimization settings in your CMS – the last thing you want is your beautifully translated website for the Italian market to have a default title tag and meta description on every page which is in English.

Subdomains

Subdomains add the country content to the beginning of the domain (de.yourdomain.com). Some CMS tools or proxies default to this behavior, so it’s been a popular technique for many international websites.

Again, this solution only works when the parent website is a .com domain.

Pros of subdomains

  • Default for some CMS tools
  • Has some connection to the current SEO authority of the main website, which can aid performance when launching in a new country

Cons

  • Links to subdomains from the language drop-down are seen as external links, however, the level of this is less than when you have unique ccTLDs for each country
  • No automatic association in the search engines with the country you’re targeting
  • Users are less likely to associate your domain with their country, as the language specification is at the beginning of the domain
  • Again, risk of internal cannibalization: Google will typically only feature one subdomain from the same site in the SERPs, meaning that your subdomains wind up competing with one another for the same search terms.

So which domain strategy works best?

All we’ve seen from the above is that there are pros and cons for all the available domain strategies, and no real clear winner for which works best.

IP serving is not the solution

From an SEO point of view, we need to avoid IP serving (serving different content to the user depending on their IP address) wherever possible. All the search engines need to be able to find and index all of your content, but have IP ranges which come from specific countries.

Google, for example, comes from the US, meaning that it will be automatically redirected to your US content. This can present issues with the indexation and visibility of your local websites in the search results.

Making informed decisions

The best way for your business to decide which domain strategy is right for your websites is to review a number of different elements. Here are some key ones to start off with:

Technology review

This is a good kick-off point; there’s no point in looking at all the options, doing your research and deciding on a domain strategy, only to find that your CMS doesn’t support the approach you’ve chosen.

There are a number of considerations here:

  • Are there limitations to the options supported by your CMS?
  • Are there extra costs associated with any of the domain strategies?
  • Does the CMS support cross-domain content publication and hreflang tags no matter which domain strategy you choose?

Top level marketing strategy

Another one which is well worth checking before doing anything else. If your business has a logo which contains the domain, or a set of brand guidelines which involve talking about the company as YourBrand.com, then you may find that any recommendation to move to a ccTLD for specific markets might not be accepted.

Check in with the decision makers on that before you begin roll-out of research into domain strategy (and save yourself time!)

Competitor research and ranking review

Look at the marketplace for the country you are interested in, and also at the domain strategies which work for the companies who are performing well in the search results. This should include search competitors and publishers on a similar topic, not just your known named competitor.

Budgetary considerations

Are you a small business with limited marketing budgets, but looking to expand into 19 markets? If so, a ccTLD approach could eat into your budgets.

You might find that there is no one-size-fits-all solution, and in some markets, it might be better to have a ccTLD whilst in all of the other countries you are focused on a .com domain. At this point, your own marketing needs to kick in.

If you are comfortable having multiple domain marketing strategies, then do so; if you aren’t, then consider putting all sites on the same strategy. Just remember, it’s unlikely that your international customers will care that one site is on a ccTLD and another is on a .com!

Final considerations: Language

One final thing to consider when choosing domains for an international audience is the words used in the domain.

Although your domain is often your company name or something comprising this, one thing to consider for international audiences is whether this name, your domain, or the way words are combined in your domain, could look odd to audiences who speak a different language.

The worst-case scenario is that your domain looks like a swear word or insult in a different language. So, before you commit to a particular domain, check with local people living in that market that you won’t be accidentally calling their mother a hamster.



source https://searchenginewatch.com/2018/01/30/how-to-find-the-perfect-domain-strategy-for-international-seo/

Inside Google’s new Search Console: What’s new, what’s the same, and what’s still to come?

Earlier this month, Google rolled out the beta version of its new and improved Search Console to all verified users.

Google has been testing the new Search Console for some months now, with a select number of users given early access to the beta. We’ve had sneak peeks at the slick, clean interface, and heard about some of the notable additions, such as the much-vaunted 16 months of historical search data now available to SEOs.

The new Search Console is still in beta, and Google says that it will continue to port features from the old Search Console to the new over the course of the coming year. Webmasters and SEOs will be able to use both versions of Search Console side-by-side until the transition is complete.

So now that the new Search Console is finally here, what shiny new features does it boast, what is more or less the same, and what functionality are we still awaiting with bated breath? Let’s take a look.

What’s new

Search performance report

The most powerful new functionality in the revamped Search Console centers around the Search Analytics section, now known as Search Performance.

As with the old Search Analytics report, you can overlay total clicks, total impressions, average CTR and average position data on top of each other with a simple click. But where webmasters have previously forced to choose between filtering by search type, query, page, country, and device, with only one option available to select at once, now you can filter by multiple variables at a time.

So, as in the screenshot above, you can compare total impression data with average CTR from web searches for “search engine” from the United States over the past three months, if that’s something that takes your fancy.

You unfortunately can’t layer multiple comparisons on top of each other – so if you want to compare desktop and mobile data side-by-side, you can’t also compare data from the U.S. and the U.K. at the same time – but the new options still allow SEOs and webmasters to get highly specific with performance data for their website.

And, of course, website owners now have access to much wider date ranges for their historical search data, making it easier to analyze longer-term trends and perform year-over-year comparisons. Google notes that, “Over the years, users have been consistent in asking us for more data in Search Analytics” than the three months that website owners were previously limited to.

Well, with the new Search Console, Google has exceeded all expectations, more than quadrupling the maximum date range that webmasters have access to. Now, you can choose between three-month, six-month and 12-month date ranges, or opt for the “Full duration”, which is a whopping 16 months.

Index coverage report

The Index Coverage section of Google’s new Search Console is a combination of the old Index Status and Crawl Errors reports. It allows site owners to see how well Google is indexing their website, as well as identify and fix errors where there are any.

You can view data by pages with errors, valid pages with warnings, valid pages that have been indexed, and excluded pages, and also overlay impression data on top. The table underneath then gives more detail as to the types of issues detected, allowing webmasters to click through and inspect the affected URLs.

Another fantastically useful feature that’s new with the revamped Search Console is the ability to request Google update its index after you’ve resolved an issue.

If you’ve gone in and fixed a HTTP 500 error, for example, rather than waiting for Google to recrawl your site and discover the fix, you can proactively request that Google update its index. According to Google’s Webmaster Central blog, it will “then crawl and reprocess the affected URLs with a higher priority, helping your site to get back on track faster than ever.”

Search enhancements: Accelerated Mobile Pages and Job Postings

Google’s updated AMP status report also allows website owners to validate newly-fixed AMP URLs. In the old version of Search Console, Google would provide a list of AMP URLs with errors and recommend a fix, but there wasn’t any way to request that Google reprocess the amended URLs.

Now, you can request that Search Console validate a fix across multiple pages, and Google will again process those with a higher priority.

Google’s blog post introducing the new Search Console grouped AMP under the heading of “Search Enhancements” together with another new report: job postings. Webmasters with job listings on their site can mark them up with Job Posting structured data to be eligible for Google Jobs – Google’s relatively new foray into the world of job listings that was announced at last year’s Google I/O.

As with AMP, the Job Posting report in Search Console will display stats around your job listing results and pinpoint any indexing issues, allowing you to fix and validate them.

Image: Search Engine Land

What’s the same

Nothing in the revamped Search Console is exactly the same as the old version, but as I’ve mentioned, there are some rough equivalents.

The new Search Performance report features much of the same data as the old Search Analytics report, and the Index Coverage report includes data that appears in the Index Status and Crawl Errors sections of the old Search Console.

The sitemap submission process is also much the same in the new Search Console, though the handy “Test” button which allowed webmasters to check their sitemap before submission is missing in the new version.

The old Search Console allows webmasters to test their sitemap before submission

Sitemaps also work in conjunction with the Index Coverage report: when site owners submit a sitemap file, they can use the sitemap filter over the Index Coverage data to focus on an exact list of URLs.

What’s still to come

A lot of data from the old Search Console has still to make its way over to the new, so we can expect plenty of future updates to Search Console over the coming year. Some notable reports and features that have yet to be added to the new Search Console include:

Structured Data, Rich Cards, and Data Highlighter

Judging by Google’s continued emphasis on rich results and structured data markup, these reports are certain to come to Search Console, though maybe not in exactly the same form as before.

Given that Google has just begun introducing native support of some content types to Google Assistant, it’s possible that the new Search Console will feature additional functionality for integrating with Assistant, perhaps in the form of assessing whether your content is correctly optimized for inclusion in the new Actions Directory.

Google might also find a way to incorporate its new Rich Results Testing Tool directly within Search Console, helping webmasters and SEOs find and fix errors that prevent rich results from displaying.

Internal links and links to your site

One important piece of SEO functionality currently missing from the new Search Console is data on links: both internal links, and links leading back to your site.

In the old Search Console, these are useful reports allowing webmasters to see exactly who is linking to their domain and which pages are the most linked-to – important for monitoring the progress of link-building campaigns as well as backlinks in general.

Similarly, the Internal Links section allows you to assess and improve the level of internal linking within your own site. You can search for individual pages to see where they are linked to across your site, and reverse sort to find out which pages need more internal linking.

Hopefully this will soon be introduced to the new Search Console so that webmasters can benefit from new and improved link reports and data.

International targeting

This report allows webmasters to target an audience based on language and country – a crucial section for international SEO. Webmasters who operate in multiple geographies will be particularly keen to find out what this looks like when it appears in the new Search Console.

Mobile usability

Given Google’s increasing emphasis on a mobile-first approach to website-building, I’m confident that we can expect some souped-up features in the mobile usability report when it appears in the new Search Console.

The Search Console mobile usability report currently assesses how well your site is optimized for mobile usage, and highlights issues such as Flash usage, small font size, touch elements (e.g. buttons) placed too close together, and the use of interstitial pop-ups. With page speed confirmed to be an official ranking factor on mobile from July, I think we can near enough guarantee that speed will be one of the assessments included in the new mobile usability report (or whatever Google decides to call it) when it rolls out.

I think it’s reasonable to predict some sort of tie-in to the mobile-first index, as well. While it’s already possible to compare mobile and desktop search data in Search Performance, Google may well build some additional functionality into the mobile usability report which allows webmasters to detect and correct issues that prevent them from ranking well on mobile.

The current report already detects mobile usability issues on individual pages, so it wouldn’t even be much of a leap to apply that to the mobile-first index, giving website owners more tools to improve their site’s usability on mobile.

What are your thoughts on the revamped Search Console? Which reports are you most excited to see in the new version? Share your views in the comments!



source https://searchenginewatch.com/2018/01/30/inside-googles-new-search-console-whats-new-whats-the-same-and-whats-still-to-come/

Monday, 29 January 2018

A beginner’s guide to using negative keywords in PPC

Let’s set the scene. You’ve signed up to Google Adwords, entered your payment details, maybe even created a few ads and got to grips with the different types of matches for keywords.

You may even have gone ahead and sent your ads live. Easy enough. But you are fully aware that it doesn’t end there.

PPC can be an expensive hobby and you’re determined that your PPC campaign will become a valuable marketing channel rather than a resented, money-burning pastime.

In order to make the most of your PPC investment, you are going to have to make use of both common sense and data to constantly tailor your ads. You want to hone in on specific buyer personas which, as a byproduct (or whichever way round you want to view it), rid your campaign of wasted clicks.

You can do this by assessing quality score, A/B testing ad formats, revisiting your keywords and adding nice features such as call out extensions.

But as the title suggests, we’re here to talk about negative keywords. In this article we will walk through the basics of negative keywords in order to get you up and running. There’s loads of more detailed PPC tips on Search Engine Watch, so if you’re after pro tips we suggest using the handy search bar!

What are negative keywords?

One of the steps in creating your adverts is to assign the types of search terms that you want your adverts to appear for. Hopefully you have been specific about your keywords, focusing on user intent and relevance.

As you would imagine, negative keywords are almost the complete opposite of your target keywords. They help you give guidelines to Google, dictating the types of search terms for which you do not want to appear.

When would you use negative keywords?

Google defines negative keywords as “A type of keyword that prevents your ad from being triggered by a certain word or phrase. Your ads aren’t shown to anyone who is searching for that phrase. This is also known as a negative match.

A common example of negative keyword use is ‘cheap’ (Google use ‘free’ as an example). Let’s say you make bespoke furniture or high-end watches; it makes sense that you would not want to pay for clicks from searchers looking for cheaper alternatives.

You also need to banish ambiguity. In her ultimate guide to AdWords keyword match types and negatives, Lisa Raehsler used a good example of ‘blueberry muffins’ in that the user intent could be for both recipes and bakeries – two very different user intents.

In such a situation you would then add ‘recipes’ or ‘bakeries’, whichever suits you, to your negative keywords.

Where do I enter negative keywords?

You may already have noticed the negative keywords tab when you were busy adding keywords for either a campaign or ad group – the tab is right next to the ‘keywords’ tab!

You can either enter Campaign level negative keywords which will apply the negative keyword across your whole campaign or alternatively you can also define them for specific ad groups depending on the complexity of your campaign. Simply select the ad group that you want to add your negative keywords to.

Note that, like keywords, you are able to define whether each negative keyword is exact, broad or phrase match. Amanda DiSilvestro explains more about these different types of keyword matches in her common PPC mistakes piece.

Finding negative keywords

If your campaign has already been running for a while, we would still not advise diving straight into your search terms tab. If you’ve ever read about the concept of ‘anchoring’ you would understand why – ever been asked to describe something without using a particular word, but all you can think about is that word? Same idea.

The data on search terms for which your website is appearing is not going anywhere, so why not take the time to use your own industry knowledge? Brainstorm the types of businesses, products or services that yours could be mistaken for and the search terms which would be used to describe them.

You are likely to uncover some negative keywords that haven’t been used by searchers yet – remember that if it shows up in your search terms, then you’ll have paid for it! After completing your brainstorming you can then use the search terms tab to identify further negative keywords.

SEO can play its part too. The worlds of SEO and Google Adwords can often come to blows, as teams compete for sought-after budgets and are inevitably looking to position their channel as the most effective.

We’re all on the same team, though, right? There is considerable overlap between the two, and PPC and SEO teams can actually work together, sharing data to benefit both campaigns.

If you are already collecting and analyzing data for your SEO campaign, it is advisable to dip into this data. It may well unearth potential negative keywords that your website is appearing for in organic search which have not yet found its way into your Adwords data.

Kill two birds with one stone by adding these negative keywords to your AdWords campaign or ad group and reassess your SEO strategy to hone in on that perfect buyer persona!

Keep checking in

If you don’t need to make adjustments to your campaign after setting it up, then I would suggest quitting your job and becoming a PPC guru!

Your campaign set-up may be top-notch, but things change: new data appears, different search terms develop and competitors change tactics. The knock-on effect is that you should keep checking in on your AdWords campaign (and negative keywords) regularly. If you don’t, you are either braver or sillier than I am (probably both).

Don’t waste your hard-earned cash by missing opportunities to maximize your investment; or, in the case of negative keywords, allow Google to charge you for clicks via search terms that are irrelevant to you and your business.



source https://searchenginewatch.com/2018/01/29/a-beginners-guide-to-using-negative-keywords-in-ppc/

Friday, 26 January 2018

How to get started with Facebook advertising: A step-by-step guide

Organic success on Facebook may have become harder for brands in the past few years – and will become all the more so with the upcoming change to its News Feed – but the same doesn’t necessarily apply to its paid advertising options.

Facebook has seriously invested in presenting a large number of options for marketers and business owners who want to promote their content to reach the right audience. It’s a highly effective tool for reaching out to audiences, as long as you’re willing to part with some funds.

So if you’ve been considering making the leap over to paid Facebook promotion, here’s how to get set up with your first Facebook Ad.

First of all, you need to visit the Ads Manager. It’s where you create new ads, get an overview of your current campaigns and measure their performance.

1. Choose your objective

Facebook wants to make your advertising experience as specific as possible. That’s why it asks you from the very first step to decide on your marketing objectives.

This way, you’re able to focus on tailored results for each objective and pay for what matters most to you.

The three main types of objectives are:

  • Awareness: Generate interest in your content or your product
  • Consideration: Make the audience interested in searching for more details about your business
  • Conversion: Get the audience to purchase your product or service

These have to do with the stage that your target audience is and your expected results from this ad.

Their subcategories include:

Awareness

Brand awareness

Reach

Consideration

Traffic

Engagement

App Installs

Video views

Lead generation

Messages

Conversion

Conversions

Catalogue sales

Store visits

Which options should you choose for your campaign? If you have a new business and you want to promote it on Facebook, for example, then you would be focusing on awareness and reach.

If you want to promote your business to potential local customers, seeking for an increase in physical sales, then you are focusing on conversion and store visits.

What’s also useful is the fact that you can narrow down your focus to specific goals, like the app installs or the video views. This way you know the exact goals you’re aiming for and start examining how to achieve them.

2. Select your audience

Once you decide on your objective, it’s time to select your audience. This is the step in which you narrow down Facebook’s two billion users and pick the ones that are more relevant to your content.

This is one of the most useful features on Facebook’s advertising, as you’re able to focus on:

  • Core audiences: manual selection of the audience based on your set criteria
  • Custom audiences: upload your contact lists to discover an existing audience.
  • Lookalike audiences: find people similar to an existing target audience

The core audiences allow you to find a new audience based on demographics, locations, interests, and even behaviors. These could be people who are based in Florida and just had a baby, or students from Tokyo who tend to shop online.

By hyper-targeting your audiences in this way, you can give your ads the best chance of converting, all while thinking carefully about the personas you want to reach out to.

In addition to finding a new audience, you can use Facebook to engage an existing one.

You can use a feature called custom audiences to upload your contact lists of existing customers, or even old ones that you want to re-engage with them. This is an easy way to blend your physical activity with your online presence and develop an improved relationship with your audience.

Moreover, there is the option of finding lookalike audiences. These are people that you haven’t engaged with in the past, but they meet the criteria of your ideal audience.

3. Decide where you want to run the ad

What’s useful with Facebook is that your ads aren’t restricted to Facebook itself, but can also display on other Facebook-owned properties like Instagram and Messenger, and within other mobile apps via the Facebook Audience Network.

You don’t have to pick all the placements for every ad, of course. It all depends on who you want to target.

For example, if you know that your target audience are frequent mobile users, then Instagram and Messenger might be two very useful placements.

4. Set your budget

This is the step that you define the cost of your advertising campaign. The cost can be defined either by the overall amount you spend or the cost of each result you get from the ads.

The success and the cost of your ad depend on the ad auctions and how your ad performs towards your target audience and their interests. An auction takes place when a person is eligible to view your ad. If you are unsure about how auction bids work, you can set them to be automatic when creating your campaign.

If you are wondering how to make sure you’re not exceeding your budget, then you can set some limitations for your campaign. You can enter either a daily or a lifetime budget to define when your campaign should stop.

5. Pick a format

What makes Facebook Ads particularly effective is that you can pick the right format for every campaign. The variety of ad formats on offer can allow you to tailor your campaign to different objectives and target audiences.

The options include:

  • Photo: Use the power of images to tell your story
  • Video: Find engagement with the right use of image, sound, and motion

  • Carousel: Add more than one image or video in one ad

  • Slideshow: Create a series of lightweight video ads without the cost or the time of video ads. This is a quick and affordable way to create video-like ads
  • Collection: Showcase your products by telling a story in an easy and immersive way

  • Canvas: Aim for a full-screen, fast-loading experience that is designed for mobile

  • Lead ads: Use this format to make lead generation easier

  • Dynamic ads: Find the ideal target audience for your products in a sophisticated, automated way

  • Link ads: Bring more people to your website

6. Place your order

Simply put, this is the step in which you can review your ad and confirm that it’s ready to be submitted.

7. Measure your ad’s performance

You can analyze the performance of your ad by clicking on all your available ads. Once you find the specific one that you want to measure, you click on ‘view charts’ to get further details.

This is where you can learn more about the ad’s performance, whether it met your objectives, but also the demographics that it reached and its placement.

What’s useful is that the metrics are relevant to your ad’s objective. For example, an ad aimed at generating awareness is not measured by the same metrics as an ad focusing on increasing app installs.

This way, you can ensure that your budget is well spent and you’re able to track the most relevant metrics for your campaign.

This way you are able to tell the exact success of each ad to find out what worked better and what needs to be improved.

Overview

With more than two billion monthly active users available to be targeted in highly specific ways, it’s no surprise that more brands are diving into Facebook’s advertising options and discovering the different ways they can benefit from them.

The upcoming News Feed algorithm change has only increased the importance of Facebook advertising, as it can help you recover your lost reach and engagement.

However, it’s important to keep in mind that Facebook’s popularity doesn’t guarantee your ad’s success. Its numerous advertising options can make it harder to pick the right one if you don’t know exactly what you want to achieve. Stay focused on your key objectives, and try not to get distracted by shiny bells and whistles.

Focus on what’s important to your company and your campaign goals, and use the right format for your ideal target audience. If you’re still new to Facebook advertising, you can start with a small budget to test the available options until you feel more comfortable rolling your ads out on a larger scale.



source https://searchenginewatch.com/2018/01/26/how-to-get-started-with-facebook-advertising-a-step-by-step-guide/

Thursday, 25 January 2018

How creating relevant experiences can boost your clicks on local search ads

We all know by now that mobile has had a tremendous impact on our lives as consumers and as marketers.

What we are still getting our collective heads around is what this change means for us as marketers.

Consumers have different expectations of the information they want when they search for “running shoes” at 9am from their desktop at work, versus “running shoes” at 6pm on their iPhone two miles away from a store. We as marketers needs to consider these expectations and deliver uniquely for them.

I wanted to take a look at some of the data across various AdWords accounts and understand how search campaigns are performing by desktop and mobile and different distances from the physical store location the search is coming from.

The insights align with what you might expect, but probably don’t align with how you are managing your campaigns – yet.

How distance impacts CTR, CPC and click percentage in local search advertising

Let’s first start with click-through rate (CTR) by distance. This metric might be the biggest variance and potentially most obvious when you stop and think about it. It stands to reason that CTR would be higher the closer a consumer is to the physical location.

However, what I didn’t expect was how much higher and how much larger the variance is for mobile compared with desktop. Our data shows that within one mile of a store, mobile CTRs are 2.5 times higher than desktop CTRs. The implications of this are logical, but really indicate a desire to go in-store. Once you get outside the first mile, the CTRs drop to be just one percentage point higher than desktop.

Next, let’s take a look at cost per click (CPC) by device.

Here we see a very interesting trend that aligns with the concept behind quality score. We see that CPCs are their lowest for mobile within one mile of a store. After understanding that the CTRs were 2.5 times higher on mobile versus desktop, one can assume that the relevancy rate is helping to earn these lower CPCs.

The trend here is the opposite based on device. CPCs are going up for mobile each distance further from the location vs. desktop which is seeing a steady decrease the further away. I think the desktop reduction speaks to the geo-targeting that occurs and reduces competition since fewer brands would enter the auction.

Lastly, I thought that the trends surrounding percentage of clicks by device and distance were very interesting.

Although cumulative, the amount of traffic that Google is able to gather less than one mile from a physical location is still much smaller than the traffic more than 15 miles away. So it make sense that there is still a larger percentage for mobile devices versus desktop at a close range, given the relevancy factor for those consumers as well as the advertisers themselves.

Relevancy: The name of the game

Ultimately, that is what I think this game is all about – relevancy. Here are three tips that you can take away from these findings, and use to create more relevant marketing for your consumers.

Relevant experiences

We know as consumers ourselves that we expect relevant experiences. We expect the opening hours of the store to be correct, we expect landing pages on mobile to be mobile responsive, and so on.

As advertisers, given the tools that we have available including customer match (now available with phone number and address as well), and various extensions, we have a lot more opportunities to increase relevancy for consumers.

This data just validates those relevancy expectations. Now it is on us as marketers to ensure we take advantage of these tools to give customers what they want, when they want it, and how they want it.

Understand your customers’ interactions with your business

What does this data look like for your business? What are the specific insights for you? Should you be bidding higher for consumers closer to your location?

Should your landing page focus on calls to action bringing consumers in-store, if that search is during store hours and they are less than one mile from your location? What is your specific data saying?

What CRM data can be used to augment this data?

The more you know about your customer base, the more you can use that information to create a better experience and a more loyal customer. How are you using your CRM data to understand where specific consumers interact, target them or cross-sell?

There are so many pieces of data that can be cut up to give an advantage to your search program. What needs to be a focus for many is to better understand how that data relates to your customers’ expectations and not yours.

For example, many paid search managers want a conversion to occur online, so the measurement and ROI story can be as strong as possible. However, the downside to that is it serves your own interests and potentially not the customer’s.

I think this data is a great indicator of how to tie consumer behavior to experience, and I firmly believe that the more we can do this as an industry, the better off we’ll be.



source https://searchenginewatch.com/2018/01/25/how-creating-relevant-experiences-can-boost-your-clicks-on-local-search-ads/

Who were the “winners” and “losers” of organic search in 2017?

Earlier this week, Searchmetrics published its fourth annual Winners and Losers Report, which reveals how certain sites fared in organic search visibility on Google.com during 2017.

Searchmetrics bases its analysis on a unique indicator known as ‘SEO visibility’, which it uses to measure a webpage’s performance in organic search.

This is not the same as organic search ranking, but aims to give an overview of how often a website shows up in search results, based on “search volume and the position of ranking keywords” (as explained in the Searchmetrics FAQ).

Using this metric, Searchmetrics calculates the change in websites’ SEO visibility over the course of the year, and sorts the top 100 winners and losers by absolute change in visibility.

Last year, we examined the winners and losers in organic search during 2016, and concluded that social media and shopping were the overall “winners”, while online encyclopedias, reference websites and lyrics websites all lost out.

How do the results from this year stack up against last year, and what can we learn from the trends highlighted?

Encyclopedias and dictionaries are back on top

In a surprising reversal of 2016’s fortunes, online encyclopedias and dictionaries were among some of the biggest “winners” in 2017.

Encyclopedias made up 9% of the overall winners by industry, with websites like britannica.com, thesaurus.com and collinsdictionary.com enjoying triple-digit percentage gains in SEO visibility. Of the top five domains ranked by gain in absolute SEO visibility, four were dictionary or encyclopedia websites: Merriam Webster, Wikia, Dictionary.com and Wiktionary.

This is a huge change from last year, when social networking websites dominated the top five; out of last year’s top five “winners”, only YouTube is still on top, rising up the ranks from fourth to first place.

Searchmetrics attributes this miraculous change in fortune to an algorithm update in June 2017 dubbed the “dictionary update”. Dictionary websites had been slowly gaining in visibility since the beginning of the year, but over the three-week period between 25th June and 16th July, they saw an even more notable uptick:

Dictionary websites saw a boost from Google’s “Dictionary update” in June and July 2017

Searchmetrics noted that dictionary URLs particularly improved their ranking for short-tail keywords with ambiguous user intent – suggesting that Google might be examining whether the users searching these terms could be looking for definitions.

I would speculate that Google could also be promoting fact-based reference websites as part of its ongoing efforts to battle fake news and dubious search results – but this is purely speculation on my part.

The trend is also not borne out by Wikipedia, which continues to see its SEO visibility drop as more Knowledge Graph integrations appear for its top keywords, allowing users to see key information from Wikipedia without bothering to click through to the site – and possibly preventing those pages in Wikipedia from ranking.

The losers lost out more on mobile

One very interesting trend highlighted in Searchmetrics’ findings is the fact that domains which lost out in 2017 saw even bigger drops on mobile than on desktop.

Domains which started out the year with roughly equal desktop and mobile visibility closed out the year with their mobile visibility far below that of desktop. For example, TV.com’s mobile visibility was 41% below its desktop visibility by the end of 2017, while perezhilton.com’s mobile visibility was 42% lower than desktop, and allmusic.com was 43% lower.

Without going behind the scenes at Google’s search index, it’s hard to know exactly what the cause could be. TV.com decidedly fails Google’s Mobile-Friendly Test, but perezhilton.com and allmusic.com both pass. Because Searchmetrics is measuring organic search visibility, these drops may not be due to a lower SERP ranking, but could be due to the websites not appearing for as many search queries on mobile.

What isn’t surprising is that in 2017, we began to see much bigger differences between the way search behaves on mobile and the way it behaves on desktop. Back in August, we looked at the results of a BrightEdge study which found that 79% of all keywords ranked differently in mobile search compared to desktop.

At the time, we speculated that this was due to tests on Google’s part to prepare for the upcoming mobile-first index. Just two months later, Google’s Gary Illyes announced at SMX East that the mobile-first index had in fact already begun rolling out, albeit very slowly.

2017 was the year that we truly started to see mobile search on Google diverge from desktop, and in 2018 we’ve already had confirmation of a major upcoming change to Google’s mobile algorithm in July, after which point page speed will officially be a ranking factor on mobile. So to say that mobile and desktop search results will continue to diverge further in 2018 seems like a very safe prediction to make.

So long, social media?

Possibly the most curious change in fortune between 2016 and 2017 was seen with social media websites, which were among some of the biggest winners in 2016 and some of the biggest losers in 2017.

Visual social network Pinterest went from being the second-biggest ‘winner’ in terms of absolute search visibility in 2016 to suffering a 23% visibility loss in 2017. Similarly, discussion forum Reddit saw a 54% drop in visibility in 2017 after having been the 8th biggest ‘winner’ in 2016.

Tumblr and Myspace also experienced significant losses, and while Facebook and Twitter (#3 and #6 in 2016, respectively) weren’t among the “losers” highlighted by Searchmetrics in 2017, they also appeared nowhere in the list of “winners”.

It’s hard to say exactly why this would be. In last year’s study, Searchmetrics attributed Pinterest’s huge gains in visibility to its “application of deep-learning techniques” to understand user intent, “thereby generating more loyalty and stickiness online”. Whether Pinterest has slowed its progress on this front, or whether other shifts in Google’s index have caused its visibility to suffer, is unknown.

Reddit, meanwhile, appears to have suffered at the hands of Google’s “Phantom V” update, with visibility dropping off sharply at the beginning of 2017. Its mobile visibility was particularly low going in to 2017, which Searchmetrics tentatively attributes to technical issues with the mobile version of its website.

Reddit’s visibility drops off as Phantom V hits in February 2017

It could be that the losses in visibility suffered by social media websites in 2017 are due to differing circumstances and not part of a wider trend, but it’s an interesting coincidence nonetheless.

What can we learn from the “winners” and “losers” of 2017?

Many of the changes of fortune experienced by websites in 2017 were the result of a specific Google update. Phantom V was spotted in the SERPs in mid-February, sending a number of brands’ domains yo-yoing up and down. Google Fred hit not long afterwards, affecting ad-heavy websites with low-quality content and poor link profiles.

Another key change of note is the User Localization Update of October 2017, in which Google started showing search results based on users’ physical location regardless of the Top-Level Domain (.com, .co.uk, .fr) they might be using to search – a big development for local SEO.

Individual updates aside, however, there are a few key points that we can take away from 2017’s Winners and Losers Report:

  • High-quality content continues to be king, along with content that perfectly serves the user intent.
  • Brands continue to do well targeting a specific content niche – as exemplified by About.com, the old content network from the late 90s. It recently relaunched as “Dotdash”, an umbrella brand spanning six different niche verticals – several of which are already making great headway in search.

About.com is reborn as five (now six) different niche websites, which quickly begin to climb in search

  • If you’re targeting short-tail keywords with ambiguous user intent (like “beauty”), be aware that your consumers might now be seeing reference websites appear much higher up in the search results than before – so you may have better chances of ranking for longer-tail, more specific keywords.


source https://searchenginewatch.com/2018/01/25/who-were-the-winners-and-losers-of-organic-search-in-2017/