Quantcast
Channel: Search Engine Optimisation – Net Age
Viewing all 32 articles
Browse latest View live

SEO For B2B Market Websites

$
0
0

Marketing to businesses on the internet has been somewhat of an uphill battle since its inception. The main issue is the level of complexity involved in these sales. A consumer is far more likely to take a chance buying a DVD online than a business looking to purchase a R 1 million supply of building materials.

The problems are compounded by factors like the incredibly long buying process involved with B2B markets that can often stretch into years, depending on the type of product being purchased and the volume. Businesses are usually far more likely use the internet to gather information and then engage more personally with the companies they’re interested in doing business with.

This doesn’t relieve the pressure really. Brand strength is still very important, your content needs be credible and they must be able to find the information they feel they need to go further into the buying process.

But you need to be found first. So, if you’re a B2B market company making use of expert SEO services, here are some factors you might want to consider.

B2C vs. B2B Optimisation

Ensure that your SEO strategy is tailored toward business markets and isn’t just a broad SEO band-aid designed to just channel useless traffic your way.

Keywords are one of the most important factors. A good keyword map should give you an idea of what kind of customer your SEO strategy is targeting. To a great extent, a good keyword choice is subject to established searching behaviour and trends. But branded keywords and specific associated terms can always help people who are looking for you (in general), find you.

Some search terms are more important than others in B2B markets.

* General Industry Keywords- Straight forward popular keywords for your industry.
* Buying Terms – These indicates that they are ready to purchase, and are just looking for the right conditions. e.g. including buy, pricing, agents, local dealers in their search term.
* Evaluation Terms- These indicate an interest in the product or are looking at a specific brand. e.g. evaluation, comparison, review.

SEO Methodology

Good SEO is sustainable and ethical. A company can promise you a meteoric rise in the rankings, and it might come true. Until on the next run the algorithm discovers a bunch of spamming methods have been used and penalises the website or maybe even removes them from the index.

Link building is the practice you should keep a very close eye on. All of the links on your website should add value, look natural and be linked to reputable websites. In SEO terms, careless linking hurts a lot more than it could ever help. So links should be aimed at encouraging web traffic rather than building up link juice.

Measuring The Effect

The indices used for a successful SEO strategy in B2B markets are more complex than just site traffic, ranking and incremental sales. As the buying process is so long, it can be hard to directly measure the results in the short term, so some other metrics for success are:

*What was the level of customer engagement?
* What type of keywords were most used when finding your site?
* How often was media downloaded? For example, brochures, pricing sheets, trial software or embedded videos.
* How much time was spent on your site?
* What percentage of inbound traffic was from paid links vs. organic?

The post SEO For B2B Market Websites appeared first on Net Age.


SEO Best Practices

$
0
0

Since the release of Google’s algorithms Panda and Penguin, a lot of SEO practitioners have been left wondering how effective the traditional optimisation practice are, which formed the foundation of SEO.

But many misunderstood the purpose of these algorithms, which was to make web crawlers more intuitive of human interest and cut down on spamming practices, not make SEO redundant.

Keyword Rich Metadata and Content

Keywords are one of the best ways to get web-crawlers to notice you. Including keywords into the meta-elements such as the title, description and alt text tags give indexers and people viewing the results a good idea of what your website is about.

Keep it above board, though. Don’t include too many, as this will be considered keyword stuffing. And make sure that whatever you include in the meta-data is strongly connected to the content of the site. If your website content is just keyword rich spam and false meta elements, not only will Google eventually penalise it, but visitors to your site will feel tricked.

Value-adding Crawlable Content

Make sure that the content that’s on your website can be indexed by web-spiders. Content such as images, video and flash animation can’t be indexed properly so make effective descriptions for them in the alt tag.

At the end of the day though, you’re trying to entertain, assist or inform human visitors not just appease web-crawlers. Some tips for your content, to appeal to users.

• Vary the medium of the content: images, videos, infographics etc.
• Never put keywords in a place that seems unnatural.
• Don’t use too much text, particularly not in large uninterrupted blocks.

Website Architecture

If spider-bots find the structure of your website unnatural or struggle to find certain content, they may penalise you or just simply won’t index the content at all. Some issues to address for crawlability on your site:

• Canonical URLs
• Robots Exclusion Protocols
• Your site map
• The load speeds of pages
• A custom 404 page

Logical Link Building

We’ve discussed, at length, the best practices of link building. Most people know that Google’s link weighting algorithm PageRank, is less important these days. And spam and paid links are being heavily penalised by search engines.

The point isn’t that you should stop link building, just that your links should be valuable. Only link to reputable sites that add value to your content. Make links the most human-experience focused part of your site, because that’s where the most value lies.

Social Media

Use social media campaigns to engage with existing and potential customers on a personal and informative level rather than just for sales promotion. Once you start one though, you can’t just stop. You must stay engaged or risk losing face. Social media campaigns can help:

• Build your brand image
• Increase social signals
• Gain a larger web presence
• Respond to customers in real time

Black-Hat SEO

The overall message of the above points: Always take the human experience into perspective when designing and maintaining your website. Every algorithm update brings search engines closer to ranking websites the way a human would.

So if you’re trying to cheat web-crawlers with your SEO, you’ll always be under threat. Rather focus your efforts on making a quality website that also happens to appeal to web-crawlers. That way you’ll be rewarded on both avenues.

The post SEO Best Practices appeared first on Net Age.

Using AdWords Data For SEO

$
0
0

Anyone practising SEO these days will know that there is a growing shortage of valuable data. As Google applies more privacy measures on how it shares data it collects from its users and limits the depth of its data in other tools, the oasis of demographic and search behaviour data has dried up considerably.

Google AdWords has long been a handy free tool for SEO practitioners. As it provides data on search volumes and browsing behaviour, as applied to PPC campaigns on their network, it can be a valuable source of data.

But to what extent can you use data from paid search and apply it to organic optimisation? Obviously they don’t translate in every way. Below we explore how some of your campaign data on AdWords can be reliably applied to your SEO efforts.

Segments

This option allows you to divide the data based on how closely the searches matched the keyword: broad, phrase or exact. If the terms are often phrase matched, this tells you the keywords you’re optimising around are actually part of other long-tail keywords used to get click throughs for your site. This can change your content strategy.

You can also segment based on the device used. If it’s mainly mobile, your landing pages may be more effective for that market and platform.

Filters

Filters are effective in sorting through a lot of seemingly popular keywords on your campaign and getting a brief overview of their actual success. Some filtering criteria that can indicate keyword importance include:

• Clickthrough rates – comparing data based on the ad copy and the keywords can give you an idea of whether or not your keywords are making your site show up in the right context.
• Conversions – There are numerous metrics relating to conversions. The value and cost of each conversion on a keyword is very important. You want to target ones which generate the most revenue. The number of conversion occurring for each click is also useful as it tell you how comprehensive your landing pages are.
• Average CPC – Ones with high values but low clicks are probably driving high value business for each click. Optimising around them could drive a similar profit.

Impressions

If your keywords are creating a high number of impressions (when people see your advert) but a low number of clicks, it might mean your advert is coming up for the wrong kind of search intent. Ad copy may also be a problem but it’s unlikely if the ratio is heavily off.

Placements

This is where your display adverts appear. If you select automatic placement Google will place these on their Display Network based on the keywords you supplied them with. The Placement Tool can also be used to return sites which thematically match your input. This can help inform the direction of your link-building strategy.

Interests & Remarketing

This section can give you valuable data on a number of incredibly specific variables such as non-converted trial users or users who abandon their shopping carts. It also offers data on user interests, based on their browsing behaviour.

This data can’t be used directly, but can be combined with marketing functions and business intelligence to create content that’s more targeted towards your market’s interests.

Keyword Tool

Probably the internet’s most popular tool for keyword research, most SEOs know it well. It can tell you how often a keyword is used in queries, both internationally and locally. You can get keyword data for the whole web, or for your site specifically.

Opportunities Tab

The information on this tab can be somewhat interpretive but, if used effectively, can really bolster your SEO efforts. ‘Analyse competition’ is useful as it takes your selected keywords and breaks down their relative chance of success, as compared to competitors in your range.

It can give you good indication of whether your keyword choices are part of the best practices or need to be abandoned.

In Conclusion

All of the above should be used cautiously. Try to remember that you’re using user activity on Google’s commercial advertising network to infer knowledge about organic search behaviour so the data relevance does get a bit distorted. Logic and sound business judgement are the only things that can really guide you.

The post Using AdWords Data For SEO appeared first on Net Age.

Google Panda 4.0 Rolls Out

$
0
0

Panda 4.0If you’ve been following along with the on-going state of the SEO industry, you’ll know that recent times have been fraught with changes.  On that score at least, things have stayed the same, in that they continue to change, and Google’s latest evolution has been the roll-out of the latest update to the Panda algorithm, 4.0.

Obviously the first thing that springs to mind is that this must be a fairly large update.  Since Google officially stopped notifying people about changes to the algorithm, saying that they would be following a program of rolling refreshes (often on a daily basis) rather than doing big publicised updates, the algorithm news front has been quite.

This break from the new practice of not saying anything suggests a major change instead of just a refresh.

What Is Panda?

The Panda algorithm, originally rolled out in 2011, and now in it’s 26th (odd) iteration, was designed to prevent sites with low quality content from being ranked in search engine results.  At the time, Panda and it’s subsequent updates wreaked havoc in the search engine results pages, with previously huge, high traffic sites experiencing massive drops as a result of poor quality, scraped, and keyword stuffed articles.

Google Panda 4.0, the latest version of the content algorithm, is supposed to be a “softer” version, but according to the professionals, it might also just be the start of future changes focused on the same goals…promoting high quality, unique and relevant content above bad quality, copied or derivative content.

How Bad Is It?

According to Google, this update will affect around 7.5% of all English queries to a significant degree.  In practice though, that doesn’t really mean much.  We can assume that this change involves actual changes to how Google identifies sites and their content.

As before, the real impact is difficult, if not impossible, to determine.  It’s only in retrospect, and usually as a result of noticeable traffic drops, that most sites realise that Google might have had a problem with your site.

What Do You Do?

After years of chasing around after updates, and joining in the Google dance, I think I’ve largely given up worrying about Google updates.  At the end of the day, if you maintain high on-site standards for SEO, with good solid page titles, internal links, sensible URLs, and above all, great quality, unique and relevant content, you should (in theory) be fine in the long run.

Focus on providing your users with a valuable and informative experience, and by definition, you should (again in theory) be pleasing Google as well.

In practice, it’s not always quite that simple.  Sometimes arbitrary applications of “rules” or mistaken interpretations can impact sites that really don’t have anything wrong.  Sometimes, things that used to be acceptable practice change overnight, and something you used to do suddenly harms you.

With SEO, there are no guarantees.  But if you write, design and act with your users in mind, the odds are on your side.  Sometimes you’ll find people doing things that should cost them and getting away with it, but don’t let it tempt you.  Google aren’t stupid.  Their wheels might grind slow, but they grind exceedingly fine too. And sooner or later those sites will find that out.

 

The post Google Panda 4.0 Rolls Out appeared first on Net Age.

Google Structured Snippets

$
0
0

It was announced yesterday in a post on the Google Research Blog that Google has rolled out a new feature that can show tabular data from your page under the description in the search engine results.

That means that if you have a table of product specs, for example, on your page, relevant information from that table may show up in the results. And unusually for Google, this appears to have rolled out everywhere at once.

We used the example from the blog post, but searched on google.co.za instead of google.com, and got the same result, as shown below, in the first organic result for the search query “nikon d710.”

Google Structured Snippets

Tables Are Back – Kind Of

Structured snippets are a result of Google’s work at understanding data presented in tables.  The very inability that formed part of the reason for the move away from table based design has now become a way of showing even more information from your page, directly in the results.

This collaboration between the WebTables team, and the Web Search team, will apparently identify data tables and display up to four facts from them as part of the results snippet in both mobile and desktop search.

Below you can see the table on the page that this data has been pulled from.  What’s really interesting is that the table contains data for two different types of camera, but Google has correctly identified the facts applicable to the camera in the query, and displayed only those in the structured snippet.

Tabular Data For Structured Snippets
That makes it pretty clear that they really have “understood” the table.

The Impact Of Structured Snippets

On the one hand, the more information Google can serve quickly, the better the user experience is likely to be.  There may be the same sort of concerns here that the Knowledge Graph or Answer Box caused, namely reduced traffic, but on the other hand, if you can deliver that information via the search engine results pages, the chances that users will assume you have other relevant information is quite high.

Hopefully we’ll soon see some studies about click through rates on structured snippets.

The post Google Structured Snippets appeared first on Net Age.

Google Panda Refresh In Progress

$
0
0

After months of relative quite, Google announced last week that they had begun rolling out the newest update to the infamous Panda algorithm, designed to reduce search engine rankings for low quality sites.

This update brings Panda (first released in 2011) to version 4.2, and is the 29th official update to the algorithm.  According to Google, when the update is finished rolling out, 2%-3% of English language queries will be affected.

This update is also one of the slowest to be pushed out, and Google has said that it will be several months before it is fully in place.

Panda Recovery

If your site was penalised by Panda in the previous update, this refresh offers the opportunity to recover, provided of course that you made the necessary changes to the site to improve your content and site quality.  Panda specifically targets sites with “thin” content, that Google feels do not provide sufficient information to meet searchers needs, but given how slowly this is rolling out, any recovery you’re due might not show up for some time yet.

In the past, the effects of Panda updates have been apparent relatively quickly, but this time rankings will improve (or decrease) gradually.

The post Google Panda Refresh In Progress appeared first on Net Age.

Optimise PDF Files For SEO

$
0
0

If you upload PDF’s to your site for your visitors to read or download, it’s a good idea to keep in mind that in many cases, Google can read that PDF and even display its contents in search engine results.  As far as Google is concerned, a PDF is just another type of web page, and that means that you can optimise it for search.

It’s a frequently overlooked part of many websites, and yet it’s very easy to make sure you have professional looking, optimised PDFs for both Google, and your users.

Optimising Your PDF Files

Below you’ll find a few tips for optimising PDF files.

PDF Titles

Your PDF title provides the blue “link” text that shows up in the search engine results.  Make sure you title your PDF appropriately (preferably with a relevant keyword).  Check your PDF Title in the Document Properties when you create it, and make sure it’s one that tells Google and your visitors what the document is about.

PDF Size

Originally, Google ignored PDF’s that were bigger than 2 Mb.  These days, they’ll index and display PDF’s as results up to about 10Mb.  If your PDF is bigger than that, it might not be indexed or shown in the search engine results page.

Document Layout and Structure

Using tags and structuring your documents reading order will not only make things easier for your readers, but will improve Google’s ability to index, and understand, the contents of your PDF.

Document Description

Assuming you’re using something like Acrobat to create your PDF’s, there are several options under the document properties that can improve the optimisation.  These include the Title attribute, the Subject attribute, and even a field for Keywords related to the document.  Ensuring these fields have relevant information in them will increase your PDF’s optimisation.

In Conclusion

As far as Google is concerned, a PDF is just another web page.  And that means it’s an opportunity for you to rank for some of the information you provide to website visitors via PDF.  Even if you’re not using professional PDF software, at the very least, make sure that the title of your PDF is relevant, and contains keywords related to its content.

Just using a meaningful title and file name for your PDF already makes a big difference, and will put you ahead of 50% of the PDF’s available through Google.

 

 

The post Optimise PDF Files For SEO appeared first on Net Age.

7 Pack Becomes 3 Pack For Local Results

$
0
0

As of the beginning of this month, Google began implementing a change to the way that the local results are displayed in search.  Where we’ve been used to search engine results displaying 7 local listings (with the map on the side), Google has now replaced this with only 3 local listings displayed in-line with the map, in a layout that is effectively the same as the mobile layout for local results.

Local 3-Pack Results

An example of the new local results layout.

Another change here has been the removal of links to the verified listings Google profile, effectively severing the connection between Google+ and the local results from the user side. (Companies still need to set up their business pages in order to show up here though.)

The new local results have been seen immediately below the AdWords ads for that keyword, (before all organic results) as well as below the first organic result.  It appears the most frequently seen variation however is to have the local results immediately above all organic results.

Implications Of The New Local Results Display

The most obvious implication of course is that it’s going to be relatively more difficult to rank for the local results now, with local results displayed reduced by more than half.

Depending on which variation is displayed, paid results from AdWords may gain an advantage, if they’re the only results visible above the local pack.  Those top 3 local results may also enjoy more clicks, but according to an initial study by Moz, the organic listing has been seeing more clicks as well, with some users skipping the local pack entirely. (Possibly due to unfamiliarity with the new layout.)

However, in cases where the first organic result displays above the local results, that organic result received a significant majority of the clicks.

Obviously the full implications of the change-over are yet to be determined, as increased user familiarity with the new layout may produce different results, but there’s no doubt that the changed landscape of the results page will affect user behaviour.

One very interesting result reported so far is that despite the changes, the presence of reviews and ratings on a result appeared to noticeably increase the chances that that result would be clicked on, regardless of whether there was an organic listing above the local results or not.

The post 7 Pack Becomes 3 Pack For Local Results appeared first on Net Age.


Google Reveals RankBrain AI

$
0
0

Ok, maybe not strictly AI in the truest sense, but a few days ago, in the wake of new Google “over-company” Alphabet earnings reports, Google revealed that for several months already, an artificial intelligence system nicknamed “RankBrain” has been responsible for handling “a large fraction” of their daily user queries.

We’ve always know that about 15% of queries received by Google every day are unique.  In this sense, Google means that they are queries which have never been received before. (Most of them are probably very long-tailed queries, or “conversational” queries, which will be phrased differently every time.)

Google’s investment into AI is also something that has been commonly known for some time, and there has been speculation in the past, that the ultimate end goal of Google search is in fact the development of true AI.  That theory got a little boost a few days ago, when they confirmed that they are now using a machine learning system to handle those unique queries.

A Significant Signal

Greg Corrado, a senior research scientist at Google explained to Bloomberg Business that while there are hundreds of different signals that inform the algorithm that determines search results, in the few months that it has been active, RankBrain has become the 3rd most important of these signals.  According to his explanation, although the other signals are based on insights of people in information retrieval, there has been no “learning” aspect to those signals.  RankBrain is different.

The result of a year-long effort by a team of engineers, the system so far has shown a greater accuracy at predicting pages that the search engine would display for queries than human search engineers, with an 80% accuracy rate, compared to the 70% rate humans managed.  Subsequent tests have found that excluding the system’s input has resulted in outcomes which they describe as being as bad as “forgetting to serve half the pages on Wikipedia.”

Not A Replacement

It’s important to keep in mind that this is not a new Google algorithm, or a replacement.  This is just one part of the algorithm, which deals mostly with queries that Google has never seen before, and most importantly, learns from those queries. Processing 3 billion searches every day means that 15% is quite a serious chunk of daily queries.

RankBrain appears to allow Google to determine patterns between ostensibly unconnected searches, and learn how they are similar to each other, whichit can then use to deal more effectively with other queries that share those subtle similarities.

What Does RankBrain Mean For Search?

In short, we don’t know yet.  In true Google fashion, despite them revealing this information, they still haven’t really said very much about it.  There have already been a few speculative analysis of the patent that appears to be involved, but for now, the long (and even medium) term impact is unknown.

We can probably infer with a fair degree of safety though that it’s participation in the algorithm will increase over time, but for now, it just looks like they’re going to be really good at resolving those long and complex search queries, and getting better all the time.

 

 

The post Google Reveals RankBrain AI appeared first on Net Age.

Pop-Up Mobile Ad Penalty

$
0
0

Earlier this month, Google’s latest algorithm update went into effect, penalising mobile sites with intrusive pop-up ads and interstitials (the full screen ads or notices that load on top of the page you were going to).

Intended to improve the user experience for people trying to access content on mobile devices, this change has followed on from the “mobile friendly” label they began showing in mobile search results as long as two years ago, in an effort to simplify mobile search results.

Intrusive Advertising To Affect Rankings

Under this change sites which cover content with intrusive, hard to dismiss pop-ups or interstitials may see their rankings drop, potentially significantly in some cases.

According to Google, they’re targeting what they call “problematic transitions.” These include pages that show a pop-up immediately on arrival (or on scrolling) which hides the page content, pages that show pop-up ads which must be closed before the user can access content, and pages which show interstitial ads above the fold, and keep content below it.

This applies only to the first click from Google though, so secondary pages won’t be affected.

Acceptable Use

There are also a few instances where interstitials will not be affected by the new change, as long as they’re “used responsibly.” These include pop-ups which the site uses because of a legal obligation, such as age verification, log-in pop-ups on sites with private content, and “banners that use a reasonable amount of screen space and are easily dismissible.”

(As usual, Google is a little vague about what “reasonable” is, but offers the example of app install banners in Chrome and Safari.)

User Experience

Effectively, it all boils down to what has become perhaps the most important factor as far as Google is concerned…the user experience. If your site makes it easy for visitors to find what they’re looking for, and achieve their objectives in visiting, Google will approve.

Anything that interferes with, or needlessly complicates that user experience is likely to earn their disapproval.

The post Pop-Up Mobile Ad Penalty appeared first on Net Age.

The Personalisation Of Search

$
0
0

Search engine optimisation. One of the holy grails of online marketing is the first organic result for a search for your products or services. Everybody wants to be on the front page, everybody wants to be first. And equally obviously, only one result can be first, and only 10 (but probably fewer) results can be on the first page.

We all know that SEO (as usually practised by marketing companies on behalf of their clients) is based around this entire ideal of high rankings for relevant search queries, and all SEO’s live in fear of the next big algorithm update, because you never know when your hard work will be wiped out by a change that invalidates previous efforts, or worse, penalises them.

But there’s a big factor that a lot of these efforts seem to overlook…

There’s No Such Thing As A First Page Anymore

Once upon a time, if you searched for something, and I searched for that same something, we’d almost always both see the same results. Hence the coveted “first place” in Google. But that’s just not true anymore. And it’s becoming less true on almost a daily basis.

For the last few years, Google search results have become increasingly personalised, delivering results based on user behaviour, location, personal search history, similar successful searches, and probably a range of other factors they don’t disclose.

That means that even entering an identical search to somebody else is likely to provide you with a different set of results, and the increasing use of Google’s “RankBrain” AI in delivering search results is only increasing the impact of search personalisation.

Volume Makes Learning Faster

Google processes 3.5 billion searches every day. And it’s algorithms learn from every search. If somebody searches for a specific term, finds their answer, and stays on the site, that search was a success.  If they immediately bounce, then the search failed, and the next search for that term will take either success or failure of similar searches into account.

The search engine learns all the time, from every one of those billions of searches. It learns which pages offer value for which types of search, and crucially, for the various intents that users may have for a search.

Then it applies that “knowledge” to every other search, and so on, billions of times a day. The search algorithm is in the process of making itself obsolete. And it’s changing the very nature of SEO as it does so.

The Future Of SEO

A little earlier, I talked about the possibility of SEO work being wiped out by algorithm changes, and it’s certainly happened many times in the past. But there are some things about SEO that have never been threatened by those changes, and that means that the future of SEO loks, in some respects at least, remarkably similar to it’s past.

Good on-page optimisation is never going to go out of style. Unique, and in-depth content has not only always been relevant, but it seems to becoming even more relevant as we move ahead. And the last thing that’s always going to be important is a good user experience. (Another factor that’s becoming even more important now.)

Even links (I sincerely hope) will eventually stop counting as a crucial ranking factor. But a well designed site, with functionality and content aimed at user needs, is not only always going to be good, but should continue to improve over time, as long as you stick to those always important principles.

 

 

The post The Personalisation Of Search appeared first on Net Age.

Searcher Task Accomplishment – 2017’s Ugly SEO Content

$
0
0

If you head over to the biggest SEO forums at the moment, you’ll find them abuzz with a new and mysterious ranking factor called “Searcher Task Accomplishment”. It’s being dubbed the most important ranking factor to date. Forget about backlinks, traditional onsite SEO, Rankbrain and every other ranking factor you’ve ever worked on. Searcher Task Accomplishment is what ranking is all about, or is it?

What Is Searcher Task Accomplishment?

Before anybody starts to panic, I must stress that Searcher Task Accomplishment (STA) is not an official Google ranking factor. Google has not made any announcements regarding STA, and we have not encountered any major algorithm updates that suggest that STA actually exists. That said, there have been numerous little updates and tweaks that suggest that STA is indeed a real thing, and that it is Google’s ultimate endgame for search results.

STA is the idea that search results should be determined by the objectives of the user performing the search, and the satisfaction the user experiences when they receive those results.

According to Rand Fishkin, the man who recently put the idea forward, “no matter what someone is searching for, it’s not just that they want a set of results. They’re actually trying to solve a problem. For Google, the results that solve that problem fastest and best and with the most quality are the ones that they want to rank.

Why Is This Google’s Goal?

To understand STA and where the idea comes from, we’ve got to take a look at industry insights, Google’s official statements, the updates Google have releasing over a number of years, and their overall goals for search. In 2004 the Nielson Norman Group noticed that users were no longer engaging with websites as they used to. Instead, they were asking search engines direct questions to get answers to their immediate needs.

In 2012, Google unveiled its knowledge graph, an algorithm designed to enhance its search engine’s search results with semantic-search information gathered from a wide variety of sources. In 2013, Google rolled out the hummingbird update to give search results more context. They’ve been on a roll ever since. Since the rise of the knowledge graph, Google has been rolling out hundreds of minor tweaks and updates to search results to provide direct answers to questions in search results. Most recently Google put forward a patent to use user facial expressions to gauge users reactions to search results and use those reactions as a ranking factor. From our point of view, Google is using STA. If not as a clearly defined algorithm, then certainly as an objective for search results.

What Does STA Mean For Content Marketing?

Searcher task accomplishment is something that SEOs have been aware of for some time, even if we didn’t have a name for it. We’ve been tailoring our content for Google’s Knowledge Graph to deliver quick punchy answers to common questions and related searches. We’ve noticed the gains in traffic when we achieve a Knowledge Graph ranking and we’ve risen to the challenge. Our article structures have changed as have the criteria we use to grade quality content.

Unfortunately, this hasn’t always had the desired effect of improving the overall quality of writing on the internet. In the 90s and early 2000s SEOs ruined the world wide web by stuffing articles with keywords and using bots to build millions of terrible backlinks.

STAs are having a similar detrimental effect on content created for the internet today.

The content being written by SEOs at the moment is not awful. It’s relevant. It answers a specific question; that’s where it ends.

This is making it more difficult for users to find content with any depth.

It’s easier to achieve a knowledge graph ranking with a simple question and answer format than it is to put in hours of research or hard-won technical expertise into an article.

Ranking For Content

To give you an example, if we conduct a simple search how to grow your business on youtube”  We’re greeted with rather lacklustre results that are thin on actual content or any useful marketing advice. Instead, we’re presented with formulaic thin content, written for rankings, not users. The formula looks something like this [Number+Ways/Tips+Promising Search Question]=Rankings.

If we look at our top three organic search results the content and structure matches the question, but the answers provide very little information that answers the question.

In position 1, Business.com suggests that we set up an account, stuff it with keywords, and copy someone else in their first 3 steps. The next four steps aren’t particularly useful either. Nowhere along the way do we actually get an answer to how to market to market a business on YouTube, or get any technical expertise on the subject matter. Instead, we’re presented with lacklustre fluff that’s easy to write.

In Position 2, the second most useful result on the internet, we’ve got Business2Community.com. Their information gets a little closer to answering our question. They’re covered some types of content you should produce and have even managed to gloss over advertising and they’ve touched on branding. The still haven’t actually covered how to actually market my business on YouTube. Bullet point ideas at best.

In Position 3, we’ve got a course from Udemy.com, who are normally pretty good with their courses, but they’re not free, and don’t happen to answer my actual question with this result.

The rest of the results on page one follow the same formula and offer very similar content with a very similar structure. They all attempt to answer a question. They all employ a bullet point answer format. They’re all attempting to achieve a knowledge graph ranking of some sort or another. None of them actually answer the initial question.

If we head over to page 2, we’ll find an article by Forbes. It’s not the definitive guide to YouTube marketing, but it does actually cover some aspects of channel creation, content dissemination, keyword research, audience engagement and more. It makes an effort to provide us with useful information on our topic and, while it has followed the bullet point format, it has not done so in a manner designed to cater to the knowledge graph. The bullet points actually make sense in relation to our question.

Where To From Here?

Google’s content guidelines have not changed much since they were released. The emphasis is still placed on creating on high-quality, useful content that is engaging, relevant, and, most importantly, does not employ tricks to improve SERP rankings. That said, part of an SEO’s job description is identifying ranking factors and finding ways to exploit them.

Our prediction has more to do with precedent than clairvoyance. We will continue to see a slew of content created specifically to cater to the knowledge graph. It will continue to be designed to answer specific one line questions, with generic fluff-based bullet point answers. The quality of the information provided by Google’s knowledge graph will continue to decline. At some point, Google will notice that searcher task accomplishment and the knowledge graph are being abused for the purpose of marketing, directly undermining their business model. At this point, Google will roll out a an update. SEOs will probably call it something-magedon.

Organic Traffic for those who employ the strategy will plummet. We’ll blame Google. It’ll be super sad.

What Do We Suggest?

Well, when it gets right down to it, the same thing we’ve been saying for years. Stay focussed on quality. Write great, meaningful content that is designed to present your users with the information they need.

Should you steer clear of knowledge graph strategies? Of course not. They’re working. They do increase traffic. They do increase visibility. Just don’t employ the strategy without focussing on the same criterion that rules them all: Quality.

The post Searcher Task Accomplishment – 2017’s Ugly SEO Content appeared first on Net Age.

Google Panda 4.0 Rolls Out

$
0
0

Panda 4.0If you’ve been following along with the on-going state of the SEO industry, you’ll know that recent times have been fraught with changes.  On that score at least, things have stayed the same, in that they continue to change, and Google’s latest evolution has been the roll-out of the latest update to the Panda algorithm, 4.0.

Obviously the first thing that springs to mind is that this must be a fairly large update.  Since Google officially stopped notifying people about changes to the algorithm, saying that they would be following a program of rolling refreshes (often on a daily basis) rather than doing big publicised updates, the algorithm news front has been quite.

This break from the new practice of not saying anything suggests a major change instead of just a refresh.

What Is Panda?

The Panda algorithm, originally rolled out in 2011, and now in it’s 26th (odd) iteration, was designed to prevent sites with low quality content from being ranked in search engine results.  At the time, Panda and it’s subsequent updates wreaked havoc in the search engine results pages, with previously huge, high traffic sites experiencing massive drops as a result of poor quality, scraped, and keyword stuffed articles.

Google Panda 4.0, the latest version of the content algorithm, is supposed to be a “softer” version, but according to the professionals, it might also just be the start of future changes focused on the same goals…promoting high quality, unique and relevant content above bad quality, copied or derivative content.

How Bad Is It?

According to Google, this update will affect around 7.5% of all English queries to a significant degree.  In practice though, that doesn’t really mean much.  We can assume that this change involves actual changes to how Google identifies sites and their content.

As before, the real impact is difficult, if not impossible, to determine.  It’s only in retrospect, and usually as a result of noticeable traffic drops, that most sites realise that Google might have had a problem with your site.

What Do You Do?

After years of chasing around after updates, and joining in the Google dance, I think I’ve largely given up worrying about Google updates.  At the end of the day, if you maintain high on-site standards for SEO, with good solid page titles, internal links, sensible URLs, and above all, great quality, unique and relevant content, you should (in theory) be fine in the long run.

Focus on providing your users with a valuable and informative experience, and by definition, you should (again in theory) be pleasing Google as well.

In practice, it’s not always quite that simple.  Sometimes arbitrary applications of “rules” or mistaken interpretations can impact sites that really don’t have anything wrong.  Sometimes, things that used to be acceptable practice change overnight, and something you used to do suddenly harms you.

With SEO, there are no guarantees.  But if you write, design and act with your users in mind, the odds are on your side.  Sometimes you’ll find people doing things that should cost them and getting away with it, but don’t let it tempt you.  Google aren’t stupid.  Their wheels might grind slow, but they grind exceedingly fine too. And sooner or later those sites will find that out.

 

The post Google Panda 4.0 Rolls Out appeared first on Net Age.

Google Structured Snippets

$
0
0

It was announced yesterday in a post on the Google Research Blog that Google has rolled out a new feature that can show tabular data from your page under the description in the search engine results.

That means that if you have a table of product specs, for example, on your page, relevant information from that table may show up in the results. And unusually for Google, this appears to have rolled out everywhere at once.

We used the example from the blog post, but searched on google.co.za instead of google.com, and got the same result, as shown below, in the first organic result for the search query “nikon d710.”

Google Structured Snippets

Tables Are Back – Kind Of

Structured snippets are a result of Google’s work at understanding data presented in tables.  The very inability that formed part of the reason for the move away from table based design has now become a way of showing even more information from your page, directly in the results.

This collaboration between the WebTables team, and the Web Search team, will apparently identify data tables and display up to four facts from them as part of the results snippet in both mobile and desktop search.

Below you can see the table on the page that this data has been pulled from.  What’s really interesting is that the table contains data for two different types of camera, but Google has correctly identified the facts applicable to the camera in the query, and displayed only those in the structured snippet.

Tabular Data For Structured Snippets
That makes it pretty clear that they really have “understood” the table.

The Impact Of Structured Snippets

On the one hand, the more information Google can serve quickly, the better the user experience is likely to be.  There may be the same sort of concerns here that the Knowledge Graph or Answer Box caused, namely reduced traffic, but on the other hand, if you can deliver that information via the search engine results pages, the chances that users will assume you have other relevant information is quite high.

Hopefully we’ll soon see some studies about click through rates on structured snippets.

The post Google Structured Snippets appeared first on Net Age.

Google Panda Refresh In Progress

$
0
0

After months of relative quite, Google announced last week that they had begun rolling out the newest update to the infamous Panda algorithm, designed to reduce search engine rankings for low quality sites.

This update brings Panda (first released in 2011) to version 4.2, and is the 29th official update to the algorithm.  According to Google, when the update is finished rolling out, 2%-3% of English language queries will be affected.

This update is also one of the slowest to be pushed out, and Google has said that it will be several months before it is fully in place.

Panda Recovery

If your site was penalised by Panda in the previous update, this refresh offers the opportunity to recover, provided of course that you made the necessary changes to the site to improve your content and site quality.  Panda specifically targets sites with “thin” content, that Google feels do not provide sufficient information to meet searchers needs, but given how slowly this is rolling out, any recovery you’re due might not show up for some time yet.

In the past, the effects of Panda updates have been apparent relatively quickly, but this time rankings will improve (or decrease) gradually.

The post Google Panda Refresh In Progress appeared first on Net Age.


Optimise PDF Files For SEO

$
0
0

If you upload PDF’s to your site for your visitors to read or download, it’s a good idea to keep in mind that in many cases, Google can read that PDF and even display its contents in search engine results.  As far as Google is concerned, a PDF is just another type of web page, and that means that you can optimise it for search.

It’s a frequently overlooked part of many websites, and yet it’s very easy to make sure you have professional looking, optimised PDFs for both Google, and your users.

Optimising Your PDF Files

Below you’ll find a few tips for optimising PDF files.

PDF Titles

Your PDF title provides the blue “link” text that shows up in the search engine results.  Make sure you title your PDF appropriately (preferably with a relevant keyword).  Check your PDF Title in the Document Properties when you create it, and make sure it’s one that tells Google and your visitors what the document is about.

PDF Size

Originally, Google ignored PDF’s that were bigger than 2 Mb.  These days, they’ll index and display PDF’s as results up to about 10Mb.  If your PDF is bigger than that, it might not be indexed or shown in the search engine results page.

Document Layout and Structure

Using tags and structuring your documents reading order will not only make things easier for your readers, but will improve Google’s ability to index, and understand, the contents of your PDF.

Document Description

Assuming you’re using something like Acrobat to create your PDF’s, there are several options under the document properties that can improve the optimisation.  These include the Title attribute, the Subject attribute, and even a field for Keywords related to the document.  Ensuring these fields have relevant information in them will increase your PDF’s optimisation.

In Conclusion

As far as Google is concerned, a PDF is just another web page.  And that means it’s an opportunity for you to rank for some of the information you provide to website visitors via PDF.  Even if you’re not using professional PDF software, at the very least, make sure that the title of your PDF is relevant, and contains keywords related to its content.

Just using a meaningful title and file name for your PDF already makes a big difference, and will put you ahead of 50% of the PDF’s available through Google.

 

 

The post Optimise PDF Files For SEO appeared first on Net Age.

7 Pack Becomes 3 Pack For Local Results

$
0
0

As of the beginning of this month, Google began implementing a change to the way that the local results are displayed in search.  Where we’ve been used to search engine results displaying 7 local listings (with the map on the side), Google has now replaced this with only 3 local listings displayed in-line with the map, in a layout that is effectively the same as the mobile layout for local results.

Local 3-Pack Results

An example of the new local results layout.

Another change here has been the removal of links to the verified listings Google profile, effectively severing the connection between Google+ and the local results from the user side. (Companies still need to set up their business pages in order to show up here though.)

The new local results have been seen immediately below the AdWords ads for that keyword, (before all organic results) as well as below the first organic result.  It appears the most frequently seen variation however is to have the local results immediately above all organic results.

Implications Of The New Local Results Display

The most obvious implication of course is that it’s going to be relatively more difficult to rank for the local results now, with local results displayed reduced by more than half.

Depending on which variation is displayed, paid results from AdWords may gain an advantage, if they’re the only results visible above the local pack.  Those top 3 local results may also enjoy more clicks, but according to an initial study by Moz, the organic listing has been seeing more clicks as well, with some users skipping the local pack entirely. (Possibly due to unfamiliarity with the new layout.)

However, in cases where the first organic result displays above the local results, that organic result received a significant majority of the clicks.

Obviously the full implications of the change-over are yet to be determined, as increased user familiarity with the new layout may produce different results, but there’s no doubt that the changed landscape of the results page will affect user behaviour.

One very interesting result reported so far is that despite the changes, the presence of reviews and ratings on a result appeared to noticeably increase the chances that that result would be clicked on, regardless of whether there was an organic listing above the local results or not.

The post 7 Pack Becomes 3 Pack For Local Results appeared first on Net Age.

Google Reveals RankBrain AI

$
0
0

Ok, maybe not strictly AI in the truest sense, but a few days ago, in the wake of new Google “over-company” Alphabet earnings reports, Google revealed that for several months already, an artificial intelligence system nicknamed “RankBrain” has been responsible for handling “a large fraction” of their daily user queries.

We’ve always know that about 15% of queries received by Google every day are unique.  In this sense, Google means that they are queries which have never been received before. (Most of them are probably very long-tailed queries, or “conversational” queries, which will be phrased differently every time.)

Google’s investment into AI is also something that has been commonly known for some time, and there has been speculation in the past, that the ultimate end goal of Google search is in fact the development of true AI.  That theory got a little boost a few days ago, when they confirmed that they are now using a machine learning system to handle those unique queries.

A Significant Signal

Greg Corrado, a senior research scientist at Google explained to Bloomberg Business that while there are hundreds of different signals that inform the algorithm that determines search results, in the few months that it has been active, RankBrain has become the 3rd most important of these signals.  According to his explanation, although the other signals are based on insights of people in information retrieval, there has been no “learning” aspect to those signals.  RankBrain is different.

The result of a year-long effort by a team of engineers, the system so far has shown a greater accuracy at predicting pages that the search engine would display for queries than human search engineers, with an 80% accuracy rate, compared to the 70% rate humans managed.  Subsequent tests have found that excluding the system’s input has resulted in outcomes which they describe as being as bad as “forgetting to serve half the pages on Wikipedia.”

Not A Replacement

It’s important to keep in mind that this is not a new Google algorithm, or a replacement.  This is just one part of the algorithm, which deals mostly with queries that Google has never seen before, and most importantly, learns from those queries. Processing 3 billion searches every day means that 15% is quite a serious chunk of daily queries.

RankBrain appears to allow Google to determine patterns between ostensibly unconnected searches, and learn how they are similar to each other, whichit can then use to deal more effectively with other queries that share those subtle similarities.

What Does RankBrain Mean For Search?

In short, we don’t know yet.  In true Google fashion, despite them revealing this information, they still haven’t really said very much about it.  There have already been a few speculative analysis of the patent that appears to be involved, but for now, the long (and even medium) term impact is unknown.

We can probably infer with a fair degree of safety though that it’s participation in the algorithm will increase over time, but for now, it just looks like they’re going to be really good at resolving those long and complex search queries, and getting better all the time.

 

 

The post Google Reveals RankBrain AI appeared first on Net Age.

Pop-Up Mobile Ad Penalty

$
0
0

Earlier this month, Google’s latest algorithm update went into effect, penalising mobile sites with intrusive pop-up ads and interstitials (the full screen ads or notices that load on top of the page you were going to).

Intended to improve the user experience for people trying to access content on mobile devices, this change has followed on from the “mobile friendly” label they began showing in mobile search results as long as two years ago, in an effort to simplify mobile search results.

Intrusive Advertising To Affect Rankings

Under this change sites which cover content with intrusive, hard to dismiss pop-ups or interstitials may see their rankings drop, potentially significantly in some cases.

According to Google, they’re targeting what they call “problematic transitions.” These include pages that show a pop-up immediately on arrival (or on scrolling) which hides the page content, pages that show pop-up ads which must be closed before the user can access content, and pages which show interstitial ads above the fold, and keep content below it.

This applies only to the first click from Google though, so secondary pages won’t be affected.

Acceptable Use

There are also a few instances where interstitials will not be affected by the new change, as long as they’re “used responsibly.” These include pop-ups which the site uses because of a legal obligation, such as age verification, log-in pop-ups on sites with private content, and “banners that use a reasonable amount of screen space and are easily dismissible.”

(As usual, Google is a little vague about what “reasonable” is, but offers the example of app install banners in Chrome and Safari.)

User Experience

Effectively, it all boils down to what has become perhaps the most important factor as far as Google is concerned…the user experience. If your site makes it easy for visitors to find what they’re looking for, and achieve their objectives in visiting, Google will approve.

Anything that interferes with, or needlessly complicates that user experience is likely to earn their disapproval.

The post Pop-Up Mobile Ad Penalty appeared first on Net Age.

The Personalisation Of Search

$
0
0

Search engine optimisation. One of the holy grails of online marketing is the first organic result for a search for your products or services. Everybody wants to be on the front page, everybody wants to be first. And equally obviously, only one result can be first, and only 10 (but probably fewer) results can be on the first page.

We all know that SEO (as usually practised by marketing companies on behalf of their clients) is based around this entire ideal of high rankings for relevant search queries, and all SEO’s live in fear of the next big algorithm update, because you never know when your hard work will be wiped out by a change that invalidates previous efforts, or worse, penalises them.

But there’s a big factor that a lot of these efforts seem to overlook…

There’s No Such Thing As A First Page Anymore

Once upon a time, if you searched for something, and I searched for that same something, we’d almost always both see the same results. Hence the coveted “first place” in Google. But that’s just not true anymore. And it’s becoming less true on almost a daily basis.

For the last few years, Google search results have become increasingly personalised, delivering results based on user behaviour, location, personal search history, similar successful searches, and probably a range of other factors they don’t disclose.

That means that even entering an identical search to somebody else is likely to provide you with a different set of results, and the increasing use of Google’s “RankBrain” AI in delivering search results is only increasing the impact of search personalisation.

Volume Makes Learning Faster

Google processes 3.5 billion searches every day. And it’s algorithms learn from every search. If somebody searches for a specific term, finds their answer, and stays on the site, that search was a success.  If they immediately bounce, then the search failed, and the next search for that term will take either success or failure of similar searches into account.

The search engine learns all the time, from every one of those billions of searches. It learns which pages offer value for which types of search, and crucially, for the various intents that users may have for a search.

Then it applies that “knowledge” to every other search, and so on, billions of times a day. The search algorithm is in the process of making itself obsolete. And it’s changing the very nature of SEO as it does so.

The Future Of SEO

A little earlier, I talked about the possibility of SEO work being wiped out by algorithm changes, and it’s certainly happened many times in the past. But there are some things about SEO that have never been threatened by those changes, and that means that the future of SEO loks, in some respects at least, remarkably similar to it’s past.

Good on-page optimisation is never going to go out of style. Unique, and in-depth content has not only always been relevant, but it seems to becoming even more relevant as we move ahead. And the last thing that’s always going to be important is a good user experience. (Another factor that’s becoming even more important now.)

Even links (I sincerely hope) will eventually stop counting as a crucial ranking factor. But a well designed site, with functionality and content aimed at user needs, is not only always going to be good, but should continue to improve over time, as long as you stick to those always important principles.

 

 

The post The Personalisation Of Search appeared first on Net Age.

Viewing all 32 articles
Browse latest View live




Latest Images