The human-to-human connection is powerful. In our new age digital world, the presence of human to human contact is fast becoming extinct. Brands are moving further and further away from the consumers they must connect with, in order to form relationships that result in heightened awareness and increased sales. The gap between brand and consumer must be bridged, in order for success to flourish.Read
On Friday 10th May 2013, through a Twitter post, Matt Cutts, Google’s head of search spam, announced that the next version of Google’s infamous Penguin Algorithm update is to be launched within the next few weeks. Following this a new video was posted answering the question “What should we expect in the next few months in terms of SEO for Google? There are some big changes coming, SEO strategies will need to change and adapt, so how do we prepare?
So lets start with the Video posted on the 10th May 2013.
You will note in the Video that Matt refers to 10 key points, he explains that all of these points will help to improve the search results by awarding the good sites and penalizing the spammers and black hats in search results. Here is a summary:
1) Penguin Updates
The next Penguin algorithm update is said to go deeper and will have an even greater impact than its predecessor. The first version of Penguin had a dramatic impact on the SEO community, are SEO’s ready for the potential shock wave and have they learned lessons from previous mistakes? This really is a scary one.
2) Advertorial Content
Earlier in the year Google penalized Interflora and a number of UK Newspapers for selling advertorial content, this was followed by a very stern warning and reminder from Google that selling links on sites that pass PageRank can and will lead to a penalty. Matt Cutts promised that Google will take a much stronger stance against those using advertorials in a way that violates the Google webmaster guidelines. So if you are using this type of promotional marketing, stop now.
3) Spam Queries
Spammy search queries have by there very nature and definition not at this point been a priority for Google, this includes terms like “Pay Day Loans” and pornographic search queries. But it seems that Google has potentially received requests from outside of Google to clean up these sectors, as a result Matt Cutts has said that Google is more likely to look into this area in the near future. Many legitimate sites and businesses do get caught up in the tarnish of Spammy search queries, if Google is able to cleanse and sort out the genuine businesses from the spammers then perhaps this could add value to the user and provide a new level of quality search?
4) Link Spammers & Link Networks
We know from past algorithm updates that Google is targeting link networks, it seems that Google will be making a greater effort to go more “Up Stream” to deter link spammers and to reduce the value of these types of links. Old link strategies are already dead but this really is the final nail in the coffin for companies still using mass submission methods of link building. Link bulding takes time, invest in building relationships, outreaching to site owners and bloggers, writing high quality content and engaging and interacting with your target audience.
5) More Sophisticated Link Analysis
Matt Cutts has said that Google will be improving its ability to analyse links with more sophisticated link analysis tools. It seems that they are in the early stages of development but after its launch they will have a greater level on understanding and potentially be able to remove a larger volume of low quality link sites from the index. A bit of a heads up for SEO’s still working with low quality link partners.
6) Help & Improvements for Hacked Sites
Matt has acknowledged that Google is working on rolling out an update to better detect and manage hacked websites in the coming months. With the plan to improve webmaster communication perhaps site owners will be able to minimise the damage caused by hack attempts and with Google’s help recover quickly?
Niche sites with good high quality content will be given an authority boost within there chosen field of expertise, this will be a well received change, allowing smaller sites to rank highly for high authority content. Providing users with more insightful and relevant content.
8) Panda Update
The Google Panda update has caused a huge amount of issues for site owners, Matt Cutts acknowledges that many of the impacted users are borderline cases and as such Google is looking at ways to reduce the impact and soften potential penalties by looking at further metrics to reduce the impact.
9) Domain Clusters in Search Results
Domain Clustering has long hindered some searches and in some cases Google displays a whole page of results for the same domain. Google’s Matt Cutts has said that they will work towards making the first page of search results more diverse but the second results page may show some clustering as a result, this is an area Google has been working on for some time.
10) Improving Webmaster Communication
This is an area Google has been continuing to work on, Matt Cutts has said to expect a greater level of details in webmaster notifications within Google Webmasters Tools. The current notifications are still very cryptic and Generic so any improvement here will be greatly received.
Preparing for Penguin
So lets talk a little more about the Penguin Google Algorithm update, what is it? Quite simply it’s a link quality filter that works separately from the core Google Algorithm, it’s a manual update, which is run sporadically and used to clean the Google Index. It’s designed to remove link networks and poor quality link sites from the Google Index as well as removing sites which try to manipulate there search engine presence by using unethical practices which are outside of the recommendations provided by Google within there webmasters guidelines. So far there have been 3 Google Penguin updates:
Penguin 1 on April 24th 2012
Penguin 2 on May 24th 2012
Penguin 3 on October 5th 2012
We know that the next Google Penguin update will be a Game-Changer and know Google has been recording and storing information about low quality link sites though its link disavow tool and through resubmission requests. With this in mind its fair to assume that Google has a much greater understanding of bad link neighbourhoods that have been used to manipulate ranking as submitted by users. Google’s New data resource has to be a core part of the new Penguin update and with this in mind expect to see some serious fireworks when Penguin is launched.
So if you feel your link profile is poor and your worried about the impact Penguin will have on your site, what should you be doing to reduce the potential impact? Lets run through some of the key areas:
1) Bad Link Neighbourhoods
Google has spent a lot of time understanding, researching and assessing bad links, understanding where to find them and their DNA. There are a number of great tools on the market that attempt to differentiate between trustworthy and spammy linking site. It’s possible to produce a link graph which plots the interactions of links, their ownership and how they are associated. Producing a graph, which shows link clusters and nodes based on IP address. By understanding the interactions between known bad link sites and unclassified linking websites, it is possible to identify sites which are hosted in bad link neighbourhoods. Will Google take a strong standpoint and penalise all sites hosted in bad link networks?
Google assigns a level of authority or weight to all websites, this is called PageRank. The higher the level of PageRank a site has the greater the level of importance, which is assigned by Google to the site. Although Google tells us not to focus on PageRank it’s an active signal which plays an important part in a sites potential online presence. So you can understand why link spammers focus on selling links in order to pass PageRank from site to site, it’s a valuable commodity and the reason why so many bad neighbourhoods exist.
A proportion of a sites PageRank is passed to the linking site, this boosts the authority of the linking site and provides a higher level of weight. PageRank is earned and built up over time, we know that if a group of link spam sites originate from an IP Cluster then there is a good possibility that more spam sites will exist within that IP group. The longer these sites exist the easier it is for them to discovered.
By understanding where good and bad neighbourhoods exist, Google would easily be able to apply a level of trust to each link. This is a metric which is currently being used by Magestic SEO along with Citation Flow in order to identify un-trustworthy links. It seems logical to apply a level of trust to a site as part of the algorithm’s assessment of a domain or a webpage. Now that Google has assembled a more detailed database of spam sites through the link disavow tool, will trust be factored into the next Penguin update?
The next Penguin update is set to be huge, so if you have engaged in grey hat or black hat practices, start thinking about removing low quality links from your site.
Taking a sceptical view you have to wonder is Google monitoring link removal, after a very public announcement that Penguin is coming in a Few Weeks, watching for site owners removing links which have been flagged in bad neighbourhoods, in order to understand who is attempting to manipulate there link profile or undertaking practices that are outside of Google’s Webmasters Guidelines? We know Google follows a long thought out plan of strategic decision-making, taking multiple steps to shepherd users towards and end goal but we can only speculate. (Take some time to research Game Theory, which is a practice followed by Google)
2) Link Velocity
It’s unknown as to weather Google measures and gives weight to link velocity but it’s a metric that is of an immense amount of interest, as a result there has been much speculation about how Google could easily utilise Link Velocity as a trigger or a signal in its Algorithms.
Link Velocity is the volume and rate of discovered links plotted over a period of time. A strong press release, a news story or a well-written piece of content can cause a spike in link velocity, which is isolated to the specific time frame. The result of the popularity of the content is a clearly defined rise in new link discovery from the date of publishing. With the original piece of content being seen as the source of the rise in link discovery.
It’s also worth mentioning that the rate at which new links are discovered by Google during its natural crawling cycle will vary depending on the importance of the linking site and the frequency at which Google crawls content on each linking site. High profile social links will be first to be discovered, personal blogs and lower quality websites can take much longer. Is Google able to understand the complexity of link discovery?
A website with an active user community, writing content on a regular basis, submitting and engaging with social communities would have a steady level of growth and as such would expect to receive a regular and steady increase in link velocity. If Google is able to pinpoint the source of the growth i.e. the press release or the regular updating content, then it should be able to understand the types of links and the context of the links, which have “naturally” occurred as a result of the original content. The theory here is that Google is or will be able to understand the origin of a spike in Link Velocity and secondly the relevance and type of links which are created as a result. It should also be able to understand trends in link discovery, for example a natural profile would see high value links being discovered first.
It is also believed that Google maps the average level of growth and analyses and devalues unnatural links that occur in spikes above the natural link velocity, if these links have no single source of creation, i.e. they relate to no specific origin or are not part of a relevant piece of source content, then potentially they would be reviewed further and could be removed.
3) Social Signals & Engagement
Social Authority is an interesting and exciting metric but is has its limitations, primarily within the business to business marketplace. For consumer facing brands and websites measuring social authority seems an excellent method of gauging popularity and importance. We know that Google has taken a strong interest in social signals but at this moment in time it’s believed that the weight of these signals within the algorithm is low, but is this about to change?
Google will need to analyse the social landscape to a great level of detail in order to accurately understand the importance of social signals. Ideally understanding the sentiment of a post and comments to assess positive and negative levels of social engagement, the source of the comments and the potential impact. Social likes and follows alone do not provide an accurate measure of popularity and can be easily manipulated but when measured in combination with social engagement, link velocity and general applause the metric starts to become exciting.
Its not perfect, one potential problem with social authority signals is, should negative sentiment be treated as a positive or negative increase in popularity. If a company has a huge level of negative press and as a result a huge increase in negative social signals, should social authority increase? In real terms the popularity of the site has increased and as a result ranking should improve, but should Google act as a judge or remain impartial?
The biggest issue is that not all sites engage in social media, this choice should not lead to these sites being penalised so as such more traditional measurements and signals will need to be adopted. With this in mind it’s difficult to see social media becoming a high value signal within Google’s Algorithms, at least not for a good few years, but it makes sense to see the weight and impact of social authority being upgraded, so we will have to wait and see what the future holds!
4) Anchor Text
Anchor text was a key metric and indicator for identifying spam websites in last two Penguin Algorithm updates. We know now that a sites anchor text needs to be diverse, pre-Penguin the allowed amount of commercial anchor text was around 80% with 20% of the anchor text relating to branded and non branded white noise links. This ratio has changed and is currently believed to be around 50/50, but it’s expected that the ratio will change even further to replicate a more natural anchor text ratio, but what is natural and what ratio is safe? Its being predicted that after the next Google Penguin update the ratio will be around 45/55 to 40/60, but of course at this point this is pure speculation but it seems a reasonable assumption.
The changes that are coming are almost guaranteed to be controversial, no one is in a position to accurately predict what will happen, we can only speculate, but we do know that Google is hell bent on making it harder to manipulate PageRank and to stop and devalue low quality link building practices. The key to survival will be to adapt and spend a great level of time building high quality content, engaging with social networks and taking a much more traditional marketing view of SEO, working on the positive promotion of your brand and your website. This is nothing new, we all know that good content always wins out but while there are quicker and faster methods of achieving growth companies will always be drawn to them to achieve quick wins.Read
I’ve received a lot of questions about weather Google uses Twitter and Facebook as a ranking signal, below is a great video from Google where Matt Cutts explains.
This is a great video created by Common Craft and Search Engine Land, I love the analogy of search engines as librarians. SEO or search engine optimisation is the process of improving and increasing the volume of exposure and visitor traffic to a website for relevant and targeted keyword searches through natural search engine results (In Search Engines like Google, Yahoo and Bing). The Ranking of a Keyword or listing position changes based on a algorithm which determines the ranking, quality and relevance of the website and landing page based on over hundreds of ranking factors. The higher a site ranks for a keyword search, the larger the volume of exposure and visitor traffic a site will receive.
So here is the video, hope you enjoy as much as I did!
Article by Matt BlayRead
It’s something we have all known for a long time in the SEO community, Google do not use the Meta Keyword Tag to determine listing relevance or the ranking position of a web page within a Google search results page. Matt Cutts, head of web spam at Google, finally acknowledges this in the following video where he discusses Google’s use of the tag and its importance. The keyword tag is still used by other search engines and does still have an important use in SEO, all be it a lot less than in the past. It’s still worth using and including but not essential.
Posted by Creative SEO UK
Matt Cutts discusses How Geo Tags influence country specific search results.
As part of the Google Webmasters Central Channel Matt Cutts is continuing his series of Informative SEO Question and Answers Videos. Matt discusses the relationship between a servers Geographic IP location and how it affects SEO.
Very interesting Video, Matt discusses some basic SEO principles and Reviews a few websites.Read