Click here to close now.


Adobe Flex Authors: Matthew Lobas, Newswire, Shelly Palmer, Kevin Benedict

Blog Feed Post

Facebook And Twitter Shares Closely Linked With High Google Search Rankings Says New Research

Study reveals comprehensive list of factors that correlate with ranking highly in Google searches

7 June, 2012, London, UK - The volume of Facebook and Twitter shares a web page generates is closely correlated with how high it ranks in Google UK searches, while too many ads on a page are likely to be having a negative effect on search visibility. But top brand websites appear to have a natural advantage for ranking highly.

The findings come from a study by search and social analytics company, Searchmetrics, which was aimed at identifying the key factors that help web pages rank well in Google UK searches. The company analysed search results from for 10,000 popular-keywords and 300,000 websites in order to pick out the issues that correlate[1] with a high Google ranking.

Five key findings of the study are:

1. Social media has arrived in search
Social signals from Facebook, Twitter and Google+ now correlate very strongly with good rankings in Google's UK index. The number of Facebook 'shares' a web page has received appears to have the strongest association (a correlation of 0.35), even higher than the total value for shares, comments and 'likes' together (0.34). Twitter is behind Facebook but is still the 6th strongest factor on Searchmetrics' list of Google ranking factors (see chart underneath) with a correlation of 0.24.

Searchmetrics found that the number of Google +1 recommendations pages achieve on the Google+ social network had a correlation of 0.37 with search rankings - the strongest correlation of any of the metrics analysed in the study. But the company wants this finding to be treated with caution as Marcus Tober, CTO and founder of Searchmetrics explained:

"Google+ does not currently have enough users for us to be totally confident about this finding. But it's indisputable that Google is trying to make Google+ an important player in most things it now does. So search and digital marketers should be sure to keep an eye on further developments."

2. Top brands appear to have a ranking advantage
Despite the perception of search as a level playing field, the study found that top brand websites seem to be enjoying a ranking advantage. Factors that are commonly believed to help web pages rank well, such as the quantity of text on a web page and having keywords in headlines and titles, don't seem to be required in the case of large, well known brands.

"Surprisingly, the data shows a negative correlation between these factors and rankings - contradicting traditional SEO theory. So not having keywords in headlines or having less text on a page seems to be associated with sites that rank higher," explained Tober.

"When we looked deeper at the top 30 results we found that this pattern really starts to emerge with highly ranked pages. And when we looked at sites that are in the top position on page one of Google - the natural position occupied by brands - this is where the negative correlation is strongest. This indicates that strong brands rank highly even without perfectly conforming to common SEO practice."

3. Too much advertising is a handicap
Too many and/or excessively clumsy advertisements were presumed to be a factor in the Google Panda Update and its successors which have tried to lower the search visibility of poor quality results. The data in this study supports this assumption as all the analysed advertisement factors returned a negative correlation (-0.05):

A deeper analysis revealed that this pattern was strongest when there was a high percentage of Google AdSense ads; rankings for pages with more AdSense ad blocks seem to drop sharply. This supports Google's statements early in 2012, in which the company said that particularly prominent, distracting or above-the-fold ads could lead to ranking problems.

4. Quantity of links is still important but quality is vital
The number of backlinks (links to a web site from other sites) is still one of the most powerful factors in predicting Google rankings (with a correlation of +0.34). To get the most benefit, however, it appears a site needs to have a spread of links that looks natural - not like it was artificially created by SEO experts.

This means that a site should not simply have a large number of perfectly optimised links that include all the keywords it wants to rank for in the anchor text. It needs to have a proportion of 'no follow' links (links which do not convey ranking benefits) and links that contain 'stopwords' (such as 'here', 'go', 'this').

5. Keyword domains still frequently attract top results:
Despite all the rumours to the contrary, web sites with keywords in the domain name, such as , still often top the rankings (correlation of +0.11). Although Google has repeatedly said that keyword domain sites will slowly weaken in power in searches, this does not yet seem to be the case.

"We collated the data for our research in February and March 2012, meaning it takes into account the impact of Google's various Panda algorithm updates that have greatly changed the look of search results since early 2011. We conducted similar studies in USA, Germany, France, Spain and Italy and found very similar results across the board, which seem to show that these findings apply internationally," explained Tober.

He added however, that while Searchmetrics' study highlighted those factors that correlate with a high Google ranking, this does not mean those factors definitively cause or influence the ranking:

"Of course, only Google knows for sure as they control how the search algorithm actually behaves," he said.

A comprehensive report which outlines the UK results from Searchmetrics' ranking factors study can be downloaded here.


[1] Correlations were calculated using Spearman's rank correlation coefficient. A coefficient score of +1 implies a perfect positive correlation and a score of -1 implies a perfect negative correlation.

About the Searchmetrics Study
The study analysed search results for 10,000 keywords and 300,000 web sites, as well as billions backlinks, Tweets, Google +1s and Facebook likes, shares and comments. The data was collected in February and March 2012. The correlations between different factors and the Google search results were calculated using Spearman's rank correlation coefficient.

About Searchmetrics
Searchmetrics is the global expert in search and social analytics software, empowering marketers to increase visibility and market share on the world's leading search engines. We create value by providing the best quality data on a global scale. Clients and partners worldwide rely on Searchmetrics to maximize return from search investments with actionable insights that help better manage, improve and scale search marketing campaigns

Searchmetrics' robust search marketing tool, Searchmetrics Suite, is supported by a unique server infrastructure that offers monitoring of over 100 search engines in over 30 countries worldwide. Searchmetrics Suite is also home to the Searchmetrics Essentials data modules, SEO+SEM and Social, encompassing the largest, fastest databases for search and social media available.

Headquartered in Berlin, with subsidiaries and offices in New York, London and Paris, the company delivers real web intelligence to a growing international customer base. In 2011 Searchmetrics was named as Best Technology Partner, EMEA by Adobe and is listed in the 2012 Always-On Media 100 companies. You can follow Searchmetrics on Twitter @Searchmetrics or on Facebook at For more information, please visit:

For more information
Chris Measures
07976 535147
[email protected]

Read the original blog entry...

More Stories By RealWire News Distribution

RealWire is a global news release distribution service specialising in the online media. The RealWire approach focuses on delivering relevant content to the receivers of our client's news releases. As we know that it is only through delivering relevance, that influence can ever be achieved.

@ThingsExpo Stories
The IoT market is on track to hit $7.1 trillion in 2020. The reality is that only a handful of companies are ready for this massive demand. There are a lot of barriers, paint points, traps, and hidden roadblocks. How can we deal with these issues and challenges? The paradigm has changed. Old-style ad-hoc trial-and-error ways will certainly lead you to the dead end. What is mandatory is an overarching and adaptive approach to effectively handle the rapid changes and exponential growth.
Today’s connected world is moving from devices towards things, what this means is that by using increasingly low cost sensors embedded in devices we can create many new use cases. These span across use cases in cities, vehicles, home, offices, factories, retail environments, worksites, health, logistics, and health. These use cases rely on ubiquitous connectivity and generate massive amounts of data at scale. These technologies enable new business opportunities, ways to optimize and automate, along with new ways to engage with users.
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new data-driven world, marketplaces reign supreme while interoperability, APIs and applications deliver un...
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....
You have your devices and your data, but what about the rest of your Internet of Things story? Two popular classes of technologies that nicely handle the Big Data analytics for Internet of Things are Apache Hadoop and NoSQL. Hadoop is designed for parallelizing analytical work across many servers and is ideal for the massive data volumes you create with IoT devices. NoSQL databases such as Apache HBase are ideal for storing and retrieving IoT data as “time series data.”
Clearly the way forward is to move to cloud be it bare metal, VMs or containers. One aspect of the current public clouds that is slowing this cloud migration is cloud lock-in. Every cloud vendor is trying to make it very difficult to move out once a customer has chosen their cloud. In his session at 17th Cloud Expo, Naveen Nimmu, CEO of Clouber, Inc., will advocate that making the inter-cloud migration as simple as changing airlines would help the entire industry to quickly adopt the cloud without worrying about any lock-in fears. In fact by having standard APIs for IaaS would help PaaS expl...
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete end-to-end walkthrough of the analysis from start to finish. Participants will also be given the pract...
SYS-CON Events announced today that ProfitBricks, the provider of painless cloud infrastructure, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. ProfitBricks is the IaaS provider that offers a painless cloud experience for all IT users, with no learning curve. ProfitBricks boasts flexible cloud servers and networking, an integrated Data Center Designer tool for visual control over the cloud and the best price/performance value available. ProfitBricks was named one of the coolest Clo...
Organizations already struggle with the simple collection of data resulting from the proliferation of IoT, lacking the right infrastructure to manage it. They can't only rely on the cloud to collect and utilize this data because many applications still require dedicated infrastructure for security, redundancy, performance, etc. In his session at 17th Cloud Expo, Emil Sayegh, CEO of Codero Hosting, will discuss how in order to resolve the inherent issues, companies need to combine dedicated and cloud solutions through hybrid hosting – a sustainable solution for the data required to manage I...
NHK, Japan Broadcasting, will feature the upcoming @ThingsExpo Silicon Valley in a special 'Internet of Things' and smart technology documentary that will be filmed on the expo floor between November 3 to 5, 2015, in Santa Clara. NHK is the sole public TV network in Japan equivalent to the BBC in the UK and the largest in Asia with many award-winning science and technology programs. Japanese TV is producing a documentary about IoT and Smart technology and will be covering @ThingsExpo Silicon Valley. The program, to be aired during the peak viewership season of the year, will have a major impac...
Apps and devices shouldn't stop working when there's limited or no network connectivity. Learn how to bring data stored in a cloud database to the edge of the network (and back again) whenever an Internet connection is available. In his session at 17th Cloud Expo, Bradley Holt, Developer Advocate at IBM Cloud Data Services, will demonstrate techniques for replicating cloud databases with devices in order to build offline-first mobile or Internet of Things (IoT) apps that can provide a better, faster user experience, both offline and online. The focus of this talk will be on IBM Cloudant, Apa...
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, will look at different existing uses of peer-to-peer data sharing and how it can become useful in a live session to...
As a company adopts a DevOps approach to software development, what are key things that both the Dev and Ops side of the business must keep in mind to ensure effective continuous delivery? In his session at DevOps Summit, Mark Hydar, Head of DevOps, Ericsson TV Platforms, will share best practices and provide helpful tips for Ops teams to adopt an open line of communication with the development side of the house to ensure success between the two sides.
SYS-CON Events announced today that IBM Cloud Data Services has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IBM Cloud Data Services offers a portfolio of integrated, best-of-breed cloud data services for developers focused on mobile computing and analytics use cases.
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
Nowadays, a large number of sensors and devices are connected to the network. Leading-edge IoT technologies integrate various types of sensor data to create a new value for several business decision scenarios. The transparent cloud is a model of a new IoT emergence service platform. Many service providers store and access various types of sensor data in order to create and find out new business values by integrating such data.
The broad selection of hardware, the rapid evolution of operating systems and the time-to-market for mobile apps has been so rapid that new challenges for developers and engineers arise every day. Security, testing, hosting, and other metrics have to be considered through the process. In his session at Big Data Expo, Walter Maguire, Chief Field Technologist, HP Big Data Group, at Hewlett-Packard, will discuss the challenges faced by developers and a composite Big Data applications builder, focusing on how to help solve the problems that developers are continuously battling.
WebRTC converts the entire network into a ubiquitous communications cloud thereby connecting anytime, anywhere through any point. In his session at WebRTC Summit,, Mark Castleman, EIR at Bell Labs and Head of Future X Labs, will discuss how the transformational nature of communications is achieved through the democratizing force of WebRTC. WebRTC is doing for voice what HTML did for web content.
Developing software for the Internet of Things (IoT) comes with its own set of challenges. Security, privacy, and unified standards are a few key issues. In addition, each IoT product is comprised of at least three separate application components: the software embedded in the device, the backend big-data service, and the mobile application for the end user's controls. Each component is developed by a different team, using different technologies and practices, and deployed to a different stack/target - this makes the integration of these separate pipelines and the coordination of software upd...