Blog

Archive for Search Engine Optimisation

Facebook thinks it has found a way to hurt Google’s search business

Facebook thinks it has found a way to hurt Google’s search business

Facebook is testing its own search engine, which will allow users to find and post links to articles without venturing anywhere near Google.

The new feature is part of Facebook’s plan to keep internet users within its own ecosystem, stopping them from ending their mobile browsing session because of the awkward experience of finding, copying and pasting a link from Google.

Some users of Apple’s  iOS mobile system in the US can now click on a new “add a link” button, which allows them to search for the link they want to share from within Facebook’s app. The keyword search sorts results by the likelihood of them being shared, prioritising newer or highly shared articles. Once they have picked the article they want from the results list, the user can publish the comment or status update as normal.  It is not clear whether Facebook’s search engine is looking for links inside Facebook or externally on the web.

Facebook told TechCrunch that it had indexed over 1 trillion posts to find out which posts were being shared, and who had shared them — data that Google doesn’t have access to.

The entire scheme is part of a larger ploy to keep users on Facebook. The social network has already announced plans to host articles natively on the News Feed, and split ad revenue

favourably with publishers. If Facebook sells an ad, it will keep just 30% of its revenue, The Wall Street Journal reports. In order to woo publishers, the site is considering giving them 100% of revenue from ads they sell on Facebook-hosted news sites.

As native advertising grows, Google’s advertising business faces more challenges on mobile especially. The company lost mobile ad market share in 2014, according to eMarketer, down to 38.2% in 2014 from 46% in 2013. Facebook’s ad share rose to 17.4%  in 2014 from 16.4% in 2013. Google has had a boost in the first three months of 2015, as the lower rates charged for mobile advertising, which had previously worried investors, were outweighed by the number of ads sold.

By making it easy to find and recommend articles and other sites, Facebook is creating an ecosystem that — it hopes — will give users less and less reason to leave.

Source: BusinessInsider.com.au

Posted in: Facebook, Latest News, Search Engine Optimisation

Leave a Comment (0) →

Apple Confirms Their Web Crawler: Applebot

Apple Confirms Their Web Crawler: Applebot

After much speculation around an Apple Web Crawler, Apple has finally posted a help document confirming the existence of AppleBot, their web crawler.

Apple said, Applebot is the web crawler for Apple. AppleBot is “used by products including Siri and Spotlight Suggestions,” the company said.

The user-agent will typically follow the following string but will contain “Applebot” in it always:

Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/600.2.5 (KHTML, like Gecko) Version/8.0.2 Safari/600.2.5 (Applebot/0.1)

Apple says it will respect the customary robots.txt rules and robots meta tags. AppleBot currently originates in the 17.0.0.0 net block. If you do not mention AppleBot in your robots.txt directive, Apple will follow what you mention for Googlebot. So if you want to block AppleBot and GoogleBot, you can just block GoogleBot, but I’d recommend you block each individually.

If you notice unusual AppleBot activity, you can reach Apple about it at Apple-NOC “at” apple.com.

It is unclear if Apple plans on building and competing with Google on search but this is one step closer to that.

Postscript: Apple Insider reported later on today that Apple is “rapidly-expanding internal search group” to build their own version of a web search engine via Spotlight. So it does appear Apple is venturing into Google’s search space – at least based on these early reports.

Source: SearchEgnineLand.com

Posted in: Latest News, Search Engine Optimisation

Leave a Comment (0) →

How Google Search Works: 60 Trillion Pages and Counting

According to Google there are more than 60 Trillion pages they now store in their database. Watch and scroll through the story to see how it all works. How Google Search Works

An old 2010 video by Matt Cutts helps explain how Google searches the web

Google Uses more than 200 ranking factors. See the graphic below:

200 Google Ranking Factors

Posted in: Blog, Google Updates, Latest News, Matt Cutts, Search Engine Optimisation, Website Design & Optimisation

Leave a Comment (0) →

Should You Update Your Website?

How Old is Your Website Design and Should you Update it for Ranking Purposes?

Matt Cutts from Google explained in a recent video Q&A session that a website could possibly lose its position if the site is not being maintained or updated. He stated that a website that was “older” and had not had a “facelift” could start to lose rankings due to the user experience not being as good as a more up to date modern website.

Google has, for many years, stated their position on ranking being wholly focused on the end user experience. Their entire business model relies on giving their clients, the people searching, a positive user experience. Ie if the person is searching for information, then they will rank the results in the order that will give the searcher the best user experience possible. Eg is the website relevant to the searched phrase, is the site authoritative in the subject matter, once the person has visited the site, did they stay on the site and if so for how long did they stay. On the other hand if the person quickly returned back to the search results page, then this tells Google the website was either not relevant or the user experience of that particular site was bad. This will influence Google’s results the next time the same search is requested.

 

Posted in: Latest News, Matt Cutts, Search Engine Optimisation, Website Design & Optimisation

Leave a Comment (0) →

Get Your Content Indexed Faster by Google

Utilizing the various functions that Google Webmaster Tools has to offer is a surefire way to help keep your website running like a well-oiled machine. Two tools our SEO team uses on a regular basis and finds to be extremely beneficial are the Crawl Errors report and Sitemap submission tool.

Amongst the toolkit is the Fetch as Google option, which also gives users an opportunity to submit their URL to the index. Surprisingly, this tool is often under-utilized by bloggers, webmasters, and SEO strategists. This is a convenient way to speed things up considerably if you have new content that you’d like to be discovered and found in the SERPs.

Website owners and marketers often publish new web pages or blog posts on their website, sit back, and wait for them to show up in the Google search results. But that can take weeks or even months to happen! The more savvy marketers will ensure that any new content is included in theirXML sitemap and then resubmit their sitemap to Google and Bing.

Source: SearchEngineWatch.com

Posted in: Latest News, Search Engine Optimisation

Leave a Comment (0) →

Hummingbird: Google’s Biggest Change in 12 Years Launched Today

It’s probably going to be more important than ever to give Google as much information about your site as possible, so that it “understands” it. I would imagine that Google will continue to give webmasters new tools to help with this over time. For now, according to Google (per Sullivan’s report), you don’t need to worry about anything, and Google’s normal SEO guidance remains the same.

It’s clear that keywords are becoming less and less important to search engine ranking success as Google gets smarter at figuring out what things mean, both on the query side of things and on the webpage side of things. Luckily, Hummingbird presumably still consists of over 200 different signals that webmasters can potentially take advantage of to gain a competitive edge.

In particular, Google said that Hummingbird is paying more attention to each word in a query, ensuring that the whole query — the whole sentence or conversation or meaning — is taken into account, rather than particular words. The goal is that pages matching the meaning do better, rather than pages matching just a few words.

Does this mean SEO is dead?

No, SEO is not yet again dead. In fact, Google’s saying there’s nothing new or different SEOs or publishers need to worry about. Guidance remains the same, it says: have original, high-quality content. Signals that have been important in the past remain important; Hummingbird just allows Google to process them in new and hopefully better ways.

Here is the latest information on Hummingbird by Danny Sullivan

Posted in: Google Updates, Latest News, Search Engine Optimisation

Leave a Comment (0) →

Google Launches Hashtag Searches, Shows Google+ Posts On Search Results Page

According to Matt McGee

“The integration of Google+ across Google properties continues in a big way today with the launch of hashtag search.

Simply put, if you search Google for a hashtag, you might see Google+ posts using that hashtag to the right of the regular search results.” However this is currently only available on Google.com and Google.ca although it is expected to be rolled out globally following testing.”

http://searchengineland.com/google-launches-hashtag-search-shows-google-posts-on-search-results-page-172725

Posted in: Blog, Google Updates, Latest News, Search Engine Optimisation

Leave a Comment (0) →

Penguin 2.0 has just been rolled out.

According to Matt Cutts, users will notice a significant change in search results.

I am not sure if you have prepared, but it’s been reported that websites with strong social signals and social activity have not been badly affected.

In fact, Social Signals and authority have become the latest focus. They are gaining importance in affecting search ranking.

Posted in: Blog, Brand Development, Search Engine Optimisation, Social Media Marketing, Website Design & Optimisation

Leave a Comment (0) →

Google Targets Spammers in 2013

Spammers beware! This summer, Google is going after black-hat and link spammers. In a video on YouTube, Google’s Matt Cutts announced 9 things that Google will do within the next months. How do these changes affect the position of your web pages in Google’s search results and what can you do to protect your rankings?

Posted in: Blog, Search Engine Optimisation, Website Design & Optimisation

Leave a Comment (0) →

2012 Google Push Into Social Media

Social Media and Google+

You probably know already that we focus on search engine optimization, adwords, online and offline marketing, and software to do the same.

 

Backlinks used to be the most important aspect, and they’re still very important. However, nowadays, especially after the latest Google updates, there’s a new factor that is rapidly rising in it’s relative importance for successful search engine optimization.

 

I’m talking about Social Signals of course you will for sure hear more about this in 2012. That’s because this is the year that Google is making an all-out push into social media and its integration into their search ranking algorithms.

 

So what are social signals? Very simple, this is just a fancy name for something you’re probably already doing — mentioning your blog posts or new sites on G+, Facebook, Twitter, Linked In and other such sites. If you do this, you’re already sending a signal to Google that the blog post or new site is important, and Google will then take notice of the fact that your content had a mention on the social sites. That’s the long and the short of it.

 

Social signals are a very good indicator for Google whether a specific piece of content has value — because these signals are supposed to come from humans. Of course the temptation is there to “trick” this new system, just like people game their backlinks today using the outlawed ‘link farms’.

 

But I advise against it, because determining if a social signal is noise or valuable should be quite easy for Google, with its massive computational resources. That’s because it is very hard for the ‘tricksters’ to avoid leaving any footprints of their activities. Even such innocuous factors such as the timing of the social signals can be analyzed and a fake signal will stand out like a big red pimple in the middle of legitimate data. That is exactly why Google is raising the importance of legitimate social signals in their determination of what is important and valuable content to show to searchers.

 

So what can you do? The solution is simple: write valuable content that will generate social signals naturally. It’s the old Google mantra all over again: write for the human reader, not the search engine. If people like your content then they will talk about it on Facebook and Twitter and Google Plus

Posted in: Blog, Search Engine Optimisation, Social Media Marketing, Website Design & Optimisation

Leave a Comment (0) →
Page 1 of 2 12
Google+