Flight Center's SEO Practices shaken at SMX Sydney

Monday, 14 April 2008

SMX Sydney took place last week on the 10th and 11th April 2008 in Luna Park, in Sydney. For the first time in Australia, the event organised by Barry Smyth brought many international and local speakers to discover and discuss the latest trends in the search marketing industry. Part of the conference, called the SEO Clinic, offers selected real-life websites to be examined by a panel of experts.

In 2008 the panel included Rand Fishkin from SEOmoz, Danny Sullivan from Search Engine Journal and Adam Lasnik from the Google Spam Team. All these guys are probably some of the most recognised professionals in the industry and having their input can save you a lot of work and help you to improve the optimisation of your website. That is great you would say. The drawback is that high-level experts can also find all sorts of things that can make you feel quite embarrassed in front of a live audience and in the industry.

How did Flight Center get Shaken?

That is exactly what happened to one of Australia's largest travel websites: FlightCenter.com. When Rand and Danny did an analysis for duplicate content for DiscoverTasmania.com, they found out that their content is served on the Flight Center website. However, they discovered that the same content is not visible for a normal user when you go to the Flight Center website. That's when it starts to be tricky.
After a little more work, they noticed that the content on the Flight Center website is served differently to users and search engines robots(responsible for crawling and indexing pages). In practice it means that, if you are a simple visitor of the website, you would see this:

But if you are a Google Bot, you would see this:

This practice is called Cloaking, which identified as a SEO Black Hat practice by Wikipedia"

Cloaking is a black hat search engine optimization (SEO) technique in which the content presented to the search engine spider is different from that presented to the users' browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page. The purpose of cloaking is to deceive search engines so they display the page when it would not otherwise be displayed.

Now it is getting really embarrassing, especially when this happens in front of an audience of search marketers.

Why did Flight Center use Cloaking?

As described by Flight Center's GM Colin Bowman in Neerav Blog,Flight Center wanted to create more interactivity and a better visual experience when a user browses their catalogue. To serve this purpose they decided to use Javascript to display the content. However, search engine spiders are not able to read content contained in Javascript, which results in the pages not being able to be indexed and therefore not being able to rank in the SERPs. Obviously this has a bad impact on traffic and Flight Center and their technology provider wanted to avoid this.

The alternative they found (probably not the best one) is to serve a text based version to the search engine spiders so that they can index the content and a Javascript version to the end user to provide optimal interactivity.

Are Flight Center's SEO practices Black hat?

While Cloaking is against the Google guidelines and considered as a Black Hat practice, the Flight Center case is quite ambiguous. As noted by Rand in SEOmoz, the content served to the search engine spiders is identical to the content served in a user browser, and therefore the intention to 'deceive' search engines spiders is not fully there.
Flight Center's GM Marketing Colin Bowman has also expressed that their intentions were not to deceive search engines, but to make sure that the content of their web pages would be properly indexed:
It is clear that our intent is not to show content to spiders that differs from the content in the pages and therefore should not be regarded as blackhat cloaking. The content that was visible to Google’s spiders is an identical replica to what is shown in the customer friendly brochure viewer so no unfair advantage was gained nor sought.

It sounds like Flight Center had no real intention to deceive search engines, but to index their content, however they have made a poor(if not more) choice in using cloaking for this purpose.

Update - 21st April:
Based on further research, discussions and feedbacks (thanks Webco), it appears that Flight Center did not direclty use cloaking on their website, however the cloaking was part of the catalogue solution offered by Catalogue Central (one of their technology provider). This gives even more sense to the Google decision.

What is the outcome for Flight Center?

Having Adam Lasnik in the panel was probably not the best thing ever for Flight Center and it seems that the content included in the catalogue section of the Flight Center website has been removed from the Google listings already. However, the Flight Center website is still present in Google SERPs, which seems to prove that the website has not been fully banned.
Probably worst (or not?) than being banned from the Google listings, Flight Center will have to review their SEO practices and probably work on some serious reputation management as the news has already generated quite a lot of buzz in the industry.

What is the outcome for the Australian SEO Industry

The Flight Center's case also bring us back to the dark VERY unpopular debate on skills level the practices in the Australian SEO industry. While the Flight Center case does not mean that all Australian SEOs are 'Cowboys' or 'Spammers' as you can see in some comments, it surely demonstrates that the standards of some of them in the industry have yet to be improved and that some SEOs in Australia have to lift up their game to compete on a global scale. In that sense, initiatives such as SMX Sydney have a key role to play in this process and should therefore be encouraged.

What is the outcome for Google?

On top of a real need to have clearer guidelines, the Flight Center case also reveals the limitations of search engines to index content that is not text-based and therefore to be up to date with the latest web development technologies.
As demonstrated at SMX Sydney in the SEO 3.0 session with Chris Dimmock talking about the Myhome.com.au website, web 2.0 technologies such as AJAX and Javascript are getting more popular and largely contribute to improve the end-users experience. However search engines are not able to follow (or not quickly enough) the trend.

While evolving toward better indexing capabilities there is a real and growing technology gap between the current web development technology and the technology behind search engine indexing. This gap is currently stretching the challenges of search engine optimisation and obviously creating some 'collateral damages'. Should search engines stop focusing too much on relevancy and include user-experience in the equation?

PS. After receiving feedback for some readers and proof reading the post, I realised that what I wrote did not really match with what I actually meant. English is a tricky language and a few missing words can make a big difference... Please find some correction in blue.


Tristan Boyd said...

It was fairly recently that something similar happened to BMW.de and also to one of the lastminute.com brands. (UK)

It's a stretch to blame the education of Australian SEO marketers for this. This is just blatant black hat. They tried it, they got busted. It's as simple as that.

I'm disapointed that Google didn't ban them outright. I'm sure that a smaller company would be out of the index for a very long time.

sesakebon said...

Thanks for your comment Tristan. Just one thing to add to your comment.
The post does not aim at blaming the Australian SEO industry! I thought I did the opposite, but I guess my english is not as good as i thought :-)

While the Flight Center case does not mean that Australian SEOs are 'Cowboys' or 'Spammers' as you can see in some comments

It is just surprising to see that cloaking has been used on a major website. This has a bad impact on the Australian industry overseas

Regarding Google, it is quite surprising too that only a partial action has been taken, well you never know.

Justen said...

Hi Sesakebon,

I fail to see how placing a link on this part of your post is relevant?? "skills level in the Australian SEO industry" What Found Agency did with their previous site was nothing to do with a lack of skills, quite the contrary, I think. It seems to me you have something against them...

BTW, you might want to try Google for "seo agency" or "ppc agency" - I've checked their backlinks and looks legit to me, can you find some form of blackhat technique they've used?

sesakebon said...

Justen thanks for your feedback and your help to improve this post (that the beauty of a blog post over a newspaper article). You are right, the choice of the anchor text for the link is not the most appropriate. The word 'debate' might be better (I have made the change)

The destination page and its content is quite relevant though. What happened last week actually brought back a debate (please have a look at the comments in the SEOmoz post) produced last year by an article from The Age as a result of similar (but not identical)case that happened to the search agency you mentioned. In that sense, there is a parallel and I will keep the link.

I surely understand that the content in the linked post might not be welcomed by some. However, there is nothing personal, Justen. The post above by Tristan mentions BMW and Lastminute.com, but I sure that he has nothing against them (Do you?)... It is just that it happen to them and therefore they are mentioned. It could have happened to X, Y or Z and they would have been mentioned instead.

To be honest, I am quite sorry for them to have been fully banned, when Flight Center only seems to have been partially banned.
Now, you mentioned they have corrected their backlinks structure and have gained some rankings. It is obviously a good thing and good on them.

PS. This post will be user-generated content soon :-)

SEO said...

It is really intriguing why professional companies should resort to cloaking to improve their search engine rankings when it is possible to get high rankings using the right methods.
For a website like flight center, the key would have been to improve user experience and interactivity instead of resorting to cloaking.

Obviously, Google might have felt that they contribute to the overall user experience with the vast volume of bookings made through the website, and therefore not ban it.

webco said...

The catalogue display technique being used by Flight Centre seems to be provided by Catalogue Central (www.cataloguecentral.com.au)... who use the exact same strategy (cloaking and all) to display catalogues for a wide variety of other Aussie companies - although Flight Centre do seem a bit more agressive in adding extra text.

It looks to me like Flight Centre have "blindly" used this system - without understanding -or quite possibly even being aware of the potential cloaking issues.

Interesting to see if/when Catalogue Central are penalised.... But presumably they are scrambling to put together a technical solution which is not so spammy.

sesakebon said...

Thanks for you comment Webco and you are quite right. Catalogues Central could be the one to 'blame' as they built and sold the solution to Flight Center and many other companies.
This could also explain why the Flight Center website has not been banned... Only the Catalogues Central section of the website has been banned.
As you mentioned, it will be interesting to see Google's response regarding Catalogues Central as it might affect more than one company.

icky said...

Serving different content to crawlers is frowned upon yes, but I dont think serving the same content in different presentation should be penalized (cant tell from screenshot if that's the case).

I also think part of the responsibility lies with Google in not being able to index dynamic content correctly.

JT said...

I am building an International SEO Best Practices Guide - (located in the US), and was wondering if you wouldn't mind sharing some of your top tips?

Is Google.Au still #1, who is #2?
Are the basic Title, on page copy, alt tag, still very influential in the Google.Au algorithm?
Any help would be greatly appreciated! Of course I will mention your site as well - if you don't mind of course.