SEO Cannot Get the Best Ranking from Your Flash Website

Flash website designers often use technologies like Flash, JavaScript click-tracking links, Ajax, and Silverlight to make their sites attractive, fast, and easy to use.
While there are great reasons to use these technologies, they create problems when it comes to search engine optimization (SEO) which is available in a website visibility report. Web page content that is wrapped in a fancy package can be difficult – or impossible – for a search engine to “see.” That means the search engine crawlers may have a hard time understanding what your page is about.

The crawlers may not index all your important pages, leaving your website buried in pages behind ‘page 1’ in searches. The search engines may also find it difficult to follow any links – internal or external – you have placed in web page content rendered in Flash, Silverlight, or other technologies. That matters because search engines use your internal links to discover other pages on your site, to understand how pages on your site relate to each other, and to determine which pages on your website are more important than others.

Flash website designers sometimes incorporate the search function for a Flash website into their designs. That can be helpful for people, but may pose problems for search engines trying to crawl your Flash website. If pages on your Flash website are accessible only from a search box, the search engine will not be able to see those pages, because search engines do not type keywords into search boxes to find relevant web pages. Consider some of the popular technologies for creating attractive, people-friendly web pages, descriptions of the potential issues, option for avoiding costly problems.

JavaScript Menus

Flash website designers often use JavaScript to make navigation menus with special mouse-over effects, animated drop-downs and other interactive features. While these design innovations can be truly useful for human beings, they can also be a real problem for search engine crawlers like Googlebot.

Today, Google’s crawler – fondly known as Googlebot – can actually follow many links created in JavaScript. But it can not follow all of them. And while Google is the dominant search engine, with about 70 percent of people using it, 30 percent of your potential customers are using a search engine other than Google. Those people are even less likely to see your JavaScript links. If your business depends on people coming to your site from search engines, saying that the bots can probably follow your JavaScript click-tracking links is a bit like your boss saying your paycheck probably will not bounce.

A CSS menu can do pretty much everything a JavaScript menu can do, and without any of the issues that cause problems for search engine crawlers. Do not forget that mobile phones, tablets and the other small computers that are increasingly popular for surfing the Web and also have problems displaying JavaScript, but do fine with CSS.

JavaScript Click-Tracking Links

People who are serious about tracking the business performance of their website use some form of analytics such as Google Analytics. Seeing how visitors get to your website, and where they go after they land on it, helps you understand how to turn more visitors into customers. Sometimes web designers use a single page with pre-set parameters to track clicks. The page captures the information about which links were clicked, and then redirects the web browser to the final page that will be shown to the person who is surfing.

This is very similar to the JavaScript click-tracking function, and sadly, it has similar effects when it comes to search engines. Even if the web designers do the redirect with a [Glossary/301-redirect|301], where some of the goodness of that link is lost. I do not recommend this approach. Solution? Use a free click-tracking service like Google analytics instead of click-tracking JavaScript’s. Yes, Google analytics uses JavaScript, but NOT in the links themselves.

Website Flash

Website Flash is an incredible technology that enables a richer user experience. Flash is often used for video, slideshows and interactive features on a website. However, search engines can’t “see” any content that is rendered in Flash.

Many websites have everything in Flash. It can look great to human visitors, but to search engines, it looks like the website consists of a single web page – and one with very little content, at that. If the search engines think your entire site consists of a single page, they’ll think your site doesn’t have much useful content, and won’t rank your site high in search results, leaving your website buried deeply in searches.

Google has improved its crawler’s ability to “see” what is in a Flash object, especially if the web designer has followed some fairly straightforward rules. Still, it is not certain that all text rendered in Flash will be accessible to Googlebot. Keep in mind that a good percentage of searchers do not use Google.

Do you really want to fence out a third of your potential customers? At the end of the day, use Flash for decorative elements. Render your links and navigation menus in HTML, so search engine bots can see them.

Silverlight

This technology, created by Microsoft Corp., enables rich media experiences similar to what you can do with Flash. Googlebot has problems seeing the text and links in Silverlight. Just as with Flash, you’re best advised to use Silverlight for decorative purposes, and use HTML to render links and navigation menus.

Low Quality Solutions

Some web designers apply a Band-Aid solution to the problems caused by rendering navigation menus in JavaScript, Flash, Silverlight or Ajax. They’ll create an HTML sitemap with links to all the pages, and sometimes submit an XML sitemap to the search engines. These sitemaps will, in fact, allow search engines to see all the pages on your site. However, the search engines still won’t be able to see how many pages on your site link to any given page. That is important information – as the number of internal links to a page tells search engines how important that page is.

If your main navigation menu is in HTML or CSS, and all your major pages have the same navigation menu, then all your important pages will be linked from many pages on your site. Minor pages on your site will have just one or two links from specific pages. The variation in the number of links to each page tells search engines very clearly which are the most important pages on your site. If, on the other hand, your navigation menu is entirely in Flash or JavaScript, and you’ve got a sitemap as a Band-Aid solution, the only internal link to each major page that search engines can see will be from the sitemap.

That gives each page on your site just one link, making it appear to a search engine bot that each page is as important as every other. That is not accurate, and means that your most important pages will not show up as high in search results as would be shown in a website visibility report. Google webmaster tools can tell you how many pages on your site link to any other page. Log in to Google webmaster tools, click on “Your Site On The Web”, then click “Internal Links.”

Pages Are Accessible Only by Online Forms

Some sites have pages that can be reached only by filling out a form. For instance, one of the largest automobile insurance companies in the world used to have a simple form on its home page that asked for your postal code. You’d fill that out, click on Submit, and be directed to the portion of the insurer’s site that dealt with your region.
It sounds logical, but search engine crawlers don’t type in postal codes, and they don’t click on Submit. To the search engines, this insurer’s site looked like just a single page – and a pretty boring one, at that.

Search forms pose a similar problem. While this is a tremendously useful way for a human to find information on your site, it is not a navigation method the crawlers can use. Crawlers don’t type words into a search box, and they don’t click on a Search button. The solution? Keep the search form – it is great for your human visitors. Add a sitemap, and submit to search engines an XML sitemap that links to every page you want indexed.

How to Check If You Have a Crawl Problem

With a States Technology Labs FREE website visibility report, you can quickly check how many pages of your site the major search engines have indexed. If the number is lower than you think it should be, contact us for a free initial report. We are happy to discuss the best steps to bring better ranking and qualified traffic to your website.

Derek States