OpenLightGroup Blog

rss

Blogs from OpenLightGroup.net


Silverlight, SEO & ASP.Net MVC – Part 2 (Solidifying a Strategy)

In the previous article we outlined a possible solution to getting links to your Silverlight Navigation Application indexed in the search engines. However, the implementation of the process left much to be desired. This article will address how to make a more robust solution to the problem and deal with some rules imposed by the search engines to prevent your from penalized by the search providers.

The overall process that we outlined looked something like this:

RequestProcess

Add the steps went something like:

  1. A request is made to a url on your site, for example: /default.aspx?page=about
  2. JavaScript is ran to determine if the requesting “user-agent” or browser supports Silverlight
  3. If yes, then find the corresponding Silverlight Deep Link to the requested page, for example: /default.htm#/About
    • Redirect the requestor to the corresponding link.
  4. If no, then continue to process the request as normal and display an HTML version of the corresponding Silverlight link

While the solution given in the previous article worked as expected, there were some significant drawbacks. The most obvious was the information used to map one page to another was hard coded into the JavaScript file. This would become a huge problem once the number of page grew past just a few pages. The less obvious issue, at least to me, was the fact that the process outlined might actually get your site removed from Google’s search index! While doing some research on how the search bots handle “rich media” files I found an article that addressed this process as “cloaking.” Something which Google views as a deceptive practice and punishes site owners by removing their site’s from Google’s indexes completely. I can understand this to some extent; however, given the particular scenario we are dealing with, I think that it is an unjustified punishment. Many people, including myself, have voiced concerns about this policy to Google, but I do not see them changing anytime soon. I am currently discussing this with folks on the Google Webmaster forums to see if there is a way to handle this scenario without being punished. So far, it looks like the process may have to change to include prompting the user to be redirected to the Silverlight link. If so step #3 listed about would change to:

3. If yes, the find the corresponding Silverlight Deep Link to the requested page.
4. Ask the user if they would like to view the Silverlight version of the page. If yes, redirect to the corresponding link. If not, continue processing the HTML version of the page.

I personally feel that this is an unnecessary step and the additional click would quickly become annoying to user’s visiting your site. So, we will not be modifying the process until I have a response from someone at Google saying that the user must be prompted to avoid being classified as a cloaking site. I plan on writing the next article in this series to address the cloaking issue.

While writing this article I have decided to create a CodePlex Project for this series. The project will include the source code I developed while writing these articles and provide some code that you can drop into your sites to simplify the implementation of the process once it is completed. The final article will summarize the information in the series and demonstrate how to utilize the code in your projects. 

The remainder of this article will outline the objectives of this project and address the improvements made to the implementation of the code shown in Part 1.

Project Objectives:

  1. Provide a way to associate Silverlight Deep Links to a corresponding HTML version of a page and vice versa.
  2. Allow link mappings to be data driven.
  3. Provide a master page that will handle the redirection to the associated Silverlight link.
    • Master page will also display the “Get Microsoft Silverlight” image link be default. This lets users know there is a Silverlight version of the site and shows them where to download the Silverlight Plug-In.
  4. Provide a Silverlight hosting page that will handle displaying a link to the HTML version of the page to users.
    • Search engines will only be able to see a link to the default html page.
  5. Prevent search engines from indexing the page(s) that host the Silverlight content.
  6. Provide a swappable search engine sitemap handler, that can be used to submit to search engines.
  7. Demonstrate a basic example of how to share data driven content between Silverlight and HTML.
  8. Provide Visual Studio templates for ASP.Net and ASP.Net MVC web sites.

Refacotrings:

Pull the links out of JavaScript

I didn’t take long before I just had to pull the hard coded data out of the JavaScript files! For now, I have moved the “Page Mappings” into an XML document located in the App_Data folder of the site. I then created a web service to allow JavaScript to retrieve the corresponding url by passing the current url. In the future, I plan to allow for the service to be easily swapped out. This will allow the mappings to be stored in many different data stores independent of how the JavaScript needs to access them. Since the mappings are now accessed via a web service, I have included the web service proxy code in the JavaScript file. As I work toward a swappable page mappings service, the goal is to keep the interface the same so that the JavaScript proxy does not need to change.

Cleaned up master page

Simply added HTML comments around the call to the RedirectToSilveright JavaScript function. I also added the “Get Microsoft Silverlight” image and link to the bottom of the master page.

Cleaned up Silverlight hosting page

The hosting page now displays a link to the HTML version of the page at the very bottom of the page. I am not real thrilled with this, but this is an attempt to “play nice” with the crawler to prevent being considered a cloaking site. In theory, the crawler should never hit this page but better safe than sorry. An additional and, in my opinion, more important reason for this link is for accessibility. Users that use accessibility software are now provided with a link to view a version of the page that will work with screen readers.

Added Robots.txt

I added a robots.txt file to the site to prevent the crawlers from index the Silverlight hosting page.

Summary

Since the project has been added to CodePlex, you can now view the entire source code as it was at the time of writing this here. To view or download the most current source code you can always use the Source Code page of the CodePlex site. I will continue to work on cleaning up the code and adding some additional features over the coming days and weeks. Eventually, I hope to provide an installer for two Visual Studio project templates that will help jump start your search engine optimized Silverlight Navigation application. i apologize for the lack of code examples in the article this week. Again please feel free to check out the CodePlex site if you are interested in viewing the source for this project. Please keep in mind, this project is not yet complete and may not build and/or work on your system yet. The code that is posted as of today (8/4/2009) is merely available for reference.

I hope you have enjoyed this article, and as always your feedback is much appreciated!

References:

  • Best uses of Flash – Interesting article even though it’s about Flash it shows Google’s overall attitude toward RIAs.
  • Images, Flash and Scripts – Bing’s attitude isn’t much different as of right now…
  • SEO for Silverlight Applications – Apparently, at the time this article was written Microsoft was recommending the same as Google. This was before the Silverlight Navigation application and Deep Linking. Hopefully, they will update this article soon. It would be interesting to see what they consider to be best practices now.
  • Rich Media Best Practices – Google’s official best practices for rich media.




Comments are closed.
Showing 8 Comments
Avatar  Hire Dedicated MySQL Programmer 7 years ago

thanks for the information the information provided by you is so useful and very informative thanks for posting this.

Avatar  Dedicated ASP .Net Developer 7 years ago

I was stuck on the same point but you have provided the explanation in such an easy way that I got the things. Thanks a lot it helped me a lot. Keep it up.

Avatar  Mayweather 7 years ago

I hope to you provide an installer for two Visual Studio project templates that will help jump start our search engine optimized Silverlight Navigation application.

Avatar  sikat ang pinoy 8 years ago

Creating silverlight on asp.net mvc are now easy. Thanks for the tutorials I like how you have presented the information in full detail. Keep up the great work

Avatar  Ian T. Lackey 8 years ago

Thanks, I appreciate the feedback.<br><br>Also wanted to comment and let everyone know this project is still in the mix. Michael and I (along with a couple others) are working on writing a new DotNetNuke book right now, and it has been taking the majority of my time lately. However, I do hope to continue this series very soon. So stay tuned, there will be more...

Avatar  Mistry 8 years ago

Nice post!

Avatar  Ian T. Lackey 8 years ago

Good question Vitor,<br><br>The main difference is with this method the url mappings between ASP.Net and the Silverlight Deep links allow you to keep your current ASP.Net site online while moving the site to the Silverlight naviagation platform. Additionally the tools used for this method are all currently released and fully supported. If you are creating a new site and do not mind using the RIA previews, it will probably be easier to use the RIA SEO strategy as outlined by Brad Abrams.

Avatar  Vitor 8 years ago

Hey Ian,<br><br>Congratulations for the great articles! I'd like to hear of you what you think about the difference approach of your solution against RIA Service solution for SEO. Why do you decide to start a new solution instead use it? I'm evaluating RIA Services right now and do would to compare it.