Design Concerns


search engine optimization

You’re likely to encounter numerous problems with SEO when designing your web site. Some are easy to overcome, others can be quite difficult. And still others aren’t problems you have to overcome per se, but issues you need to be aware of or you risk being ignored by search engine crawlers.

Among the tactics that might seem okay to some, but actually aren’t, are the so-called black-hat SEO techniques. These are practices that are implemented with a single goal in mind: increasing search engine rankings, no matter how inappropriate they might be. Some companies deliberately use such techniques when creating web sites, even if the results that are returned have absolutely nothing to do with the search terms users entered.

Domain cloaking

search engine optimization
On the surface, domain cloaking sounds like a great idea. The concept is to show users a pretty web site that meets their needs, while at the same time showing search engines a highly optimized page that probably would be almost useless to users. In other words, it’s a slimy trick to gain search engine ranking while providing users with a nice site to look at.
It starts with content cloaking, which is accomplished by creating web site code that can detect and differentiate a crawler from a site user. When the crawler enters the site, it is redirected to another web site that has been optimized for high search engine results. The problem with trying to gain higher search results this way is that many search engines can now spot it. As soon as they find that a web page uses such a cloaking method, the page is delisted from the search index and not included in the results.

Many unscrupulous SEO administrators will use this tactic on throw-away sites. They know
the site won’t be around for long anyway (usually because of some illegal activity), so they use domain cloaking to garner as much web site traffic as possible before the site is taken down or delisted.

Duplicating content

search engine optimization
When you’re putting together a web site, the content for that site often presents one of the greatest challenges, especially if it is a site that includes hundreds of pages. Many people opt to purchase bits of content, or even scrape content from other web sites to help populate their own.These shortcuts can cause real issues with search engines.

Suppose your web site is about some form of marketing. It’s very easy to surf around the Web and find hundreds (or even thousands) of web sites from which you can pull free, permission-granted content to include on your web site. The problem is that every other person or company creating a web site could be doing the same thing. The result? A single article on a topic appears on hundreds of web sites — and users don’t find anything new when they search for that topic and every site has the same article.

To help combat this type of content generation, some search engines now include as part of their search algorithm a method to measure the freshness of site content. If a crawler examines your site and finds that much of your content is also on hundreds of other web sites, you run the risk of either ranking low or being delisted from the search engine’s indexing database. Some search engines now look for four types of duplicate content:

Highly distributed articles: These are the free articles that seem to appear on every
single web site about a given topic. This content has usually been provided by a marketing-savvy entrepreneur as a way to gain attention for his or her project or passion. But no matter how valuable the information, if it appears on hundreds of sites, it will be deemed duplicate and reduce your chances of being listed high in the search result rankings.

Product descriptions for e-commerce stores: The product descriptions included on nearly all web pages are not included in search engine results. Product descriptions can be very small, and depending on how many products you’re offering, there could be thousands of them. Crawlers are designed to skip over most product descriptions. Otherwise, a crawler might never be able to work completely through your site.

Duplicate web pages: It’s pointless for a user to click through a search result only to find that your web pages have been shared with everyone else. These duplicate pages gum up the works and reduce the level at which your pages end up in the search results.

Content that has been scraped from numerous other sites: Content scraping is the practice of pulling content from other web sites and repackaging it so that it looks like your own content. Although scraped content may look different from the original, it is still duplicate content, and many search engines will leave you completely out of the search index, and hence the search results. 

Hidden pages

search engine optimization
One last SEO issue concerns the damage to your SEO strategy that hidden pages can inflict.
These are pages in your web site that are visible only to a search crawler. Hidden pages can also lead to issues such as hidden keywords and hidden links. Because keywords and links help to boost your search rankings, many people try to capitalize on this by hiding them within the body of a web page, sometimes in a font color that perfectly matches the site background. There’s no way to beat the detection of hidden pages. If you have a web site and it contains hidden pages, it’s just a matter of time before the crawler figures out that the content is part of a hidden SEO strategy; and once that’s determined by the crawler, your site ranking will drop drastically.

404 error pages

search engine optimization
One problem that visitors may encounter is broken links that don’t lead to their intended target. Instead, these links take the user to a worthless page that usually contains a message such as ‘‘error 404.’’ Not very descriptive, is it? When users encounter an error that they don’t understand, it simply adds to the frustration of being blocked from going where you want to go. 

Error pages happen: links break, pages become outdated, and — especially if you’re linking to a search engine results page — people type incorrect URLs into their browsers all the time. It’s what you do about these issues that will determine whether your user heads off to another site or surfs back to your site in an effort to locate what they were looking for.

No one wants their site visitors to encounter an error page, but sometimes you can’t help it. For those times when it does happen, you want the page to be as useful as possible to your visitors. Adding elements that tell the visitor more about what happened and what their options are is the best way to accomplish that. That ‘‘error 404 page unavailable’’ message doesn’t give them anything to work from.

Instead, you can use your error page to provide a list of links that are similar to what the user was trying to reach in the first place. Or you can offer a search box that enables users to search for information similar to what was on the page that they couldn’t reach. The worst thing you can do is nothing.

Give your visitors options that are designed to keep them on your web site. And one more
thing: Don’t overwhelm them. An error page should look like an error page, even if it’s more
useful than the simple ‘‘this page doesn’t exist’’ error; but it should also be simple. However,
don’t make your error pages look like the rest of the site. You might assume that providing users with your site’s usual look and feel and navigational structure is the best way to ensure that they don’t leave, but that’s not the case. When you don’t distinguish an error page from the rest of your site, two things can happen.

First, users get confused. The error page looks like the rest of your site, so it’s not immediately recognized as an error. Second, the navigational or link structure that you include on the page might not work properly, which translates to even more frustration for the visitor.

When designing error pages, your best option is to keep it simple and make it useful. Do that
and you will likely ensure that your visitors stay on your site. At the same time, you’re providing useful information that can be loaded with keywords and other SEO elements. The result is the best of both worlds: managing search engines while impressing your visitors with the usefulness of your site.

Image Source: Google.com

Thanks For Reading

Post a Comment

 
Top