Nov 15, 2014

Keywords and Your Web Site

Keywords and Your Web Site




Keywords That’s a term you hear associated with search engine optimization all the time. In fact, it’s very rare that you hear anything about SEO in which keywords aren’t involved some way; so what’s so special about keywords?

Simply put, keywords are those words used to catalog, index, and find your web site; but of course, it’s not nearly as simple as it sounds. There is a fine science to finding and using the right keywords on your web site to improve your site’s ranking. In fact, an entire industry has been built around keywords and their usage. Consultants spend countless hours finding and applying the right keywords for their customers, and those who design web sites with SEO in mind also agonize over choosing just the right ones.

Using popular — and effective — keywords on your web site will help to ensure that it is visible in the search engine results, rather than be buried under thousands of other web site results. There are keyword research tools that can help you find the exact keywords to use for your site and therefore for your search engine optimization. Understanding the use of keywords — where to find them, which ones to use, and the best ways to use them — enables you to have a highly visible and successful web site.


The Importance of Keywords

On the most basic level, keywords capture the essence of your web site. Keywords are the words or phrases a potential visitor to your site enters into a search engine to find web sites related to a specific subject, and the keywords that you choose will be used throughout your optimization process. As a small-business owner, you will want your web site to be readily visible when those search engine results come back. Using the correct keywords in your web site content can mean the difference between being listed as one of the first 20 sites returned from search engine results (which is optimum) or being buried under other web sites several pages into the results (which means hundreds of results are returned before your site). Studies show that searchers rarely venture past the second page of search results when looking for something online. Consider for a moment the telephone book’s Yellow Pages. Suppose you’re looking for a restaurant. The first thing you’re going to do is find the heading restaurant, which would be your keyword. Unfortunately, that’s pretty broad, and even in a smaller city, there might be a page or more of restaurants to look through. If you’re in a large city, there might be hundreds of pages.

If you are so inclined, you can narrow your search to Chinese restaurants, which will greatly reduce the number of entries that you have to search through. Basically, that’s how keywords work in search engines and search engine optimization. It’s also a good example of how people search: They begin with the broadest concept and then gradually narrow their search criteria, based on what they learn in each step of the process. Recall from the discussion of Long Tail search in previous post that what leads visitors to your site can be either the broad search term or the more narrow (and very specific) term. Based on that concept, choosing the appropriate keywords for your web site will improve your search engine rankings and lead more search engine users to your site.

How do you know which keywords to use? Where do you find them? How do you use them? Knowing the answers to these questions will save you a great deal of time when creating a website. Where you rank in search engine results is determined by what keywords are used and how they are positioned on your web site. It’s critical to choose appropriate keywords, include variations of those keywords, avoid common (or stop) words, and know where and how many times to place them throughout your web site. 

Used correctly, keywords should enable you to be placed in the first page or two of the most popular search engines, and in some cases even as the number one result. This tremendously increases the traffic to your web site. Keep in mind that the majority of Internet users find new web sites by using a search engine. High search engine rankings can be as effective as, if not more effective than, paid ads in marketing your business.

The business you receive from search engine rankings will also be more targeted to your services than it would be with a blanket ad. By using the right combination of keywords, your customer base will consist of people who set out to find exactly what your site has to offer, and those customers will be more likely to visit you repeatedly in the future. To decide which keywords should be used on your web site, you can start by asking yourself the most simple, but relevant, question: ‘‘Who needs the services that I offer?’’ It’s an elementary question, but one that will be most important in searching for the correct keywords and having the best search engine optimization.

For example, if you were marketing specialty soaps, you would want to use words such as soap (which really is too broad a term), specialty soap, bath products, luxury bath products, or other such words that come to mind when you think of your product. It’s also important to remember to use words that real people use when talking about your products.

For example, using the term ‘‘cleaning supplies’’ as a keyword wouldn’t result in a good ranking because people thinking of personal cleanliness don’t search for ‘‘cleaning supplies.’’ They search for ‘‘soap’’ or something even more specific, like ‘‘chamomile soap.’’ Your customers are usually your best source of information about the search terms they use to find your products or web site.One way to figure out what terms people use is to ask them. Most customers don’t mind answering a brief questionnaire, especially when some incentive is involved. Many companies have luck eliciting feedback by offering entry into a drawing for a prize or even a discount coupon to people who participate in surveys. Some people will offer opinions just because you ask. If you have a blog or forum on your web site, that’s a good place to pose a question about the terms people use when they think of your site or products.

In addition to the terms that you think of and that your customers tell you they use, people also look for web sites using variations of words and phrases — including misspellings. An example might be ‘‘chamomile.’’ Some people may incorrectly spell it ‘‘chammomile’’ or ‘‘camomile,’’ so including those spellings in your keywords can increase your chance of reaching those searchers. In addition, remember to use capitalized and plural keywords. The more specific the words and the greater the number of variations, the better the chances that your web site is targeted.

Be careful to avoid words such as ‘‘a,’’ ‘‘an,’’ ‘‘the,’’ ‘‘and,’’ ‘‘or,’’ and ‘‘but.’’ These are called stop words, and they’re so common they are of no use as keywords.

Image source: Google.

Thanks For Reading......... 

Oct 9, 2014

Design Concerns as per SEO aspect

Design Concerns


search engine optimization

You’re likely to encounter numerous problems with SEO when designing your web site. Some are easy to overcome, others can be quite difficult. And still others aren’t problems you have to overcome per se, but issues you need to be aware of or you risk being ignored by search engine crawlers.

Among the tactics that might seem okay to some, but actually aren’t, are the so-called black-hat SEO techniques. These are practices that are implemented with a single goal in mind: increasing search engine rankings, no matter how inappropriate they might be. Some companies deliberately use such techniques when creating web sites, even if the results that are returned have absolutely nothing to do with the search terms users entered.

Domain cloaking

search engine optimization
On the surface, domain cloaking sounds like a great idea. The concept is to show users a pretty web site that meets their needs, while at the same time showing search engines a highly optimized page that probably would be almost useless to users. In other words, it’s a slimy trick to gain search engine ranking while providing users with a nice site to look at.
It starts with content cloaking, which is accomplished by creating web site code that can detect and differentiate a crawler from a site user. When the crawler enters the site, it is redirected to another web site that has been optimized for high search engine results. The problem with trying to gain higher search results this way is that many search engines can now spot it. As soon as they find that a web page uses such a cloaking method, the page is delisted from the search index and not included in the results.

Many unscrupulous SEO administrators will use this tactic on throw-away sites. They know
the site won’t be around for long anyway (usually because of some illegal activity), so they use domain cloaking to garner as much web site traffic as possible before the site is taken down or delisted.

Duplicating content

search engine optimization
When you’re putting together a web site, the content for that site often presents one of the greatest challenges, especially if it is a site that includes hundreds of pages. Many people opt to purchase bits of content, or even scrape content from other web sites to help populate their own.These shortcuts can cause real issues with search engines.

Suppose your web site is about some form of marketing. It’s very easy to surf around the Web and find hundreds (or even thousands) of web sites from which you can pull free, permission-granted content to include on your web site. The problem is that every other person or company creating a web site could be doing the same thing. The result? A single article on a topic appears on hundreds of web sites — and users don’t find anything new when they search for that topic and every site has the same article.

To help combat this type of content generation, some search engines now include as part of their search algorithm a method to measure the freshness of site content. If a crawler examines your site and finds that much of your content is also on hundreds of other web sites, you run the risk of either ranking low or being delisted from the search engine’s indexing database. Some search engines now look for four types of duplicate content:

Highly distributed articles: These are the free articles that seem to appear on every
single web site about a given topic. This content has usually been provided by a marketing-savvy entrepreneur as a way to gain attention for his or her project or passion. But no matter how valuable the information, if it appears on hundreds of sites, it will be deemed duplicate and reduce your chances of being listed high in the search result rankings.

Product descriptions for e-commerce stores: The product descriptions included on nearly all web pages are not included in search engine results. Product descriptions can be very small, and depending on how many products you’re offering, there could be thousands of them. Crawlers are designed to skip over most product descriptions. Otherwise, a crawler might never be able to work completely through your site.

Duplicate web pages: It’s pointless for a user to click through a search result only to find that your web pages have been shared with everyone else. These duplicate pages gum up the works and reduce the level at which your pages end up in the search results.

Content that has been scraped from numerous other sites: Content scraping is the practice of pulling content from other web sites and repackaging it so that it looks like your own content. Although scraped content may look different from the original, it is still duplicate content, and many search engines will leave you completely out of the search index, and hence the search results. 

Hidden pages

search engine optimization
One last SEO issue concerns the damage to your SEO strategy that hidden pages can inflict.
These are pages in your web site that are visible only to a search crawler. Hidden pages can also lead to issues such as hidden keywords and hidden links. Because keywords and links help to boost your search rankings, many people try to capitalize on this by hiding them within the body of a web page, sometimes in a font color that perfectly matches the site background. There’s no way to beat the detection of hidden pages. If you have a web site and it contains hidden pages, it’s just a matter of time before the crawler figures out that the content is part of a hidden SEO strategy; and once that’s determined by the crawler, your site ranking will drop drastically.

404 error pages

search engine optimization
One problem that visitors may encounter is broken links that don’t lead to their intended target. Instead, these links take the user to a worthless page that usually contains a message such as ‘‘error 404.’’ Not very descriptive, is it? When users encounter an error that they don’t understand, it simply adds to the frustration of being blocked from going where you want to go. 

Error pages happen: links break, pages become outdated, and — especially if you’re linking to a search engine results page — people type incorrect URLs into their browsers all the time. It’s what you do about these issues that will determine whether your user heads off to another site or surfs back to your site in an effort to locate what they were looking for.

No one wants their site visitors to encounter an error page, but sometimes you can’t help it. For those times when it does happen, you want the page to be as useful as possible to your visitors. Adding elements that tell the visitor more about what happened and what their options are is the best way to accomplish that. That ‘‘error 404 page unavailable’’ message doesn’t give them anything to work from.

Instead, you can use your error page to provide a list of links that are similar to what the user was trying to reach in the first place. Or you can offer a search box that enables users to search for information similar to what was on the page that they couldn’t reach. The worst thing you can do is nothing.

Give your visitors options that are designed to keep them on your web site. And one more
thing: Don’t overwhelm them. An error page should look like an error page, even if it’s more
useful than the simple ‘‘this page doesn’t exist’’ error; but it should also be simple. However,
don’t make your error pages look like the rest of the site. You might assume that providing users with your site’s usual look and feel and navigational structure is the best way to ensure that they don’t leave, but that’s not the case. When you don’t distinguish an error page from the rest of your site, two things can happen.

First, users get confused. The error page looks like the rest of your site, so it’s not immediately recognized as an error. Second, the navigational or link structure that you include on the page might not work properly, which translates to even more frustration for the visitor.

When designing error pages, your best option is to keep it simple and make it useful. Do that
and you will likely ensure that your visitors stay on your site. At the same time, you’re providing useful information that can be loaded with keywords and other SEO elements. The result is the best of both worlds: managing search engines while impressing your visitors with the usefulness of your site.

Image Source: Google.com

Thanks For Reading

Oct 7, 2014

Programming Languages and SEO

Programming Languages and SEO


search engine optimization


One aspect of web site design you might not think of when planning your SEO strategy is the programming language used to develop the site. Programming languages all behave a little differently. For example, HTML and PHP use completely different protocols to accomplish the visuals you see when you open a web page. (When most people think of web site programming, they think in terms of HTML.) In reality, many other languages are also used for coding web pages — and those languages may require differing SEO strategies.

JavaScript

search engine optimization
JavaScript is a programming language that enables web designers to create dynamic content. However, it’s not necessarily SEO-friendly. In fact, JavaScript often completely halts a crawler from indexing a web site, and when that happens the result is lower search engine rankings or complete exclusion from ranking.

To overcome this, many web designers externalize any JavaScript that’s included on the web site. Externalizing the JavaScript means that it is actually run from an external location, such as a file on your web server. To externalize your JavaScript:

1. Copy the code, beginning at the starting tags, and paste it into a            Notepad file.2. Save the Notepad file as filename.js.3. Upload the file to your web server.4. Create a reference on your web page to the external JavaScript             code. 

The reference should be placed where the JavaScript will appear, and might look like this:

<script language="JavaScript" type="text/javascript" src="filename.js"></script>

This is just one of the solutions you can use to prevent JavaScript from becoming a problem
for your SEO efforts. There are many others, and depending on your needs you should explore some of them.

Flash

search engine optimization
Flash is another of those technologies that some users absolutely hate. That’s because Flash, though very cool, is resource intensive. It causes pages to load slower, and users are often stuck on an opening Flash page and can’t move forward until the Flash has finished executing. When you’re in a hurry, which is almost always, it’s a frustrating situation to deal with. Flash is also a nightmare when it comes to SEO. A Flash page can stop a web crawler in its tracks, and once it is stopped, the crawler won’t resume indexing the site. Instead, it will simply move on to the next web site on its list.
The easiest way to avoid Flash problems is to simply not use it. If, despite Flash’s difficulties with search rankings, your organization needs to use it, then you can code the Flash in HTML and an option can be added to test for the ability to see Flash before it is executed. Note, however, that there’s some debate about whether or not this is an acceptable SEO practice, so before you implement this type of strategy in an effort to improve your SEO effectiveness, take the time to research the method.

Dynamic ASP

search engine optimization
Most of the sites you’ll encounter on the Web are static web pages. These sites don’t change beyond any regular updates by a webmaster. Conversely, dynamic web pages are web pages that are created on the fly according to preferences that users specify in a form or menu. These sites can be created using a variety of different programming technologies, including dynamic ASP. The problem with these sites is that they don’t technically exist until the user creates them. Because a web crawler can’t make the selections that build these pages, most dynamic web pages aren’t indexed in search engines.
There are ways around this, however. Dynamic URLs can be converted to static URLs with the right coding. It’s also possible to use paid inclusion services to index dynamic pages down to a predefined number of levels (or number of selections, if you’re considering the site from the user’s perspective).

Dynamic ASP, like many of the other languages used to create web sites, carries with it a unique set of characteristics. But that doesn’t mean SEO is impossible for those pages. It does mean that the approach used for the SEO of static pages needs to be modified. It’s an easy enough task, and a quick search of the Internet will almost always provide the programming code you need to achieve SEO.

PHP

search engine optimization
Search engine crawlers being what they are — preprogrammed applications — there’s a limit to what they can index. PHP is another programming language that falls outside the boundaries of normal web site coding. Search engine crawlers see PHP as another obstacle if it is not properly executed.

Properly executed means that PHP needs to be used with search engines in mind. For example, PHP naturally stops or slows search engine crawlers, but with some attention and a solid understanding of PHP and SEO, it’s possible to code pages that work. One thing that works well with PHP is designing the code to look like HTML. It requires an experienced code jockey, but it can be done. And once the code has been disguised, the PHP site can be crawled and indexed so that it is returned in search results.

Image Source: Google

Thanks for Reading.

To be continued.........

Oct 6, 2014

Problem Pages and Work-Arounds

Problem Pages and Work-Arounds

No matter how much time and consideration you put into your SEO strategy, some elements of your web site will require special consideration. Certain sites — such as portals — need a different approach than a standard web site might require. How you deal with these issues will have an impact on the effectiveness of your SEO efforts.

Painful Portals

search engine optimization
The use of portals — web sites that are designed to funnel users to other web sites and content — as a search engine placement tool is a hotly debated topic. Many experts will start throwing around the word ‘‘spam’’ when the subject of SEO and portals comes up; and there have been serious problems with portals that are nothing more than search engine spam. In the past, portals have certainly been used as an easy link-building tool offering nothing more than regurgitated information. Sometimes the information is vaguely reworded, but it’s the still the same information.
Search engine operators have long been aware of this tactic and have made every effort to hinder its usefulness by looking for duplicate content, interlinking strategies, and other similar indicators. Using these techniques, search engines have managed to reduce the usefulness of portal web sites as SEO spam mechanisms.

However, because search engine operators need to be cautious about portals that are nothing more than SEO spam, if your site is a portal, then optimizing it will be a little harder. As with all web site design, the best objective for your site, even for a portal, is to help your visitors achieve a desired result, whether that’s purchasing a product, signing up for a newsletter, or finding desired information. If you make using your site easy and relevant, your site visitors will stay on your site longer, view more pages, and return to your site in the future. Portals help you reach these goals by acting as excellent tools for consolidating information into smaller, more manageable sources of information that users find easier to use and digest.

Too often, people optimizing web sites focus on the spiders and forget about the visitors. The sites you are developing have to appeal to the visitors and provide them with the information they’re looking for, or all you will have at the end of the day are hosting bills and low conversion rates. Portal web sites enable you to create a series of information resources that provide full information on any given topic, while structuring a network of information covering a much larger scope. Though the visitor is of significant importance when building a web site, the site itself is of primary significance, too. There’s no point in creating a beautiful web site if no one’s going to see it, and portals are a fantastic tool for increasing your online visibility and search engine exposure, for a wide variety of reasons. Perhaps the most significant of these reasons is the increase in keywords that you can use in portal promotion. Rather than have one web site with which to target a broad range of keywords, portals enable you to have many web sites, each of which can have its own set of keywords. For example, instead of trying to put ‘‘deer hunting’’ and ‘‘saltwater fishing’’ on the same page, you can create a hunting portal that enables you to have separate sites for deer hunting, saltwater fishing, and any other type of hunting activity you’d like to include. 

On one page it’s much easier to target the two key phrases ‘‘deer season’’ and ‘‘Mississippi hunting license’’ than it is to target two key phrases like ‘‘deer season’’ and ‘‘marlin fishing.’’ Targeting incompatible keywords or phrases — that is, keywords or phrases that aren’t related to a larger topic — makes it harder to have both readable, relevant content and reach the keywords that you need to use. There are other advantages to creating web portals as well. Having a portal enables you to have multiple home pages, which gives you the opportunity to create sites that consistently appear in a top ranking. You also have more sites to include in your other SEO strategies and more places to include keywords. However, there is a fine line between a useful portal and one that causes search engines to turn away without listing your portal on SERPs.  As with most issues in web design, keep it user-friendly and attractive. If you have any concerns that the actions you’re taking with your site or the design methods that you’re using could lead to negative results for the SEO of your site, don’t use them. If you have a feeling that a strategy won’t work, it probably won’t, and you’re wasting your time if you use a design you’re not comfortable with.

Fussy frames

search engine optimization
Some web site designs require the use of frames. Frames are sections of a web site, with each section constituting an entity separate from the other portions of the page. Because the frames on a site represent separate URLs, they often create display issues for users whose browsers don’t support frames, and for search crawlers that encounter the frames and can’t index sites where the frame is the navigational structure.

You have a couple of options when frames are essential to the design of your web site. The first is to include an alternative to the framed site. This requires the use of the noframes tag. This tag directs the user’s browser to display the site without the framed navigational system. Users may see a stripped-down version of your site, but at least they can still see it. When a search crawler encounters a site made with frames, the noframes tag enables it to index the alternative site. It’s important to realize, however, that when you use the noframes tag, you need to load the code for an entire web page between the opening tag and closing tag. Another issue with frames is that search engines often display an internal page on your site in response to a search query, but if this internal page does not contain a link to your home page or some form of navigation menu, users are stuck on that page and cannot navigate through your site. That means the search crawler is also stuck in that same spot. As a result, the crawler might not index your site.

The solution, of course, is to place on the page a link that leads to your home page. In this link, include the attribute TARGET = "_top". This prevents your site from becoming nested within your own frames, which locks users on the page they landed on from the search results. It also makes it possible for crawlers to efficiently crawl your site without getting stuck.  That link back to your home page will probably look something like this: 

<a href="index.html" TARGET = "_top">Return to Home Page</a>

Frames are difficult, but not impossible, to get around when you’re putting SEO strategies into place. It’s a good idea to avoid them, but they won’t keep you completely out of search engine rankings. You just have to use a different approach to reach those high rankings you desire.

Cranky cookies

search engine optimization
Cookies are one of those irritating facts of life on the Internet. Users want web sites tailored to them, and cookies are one way companies have found to do that. When users enter a site and customize some feature of it, a small piece of code — the cookie — is placed on the user’s hard drive. Then, when the user returns to the site in the future, that cookie can be accessed and the user’s preferences executed.

When cookies work properly, they’re an excellent tool for web designers. When they don’t work as they should, problems arise. The main issue with cookies is that some browsers allow users to set how cookies will be delivered to them; and some source code prompts the user to be asked before a cookie is accepted. When this happens, the search engine crawler is effectively stopped in its tracks, and it doesn’t pick back up where it stopped once the cookies are delivered. In addition, any navigation that requires cookies prevents a crawler from indexing the pages. To overcome this issue, you must code cookies to ensure that the source code is not designed to query the user before the cookie is delivered. 

If you like the post please comment and share it.
Thanks for Reading

Oct 4, 2014

Components of an SEO-Friendly Page

Components of an SEO-Friendly Page


search engine optimization

Building an SEO-friendly web site doesn’t happen by accident. It requires an understanding of what elements of search engines examine and how those elements affect your ranking. It also requires including as many of those elements as possible on your site. It does little good to have all the right meta tags in place if you have no content and no links on your page. It’s easy to get caught up in the details of SEO and forget the simplest web-design principles — principles that play a large part in your search engine rankings. Having all the right keywords in the right places in your tags and titles won’t do you much good if the content on your page is nonexistent or completely unreachable by a search engine crawler.


Understanding entry and exit pages

search engine optimization
The entry and exit pages are the first and last pages of your web site that a user sees. It’s important to understand that an entry page isn’t necessarily the home page on your web site. It can be any other page where a user lands, either by clicking through search engine results, by clicking a link from another web site or a piece of marketing material, or by bookmarking or typing directly into the address bar of a browser.
Entry pages are important in SEO because they are the first page users see — the electronic equivalent of a first impression. The typical web site is actually several small, connected sites.

Your company web site might contain hubs, or central points, for several different topics. For example, if your site represents a pet store, then you’ll have hubs within it for dogs, cats, birds, fish, and maybe exotic animals. Each hub will have a main page — which will likely be your entry page for that section — and several additional pages leading from that central page to other pages containing relevant content, products, or information about specific topics. Understanding which of your pages are likely entry pages helps you to optimize those pages for search engine crawlers. Using the pet-store example, if your home page and all the hub pages are properly Search Engine Optimized, you potentially could be ranked at or near the top of five different sets of search results. When you add additional entry pages deeper in your web site structure (that is, a dog-training section to the hub for dogs), you’ve increased the number of times you can potentially end up at the top of search engine rankings.
Because entry pages are important in the structure of your web site, you want to monitor those pages using a web site analytics program to ensure they are working the way you expect them to work. A good analytics program, such as Google Analytics, will show you your top entry and exit pages.
Exit pages are those from which users leave your site, either by clicking through an exiting link, selecting a bookmark, or typing a different web address into their browser address bar. Exit pages have two purposes. The first is to drive users from their entry pages to a desired exit page. This is called the path that users travel through your site. A typical path might look something like this:

SERP Home -> Women’s Clothing ->  Product Pages -> Shopping Cart -> Checkout Receipt

In this example, Home is the entry page and Receipt is the exit page. By looking at this navigational path, you can tell how users travel through your page and where they fall off the page; but there’s an added benefit to understanding the navigational path of your users. When you know how users travel through your site, you can leave what’s called a bread-crumb trail for them. That’s a navigational indicator on the web site that enables them to quickly see where they are on your site. 

The bread-crumb trail not only helps users return to a previous page in the navigational path; it also makes it easier for a web crawler to fully examine your site. Because crawlers follow every link on your page, this is an internal link structure that leads crawlers to individual pages that you want included in search engine results.

Choosing an Analytics Program

search engine optimization
An important element in any SEO plan is analytics — the method by which you monitor the effectiveness of your web site. Analytics are the metrics that show you how pages, links, keywords, and other elements of your web site are performing. If your web host hasn’t provided you with an analytics program, find one. Not having an analytics program is like walking around in the dark, hoping you won’t bump into a wall.
Many web site owners shy away from analytics packages because they believe them to be complicated and expensive. However, they don’t always have to be. You can find a good analytics program that’s not only easy to use, but also inexpensive or even free; but use caution about making ease and low cost the deciding factors when selecting an analytics program.
The program will give you the power to see and control how your web site performs against your goals and expectations. You want it to show you everything you need to know, so here are some considerations when you’re evaluating analytics programs:

■ What reports are included in the tools you’re examining, and how will you use those reports?
■ How do you gather the information used to create the metrics you need?
■ How often are your reports updated?
■ How much training is necessary to understand your application and the reports provided?
■ Do you get software installation or is the product provided strictly as a web-based service?
■ What is the total cost of ownership?
■ What types of support are available?
■ What is the typical contract length?

Many analytics programs are available. Google Analytics, AW Stats, JayFlowers, ClickTracks, and dozens of others all offer something different at a different price tag. If free is what you can afford, don’t assume you’ll get a terrible package. Google Analytics is one of the free packages available, and it’s an excellent program, based on what used to be the Urchin Analytics package (which was quite costly). Other programs cost anywhere from $30 to $300 a month, depending on the capabilities you’re purchasing.

Cost is not the most important factor, however. Ultimately, your main consideration should be how the analytics package can help you improve your business. Using powerful titles Page titles are one of the most important elements of site optimization. When a crawler examines your site, the first elements it looks at are the page titles; and when your site is ranked in search results, page titles are again one of the top elements considered. Therefore, when you create your web site, you need great page titles.

Consider several key factors when coming up with your page titles: 

■ Unless you’re Microsoft, don’t use your company name in the page title. A better choice is to use a descriptive keyword or phrase that tells users exactly what’s on the page. This helps to ensure that your search engine rankings are accurate.

■ Try to limit page titles to less than 50 characters, including spaces. Some search engines index only up to 50 characters; others might index as many as 150. Regardless, maintaining shorter page titles forces you to be precise in the titles that you choose and ensures that your page title will never be cut off in the search results.

■ Don’t repeat keywords in your title tags. Repetition can occasionally come across as spam when a crawler is examining your site, so avoid that in your title if possible, and never duplicate words just to gain a crawler’s attention. It could well get your site excluded from search engine listings.

■ Consider adding special characters at the beginning and end of your title to improve noticeability. Parentheses (()), arrows (<<>>), asterisks (****), and special symbols such as ££££ can help draw a user’s attention to your page title. These special characters and symbols don’t usually add to or detract from your SEO efforts, but they do serve to call attention to your site title.

■ Include a call to action in your title. There’s an adage that goes something like, ‘‘You’ll never sell a thing if you don’t ask for the sale.’’ That’s true on the Web too. If you want your users to do something, you have to ask them. All of your page titles should be indicated with the title tag when you code your web site. The title tag isn’t difficult to use. Here’s an example of such a tag:

<title>A Descriptive Web Site Title</title>

If your page titles aren’t tagged properly, you may as well not be using them, so take the time to ensure that your page titles are short, descriptive, and tagged into your web site code. By using title tags, you increase the chances that your web site will be ranked high within search engine results.

Creating great content

search engine optimization
Web site content is another element of an SEO-friendly site that you should spend plenty of time contemplating and completing. Fortunately, there are ways to create web site content that will make search crawlers love you. Great content starts with the right keywords and phrases. Select no more than three keywords or phrases to include in the content on any one of your web pages. Why only three? Wouldn’t more keywords and phrases ensure that search engines take notice of your site? Actually, when you use too many keywords in your content, you face two problems. First, the effectiveness of your keywords will be reduced by the number of different ones you’re using. Choose two or three for each page of your site and stick with those. 

Second, you may be delisted or ignored because a search engine sees your SEO efforts as keyword stuffing. It’s a serious problem, and search engine crawlers will exclude your site or pages from indexes if they contain too many keywords. After you have the two or three keywords or phrases that you plan to focus on, you need to actually use those keywords in the page content. Many people assume that the more frequently you use the words, the higher your search engine ranking will be. Again, that’s not necessarily true. Just as using too many different keywords can cause a crawler to exclude you from a search engine index, overusing the same word will also cause crawlers to consider that as keyword stuffing. Again, you run the risk of having your site excluded from search indexes. The term used to describe the number of times a keyword is used on a page is keyword density.

For most search engines, allowed keyword density is relatively low. Google is very strict about ranking sites that have a keyword density of 5 to 7 percent; much lower or much higher and your ranking is seriously affected or completely lost. Yahoo!, MSN, and other search engines allow keyword densities of about 5 percent. Going over that mark could cause your site to be excluded from search results. 

Keyword density is an important factor in your web site design, but there are other content concerns, too. Did you know that the freshness and focus of your content also affects how high your web site ranks? One reason why many companies began using blogs on their web sites was because blogs are updated frequently and they’re highly focused on a specific topic. This gives search engines new, relevant content to crawl.

Consider implementing a content strategy that includes regularly adding more focused content or expanding your content offerings. It doesn’t have to be a blog, but news links on the front page of the site, regularly changing articles, or some other type of changing content will help gain the attention of a search engine crawler. Don’t just set these elements up and leave them, however. You also have to ensure regular updates and keep the links included in the content active. Broken links are another crawler pet peeve. Unfortunately, with dynamic content, links will occasionally break. Make sure you’re checking this element of your content on a regular basis and set up some kind of user-feedback loop so broken links can be reported to your webmaster.

Finally, when you’re creating your web site content, consider interactive forums. If you’re adding articles to your site, give users a forum in which they can respond to the article, or a comments section. This leads to more frequent updates of your content, which search crawlers love. In short, forums provide users with an ongoing, interactive relationship with your web site, and give an extra boost to your search engine ranking.

Maximizing graphics

search engine optimization
Images or graphics on your web site are essential. They’re also basically ignored by search engines, so what’s the point of putting them on your site? There’s a good reason that has nothing to do with SEO. Without images, your page is just boring text. You’re not going to be happy with using plain text instead of that cool, new logo you had designed for your company, and neither are your users. They want to see pictures. If images are a must on a web site, then there should be a way to use those images to increase your web site traffic or to at least improve your site ranking. And there is. One technique that will help your SEO make use of graphics on your site is to tag those graphics with alt tags inside the img tags. The alt tags are the HTML tags used to display alternative text when a graphic is present. An alt tag should be a short, descriptive phrase about the image, which includes the keywords used on that page when possible.

The img tags are the tags used to code the images that appear on your web site. Here’s an example of what an img tag, with an included alt tag, should look like:

<img src="pic1.jpg" alt="alternative text"/>

Here’s how that tag breaks down: <img src="pic1.jpg" is your image tag; alt="alternative text"/> is your alternative text tag. The alternative text tag is where your keywords should be included if at all possible.

You want to tag your images as part of your SEO strategy for two reasons. First, crawlers cannot index images for a search engine. The crawler ‘‘sees’’ the image and moves on to the text on the page. Therefore, something needs to take the place of that image, so the crawler can index it. That’s what the alternative text does. If this text includes your keywords, and the image is near text that also includes the keywords, then you add credibility to your site in the logic of the crawler.

The second reason you want to tag your images as part of your SEO strategy is to take advantage of image-based search engines, such as Google Images. These image-based search engines are relatively new, but they shouldn’t be undervalued. Just as a search engine can find and index your site for users searching the Web, image-based search engines find and index your images. Then, when users perform a search for a specific keyword or phrase, your image is also ranked, along with the text on the pages. Image searches are gaining popularity, so crawlers like the one Google uses for its Google Images search engine will gain momentum, and image searches will add to the amount of web site traffic that your SEO strategies help to build. Conversely, while not discounting the value of images, don’t overuse them on your web pages either. As with any element of a web page, too much of a good thing is not good.

Thanks for reading

Oct 3, 2014

Understanding Web Site Optimization

UNDERSTANDING WEBSITE OPTIMIZATION



Web site optimization is all about creating a site that is discoverable by search engines and search directories. It sounds simple enough, but there are many aspects of site optimization to consider, and not all of them are about the keywords, links, or HTML tagging of your site. Does hosting matter?

This question comes up frequently when a company or individual is designing a web site. Does it matter who hosts your site? The answer is no, but that’s not to say that domain hosting is unimportant. Elements of the hosting have a major impact on how your site ranks in search results.

One of the biggest issues that you’ll face with domain hosting is the location of your hosting company. If you’re in the United States and you purchase a domain that is hosted on a server in England, your search engine rankings will suffer. Geographically, search engine crawlers will read your site as being contradictory to your location. Because many search engines serve up results with some element of geographical location included, this contradiction could be enough to affect your ranking.

The length of time for which you register your domain name could also affect your search engine ranking. Many hackers use throw-away domains, domain names that are registered for no more than a year, because they usually don’t even get to use the domain for a full year before they are shut down. In fact, the typical malicious web site is online for less than four months, and usually for no more than a couple of weeks to a month. For this reason, some search engines have implemented ranking criteria that give priority to domains registered for longer periods. A longer registration also shows a commitment to maintaining the web site.

Domain-naming tips


The question of what to name a web site is always a big one. When selecting a name, most people think in terms of their business name, personal name, or a word or phrase that has meaning for them. What they often don’t consider is how that name will work for the site’s SEO. Does the name have anything at all to do with the site, or is it completely unrelated? Have you ever wondered why a company might be willing to pay millions of dollars for a domain name? The domain name business.com was purchased for $7.5 million in 1999 and was recently thought to be valued at more than $300 million. Casino.com went for $5.5 million and worldwideweb.com sold for $3.5 million. What’s so important about a name?



Choosing the right site name

Where SEO is concerned, the name of your web site is as important as many of the other SEO elements that you need to consider. Try this test. Use your favorite search engine to search for a topic, perhaps ‘‘Money-Making business.’’ When your search results are returned, look at the top five results. Most of the time, a web site containing those words will be returned in those top five results, and it will often be in the number one slot. 
In other words, if your company name is ABC Company but your business is selling Leather bags, consider purchasing the domain name LeatherBags.com, instead of ABC Company. com. 
ABC Company may not get you in the top of search rankings, but the very specific nature of your product probably will; and both the content of your site and your domain name will attract crawlers in the way you want. Using a domain name containing a keyword from your content usually improves your site ranking.
A few more things that you should keep in mind when you’re determining your domain name include the following: 

■ Keep the name as short as possible. Too many characters in a name mean increased potential for misspellings. It also means that your site address will be much harder for users to remember unless it’s something really startling.

■ Avoid dashes, underscores, and other meaningless characters. If the domain name that you want is taken, don’t just add a random number or piece of punctuation to the name in order to ‘‘get close.’’ Close doesn’t count here. Instead, try to find another word that’s relevant and possibly included in the list of keywords you’ll be using. 
For example, instead of purchasing www.yourwebsite2.com, try something like www.yoursitesubject.com.

■ Opt for a .com name whenever possible. There are a lot of domain extensions to choose from, such as info, biz, us, tv, names, and jobs, but if the .com version of your chosen domain name is available, that’s always the best choice. Users tend to think in terms of .com, and any other extension will be harder for them to remember. Com names also tend to receive higher rankings in search engines than web sites using other extensions, so if your competition has www.yoursite.com and you choose to use www.yoursite.biz, chances are good that the competition will rank higher in search results than you. 

Try this: Choose a random term and then use your favorite search engines to search for that term. Looking only at the top one or two pages of search results, how many of those sites have an extension other than .com? If you do see extensions other than .com, they’re likely to be .org, .net, .gov, or .edu—and you probably won’t see many of those. That’s how prevalent .com is, and it illustrates why you should try to use it whenever possible.



Considering URL structures


One more thing to think about as you’re choosing your domain name is how URLs will be structured as you begin to put your site together. Some URLs are very long and seem completely random. For example, take a look at any given product page URL for Amazon.com. If you copy and paste that URL into a document, it could be two or three lines long, and it won’t mean a thing to you after the Amazon.com part.

Ever notice how Amazon.com product pages rarely (if ever) seem to turn up in search rankings? That’s because the pages are dynamic, and a URL that exists on Amazon today may not exist there tomorrow. Dynamic URLs change. Often. And for a variety of reasons. Sometimes dynamic URLs are used on product pages, but they can also be used when content is drawn from a database on a visitor-by-visitor basis or when visitor tracking information is included in the URL.

Typically, search crawlers can’t effectively crawl sites that have dynamic URLs because the crawler can’t trigger the dynamic URL the way a user does. One way to deal with dynamic URLs is to use a program that rewrites them.

URL rewriting is a common practice in SEO, especially since Google stated that it can’t effectively crawl dynamic URLs. Unfortunately, even URL rewriting comes with a set of drawbacks. For example, because even a rewritten dynamic URL tends to be very long, they often wrap — or become two lines — in error messages or when used in blog posts or forums. The result is sometimes an incomplete URL that can’t be followed.

URL rewriting also introduces the possibility for errors, especially if the rewriting is done manually in the coding for a web page. A better option is to use static URLs. Static URLs remain the same all the time. You can see static URLs all over the Web. Even blog posts have a temporary dynamic URL, but then once the post goes into archives, the URL becomes static and doesn’t change again. It helps to more effectively rank web pages that change temporarily and then become permanent. Another advantage of static URLs is that, when used, these URLs can contain keywords that are meaningful not only to search crawlers, but also to the people who visit your web site. Static URLs are easier to read. They usually contain mostly words, with few numbers, and they never include randomly generated identifiers.

As you’re putting your site together, consider how it’s going to grow and how you’ll be naming the pages that you add to it. Part of that consideration is entirely site design and will be determined by the programming language that you use to create your site; but much of it involves forethought about how such matters will be handled. Discuss with your web site designer how you would like to have the URL structure handled. The designer will know how to ensure that your URLs are as usable as the rest of your site.

Again, it’s important to realize that domain naming is only one facet of SEO strategy. It won’t make or break your SEO, but it can have some effect. Therefore, take the time to think about the name you plan to register for your site and then how you plan to structure your URLs as your site grows.

If you can use a name that not only reaches your audience, but also lands you a little higher in search results and makes it easier to create useful URL structures, then by all means purchase it; but if no name really seems to work in the SEO strategy for your site, don’t get discouraged. You can make up for any domain-naming issues by implementing solid keyword strategies, tagging strategies, and other elements of SEO. Do try to keep your URL structure simple, though, even when your domain name might not be your first choice.



Understanding usability

Usability: It means different things to different web site designers. It’s also been at the top of every user’s requirements list since the Web became part of daily life. When users click through to your web site from a search results page, they want the site to work for them. That means they want to be able to find what they’re looking for, to navigate from place to place, and to be able to load pages quickly, without any difficulties.

Web site users are impatient. They don’t like to wait for pages to load, they don’t want to deal with Flash graphics or JavaScript, and they don’t want to be lost. These are all elements of usability — how the user navigates through and uses your web site. And yes, usability has an impact on SEO, especially from the perspective of your site links and loading times. When a search engine crawler comes to your site, it crawls through the site looking at keywords, links, contextual clues, meta and HTML tags, and a whole host of other elements. The crawler moves from page to page, indexing what it finds for inclusion in search results; but if that crawler reaches the first page and can’t get past the fancy Flash you’ve created, or if it
gets into the site and finds links that don’t work or that lead to unexpected locations, it will recognize this and make note of it in the indexed site data. That can damage your search engine rankings.



Navigation knowledge

When you consider web site navigation, there are two types: internal navigation and external navigation. 
Internal navigation involves the links that move users from one page to another on your site. External navigation refers to links that take users away from your page. 

In order for your navigation to be SEO-friendly, you have to use both types of navigation carefully. Look at a number of different high-ranking web sites. How is the navigation of those sites designed? In most cases, you’ll find that the top sites have a left-hand navigation bar that’s often text based, and some have a button-based navigation bar across the top of the page. Few have only buttons down the left side, and all of them have text links somewhere in the landing page.

The reason why the navigation structure for many sites looks the same is because this plan works. Having a text-based navigation bar on the left works for SEO because it enables you to use anchor tags with the keywords you’re using for the site. It also enables crawlers to move from one page to another with ease.

Buttons are harder for crawlers to navigate, and depending on the code in which those buttons are designed, they might be completely invisible to the crawler. That’s why many companies that put button-based links at the top of the page also include a text-based navigation bar on the left. The crawler can still move from page to page, but the user is happy with the design of the site.

The other elements that appear on nearly every page are text-based links within the content of the page. Again, those links are usually created with anchor tags that include the keywords the site is using to build site ranking. This is an effective way to gain site ranking. The crawler comes into the site, examines the linking system, examines the content of the page, compares these items, and finds that the links are relevant to the content, which is relevant to the keywords. That’s how your ranking is determined. Every element works together. 
Take the time to design a navigational structure that’s not only comfortable for your users, but also crawler-friendly. If it can’t always be perfect for the crawlers, make sure it’s perfect for users. Again, SEO is influenced by many different factors, but return visits from users are the ultimate goal. This may mean that you have to test your site structure and navigation with a user group and change it a few times before you find a method that works both for returning users and for the crawlers that help to bring you new users. Do those tests. That’s the only way you’ll learn what works.


Usability considerations

It’s not always possible to please both your site users and the crawlers that determine your page ranking. It is possible, however, to work around problems. Of course, the needs of users come first because once you get them to your site you want them to come back. On the Internet, it’s extremely easy for users to surf away from your site and never look back — and returning visits can make or break your site. The catch is that in order to build returning visitors, you have to build new visitors, which is the purpose of SEO. That means you need search engines to take notice of your site.

When it seems that users’ preferences are contrary to crawlers’ preferences, there is a solution: a site map. There are two types of which you should be aware. 

A basic site map is an overview of the navigational structure of your web site. 

It’s usually text based, and it’s nothing more than an overview that includes links to all the pages on your web site. Crawlers love site maps. You should, too.

A site map enables you to outline the navigational structure of your web site, down to the second or third level of depth, using text-based links that should include anchors and keywords. 

An example of a site map for the Work.com web site is shown in Figure 4-5. When a site map exists on your web page, a search engine crawler can locate the map and then crawl all the pages that are linked from it. All those pages are then included in the search engine index and will appear on search engine results pages. Where they appear on those SERPs is determined by how well the SEO is done for each individual page.

A second type of site map, the XML site map, is different from what you think of as a site map in both form and function. An XML site map is a file that lists all the URLs for a web site. This file is usually not seen by site visitors, only by the crawlers that index your site. A site map enables you to include links to all of your pages, two to three levels deep, that include keywords and anchor tags.

Thanks For reading