Wednesday, February 23, 2011

Ranking High in Google


How To get your website to rank high in Google. Hopefully this helps!
A good site does not always mean good layout, navigability and loading speed. In order for a site to be excellent, indeed it must contain the above stated qualities but it also needs to be easily found and accessed by the general public. And what better way to put your site at the disposal of billions of people over the Internet, if not by ranking at the top of the available search engines, especially their crowned king, Google.
The following rules are sometimes common for all search engines, but I will try to concentrate on the best and most popular search engine, namely Google. In order to understand Google better, let’s look at a small history of the famous company.
Google’s first appearance on the rough “war-theater” of search engines was in November 1997, when it entered in direct competition with the “big boys” of that time, Lycos, AltaVista and HotBot. Over the next few years, the popularity meter favors Google more and more, especially for it’s quick search methods and amazing technology. After gaining the position of providing secondary results to Yahoo from Inktomi in 2000, Google had a fast-paced ascent. Nowadays, Google searches through over 8 billion pages and serves 70 million searches everyday. Think it’s tough having a web page at the top of Google in these circumstances? Well it is, but if you follow the guides provided in this article you’re bound to have phenomenal success making your site popular.
Search Engine Ranking – General Tips
Following, are some general SEO tips that usually work for all search engines. Most of them include HTML editing, so it would be best if you knew the basics of HTML before trying to optimize your site.
Text versus images – because of the headache text and font causes for websites (because of the different browsers that have different fonts, MAC and PC differences, monitor sizes, etc.), a lot of web-designers fall in the trap of using images containing the text instead of pure formatted text inside the HTML code. Although it may provide a quick solution to layout problems, this practice causes two other major disadvantages, namely increased sized and the inability of search engines to look through your site (search engines look inside the HTML code for text, they cannot scan pictures). Therefore, it’s always better to work a little harder on finding the right text size and font, rather than using images.
Titles – whenever going through a page, search engines look at its title first, so make sure the page’s title contains the keywords people will most likely look for when searching. For example if your site is for a travel company, make sure you have words like “travel”, “vacation”, “holiday”, etc. Designers have a tendency to title their pages in an extremely simple and inefficient manner, like Homepage, Contact, Shop, etc. Titles should contain whole sentences with a high keyword density, giving the search engine more “tips” on finding your site. To insert a title to a web page, edit the HTML with any editor (Notepad would do), and place the title between the title tags in the head of the web page.
Keywords – Keywords are clues that search engines process and then rank your site according to keyword density. Keywords should be used both inside the site and in the head section of the HTML code. To place keywords in the code, you need to insert the following tag inside the site’s <head> section: <Meta keywords content= “insert your keywords here, separated by commas; you can place as many keywords as you want”>. For example, if you have built a site for a company that sells cars, you should place common words like “car”, “ABS”, “TDI”, “4WD”, “car prices”, “low car prices”, “Mercedes”, “Audi”, etc. Also, seeing how you can place as many keywords as you like, you should place misspelled words in the Meta tag, in case the person searching misspelled. For the above example, you could add “Mercedece”, “Audy”, etc. It might sound funny but you have no idea how many people misspell their searches and if you are able to think ahead on how people misspell words, you are bound to be the only (or one of the few) site owner that has a link on Google for those words, therefore dramatically increasing the chance for your site to get accessed.
The words in the Meta tags do not show up on a site. Also try to use the correct keywords in a high density throughout your site. The more instances of a keyword a search engine finds while scanning your site, the better the rank. If your page is optimized for the right keywords, the number of total visitors can double or triple. This comes as a complementary measure to take, besides a good internal and external linking structure, in order to improve the chances of your site being found on Google. Many websites fail exactly in this field, as their keywords don’t intersect with the searches of their potential visitors. The text within the body of your page – the visible text contained between the <Body> tag and the </Body> tag, should be roughly somewhere between 300 and 500 words, in which your key phrase should be used a couple of times and keywords should have a density of about 5% of the text.
Invisible misspelled keywords (not recommended, but it’s a widely used method – take notice that it could be dangerous to use cloaking text techniques, as some search engines may flag, or even ban your site for this) – Another tricky way to use misspelled keywords on your site, is to put all the wrong variants of your keywords in a couple of rows somewhere on your site, where it’s not likely to be seen at all (above the disclaimer, or somewhere at the top, just above the header), and change their color to the exact color of the background (if you have a picture as the background, you can’t use this method). The problem would be that if a visitor selects that text by mistake, he would see a few rows off misspelled words on your site, which undoubtedly gives a minus to your professional image. For example, I’m going to write a piece of white text here, try to select it: &133; this doesn’t look very nice, doesn’t it?. Now, depending on the browser you use, those words were highlighted and you can see their content, or they are permanently hidden and you can’t see what I wrote, it seems like you just selected blanks. For those curious readers that can’t see it, I wrote “: &133; this doesn’t look very nice, doesn’t it?..”. Again I do not recommend using this technique as it might be risky, but it’s as important to know what you SHOULDN’T do, as it is to know what you SHOULD do, sometimes.
Headers – Search Engines (and Google especially) take headers into consideration when ranking your site. The header tags go from <h1> to <h6> and are used to more easily split your website’s text into fragments. The biggest size is for <h1>, decreasing to smaller headers when using the following tags, until <h6> which is the smallest. You have a higher chance of rating higher if you use your site’s keywords or key phrases inside headers, especially the first two header tags <h1> and <h2>.
Meta Robot tags – a rarely used meta tag (or at least you can rarely find information about it in articles like this one), this meta tag “tells” a specific robot to index your page, somewhat increasing the probability and speed at which you will get indexed. Since adding it involves little additional effort, you might as well do it while you add the other tags. The tag shows a search engine the way to index your page and follow the links on it. The tag looks like this: <Meta name=”robots” content=”index, follow” />. In case you want to do the opposite and not let search bots crawl your site, replace “index” with “noindex”, or if you don’t want to let them follow your links, replace “follow” with “nofollow”.
Other HTML SEO tricks – There are a few more HTML tricks that send your site up the ranks. Using frames and pop-ups increase your rating on many search engines. The code for these HTML objects is quite long and complicated to explain here (as it is not the purpose of the article), but you can easily create framesets and pop-ups using a web-design program like Macromedia DreamWeaver or Microsoft FrontPage. Another tip would be the use of “alt” tags on images. The “alt” tag is the text you see sometimes when holding your mouse over a picture (most of the times a description of the picture). To use the tag, find the image source of a picture (e.g.:<img src=”picture.gif”) and insert the following attribute: alt=”description of picture”. It should look like this: <img src=”picture.gif” alt=”picture description” &133; >. This should make a small box, containing the “picture description” line, appear over the image if you hover the mouse above it. Search engines also look for these alt lines to configure your rank.
Google’s PageRank
Ranking high on Google is not all about HTML tricks – you need to create an attractive and useful page, something that people will want to come back to. If other web site administrators like your site they might post links to it on their own pages. All the external links your web page gets are analyzed and computed by Google through a process called PageRank, an innovative search algorithm that processes text information from their web page database a lot faster than other search engines. The higher you page rank, the closer you are to being in the top search results. A good rank can send your page somewhere in the first 50-100 search results shown by the search engine in a particular field. PageRank ranges from 0 t o10, with pages between 7-10 being considered high ranking in Google’s search engine.
Basically, PageRank works like this: using a mathematical formula, it calculates the number of links your site has to other web pages and their importance. It’s like a vote casting process, each link that appears on another site means a good vote and the importance of the vote is determined by the importance of the site that gives it. In Google’s own words (taken from Google’s FAQ):
“Google looks at more than the sheer volume of votes, or links a page receives; it also analyzes the page that casts the vote. Votes cast by pages that are themselves “important” weigh more heavily and help to make other pages “important.”
Below you will find the actual PageRank formula used by Google to calculate your web page rank. The amount of your PageRank, which you can pass on when you cast a vote (create a link to a different site than yours), is called a damping factor. The damping factor is known to be .85, and this is a little less then the linking pages own PR.
PR(A) = (1-d) + d(PR(t1)/C(t1) + … + PR(tn)/C(tn))
PR(A) is the PageRank boost your page A will get after being linked from someone else’s site (t1). PR(t1) is the page rank of the site which links to you and C(t1) is the amount of total links that (t1) has. You should keep in mind that a page’s voting power is just .85 of that page’s actual PR and this gets spread out evenly between the sites it links to. The democratic process of ranking pages that Google uses doesn’t reduce the value of a pages rank when it casts a vote on another page. There is no way of transferring PageRank, so the action of linking to another page won’t diminish your own rank. Fortunately for us, we don’t have to calculate our Page Rank with the above formula, Google offers us an easier method. Your page rank appears in the Google toolbar, where it is constantly refreshed.
You should keep in mind that Google values a link by the ranking of the page it appears on, not the ranking of the site’s home page. A link from a popular home page will increase your own ranking more than a link from the same site’s secondary pages. If you want to see how many links there are to your existing web page, go to Google and type “link” in the search box, followed by your page’s URL. This will give you a better idea where you stand in the Google ranking system. If the inbound links are few in numbers and come from unpopular sites you probably have a zero or very low PageRank rating. Studies have shown that the quality of the link is just as important as the number of incoming links to your site. If you get a link from sites like MSN or Yahoo, your website will grow in Google’s rankings, as you are now linked to very respected sites.
Another technique in finding high quality links is by searching for sites that offer site submissions. You can use the following lines in Google:<br>
“add URL” “your keyword phrase”
Also try replacing, “add URL” with any other of these search phrases:<br>
“add a site”, “add a link”, “submit URL”, “submit link”, “submit an URL”, “submit a link”.
The keyword being used to link to your site also plays a crucial role. When asking a site owner to link to your site make sure that you ask them to link with the keyword that you are optimizing that page for. A high percentage of the #1 web pages on Google result pages used the queried keyword in <a> link texts. Nearly one out of four web pages is a very high number so it looks like Google ranks web pages higher if they contain keywords in their <a> link texts.
Linking, in addition to keyword density, should be the main concern of every webmaster that wants his site in a high Google rank. We will discuss linking options later on in the article.
Googlebots
Googlebots are small programs running on Google’s servers that continuously browse the Internet indexing web sites in Google’s main database. The Googlebots scan the cache of your web page and send the data back to the database. The easiest way to “attract” Googlebots to index your site is if you have a site map. The site map is probably one of the most important pages that the Googlebots can turn to, as the links contained their point back to your own web pages. This creates a trail that Google will use to trace the rest of the links to your site. You can find out if your site was indexed by the Googlebots, with the help of the Google toolbar. Check your Google toolbar to see if your page has been indexed. If the ranking value is zero, you should probably revise your web site’s design so that the Googlebots can find it (eliminate some of the Flash and see if that helps).
Another way to attract search engine bots is to have a small sized website. Keeping your website under 101kb is sure to attract Googlebots, if it’s much larger, there is a chance Googlebots will avoid it, as Google’s cache doesn’t go higher than 101kb. This is, of course, a general guideline, but keeping your site small you have several benefits besides that of easier attracting Googlebots: it will load easier, making it better for human navigation, it will be easier to manage and upload and it will be a lot easier to promote.
Besides the initial Googlebots launched by the search engine a few years ago, it seems Google has begun using another spider in their scanning and indexing of web sites. News of a second Googlebot was discovered by a number of site owners who, while studying their site logs, noticed two Google spiders; with different IP address ranges; visited and scanned their respective sites.
SEO – Linking
As we noticed earlier, the most important factor for getting a higher rank in the PageRank system is to have a complex and dynamic linking process (internal and external). There are several ways of making your link popular and having it appear on other sites:
Paying a specialized company – Search Engine Optimization (SEO) Companies are a good option if you are willing to hand out a fair amount of money to rank higher in Google or other search engines, or if you find it to hard to take all the advices in this article into consideration for your site. This option could sometimes be a tricky deal, as SEO is always an unstable terrain and you might not get the desired result.
Link-Exchange -The most simplistic method after paying for the optimization is plain link-exchange. Basically it consists of two webmasters owning different sites, exchanging links and placing the counter-part’s link on their own site, thus increasing their rank reciprocally. You probably won’t have a lot of success if your site has a daily average of 10 visitors and you ask the Yahoo webmaster to exchange links, but it’s always a good thing to start low, exchanging links with similar websites (similar as in content and popularity). In time you will have your link spread in different places over the Internet and you can move up to exchanging links with more serious and popular partners.
Open directory listings – ODLs are special websites that index and sort other sites by category and content. The most important ODL is without a doubt the DMOZ Open Directory Project, where you can register your site for free and it will be checked for content. If the site is approved, it will be indexed in a category (for example “Commerce” or “Web Design”, etc.) where there is a chance someone will find your link and click it, or add it to their site.
All the people working at DMOZ are volunteers and their mission is to sort the good pages from the bad. They present the filtered pages back to the general public, focusing on high quality content for each site they review.
The information collected by DMOZ may be used by thousands of sites, and some of them may link to you. All these links score very low in Google’s PageRank but they are there and their numbers might add up to something that counts.
Post your link everywhere! – Whenever you have the chance, give out your link. If you spend time on a forum, place your link in your signature, if you write an article for a site, place your site’s link when signing it, etc. This is not only a good way to get your site clicked and get it popular, it also raises the chance of your site being linked to another one. Think about it, you’re posting on a forum with 2000 registered people, some of which are bound to have websites of their own or are managing someone else’s website.
Internal linking system – Use a well-balanced internal linking system. You can cross-link your most important pages in your site, so that the Google search engine sees them as relevant. If your website has hundreds of pages, the cross linking system makes it a lot easier for the Googlebots to index your pages quicker. You can also benefit from the use of the increased amount of keywords contained in the text links.
Google Dance
The “Google Dance” is the name given to Google’s updating processes when, for a short amount of time (hours usually) the ranks and the position of your site in the search engine “dance” up and down. The movement in the PageRank system is caused by the impossibility to update the information from over 10 000 servers simultaneously. Therefore, during the process, there is a chance that a lower ranked site was updated before a higher-ranked one, and until the latter will be updated, they will be bound to change places.
Up until 2003, updates were performed once a month, and during a few hours all ranks were distorted severely. Therefore, in order to avoid chaos inside their search engine, Google managed to find a way to continuously update its search engine, not only not causing the severe differences in rank anymore (they still occur today, but at a lower standard) but also making it possible to update more often, as the continuous update prevents any abrupt variations in rank. In case you see your page shift up and down the ranks in small intervals (maybe minutes or hours), you are probably just experiencing a Google Dance.
Google’s Own SEO Advice
Here are some quotes from the Google Guidelines found on the search engine’s website , regarding what you should do and what you shouldn’t do if you want to rank higher on Google’s PageRank.
“Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link” – this helps Googlebots index your page more easily. Not only that, but it gives readers a cleaner way of navigating. So by respecting this rule, you hit two targets with one shot.
“Offer a site map to your users with links that point to the important parts of your site. If the site map is larger than 100 or so links, you may want to break the site map into separate pages” – again an advice that is useful both for human browsers of your page and for spider bots. The sitemap is a necessity for every major site around, since some can be so complex that visitors literally loose themselves on the site. It’s also the first place a spider bot looks, making it a valuable asset in the ranking process.
“Create a useful, information-rich site and write pages that clearly and accurately describe your content” – the importance of clear text is major for any site that handles a good amount of visitors. If you make your visitors loose interest by using spamming methods, un-organized text, etc. you risk dropping in the ranks, as visitor numbers are also calculated in the ranking system. The general rule for any site is that if the visitors like it and find it useful the search engine will do that too.
“Think about the words users would type to find your pages, and make sure that your site actually includes those words within it” – The relevancy of the keywords you use is crucial, so take some time to put together a list of what you think relates to your site.
“Try to use text instead of images to display important names, content, or links. The Google crawler doesn’t recognize text contained in images” – We showed the importance of using text over images in the “General Tips” chapter. In some cases it seems almost critical to use an image rather than a text (for banners or site headers for example). You can however do a small trick that let’s you have a cool image as a banner and have the company name or other text, outside the image. If you have a pre-made banner with your company name inside, try to edit the picture and remove the text, than make a small table in the website (where the header used to be) of the exact same width and height the new empty banner has. Then place the empty banner as the background of the table and you can write whatever text you need over it. Just make sure the text is in the same position as it was before you removed it from the image. This works perfectly for keeping the same site design as you initially wanted and using text for the spider bots at the same time.
“Make sure that your TITLE and ALT tags are descriptive and accurate”. Don’t mislead the Googlebots and provide them with accurate descriptions of your page. Google doesn’t like to be tricked, so it’s a good advice to play by the rules in order to avoid being flagged, or worse, being banned. The “misleading” rule applies not only to Title and Alt tags, but also to accurate keywords, keyword spamming, invisible java text with misleading information, etc.
“Check for broken links and correct HTML”. Not only that in the visitor’s eyes, a broken link is a signal of negligence, but Googlebots or other crawler type bots can get mislead when indexing a page with broken links.
“If you decide to use dynamic pages (i.e., the URL contains a ‘?’ character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them small”. – The quote speaks for itself. Try to reduce dynamic code as much as you can.
“Keep the links on a given page to a reasonable number (fewer than 100)”. – Again, this advice refers to the Googlebots’ ability to quickly index a page. If a page is too complex or has an unorganized structure, not only it will take a lot more time for the bots to crawl through the site, they might sometimes have problems, getting stuck between links.
“Avoid hidden text or hidden links and don’t employ cloaking or sneaky redirects” – like I already stated above, Google hates to be tricked or mislead. The use of java cloaking or invisible text columns (with irrelevant information, used just to obtain a higher rank or to get more visitors to the site) is strictly forbidden by Google, using this techniques at a large scale, might even get your site banned from Google’s search engine (meaning your site will never show up on Google, no matter what your PageRank score would be).
“Don’t load pages with irrelevant words” – same as above; using irrelevant text and keywords or key phrases only to trick visitors to come to your site is combated by Google. The strictest measures regarding this matter apply to pornography sites, which use irrelevant words to attract viewers to their page, possibly attracting children.
“Don’t create multiple pages with substantially duplicate content”. Google sees this as another form of spamming. Don’t worry if you copy a few paragraphs and insert them to another link if that makes sense, just don’t duplicate whole pages for the sake of having more internal links and more keywords.
“Avoid doorwa pages created just for search engines, or other cookie cutter approaches such as affiliate programs with little or no original content” – another “DON’T” on Google’s list. Doorway pages break pretty much all the above rules, being pages flooded with keywords, irrelevant texts, cloaked text, duplicate content, etc.
“Make sure your web server supports the If-Modified-Since HTTP header” – This feature allows your web server to tell Google whether your content has changed since it has last crawled your site. Supporting this feature saves you bandwidth and overhead.
“Google may respond negatively to misleading practices like tricking users by registering misspellings of well-known web site” – well this rule could not only upset Google, but local or international authorities too…Misspelling of well-known sites or brands, is common on the Internet, so you can see web-sites marketing “PLIMA” sneakers, “Raebok” tennis rackets and the examples could go on and on. Misleading logos or banners are also common. The best example I found regarding this matter, was of a search engine named “Gogle”, which claimed to be the world’s most popular search engine (I didn’t include the link, for your own protection, as the site was crawling with Trojan viruses).

0 comments:

Post a Comment