Hello team welcome to my Blog and post your views, updates, articles related to SEO

Saturday, March 25, 2006

Search Indexing Robots and Robots.txt

Search engine robots will check a special file in the root of each server called robots.txt, which is, as you may guess, a plain text file (not HTML). Robots.txt implements the Robots Exclusion Protocol, which allows the web site administrator to define what parts of the site are off-limits to specific robot user agent names. Web administrators can disallow access to cgi, private and temporary directories, for example, because they do not want pages in those areas indexed.
The syntax of this file is obscure to most of us: it tells robots not to look at pages which have certain paths in their URLs. Each section includes the name of the user agent (robot) and the paths it may not follow. There is no way to allow a specific directory, or to specify a kind of file. You should remember that robots may access any directory path in a URL which is not explicitly disallowed in this file: everything not forbidden is OK.
The three most common items you will find in a robots.txt file are:
1. allow
2. disallow
3. wildcard or asterisk: "*"

If you want to know the more details about Robots.txt, please log on to following URL:

Google Hacking

Google hacking is the use of a search engine, such as Google, to locate a security vulnerability on the Internet. There are generally two types of vulnerabilities to be found on the Web: software vulnerabilities and misconfigurations.
Although there are some sophisticated intruders who target a specific system and try to discover vulnerabilities that will allow them access, the vast majority of intruders start out with a specific software vulnerability or common user misconfiguration that they already know how to exploit, and simply try to find or scan for systems that have this vulnerability. Google is of limited use to the first attacker, but invaluable to the second.

How can you prevent Google hacking?
Make sure you are comfortable with sharing everything in your public Web folder with the whole world, because Google will share it, whether you like it or not. Also, in order to prevent attackers from easily figuring out what server software you are running, change the default error messages and other identifiers. Often, when a "404 Not Found" error is detected, servers will return a page like that says something like:
Not FoundThe requested URL /cgi-bin/xxxxxx was not found on this server.Apache/1.3.27 Server at your web site Port 80
The only information that the legimitate user really needs is a message that says "Page Not found." Restricting the other information will prevent your page from turning up in an attacker's search for a specific flavor of server.
Google periodically purges it's cache, but until then your sensitive files are still being offered to the public. If you realize that the search engine has cached files that you want to be unavailable to be viewed you can go to ( http://www.google.com/remove.html ) and follow the instructions on how to remove your page, or parts of your page, from their database.

Friday, March 24, 2006

Search Engine Optimization Errors

1. No incoming links - Google ranks webpages in part using their formula called PageRank, which is based on the number and quality of incoming links pointing to a page. This principle of rating websites by the number of other sites that link to it is often called Link Popularity, and all of the major search engines rely on this measure when they rank websites in search results. If you have no links from other websites pointing to pages on your website, you will never achieve high rankings in Google, Yahoo! or the other search engines. The most common way of obtaining links is by inviting the webmasters of other websites to exchange links. Naturally, you will have to have created a page on your site to fulfill your part of the link exchange agreement. Other methods include registering your website with one of the many directory websites. Links from blogs, guestbooks, forums, websites' automated links pages, and other low-ranked webpages are of little or no help in this regard. This is the reason I created my Preferred Links system - to create a way to exchange high-quality links. Its simple and free. All you have to do is to comply with 7 simple rules in exchanging links with me that insures you'll be giving a quality link to all of your link partners. You probably already comply with these rules, so sign up today and you'll get a link from me with some real search engine punch.

2. Poor choice of title - Search engine ranking systems give the title tag a great deal of weight in ranking. Don't waste the title by just using your just website name, URL, or some silly phrase that amuses you. Your two or three most important keywords should be in the title of your main page. The title tags on your interior pages should similarly focus on the keywords for ach page. By the same token, don't overstuff the title with every conceivable keyword for the page. Moderation in all things, as the saying goes.

3. Little or no presence of keywords on the site's main page - Search engine ranking software is completely dependent on the text it finds on your webpages. How is a search engine supposed to know the topic of your site if you don't set it out in plain text? A search engine will only find the keywords you feed to it on your webpages. Add emphasis to your keywords by enclosing them in headline and strong tags. I've exaggerated the technique on this page to demonstrate what I mean. A page that is comprised solely of graphics or Flash animation is crippling its search engine potential. Feed those spiders some juicy text! Even a single keyword-rich sentence at the bottom of the page is better than no text at all!

4. tag abuse - A few years ago, search engine ranking systems gave a good deal of weight to the tag, and so every computer book author told his readers this was the secret of search engine optimization. Well, that ship has sailed, and all but one of the major search engine ranking systems now ignore the keywords tag... unless you overstuff it, in which case you could get penalized for SPAMming. Google still ignores all tags for ranking purposes. Yahoo! has retained many of the tendencies of its Inktomi search technology and does give some weight to the keywords and description tag, but again, its not a major factor. Overall, keep your tags short and sweet.

5. Invalid HTML - Writing code that is in compliance with the World Wide Web Consortium's standards means your site is easily read by search engine spiders, easier to maintain, and cross-browser compatible. This is especially important for websites that receive government funding, because the federal government requires compliance with the Americans With Disabilities Act, which in turn means your site has to be compatible with devices that allow the visually impaired to browse the Internet. Using code that is proprietary to Microsoft Internet Explorer is not going to cut it for such sites, so you need to know what is and isn't in the current W3C standards. You can check your webpages' compliance with the W3C's HTML standards using the W3C's newly-improved HTML Validator. Not only will your pages get ranked properly, but you won't have to worry that some users won't see your pages the way you intended just because they don't use the same browser that you do. Don't forget that not everyone surfs the Web with a computer. Cell phones, PDAs, and other devices are Web-enabled now and they don't rely on Microsoft Internet Explorer to display webpages.

6. Dead links/Bad Links - Be sure all of the links on your pages are valid. Do not link to link farms, web rings, or other schemes designed to fool the search engines. Also watch for server problems. Inaccessible pages are liable to be removed from the search engine index. Avoid redirects that rely on JavaScript or tags. These can be considered as "doorway" pages, which violate nearly all search engine guidelines. There are several automated link-checking tools available online. The W3C offers a good FREE link checker. Links that include User ID numbers, session ID numbers, and long query strings (ie. the text following the "?") in the URL are ignored by Google and may cause problems in all search engines.

7. Poor spelling and grammar - Misspell your keywords and you're negating your search engine optimization efforts. But if you misspell common words, people may lose respect for you and your company. For example, the contraction for the two words "you" and "are" is "you're," not "your." And there is no such word as "alot." You can allot blame for this between parents and schools, but it takes a lot of hubris not to thoroughly proofread your writing. I went to Catholic grade school and the nuns had us diagramming sentences for 5 of those 8 long years. I hated it then, but I'm very grateful now because reasonably good grammar comes to me naturally as I write. The sole exception is to include some common variations of your keywords, if not outright misspellings, in the text of your interior pages in order to snare the search engine traffic from user errors. If you employ this strategy, its a good idea to make it clear that your usage of the alternate spelling is intentional.

Tuesday, March 14, 2006

Data providers

Thursday, March 09, 2006

Dmoz Directory


The ODP (OPEN DIRECTORY PROJECT) is the most popular human edited web directory. The human editor re a search engine. But they give search query, the main purpose is to list and categorize the websites. views all submissions and categorizes the websites in appropriate category. ODP is commonly called Dmoz directory. Dmoz is hosted and administered by Netscape Communication Corporation. The ODP gives services for most popular search engine on the web like AOL search, Netscape search, HotBot, Lycos, Direct hit and others.
In ODP 100% free for submitting site and using data. Anyone can easily download and use ODP data without any cost. Also there is no cost for associated with listing and submitting sites.


  • Search engine sends out a spider that is a program which follows links and capturing text detail about the web pages.
  • In the case of directory the human editors visit the sites and listing the site in the appropriate category. The main purpose of the ODP is to list and categorize web sites.
  • Directory does not give rank, promote or optimize sites for search engines. Simply they are data provider.

How to submit website in dmoz?

  • http://dmoz.org/
  • Check your site already listed or not.
  • Select appropriate category
  • Click suggest URL
  • Submit the detail Title, Description, URL, E-mail

Saturday, March 04, 2006

Welcome to Google Office..

Google logo beside the lift.

Googleplex..... lobby.

New Googleplex in London

Big G in Google Office
Multicolor Google gadget pen.

Google Grown as Mountain
I am also using Google Search Engine

Thursday, March 02, 2006

Always Busy

Busy on searching

Google Search: more, more, more....!!!

Alerts Receive news and search results via email
Answers Ask a question, set a price, get an answer
Blog Search Find blogs on your favorite topics
Book Search Search the full text of books
Catalogs Search and browse mail-order catalogs
Directory Browse the web by topic
Froogle Shop smarter with Google
Groups Create mailing lists and discussion groups
Labs Try out new Google products
News Search thousands of news stories
Video Search TV programs and videos
Blogger Express yourself online
Code Download APIs and open source code
Desktop Info when you want it, right on your desktop
Earth Explore the world from your PC
Pack A free collection of essential software
Picasa Find, edit and share your photos
Talk IM and call your friends through your computer
Translate View web pages in other languages

more in http://www.google.com/intl/en/options/

Dotcom-SEO: Internet Marketing and SEO

Google Adsense

Time limit for AdSense referrals earnings is now 180 days
You may have noticed today that the time limit for AdSense referrals is now 180 days. Based on the feedback we heard, we agreed that 90 days may not provide enough time for your referred publishers to complete earning $100. Therefore, we decided to double the window. This change is retroactive, so it will also apply to AdSense signups that occurred more than 90 days but less than 180 days ago.