Search engines prefer pages with at least 250 characters.
Use the keywords a user will use to find your website or web pages, not the keywords you want your content to be found on. Use words on your page that resonate with your target audience and are descriptive of your site. For example, consider using the search term "car sales" rather than "vehicle auctions". Get inside the mind of the user.
Map keywords and phrases to their implicit intent. How are the keywords related to how the visitor is trying to solve their problem? How do the keywords and phrases relate to what stage visitors are in their seeking process? What would the visitor consider a success based upon the keyword? Use these keywords to plan internal hyperlinks that provide the most relevant and persuasive content.
On your home page, select phrases that describe the general theme of the site, but don't try to cover everything on your home page.
For your site's internal pages, identify the most important subject of that page and pick words that are specific to that subject.
Hint: Looking for a key phrase you want to target? Search for it in Google, MSN, and Yahoo! and see what results come up. If the results aren't relevant to your product or service, it's not a key phrase you want to target-no matter how many hits it gets!
How to Determine the Best Keywords?
Look at Your Log Files. Log files can be a good way to determine what your audience is searching for and what keywords you should use to draw users to your content.
Capture the "exact phrase" that searchers entered.
Give insight into the number of words searchers are using.
Provide a rich source of keyword data. Remember, log files only show the keywords that have brought users to our site. They don't show you the unsuccessful keywords that didn't bring users to your site.
Target Variations of Your Keywords. Be sure you target your content to address the following:
Slang, acronyms, and abbreviations
Evaluate Your Keywords: Keywords may have multiple meanings. "Accessibility" might mean "handicapped access" or "website monitoring." "Chips" may refer to the snack food or computer part. Pay attention to keywords used alone or combined with another word. For example, "passport" by itself is likely a search for information; "passport" searched with a location is likely a search for services. Keep this in mind when choosing keywords for a page.
Where to Put Your Keywords Major Search Engines Emphasize Text Found In:
Text in the H1 and H2 tag
In Paragraphs and General Text on The Site
In STRONG tags: Keyword
In the File Names of The Web Document: www.domain.com/keyword.html
ALT Description Attributes on Image Tags
Text Placed in and Around Hypertext Links
TITLE Attributes on Anchor Tags
SUMMARY Attributes on Tables
Most commercial search engines don't use meta-tag content to determine relevancy, but meta-tag content can be very useful in your own site's search (enterprise search).
Optional and automatically excluded words Google usually returns pages that use all of the words you search for. However, if some of the words in your search don't appear on the best pages we find, we'll also consider pages that don't include them. For example, when you search for recipe for a cheese souffle, we might return recipes that don't happen to include the words "for" or "a".
Very common words (often called "stopwords"), such as "the", "and", or "of", are usually dropped from searches because they typically don't convey much information compared to the other words in a search. We might also treat words as optional if they're redundant given the other words in your search. For example, in UV sun protective swimwear, requiring "UV" to appear might exclude high quality pages, so we may exclude "UV" in compiling your results.
Even when words are treated as optional, they're still taken into account in assessing how relevant a page is to your query. For example, Google shows different results for University of Pennsylvania than we do for University in Pennsylvania.
Generally, excluding common words allows us to return better search results. If one of these words is important to your search, you can precede it with a plus sign "+" to ensure that Google requires it to appear in every search result. So, for example, a search for +The Red Violin will return only results that include the word "the".
Alternate words Google usually returns pages that use all of the words you included in your search. Sometimes, however, we'll consider other words as substitutes if we think that doing so will improve the results we show you. For example, if you search for dance marathons, Google's results might include pages that talk about a dance marathon. On the result pages, we'll highlight occurrences of both the original and alternate search terms that appear in titles and snippets.
There are several ways Google identifies alternate words: Stemming finds alternate forms of a word, such as singular or plural variations. Synonyms can help someone searching for UC Berkeley law school find pages that mention Boalt law school. Abbreviations expand search terms so that rc model airplanes might also find pages about radio control model airplanes. Words might be combined or split so that we return pages about organic dog food when you enter organic dogfood. Because it's often easier to type words without accents, a search for a coup d'etat might return pages that talk about a coup d'etat.
Usually, the alternate words we add to your results will help your search, but we understand that in some cases you want to restrict your search to precisely the terms you enter. In that case, you can precede a word with a plus sign "+" to tell Google you're looking for that exact term. So, for example, if you search for dance +marathons, we'll only return pages that are talking about more than one.
Phrase searches Sometimes you'll only want results that include an exact phrase. In this case, simply put quotation marks around your search terms."The long and winding road" Phrase searches are particularly effective if you're searching for proper names ("George Washington"), lyrics ("the long and winding road"), or other famous phrases ("This was their finest hour").
Negative terms If your search term has more than one meaning (bass, for example, could refer to fishing or music) you can focus your search by putting a minus sign ("-") in front of words related to the meaning you want to avoid. For example, here's how you'd find pages about bass-heavy lakes, but not bass-heavy music: "bass -music"
Note: when you include a negative term in your search, be sure to include a space before the minus sign.
Link Development Tips: Link development is essential to a successful SEO campaign. The main disadvantage is it takes time, whereas paid search advertising (in Google, Overture, Find What, etc.) results are nearly instantaneous.Link development's main advantage is dynamic, cumulative, and difficult to imitate. Many sites maintain search engine visibility and the resulting qualified traffic because of successful link development, not just based on the number of keywords on a page.
Some Link Development Tactics we Commonly Use: Choose quality over quantity. Link quality carries more weight than quantity. Spend time getting the highest quality links pointing to your site. One of search engine spammers' biggest sales pitches is, "Get millions of links to your site." Don't fall for that arcane, useless pitch.
Begin with Web Directories: Yahoo Directory and Business.com are two reliable places for high-quality links. Both require annual submission fees. If their links don't positively affect your site's link development, don't renew.
Harness Online Publicity: How-to tips, helpful articles, even useful press releases often garner links from other Web sites. Publicity is usually part of a company's overall marketing plan, so harnessing these resources for link development can be a simple task.
Use Blogs and Forums Wisely: Blogs and forums can call attention to useful information on your own site.
Use Search Engines to Research Link Development: Look at competitors' sites to determine their link development strategy. It can help you with your own. What newspapers and media outlets do they use for online publicity? What Web directories link to their sites? No link popularity checker ("link:domain.com" in Google, "linkdomain:domain.com" in Yahoo) can substitute for doing the research yourself.
The building blocks of organic free search engine optimization include:
1. Make good use of keywords. For users to find your web pages on your own site's search engine and in commercial search engines like Google, pages must contain keyword phrases that match the phrases your users type into search boxes.
2. Have effective site architecture. You can develop a good site architecture that will help users easily understand the structure of your site, navigate the content, and succeed using your search engine. A few, simple navigation and coding tips can help you do that.
3. Have a process for indexing your site(using robots. text files).
4. Ensure quality links and link popularity. The last basic of successful search engine optimization is link popularity. That's the number and quality of links that point to your website. Link quality(links from popular, highly trafficked, or respected sites) carries far more weight than link quantity.
To enter a query, type in a few descriptive words and press Enter (or click the Search button) for a list of relevant web pages. Since Google only returns web pages that contain all the words in your query, refining or narrowing your search is as simple as adding more words to the search terms you have already entered. Your new query will return a smaller subset of the pages Google found for your original "too-broad" query.
Choosing Keywords For best results, it's important to choose your keywords wisely. Keep these tips in mind:
Try the obvious first. If you're looking for information on Picasso, enter "Picasso" rather than "painters".
Use words likely to appear on a site with the information you want. "Luxury hotel dubuque" gets better results than "really nice places to spend the night in Dubuque".
Make keywords as specific as possible. "Antique lead soldiers" gets more relevant results than "old metal toys".
Automatic "and" Queries By default, Google only returns pages that include all of your search terms. There is no need to include "and" between terms. Keep in mind that the order in which the terms are typed will affect the search results. To restrict a search further, just include more terms. For example, to plan a vacation to Hawaii, simply type: vacation hawaii Automatic Exclusion of Common Words Google ignores common words and characters such as "where" and "how", as well as certain single digits and single letters, because they tend to slow down your search without improving the results.
Google will indicate if a common word has been excluded by displaying details on the results page below the search box. If a common word is essential to getting the results you want, you can include it by putting a "+" sign in front of it. (Be sure to include a space before the "+" sign.) Another method for doing this is conducting a phrase search, which means putting quotation marks around two or more words. Common words in a phrase search (e.g., "where are you") are included in the search.
For example, to search for Star Wars, Episode I, use: Star Wars Episode +I OR "Star Wars Episode I".
Capitalization Google searches are NOT case sensitive. All letters, regardless of how you type them, will be understood as lower case. For example, searches for "george washington", "George Washington", and "gEoRgE wAsHiNgToN" will all return the same results.
Word Variations (Stemming) To provide the most accurate results, Google does not use "stemming" or support "wildcard" searches. In other words, Google searches for exactly the words that you enter in the search box. Searching for "book" or "book*" will not yield "books" or "bookstore". If in doubt, try both forms: "airline" and "airlines," for instance.
The U.S. government is one of the largest producers of content on the Web. But the vast majority of people start their Web experience each day not on a government website but on a commercial search engine. How do we ensure that people are finding the government content they need via commercial search engines and when they search our sites?
Since the mid-1990s, Internet search has evolved from using spiders to "crawl" pages, to algorithms relying on Meta tags, to the point where is now a leading factor in determining a website's success. We all know it's important, but many of us struggle to do it well.
Search engines are becoming more efficient each day, so government web professionals need to stay on top of this fast-moving technology. Although traditional SEO strategies still apply, to stay at the forefront of search engine technologies, we must be creative and innovative and build good SEO into our day-to-day web operations.
At this New Media Talk, you'll hear from a panel of some of the country's top search experts as they discuss the future of Search and Search Engine Optimization. These thought leaders will cover the latest search strategies, and give us their thoughts on how Government can become a leader in SEO. Join us, and bring your ideas and questions!
We kick off January with the launch of Picasa for Mac at Macworld.
The Vatican launches a You Tube Channel, providing updates from the Pope and Catholic Church.
Together with the New America Foundation's Open Technology Institute, the Planet Lab Consortium, and academic researchers, we announce Measurement Lab (M-Lab), an open platform that provides tools to test broadband connections.
The latest version of Google Earth makes a splash with Ocean, a new feature that provides a 3D look at the ocean floor and information about one of the world's greatest natural resources
We introduce Google Latitude, a Google Maps for mobile feature and an iGoogle gadget that lets you share your location with friends and see the approximate location of people who have decided to share their location with you.
After adding Turkish, Thai, Hungarian, Estonian, Albanian, Maltese, and Galician, Google Translate is capable of automatic translation between 41 languages, covering 98% of the languages read by Internet users.
Our first message on Twitter gets back to binary: I'm 01100110 01100101 01100101 01101100 01101001 01101110 01100111 00100000 01101100 01110101 01100011 01101011 01111001 00001010. (Hint: it's a button on our homepage.)
We launch a beta test of interest-based advertising on partner sites and on You Tube. This kind of tailored advertising lets us show ads more closely related to what people are searching for, and it gives advertisers an efficient way to reach those who are most interested in their products or services.
We release Google Voice to existing Grand Central users. The new application improves the way you use your phone, with features like voicemail transcription and archive and search of all of your SMS text messages.
We celebrate our San Francisco office's Gold rating from the U.S. Green Building Council's LEED (Leadership in Energy and Environmental Design) Green Building Rating System. We see it as a sign that we're on track with our approach to building environmentally friendly offices.
The White House holds an online town hall to answer citizens' questions submitted on Google Moderator.
We launch new iGoogle backdrops inspired by video games, including classics like "Mario," "Zelda," and "Donkey Kong".
We announce Google Ventures: a venture capital fund aimed at using our resources to support innovation and encourage promising new technology companies.
Using our transliteration technology, we build and release a feature in Gmail that makes it easy to type messages in Indian languages like Hindi or Malayalam.
Google Suggest goes local with keyword suggestions for 51 languages in 155 domains.
Our April Fool's Day prank this year is CADIE, our "Cognitive Autoheuristic Distributed-Intelligence Entity" who spends the day taking over various Google products before self-destructing.
We announce an update to search which enables people to get localized results even if they don't include a location in their search query.
For India's 15th general election, we launch the Google India Elections Centre, where people can check to see if they're registered to vote, find their polling place, as well as read news and other information.
Over 90 musicians from around the world including a Spanish guitarist, a Dutch harpist and a Lithuanian birbyne player perform in the first-ever YouTube Symphony Orchestra at Carnegie Hall.
We rebuild and redesign Google Labs as well as release two new Labs: Similar Image search and Google News Timeline. Later in the month, we introduce Toolbar Labs.
We begin to show Google profile results at the bottom of U.S. search pages when people search for names, giving people more control over what others find about them when they search on Google.
We release 11 short films about Google Chrome made by Christoph Niemann, Motion Theory, Steve Mottershead, Go Robot, Open, Default Office, Hunter Gatherer, Lifelong Friendship Society, SuperFad, Jeff&Paul, and Pantograph.
To clear brush and reduce fire hazard in the fields near our Mountain View headquarters, we rent some goats from a local company. They help us trim the grass the low-carbon way!
At our second Searchology event, we introduce a few new search features, including the Search Options panel and rich snippets in search results.
We launch Sky Map for Android, which uses your Android phone to help you identify stars, constellations and planets.
Christin Engelberth, a sixth grader at Bernard Harris Middle School in San Antonio, Texas, wins the second U.S. Doodle 4 Google competition with her doodle "A new beginning".
At our second annual Google I/O developer conference in San Francisco, we preview Google Wave, a new communication and collaboration tool.
Redirection: Are you moving pages from one part of your site to another... or even to a different domain? Discover the cautious-yet-effective method of making the move, and the ways to ensure your users and search engines don't get lost in the shuffle.
Sitemaps: Learn about opportunities and limitations of sitemaps, an easy tool you can create to help search engines crawl your site more effectively. Find out what's truth vs. myth in how search engines find and index Web pages.
Search Engine Accessibility: Discover how to better help search engine bots and your users find, see, and understand what's on your site, and fine-tune how you facilitate your users' interactions and communications online.
Multimedia: Learn how to use Flash, video, and images on your site without totally baffling search engine bots and many of your users.
Descriptions and Annotations: Learn how title tags, meta descriptions, and image annotations can make a substantive difference in the quantity and quality of your site's traffic.
Site Security and Protection: There are some pages you do NOT want search engines indexing. Find out about different levels of protection, and also learn how to remove information that search engines have already indexed. Also, find out how to guard your site against unscrupulous Web e-mail spammers.
Site Clinics: We'll examine two or three government sites, highlighting best practices and also pitfalls to help you improve your own sites.
What is It? Search engine robots will check a special plain text file in the root of each server called robots.txt before indexing a site. Robots.txt implements the Robots Exclusion Protocol, which allows you as a web manager, to define what parts of your site are off-limits to search engine crawlers. For example, Web managers can disallow access to the Common Gateway Interface (CGI), or private and temporary directories, because they do not want pages in those areas indexed.
Here is some general information about robots.txt files. Robots.txt File The robots exclusion standard or robots.txt protocol is a convention to prevent cooperating web spiders and other web robots from accessing all or part of a website. The information specifying the parts that should not be accessed is specified in a file called robots.txt in the top-level directory of the website. The robots.txt file is made up of two parts, the User-agent and the Disallow. The User-agent specifies which robots to allow or disallow and the Disallow specifies which directories robots can or cannot crawl. The robots.txt is a gentleman's agreement and some crawlers, such as Google, may ignore the robots.txt file that disallows all crawling.
Example of a recommended robots.txt files blocking crawling of the cgi-bin, scripts, and images directories:
Why It's Important The USASearch.gov team has received several emails from federal webmasters informing us that their websites are not being crawled by USA.gov (formerly FirstGov.gov). We have found that most of these sites have disallowed searchbots from crawling their websites. In order for your content to be included in the USASearch.gov, or any search engine, you must allow search engines to crawl your site. USASearch.gov uses the MSN Search index to provide its core results. At the very least, federal webmasters should allow MSNBot to crawl their sites so they can be included in search results for USA.gov, the official web portal of the U.S. government.
In addition, OMB's Memorandum, M-06-02, Improving Public Access to and Dissemination of Government Information and Using the Federal Enterprise Architecture Data Reference Model says: "when disseminating information to the public-at-large, publish your information directly to the Internet. This procedure exposes information to freely available and other search functions and adequately organizes and categorizes your information".
This memorandum assumes that your robots.txt file is allowing search engines to crawl your site. If you are disallowing search engine crawlers, you are not exposing information to search engines, and therefore not complying with this guidance.
Include the robots.txt file in your server's root directory. This is standard web management practice. If you have robots.txt files in various subdirectories of your site, it will block crawling of that subdirectory and any directory below.
Search your server for stray robots.txt files and delete any robots.txt file below the root directory.
Meta-Tag Robots Exclusion:
Review your pages to make sure you are not using robots exclusion in your Meta tags if you intend for those pages to be publicly disseminated.
For those who are not familiar with Meta Tag Robots Exclusion, HTML meta tags can be used to exclude robots according to the contents of a web page. Again, this is purely advisory, and also relies on the cooperation of the robot programs.