Wired News: Single Post Wins Google Contest: "Dash became the overall winner by virtue of a single post on his blog in which he asked his readers to link from their own sites. Countless numbers did and -- after his very late entry assumed the top Google spot shortly after the competition's first round ended -- Dash was never seriously challenged.
'The impetus (was) for me, on a fluke, to say, 'Hey, I don't want these guys who do link spamming to win the contest,'' said Dash"
The contest, sponsored by Australian affiliate marketing company DarkBlue and SearchGuild.com, was designed to spotlight the various techniques used by search engine optimizers to boost sites' Google results. Those results are often a key factor in businesses' fortunes.
But Google is known for frequently changing its ranking methods, largely in an attempt to foil people who seek to manipulate its system with new tricks to improve their placement. Some charge that, partly as a result of these repeated adjustments to its ranking methodology, Google doesn't always return the most relevant results.
Nevertheless, Chris Ridings, the competition's organizer, told Wired News at the time of the first-round judging that it's easier for people to claim they can outfox Google than to actually demonstrate their prowess. The competition, he said, provides a stage for search optimization experts to show how strong particular ranking techniques really are.
To Dash, then, winning the flat-screen television awarded to the second-round victor was testament to the power of good content and a longstanding online presence.
"A lot of people are trying to increase their page rank unethically," said Dash. "I think if we show them (that) the best thing you can do is to write really good material, then hopefully, they'll spend their time doing that (instead of) spending time coming up with ways to graffiti other people's pages."
Furthermore, Dash maintained, his victory proves one thing: That the Web is a meritocracy.
"A page that's read by people instead of robots is going to do better," he said.
Have a comment on this article? Send it
More stories written by Daniel Terdiman
Page 1 of 1
Related Stories
A Contest to Outwit Google Jun. 08, 2004
Kerry Gets Google-Bombed May. 24, 2004
Dropping the Bomb on Google May. 11, 2004
How the Word Gets Around May. 07, 2004
Merchants Bemoan Google Rule Mar. 27, 2004
Spammers Clog Up the Blogs Oct. 24, 2003
Search Results Clogged by Blogs May. 16, 2003
Wired News RSS headline feeds
--------------------------------------------------------------------------------
Ads by Google White Knight UK
Compare UK prices at different
retailers and save a packet!
www.dealtime.co.uk White Knight
Compare Bargain Prices.
Save Time and Money.
www.kelkoo.co.uk White Knight at Tesco
Free delivery and top brands.
1,000 Extra ClubCard points in July
www.tesco.com/electrical Save up to 40%
On high street prices! Over 5000
home & kitchen appliances
electricshop.com
--------------------------------------------------------------------------------
Wired News: Staff | Contact Us | Advertising | RSS | Blogs | Subscribe
We are translated daily into Spanish, Portuguese, and Japanese
© Copyright 2004, Lycos, Inc. All Rights Reserved.
Your use of this website constitutes acceptance of the Lycos Privacy Policy and Terms & Conditions
Friday, July 09, 2004
Google trademarks targeted by kids' site Googles - Best Practices Search Engine Forums
Google trademarks targeted by kids' site Googles - Best Practices Search Engine Forums: "apparently there's a children's site called Googles.com that owns the Trademark for the word Google and there's a suit brewing over it.
SAN FRANCISCO (Reuters) - Stelor Productions, the operator of a children's Web site called Googles.com, said on Wednesday it has launched trademark proceedings against Web search leader Google Inc., which is preparing a much-awaited initial public offering.
Stelor, based in Darnestown, Maryland, said it filed two separate actions against Google with the U.S. Patent and Trademark Office.
Stelor, which charges that Google's mark is confusingly similar to its own, is opposing Google's trademark application to cover a 'long list of 'Google' goods and services, including children's books, stickers and children's clothing.'
http://www.reuters.co.uk/newsChanne...pe=internetNews"
SAN FRANCISCO (Reuters) - Stelor Productions, the operator of a children's Web site called Googles.com, said on Wednesday it has launched trademark proceedings against Web search leader Google Inc., which is preparing a much-awaited initial public offering.
Stelor, based in Darnestown, Maryland, said it filed two separate actions against Google with the U.S. Patent and Trademark Office.
Stelor, which charges that Google's mark is confusingly similar to its own, is opposing Google's trademark application to cover a 'long list of 'Google' goods and services, including children's books, stickers and children's clothing.'
http://www.reuters.co.uk/newsChanne...pe=internetNews"
Thursday, July 08, 2004
Google Filters and Avoiding their Screens
Google Filters and Avoiding their Screens:
"Possible Link Related Filters"
Identical link text
Recripricol links damped
Links pages/ guest books filtered
Interlinking on same IP filtered: Many observers have noticed a cross linking filter resulting from cross linking together too many sites from the same server and especially from the same c level block. A c level block is the part of a website address in the third section. Example: 123.123.xxx.123 xxx = c level There is some belief that the threshold trigger for that filter is about 20 linked sites.
Duplicate content may not only trigger a filter, but sites that contain a large number of pages that are similar in content might be targeted.
The sandbox filter allegedly works this way. A new site will receive a fresh site bonus from Google and rank highly. Following that initial blush with fame and fortune, the site will drop in the search rankings, and drop, and then drop some more. That is where the alleged sandbox occurs.
Once in the sandbox, the site will be anywhere from two to four months rising in the SERPs to a respectable position. During that sandbox period, regardless of the number and quality of inbound links and the PageRank, that site will not rank well at all. It is thought that gaining too many links too quickly might be part of the reason for the sandbox. On the other hand, building up a domain with incoming links prior to site launch may help avoid the sandbox entirely.
"Possible Link Related Filters"
Identical link text
Recripricol links damped
Links pages/ guest books filtered
Interlinking on same IP filtered: Many observers have noticed a cross linking filter resulting from cross linking together too many sites from the same server and especially from the same c level block. A c level block is the part of a website address in the third section. Example: 123.123.xxx.123 xxx = c level There is some belief that the threshold trigger for that filter is about 20 linked sites.
Duplicate content may not only trigger a filter, but sites that contain a large number of pages that are similar in content might be targeted.
The sandbox filter allegedly works this way. A new site will receive a fresh site bonus from Google and rank highly. Following that initial blush with fame and fortune, the site will drop in the search rankings, and drop, and then drop some more. That is where the alleged sandbox occurs.
Once in the sandbox, the site will be anywhere from two to four months rising in the SERPs to a respectable position. During that sandbox period, regardless of the number and quality of inbound links and the PageRank, that site will not rank well at all. It is thought that gaining too many links too quickly might be part of the reason for the sandbox. On the other hand, building up a domain with incoming links prior to site launch may help avoid the sandbox entirely.
Wednesday, July 07, 2004
Google Bans Traffic Power and it's Clients
Google Bans Traffic Power and it's Clients: "GoogleGuy recently broke his silence to confirm that Google has taken action against an SEO firm and it's clients for spammy techniques. The SEO company convinced some its clients to use javascript redirects and place hidden links to doorway pages created by the firm. GoogleGuy explains:
I believe that one SEO had convinced clients either to put spammy Javascript mouseover redirects, doorway pages that link to other sites, or both on their clients' sites. That can lead to clients' sites being flagged as spam in addition to the doorway domains that the SEO set up.
GoogleGuy later reassured webmasters that those who use javascript mousover to place text in the status bar do not need to worry about beeing banned."
I believe that one SEO had convinced clients either to put spammy Javascript mouseover redirects, doorway pages that link to other sites, or both on their clients' sites. That can lead to clients' sites being flagged as spam in addition to the doorway domains that the SEO set up.
GoogleGuy later reassured webmasters that those who use javascript mousover to place text in the status bar do not need to worry about beeing banned."
SearchEngineWatch Forums - Alt Attributes Appearing as Anchor Text in Text-only Cache
SearchEngineWatch Forums - Alt Attributes Appearing as Anchor Text in Text-only Cache: "Regarding the internal links (i.e. company logo that links back to the same page with same alt text), I have a solution that works well on dynamic sites.
On some of my dynamic sites, when I do an allinurl:www.domain.com site:www.domain.com most the results have that 'In order to show you the most relevant results, we have omitted some entries very similar to the # already displayed. If you like, you can repeat the search with the omitted results included.' message.
The reason some of my sites had this was because the content closest to the top of the page was the alt tag of the logo that linked back to the homepage. All the pages had the same header with same text at the top.
What I did was change the alternative text to read the same as the page title for each unique page. Now the alt text is different at the top and no more messages like 'In order to show you the most relevant results, we have omitted some entries very similar to the # already displayed. If you like, you can repeat the search with the omitted results included.'
Also, I do not have the same alternative text with a link back to the same page on every page. Not that I feel having the same text linking to the same page over and over again is a bad thing. I only did this because I hated getting 'In order to show you the most relevant results, we have omitted some entries very similar to the # already displayed. If you like, you can repeat the search with the omitted results included.'"
On some of my dynamic sites, when I do an allinurl:www.domain.com site:www.domain.com most the results have that 'In order to show you the most relevant results, we have omitted some entries very similar to the # already displayed. If you like, you can repeat the search with the omitted results included.' message.
The reason some of my sites had this was because the content closest to the top of the page was the alt tag of the logo that linked back to the homepage. All the pages had the same header with same text at the top.
What I did was change the alternative text to read the same as the page title for each unique page. Now the alt text is different at the top and no more messages like 'In order to show you the most relevant results, we have omitted some entries very similar to the # already displayed. If you like, you can repeat the search with the omitted results included.'
Also, I do not have the same alternative text with a link back to the same page on every page. Not that I feel having the same text linking to the same page over and over again is a bad thing. I only did this because I hated getting 'In order to show you the most relevant results, we have omitted some entries very similar to the # already displayed. If you like, you can repeat the search with the omitted results included.'"
SearchEngineWatch Forums - Alt Attributes Appearing as Anchor Text in Text-only Cache
SearchEngineWatch Forums - Alt Attributes Appearing as Anchor Text in Text-only Cache: " Some believe that all identical anchor text is fine, while others believe that if the identical phrase goes over a certain percentage of total links it can trip a penalty or filter. I'm one of those that believes it can happen, even just within the site itself without regard to inbound links, though inbounds also could possibly make it even worse if there's a problem. I've had it happen to me; that was one of the factors I identified as causing problems with the site.
Then there's on-page KWD and number of phrase occurences. While for Yahoo a certain ideal density and number of occurences is fine, some believe that for Google it's necessary to whittle down number of occurences of an identical phrase on a page to avoid problems."
Then there's on-page KWD and number of phrase occurences. While for Yahoo a certain ideal density and number of occurences is fine, some believe that for Google it's necessary to whittle down number of occurences of an identical phrase on a page to avoid problems."
Google Pulls Plug on "onmouseover" Pages
Google Pulls Plug on "onmouseover" Pages: "June 23, 2004 (utc 0)
Along with todays PR update, I've also noticed that several sites that used the increasingly popular tactic of creating keyword stuffed entry pages that forward to the true home page via the onmouseover javascript command are no longer found at all in the index.
This tactic has been discussed here for a number of months now. Nice to see that these sites are no longer appearing in the SERPS.
The sites that were doing this seem to have been completely booted from the index, too. Toolbar shows a gray PR. "
Along with todays PR update, I've also noticed that several sites that used the increasingly popular tactic of creating keyword stuffed entry pages that forward to the true home page via the onmouseover javascript command are no longer found at all in the index.
This tactic has been discussed here for a number of months now. Nice to see that these sites are no longer appearing in the SERPS.
The sites that were doing this seem to have been completely booted from the index, too. Toolbar shows a gray PR. "
Monday, July 05, 2004
Google Cracks Down
Google Cracks Down: "According to threads taking place in several popular search engine marketing forums, it seems that Google has adjusted their algorithms to catch a batch of sites using a JavaScript technique known as 'onmouseover.' "
Google Alters AdWords; Sharpens Relevancy Matching - The Search Engine Marketing Weblog - sem.weblogsinc.com
June 30th 2004 Google Alters AdWords; Sharpens Relevancy Matching - The Search Engine Marketing Weblog - sem.weblogsinc.com: "Google sent an e-mail to AdWords advertisers today announcing an algorithmic change to the matching of ads and keywords. For now, the change affects only broad-matched keywords associated with AdWords ads displayed on Google.com, but Google promises the improved relevance will ultimately affect other matching options and other portions of the advertising network... it is clear that broad matching will no longer be as broad. Although Google expresses a somewhat defensive tone in the e-mail and related FAQ, the truth is that this change should be good news for advertisers, especially those who rely on broad matching to eliminate the work of fine-tuning keyword. Here is the example Google uses—"
”For example, an advertiser specializing in Alaskan cruises may have selected cruises (broad-matched) for their campaign. Previously, this keyword may have been disabled due to poor performance on more popular queries such as Hawaiian cruises. Instead of disabling all broad match variations of cruises, we will now show this ad for specific query variations that are more relevant to the ad, such as Alaskan cruises.”
The big question is HOW—how will Google determine niched relevancy in an ad? Is the change really algorithmic as I stated above, or will human resources be used to evaluate ads? (The second possibility is far less likely than the first.)
One implication seems clear: Google is de-emphasizing broad matching in favor of exact matching.
If the keywords are matched more intelligently to start with, the chances of poor performance are lessened. Is it possible that Google is using this after-the-fact determination to refine the matching relevancy? Nobody outside Google knows, but some posters at WebmasterWorld are already seeing changes in ad positioning.
Includes comments by "adwords advisor"
AdWords Announce Improved Ad Relevancy: "Power posting enables you to put in a unique max CPC and URL for a bunch of different keyword tied to one ad. It helps to make things a little faster on the setup but if you want to have unique ads for each keyword ya still have to type all that in. "
”For example, an advertiser specializing in Alaskan cruises may have selected cruises (broad-matched) for their campaign. Previously, this keyword may have been disabled due to poor performance on more popular queries such as Hawaiian cruises. Instead of disabling all broad match variations of cruises, we will now show this ad for specific query variations that are more relevant to the ad, such as Alaskan cruises.”
The big question is HOW—how will Google determine niched relevancy in an ad? Is the change really algorithmic as I stated above, or will human resources be used to evaluate ads? (The second possibility is far less likely than the first.)
One implication seems clear: Google is de-emphasizing broad matching in favor of exact matching.
If the keywords are matched more intelligently to start with, the chances of poor performance are lessened. Is it possible that Google is using this after-the-fact determination to refine the matching relevancy? Nobody outside Google knows, but some posters at WebmasterWorld are already seeing changes in ad positioning.
Includes comments by "adwords advisor"
AdWords Announce Improved Ad Relevancy: "Power posting enables you to put in a unique max CPC and URL for a bunch of different keyword tied to one ad. It helps to make things a little faster on the setup but if you want to have unique ads for each keyword ya still have to type all that in. "
Wednesday, June 16, 2004
Google invests in Chinese search | CNET News.com
Google invests in Chinese search | CNET News.com: "Search giant Google and venture capital firm Draper Fisher Jurvetson ePlanet Ventures are among the new investors in Chinese search engine Baidu.com."
Google added to its presence in the Chinese search market earlier this year with its advertising product called AdWords in two Chinese dialects. China has a thriving Internet market with 80 million Internet users--a figure that is projected to jump to 100 million by end of this year, according to China Internet Network Information Center.
Beijing-based Baidu, which launched its search service in 2001, is the most frequently used search engine in China, the company said. It offers a Chinese Web page index of more than 300 million pages, in addition to paid search services
Google added to its presence in the Chinese search market earlier this year with its advertising product called AdWords in two Chinese dialects. China has a thriving Internet market with 80 million Internet users--a figure that is projected to jump to 100 million by end of this year, according to China Internet Network Information Center.
Beijing-based Baidu, which launched its search service in 2001, is the most frequently used search engine in China, the company said. It offers a Chinese Web page index of more than 300 million pages, in addition to paid search services
The Sandbox Effect: Not a Nice Place to Play
The Sandbox Effect: Not a Nice Place to Play: "Named for the phenomenon of new websites being held back, in the search engine results pages (SERPs) by leading search engine Google, the Sandbox Effect has many website owners upset. When a new website is indexed in Google, it often receives what many observers consider to be a new site bonus. The brand new site will rocket to the top of the SERPs charts for a brief, shining moment. From there, it's all downhill."
Many observers believe the purpose of the Sandbox filter is to discourage unscrupulous webmasters from using practices that are against Google's Webmaster Guidelines. Some of the techniques that Google is attempting to disrupt are the use of throw away spam sites to build early traffic, and to slow down the purchase of expired domain names to get a jump start from any pre-existing Google PageRank.
One technique, to avoid spending too long in the sand, is to purchase and register a domain name and park it. By placing the domain on a parked basis, some of the sand time will be run through the Google hourglass, by the time your site is ready for launch. During the time period of your parked domain, you can be preparing content when your site goes public.
Conclusion
Google has placed a dampening filter in its search algorithm, which holds back new sites for three or four months, following the initial fresh site bonus.
The so-called Sandbox Effect places new websites into a brief moratorium, where they will not rank well, if they appear at all, in the SERPs.
Because the Sandbox appears to affect every new website, the onus is on the webmaster to plan for it happening to them. To counter the Sandbox, website owners are encouraged to carefully select their launch date to limit the damage.
During the sojourn in the sand, a tremendous opportunity exists to add content and incoming links to your website. That time period affords you the chance to find additional links, optimize your site, and submit to the many Internet directories.
With a little planning, and using your time in Sandbox hiatus wisely, your site can burst forth from the filter to the top of the rankings for your keywords.
NB UK SITE STARTED SHOWING PR 28 April 2004
Many observers believe the purpose of the Sandbox filter is to discourage unscrupulous webmasters from using practices that are against Google's Webmaster Guidelines. Some of the techniques that Google is attempting to disrupt are the use of throw away spam sites to build early traffic, and to slow down the purchase of expired domain names to get a jump start from any pre-existing Google PageRank.
One technique, to avoid spending too long in the sand, is to purchase and register a domain name and park it. By placing the domain on a parked basis, some of the sand time will be run through the Google hourglass, by the time your site is ready for launch. During the time period of your parked domain, you can be preparing content when your site goes public.
Conclusion
Google has placed a dampening filter in its search algorithm, which holds back new sites for three or four months, following the initial fresh site bonus.
The so-called Sandbox Effect places new websites into a brief moratorium, where they will not rank well, if they appear at all, in the SERPs.
Because the Sandbox appears to affect every new website, the onus is on the webmaster to plan for it happening to them. To counter the Sandbox, website owners are encouraged to carefully select their launch date to limit the damage.
During the sojourn in the sand, a tremendous opportunity exists to add content and incoming links to your website. That time period affords you the chance to find additional links, optimize your site, and submit to the many Internet directories.
With a little planning, and using your time in Sandbox hiatus wisely, your site can burst forth from the filter to the top of the rankings for your keywords.
NB UK SITE STARTED SHOWING PR 28 April 2004
Tuesday, June 15, 2004
adsense income...Files: How Google Took the Work Out of Selling Advertising
The New York Times > Business > Your Money > Techno Files: How Google Took the Work Out of Selling Advertising: "The highest returns seem to go to sites devoted to very specific tasks - like SeatGuru.com, which rates best and worst seats on airplanes. Though he was not specific, Matthew Daimler, 26, the founder of SeatGuru, said the program brought him thousands of dollars a month. This new revenue, he said, let him 'change from a hobby site to a business, and the best part is that I don't have to have an ad sales force but am still exposed to hundreds of thousands of advertisers.'"
Friday, June 11, 2004
Froogle Search: manchester airport flights airlines guide
Froogle Search:
From; 'searchenginejournal@yahoogroups.com' "I don't know if it helps the searchengine visibility, but it can't hurt. I know that when someone does a search, Froogle will only display 1 listing from each vendor in the results, but customers can see more listings for a certain vendor by clicking a 'more listings' link under each image. My company has been in Froogle almost since its inception, and the only challenge we've experienced is that no one really has any control over WHICH products are displayed for a particular search.
From; 'searchenginejournal@yahoogroups.com'
Slashdot | Google Finally Moves Toward RSS Standard
Slashdot | Google Finally Moves Toward RSS Standard:
Posted by CmdrTaco on Thursday June 10, @05:19PM
from the i-miss-ultramode dept.
declan writes 'My News.com colleague Evan Hansen just got his hands on an internal email thread revealing that Google is planning to embrace RSS. Evan's co-authored News.com article quotes from the email (sent to Sergey Brin, Larry Page, and Eric Schmidt) confirming that Google is rethinking only supporting Atom. "
Posted by CmdrTaco on Thursday June 10, @05:19PM
from the i-miss-ultramode dept.
declan writes 'My News.com colleague Evan Hansen just got his hands on an internal email thread revealing that Google is planning to embrace RSS. Evan's co-authored News.com article quotes from the email (sent to Sergey Brin, Larry Page, and Eric Schmidt) confirming that Google is rethinking only supporting Atom. "
Thursday, June 10, 2004
China & Google purchases
Instant Position Search Engine Marketing Forums: "Google has been trying to go strong in China for some time, with lots of problems.
Now they are going for a different strategy:
'Now, Google is making an investment in one of China's largest search engines, Baidu, which means ``hundred times.'' Google won't comment, but the investment was confirmed by U.S. investor Tim Draper of Menlo Park's Draper Fisher Jurvetson, which invested $10 million in Baidu. The move might give Google a way to secure its position in China and gain leverage over its chief competitor there.'
Full story: http://www.miami.com/mld/miamiherald/business/technology/8867532.htm?1c "
Now they are going for a different strategy:
'Now, Google is making an investment in one of China's largest search engines, Baidu, which means ``hundred times.'' Google won't comment, but the investment was confirmed by U.S. investor Tim Draper of Menlo Park's Draper Fisher Jurvetson, which invested $10 million in Baidu. The move might give Google a way to secure its position in China and gain leverage over its chief competitor there.'
Full story: http://www.miami.com/mld/miamiherald/business/technology/8867532.htm?1c "
Wednesday, June 09, 2004
Yahoo tests new home page | CNET News.com
Yahoo tests new home page | CNET News.com: "Yahoo has begun quietly testing a makeover of its home page, as evidenced by trial mock-ups seen by some Web users over the past few weeks when they accessed the portal... Yahoo's home page changes are few and far between. The company has long remained militant about maintaining the site's speedy load time and visual style, resisting a heavy use of graphics and animation. Before its 2002 makeover, Yahoo's home page had maintained its basic framework since 1995. "
Yahoo email: The e-mail revamp, slated to launch this summer, will include a boost in memory to 100MB for free users and "virtually unlimited" storage for its paid customers. The move is meant to counter archrival Google, which plans to launch a free e-mail service called Gmail that offers 1GB of storage.
Yahoo email: The e-mail revamp, slated to launch this summer, will include a boost in memory to 100MB for free users and "virtually unlimited" storage for its paid customers. The move is meant to counter archrival Google, which plans to launch a free e-mail service called Gmail that offers 1GB of storage.
BBC NEWS | Technology | Inside the Google search machine
BBC NEWS | Technology | Inside the Google search machine: "Google used to update its web index every month which, because it caused results to jump around a little, was dubbed the Google Dance. But not anymore, says Mr Cutts.
'Within the last year we have improved out way of processing and indexing the web,' he says. 'You are not going to see Google dances.' 'Now we crawl a percentage of the web everyday,' he says, 'so after a relatively small time frame we hit every page.' "
'Within the last year we have improved out way of processing and indexing the web,' he says. 'You are not going to see Google dances.' 'Now we crawl a percentage of the web everyday,' he says, 'so after a relatively small time frame we hit every page.' "
Thursday, June 03, 2004
Google Gains Overall, Competition Builds Niches
Google Gains Overall, Competition Builds Niches: "Hitwise placing Google at the top for search in April 2004, with Yahoo! as the number one portal. Google registered more than a 21 percent gain in market share from August 2003 to April 2004, while Yahoo!'s search engine lost nearly 1 full percentage point, and MSN's search function registered a nearly 9 percent decrease in market share during the same period. "
Just as an iProspect study revealed that searchers viewed results differently across the top search engines, Hitwise also found differentiators among the top three search applications. MSN Search had the highest percentage of U.S visits from the sites Hitwise categorized as Shopping and Classifieds; Business and Finance; and Travel, while Yahoo! Search and Google received more U.S. visits from the sites in the Education; News and Media; and Entertainment categories.
Women found paid ads to be more relevant than the men did when searching across Google, Yahoo!, MSN, and AOL. College graduates and Internet veterans found organic results to be more relevant than their non-graduate and Internet novice counterparts.
Demographically, Hitwise found that Google was the preferred search tool for males, while MSN Search appealed to females. Yahoo! was the more popular engine for 18 to 34 year-old searchers, and MSN Search captured the over 55 crowd
Just as an iProspect study revealed that searchers viewed results differently across the top search engines, Hitwise also found differentiators among the top three search applications. MSN Search had the highest percentage of U.S visits from the sites Hitwise categorized as Shopping and Classifieds; Business and Finance; and Travel, while Yahoo! Search and Google received more U.S. visits from the sites in the Education; News and Media; and Entertainment categories.
Women found paid ads to be more relevant than the men did when searching across Google, Yahoo!, MSN, and AOL. College graduates and Internet veterans found organic results to be more relevant than their non-graduate and Internet novice counterparts.
Demographically, Hitwise found that Google was the preferred search tool for males, while MSN Search appealed to females. Yahoo! was the more popular engine for 18 to 34 year-old searchers, and MSN Search captured the over 55 crowd
google primer - if anyone out there hasnt a clue how it works still
Google: World's Best Search Engine?:
Evaluating Results
Google's web-page-ranking system, PageRank, tends to give priority to better-respected and trusted information. Well-respected sites link to other well-respected sites. This linking boosts the PageRank of high-quality sites. Consequently, more accurate pages are typically listed before sites that include unreliable and erroneous material. Nevertheless, evaluate carefully whatever you find on the web since anyone can:
Create pages
Exchange ideas
Copy, falsify, or omit information intentionally or accidentally
Many people publish pages to get you to buy something or accept a point of view. Google makes no effort to discover or eliminate unreliable and erroneous material. It's up to you to cultivate the habit of healthy skepticism. When evaluating the credibility of a page, consider the following AAOCC (Authority, Accuracy, Objectivity, Currency, and Coverage) criteria and questions, which are adapted from www.lib.berkeley.edu/ENGI/eval-criteria1001.html.
Authority
Who are the authors? Are they qualified? Are they credible?
With whom are they affiliated? Do their affiliations affect their credibility?
Who is the publisher? What is the publisher's reputation?
Accuracy
Is the information accurate? Is it reliable and error-free?
Are the interpretations and implications reasonable?
Is there evidence to support conclusions? Is the evidence verifiable?
Do the authors properly list their sources, references or citations with dates, page numbers or web addresses, etc.?
Objectivity
What is the purpose? What do the authors want to accomplish?
Does this purpose affect the presentation?
Is there an implicit or explicit bias?
Is the information fact, opinion, spoof, or satirical?
Currency
Is the information current? Is it still valid?
When was the site last updated?
Is the site well maintained? Are there any broken links?
Coverage
Is the information relevant to your topic and assignment?
What is the intended audience?
Is the material presented at an appropriate level?
Is the information complete? Is it unique?
Evaluating Results
Google's web-page-ranking system, PageRank, tends to give priority to better-respected and trusted information. Well-respected sites link to other well-respected sites. This linking boosts the PageRank of high-quality sites. Consequently, more accurate pages are typically listed before sites that include unreliable and erroneous material. Nevertheless, evaluate carefully whatever you find on the web since anyone can:
Create pages
Exchange ideas
Copy, falsify, or omit information intentionally or accidentally
Many people publish pages to get you to buy something or accept a point of view. Google makes no effort to discover or eliminate unreliable and erroneous material. It's up to you to cultivate the habit of healthy skepticism. When evaluating the credibility of a page, consider the following AAOCC (Authority, Accuracy, Objectivity, Currency, and Coverage) criteria and questions, which are adapted from www.lib.berkeley.edu/ENGI/eval-criteria1001.html.
Authority
Who are the authors? Are they qualified? Are they credible?
With whom are they affiliated? Do their affiliations affect their credibility?
Who is the publisher? What is the publisher's reputation?
Accuracy
Is the information accurate? Is it reliable and error-free?
Are the interpretations and implications reasonable?
Is there evidence to support conclusions? Is the evidence verifiable?
Do the authors properly list their sources, references or citations with dates, page numbers or web addresses, etc.?
Objectivity
What is the purpose? What do the authors want to accomplish?
Does this purpose affect the presentation?
Is there an implicit or explicit bias?
Is the information fact, opinion, spoof, or satirical?
Currency
Is the information current? Is it still valid?
When was the site last updated?
Is the site well maintained? Are there any broken links?
Coverage
Is the information relevant to your topic and assignment?
What is the intended audience?
Is the material presented at an appropriate level?
Is the information complete? Is it unique?
Saturday, May 29, 2004
GMAIL & PRIVACY LEGISLATION
Wired News: Tightening the Reins on Gmail: "California's Senate voted on Thursday to support a bill to limit a new e-mail service by No. 1 Web search company, Google, over concerns it could threaten the privacy of users. "
Subscribe to:
Posts (Atom)