Saturday, January 10, 2004

Cool tool BuzzToolBox Blog

BuzzToolBox Blog: "Goofresh
Google offers a date-based syntax, but you can only access it via the advanced search, which limits your time options, or the daterange: syntax, which uses Julian dates and is a bit difficult to use. Goofresh is a way to search for sites added today, yesterday, within the last seven days, or last 30 days."

Friday, January 09, 2004

How to Cheat Google AdWords Select - Part 2

ADWORDS:
How to Cheat Google AdWords Select - Part 2: "if you�re not using all the tools and tricks that AdWords provides, then you�re definitely leaving money on the table. And, when your competitors see your results, they�ll think you�ve somehow cheated the AdWords system.

This series of articles will systematically show you how to:

  • Generate the absolute best keyword list for your target market
  • Use simple techniques to vastly expand your productive keyword list
  • Create highly clickable copy for your ad
  • Precisely limit the distribution of your ad to only those prospects who are most interested
  • Beat the competition with creative bidding strategies
  • Dramatically reduce costs of your campaigns while increasing clickthroughs "
  • Wednesday, January 07, 2004

    Google Dance Case Studies - hugely useful

    Google Dance Case Studies: "By Danny Sullivan, Editor
    December 7, 2003
    While various webmaster forums have lots of discussion about the recent Google changes, specifics about particular situations are often lacking. It's much more useful to look at real life examples rather than deal with generic situations. So, as people have been contacting me, I've asked permission to use their stories to illustrate in real life how the changes at Google have made impacts. Here are some stories. You can read them all by scrolling through the article or use the links below to jump directly to sections of interest. Also be sure to see other articles about the recent changes on the Florida Google Dance Resources page."

    Report on Report On Google - Topic Sensitive PageRank? | SEO Research Labs

    To edit & annotate;
    Free Report On Google - Topic Sensitive PageRank? | SEO Research Labs: "How To Prosper With The New Google"

    Core of new algo:

    "Applied Semantics' CIRCA Technology is based on a language-independent, highly scalable ontology that consists of millions of words, their meanings, and their conceptual relationships to other meanings in the human language. The ontology, aided by sophisticated search technology, is the basis for a conceptual understanding of the multiplicity of word meanings, enabling computers to more effectively manage and retrieve information which results in improved knowledge discovery opportunities for searchers."

    What CIRCA allows Applied Semantics (and Google) to do, is to identify concepts related to specific words and phrases. They use this technology right now to serve up relevant advertising in a variety of contexts. Applied Semantics technology may also be involved in Google's
    keyword stemming system.

    Among other things, CIRCA can calculate how closely related or similar "phrase A" is to "concept B." If you search for "Colorado bicycle trips," CIRCA can relate that conceptually to a region (Colorado, which is in the Rocky Mountains), to concepts like bicycling and travel, etc. This is important, because it means that they can calculate the "distance" between your search query and various concepts in their database.

    Understanding The Changes, Ignoring The Noise: For some search queries, the results have been radically changed – in a few cases, the top 100
    listed pages have all dropped out. The folks at Google Watch have compiled a listing of affected search terms (see www.scroogle.org), and the amount of change in each, which has proven very valuable in conducting parts of our research.

    When we looked at hundreds of unbiased real world search queries and mapped out the amount that each has changed, there is a very clean distribution in terms of how much they have changed. In the real world, radical changes are the exception, not the rule.

    Looking at "real estate," according to Scroogle.org's methodology, 77 of the top 100 pages dropped out of the top 100. Looking at the more pecific "colorado real estate," 24 of the top 100 dropped out. You can see this pattern repeated over and over again. The more generic
    searches show more changes in the top results. Look at the pages that dropped out of the "real estate" top 100. You will see a whole lot of local
    realtors who managed to link their way (using PageRank and link text) into enviable positions, but not too many are really among the 100 most relevant pages for that query.

    The first page I see listed among the "missing" is titled "Southern California Real Estate." Interestingly enough, that page shows up at #2 for the more specific search "Southern California Real Estate." In other words, they haven't been penalized, they just don't show up where they
    don't belong any more.

    Not so different for TT if have followed tips for links & content....
    Those sites that have a lot of content and lots of relevant links (both incoming and outbound) have done well. Those that have gotten by with doorway pages and link swaps are no longer quite so successful.



    Nothing really new in conclusions, just the usual warnings:

    If you have mapped out a lot of content in your site plan, the task of writing it all can be daunting. So much so, in fact, that some folks never start, and try to get by with cheap tricks like machine-generated "doorway pages." Don't fall for these "quick fixes." The risk exceeds the
    potential reward, and it's not that hard to develop content.


    So you've got a beautiful, useful, content-rich website, perfectly targeting your desired visitor with the search terms he/she is going to use. Every page is a shining example of optimized content, it's all linked together perfectly... you're done, right? Wrong! If you stop after step 3, you're going to be very disappointed. Search engines aren't terribly impressed with a website that nobody else has linked to.

    Above all: Don't Stop When You Hit The Top!
    A lot of folks make a very critical error, when they start seeing good search engine rankings and the nice increase in traffic that good rankings can bring. They get busy with their new visitors, then they stop. They stop working on content, they stop working on links, and they eventually
    stop seeing good rankings.

    Then, they start complaining about Google, or Inktomi, or whatever search engine dropped them first. They'll shout to the rafters that the search results have gone to heck in a hand basket, that the search engines are persecuting them, etc. etc. My advice to you is simple – don't stop when you hit the top. Keep forging new relationships, keep building links, keep adding content, and keep your web site up to date. Your competition
    isn't going to stop trying to beat you, and that includes the folks who are still ahead of you. The effects of your efforts usually show up in the search engines a couple months later. Some folks give up after a few weeks, because they haven't seen instant results...

    If you're going to make a commitment to a search engine strategy for your website, and make the substantial investments it takes (both time and money), then stick with your commitment and execute your strategy. Don't give up too early, and don't stop when you hit the top.

    Why I wouldn't buy Google :: Search Engine Optimization at ChriSEO.com :: Search Engine optimization and web site ranking tips, articles and news

    Google vs Yahoo - I see a head to head battle emerging....
    Why I wouldn't buy Google :: Search Engine Optimization at ChriSEO.com :: Search Engine optimization and web site ranking tips, articles and news: "1. Any JimWorld regulars will recognize my first point, sagely given by jcokos, PageRank is patented by Stanford. A number of other acquisitions by Google also have this drawback. Meaning that, effectively, what Stanford wants it's likely to get. These aren't technologies that can be quickly replaced, so even if there is some sort of agreement, the balance of power is always going to be in Stanfords favour.

    2. Google is inherently non-commercial. It comes with an expectation of being free to its users. Any attempt to change that would lead to the mass flocking of its users to elsewhere. The methodology that Google uses to draw revenue is it's text ads system. The internet has seen systems work well time and time again and then decline in to uselessness. Works now, as effective next year? Probably not. With competitors catching up text ads are going to have less and less effect o�n the viewer and therefore be worth less to the advertiser. What, exactly, is going to replace those revenues and shouldn't Google be implementing that now?

    3. With Yahoo buying pretty much every decent search engine and Microsoft throwing its dollars at building o�ne, Google is set to face the strongest competition it has had to face in recent years. With the gap between technologies and capabilities closing, with o�ne engine being largely similar to another, you can no longer compete o�n being 'better'. It becomes a marketing game. Does Google have the skills? I doubt it. Even if it does, is it as good at it as Microsoft, definitely not.

    4. Google exists and promotes itself o�n secrecy and rumours. Buying in would be a bit like buying a car that works"

    Tuesday, January 06, 2004

    IPO news

    Bloomberg.com: Top Worldwide: "Jan. 5 (Bloomberg) -- Google Inc. hired Morgan Stanley and Goldman Sachs Group Inc. to arrange its initial public offering, a sale that may raise as much as $4 billion,"

    Google's House of Cards

    Problems inherent in the Adsense system...

    Google's House of Cards: "The true reason Google refuses to budge is the program is built on a shaky foundation. To secure the large distribution network AdSense needs, Google must pay publishers good prices. But if advertisers make separate bids for contextual ads, most would bid significantly less than they do for paid listings. Publishers would see revenue fall dramatically. This, in turn, would lead many sites to either remove the AdSense tags from their pages or demand a better revenue share.
    AdSense is a house of cards, built on a foundation that forces advertisers to overpay for contextual ads. If Google allowed separate bids, it would risk losing revenue on multiple fronts: from the lower bids, the loss of distribution, and the loss of revenue share. The company bet it could keep prices high and revenue shares low as it built this program. But with smart advertisers turning off their AdSense ads, it's time for Google to give advertisers what they want"

    Slashdot | Better Search Results Than Google?

    How people search...
    Slashdot | Better Search Results Than Google?: "Searching for info about electronic products is the worst on google.
    I use the following along with any thing i want to search and it usually does the trick
    -shop -shopping -price -buy -order -shipping'.
    This no doubts subtracts one or two sites which are good but atleast filters out most of the shopping sites."...

    The thing I have noticed to be the greatest single limit on web searching is the operator. I can regularly find things on the net that my co-workers cannot. This is because I understand keyword boolean searching at a deeper level than most people.

    I blame this on the level of education of the common population, as opposed to being evidence of my own superiority. 8-)

    In a world where most people have never actually met or "dealt with" a librarian (archivist, whatever 8-) it should surprise nobody that these self-same people have no idea what it means to take personal responsibility for organizing their own approach to knowing things.

    Having grown up near and actually talked to librarians all my life I actually understand how to group information. Applying that knowledge to a search for some words and against others isn't that far a stretch.

    It is a personal pet peve of mine to have to listen to people bemoan Google (etc.) when these self-same people have never even *noticed* the advanced search link, nor even learned the power of the minus ("-") in the standard search bar.

    There is no technology that can "fix" bad user inquiries that won't in turn "ruin" good ones.

    From my own experience with developing search technologies for an e-content site, these guys are on the right track. Compared to a lot of search technologies out there, Google is dumb. But it is blazing fast, general purpose, and smarter than most of its (former) compettitors. Part of why it is dumb is that it is so general purpose. To make a search engine smarter, you have to add context. Specialized search engines can do this by standardizing their inputs. Google could do this too, but it would require complex parsing of everything that it spiders.

    Another thing that Google really lacks is detection of duplicates. Google tries to do this, but does it poorly. I remember recently doing a search on Google for an obscure DB2 error code, and getting the same page out of the IBM manual over and over again, all on different college websites.
    This is another area where linguistic/statistical analysis could really help. Most knowledge-base products offer a "More Like This" feature that is an index of linguistic similarities between items. An easy way to detect duplicates with such a system is to have a fine scale and place an uppler limit on similarities, i.e. any two items with a similarity > N are likely to be duplicates.


    All of this being said, I would be surprised if Google does not address these issues in the very near future. I do not think they have gone down the path that many large companies go down where they stop trying to innovate and instead just try to protect their turf

    Monday, January 05, 2004

    MarketingWonk: November 2003 Archives (internet advertising & online marketing industry news)

    Re PR:
    MarketingWonk: November 2003 Archives (internet advertising & online marketing industry news)
    What is a good page rank? Google has my home page listed as a six. The other pages are fives some fours.
    >
    > Is six an OK rank? a bad rank? Will a one person site ever get a
    > rating of nine or ten? Should I try for a higher number or be
    > satisfied with my current page rank?

    Hmm, just a six. Really, you should give up on every hoping to do well with Google. If you're not a nine or a ten, you'll simply never appear for anything.

    NOT!

    Honestly, do yourself a favor and uninstall the Google Toolbar. A PR6 value is actually pretty high, but in and of itself, it doesn't predict how well you'll do with Google.

    Each page within your web site will have its own PR value. That's only one part of many factors that influences if each and every page will rank well for different terms.

    Another key factor is link context. Amazon, last time I looked, was a PR9 site. Search for "books," and it's number one. Search for "cars," and it doesn't appear. Why? No one links to Amazon and in or near the link uses the word "cars." The links are not contextuall relevant for that term.

    Try something else. Search for something at Google, anything. The
    results that come up should all be listed by PR value, right -- those with the highest PR scores first. Not so. Start visiting sites. You'll be surprise to see things like a PR4 site may come first, then a PR6 site, then PR5, then PR6, then PR4 and so on.

    To recap, the PR value you have is not the primary factor to concern yourself with. Instead of trying to raise your PR score, instead, ensure that you are offering great content and the best possible user experience to your users. That will bring in some links to you naturally, which should help both the PR score and your contextual linking. Also, seek out sites related to your topic and audience that are non-competive. Ask for links, and be free to link out in return, if your audience will gain from that.

    And Andrew, excellent job with I-Search! I've greatly enjoyed reading the new posts coming in under your helmsmanship, as well as the ample stage-setting comments you've often provided.

    cheers,
    danny

    Google email response to Florida enq List archives at Adventive

    List archives at Adventive: "t archives"

    reply:

    Thank you for your note. Please be assured that Google does not accept payment for inclusion of sites in our index, nor for improving the rank of sites in our results. As you may know, results in our index change regularly based on ongoing, automated processes aimed at improving the quality and content of our search results. Changes you observe may include, but are not limited to, addition of new sites, changes in the ranking of existing sites, sites falling out of the index or getting
    dropped and sites' content fluctuating between old and new content.

    We realize these changes can be confusing. However, these processes are completely automated and not indicative of wrong-doing or penalization of individual sites. We currently include over three billion pages in our index and it is certainly our intent to represent the content of the Internet fairly and accurately. The ongoing changes you have observed are part of this effort.

    While we cannot guarantee that your clients' pages will consistently appear in our index or appear with a particular rank, we do offer guidelines for building a "crawler-friendly" site. You can find these guidelines at http://www.google.com/webmasters/guidelines.html . Following
    these recommendations may increase the likelihood that your clients' sites will show up consistently in Google search results.

    We appreciate your feedback and will pass it on to our engineers.

    Regards,
    The Google Team

    In the Wake of the Florida Update - High Rankings Advisor

    In the Wake of the Florida Update - High Rankings Advisor: "++In the Wake of the 'Florida' Update++"

    Concludes: Overall, it will take some time for any definite/solid information to filter down about the true effects of the "Florida" update. Theories
    will continue to swirl around the 'Net. So will rankings! But the fact remains that "common-sense" SEO copywriting wins out in the long run.

    Google Dance Syndrome Strikes Again, Part s 1,2,3

    Google Dance Syndrome Strikes Again, Part 2: "Google Dance Syndrome Strikes Again,"

    Q. Does this mean Google no longer uses the PageRank algorithm?

    Google never used the PageRank algorithm to rank Web pages. PageRank is a component of the overall algorithm, a system Google uses to measure how important a page is based on links to it. It's always been the case link context was considered, as well as page content.

    Unfortunately, some call Google's system of ranking "PageRank." Google itself can make this mistake, as on its Webmaster's information page:

    The method by which we find pages and rank them as search results is determined by the PageRank technology developed by our founders, Larry Page and Sergey Brin.
    The page describing Google's technology more accurately puts PageRank at the "heart" of the overall system, rather than give the system that name.

    PageRank was never the factor that beat all others. It's been, and continues to be, the case that a page with lower PageRank might rank higher than a page with a higher PageRank. Search for books. and if you have the PageRank meter switched on in the Google Toolbar, you'll see how The Online Books Page with a PageRank of eight comes above O'Reilly, although O'Reilly has a PageRank of nine. That's a quick example. You can see more by checking yourself.

    Google IPO News

    Google IPO News: "News of Google's suspected IPO is everywhere as Bloggers, industry experts and journalists seek to give the subject a new spin. Check out this collection of articles on Google's situation and decide for yourself what you think."