best websites of the world

Please comment...

Please comment below with your thoughts. I'm not so old a dog that I can't learn a few new tricks!

Filed under: Link Popularity, Linking, SEO, Visitor Factors, , , , , ,

Comments

  1. Angie Haggstrom wrote on

    I read a blog post a short while ago (Sorry, I can’t remember where, but I’m sure there were thousands) that mentioned Google would soon be taking bounce rate into consideration. Rumor has it that Mr. Cutts all but admitted this to be true.

    Even if more emphasis was placed on a site’s bounce rate, wouldn’t that still be biased against new sites? I would think so. At the same time, links are no guarantee that a site will have a good bounce rate or be more relevant.

    In my opinion, I don’t see visitor factors dominating links in terms of value. It’s a cyclical pattern.

    If it’s a good website, it will have good content, and therefore, more visitors and link love (a hub). If it’s a bad site, there will be very few connections to the rest of the virtual world.

    What would happen if it depended on the idea of consistent balance? Search engines would select sites that have matching visitor quality and links. If links become far higher than visitor factors, the site’s placement would fall and vice versa.

    What would happen if a new site with perfect visitor/link ratio was compared to an established site with the same qualities and just higher numbers? Wouldn’t this scenario still benefit the older site?

    I’m really quite unsure, but I will certainly be watching curiously. In the meantime, I’m going to build on both sides of the equation.

    Certainly makes you think…

    Angie

  2. Barry Welford wrote on

    Perhaps the one I wrote on Bounce Rate recently, Angie, is the one you’re thinking about. I’m certainly in the camp that says this is inevitably what Google will be including. Good article on an important issue. Thanks.

  3. Angie Haggstrom wrote on

    That isn’t the one, but precisely what I was talking about. Thank you for sharing! (You’ve gained another subscriber.)

  4. Chris Allison wrote on

    Ultimately the Matthew Effect is the foundation of the network effects expressed by O’Reilly when outlining what Web 2.0 is. To be a successful site online you just need to secure a bit of momentum, the users will network you up after that. On the other hand, if you secure any amount of downward momentum you will be networked down very quickly as well. This obviously makes it very difficult for new players to enter the scene, and in my mind can be said to be one of the pitfalls or ugly points of the web. In one sense it is very easy to join in what’s going on online, in another sense it is very difficult to get any credibility, voice, or market share.

  5. Glenn (Owner) wrote on

    Thanks for your comments guys. Excellent! I’d like to reply to a couple, but it’s the weekend here, so I’ll have to come back on Monday to it.

    In the meantime, Chris, I think someone has poached your comment and used it to comment on my post as submitted to Sphinn. Check it out: http://sphinn.com/story/96071

    Not sure what to do about this.

  6. Chris Allison wrote on

    Wow, that’s low. Thanks for noticing, Glenn. I left a replying comment at the sphinn article. I think that should suffice.

  7. Brett Tabke wrote on

    Very nice run down. I agree with 99% of the article. So let me uphold web tradition and talk about the 1% I have a problem with (lol).

    I get worried when people talk about “metrics” like on-page-time, back-click-time, click-to-conversion time, or “session time”. I worry that people are going to take these “Direct Hit” (google it), type metrics as gospel. Lets look at just a few of the inbound data sources google has available:

    – on site searches (and all related data like spelling clicks, cached clicks, various services clicks like youtube to images to news, etc).
    – clicks off Serps. They’ve had that click tracker running since 2002.
    – google checkout – conversion tracking.
    – toolbar. Although my gut tells me most of this data isn’t worth much except for demo and psycho-graphic data.

    The crown jewel of Google data intel is Google Analytics. It now runs on more sites than the rest of the top trackers combined. It completes the Google picture on the site level.

    Given that vast and massive data mine, only Google could tell you what they would do with gathered data and how it would apply to the algo. Anything we come up with is nothing but a guess and must be discounted as such instantly.

    The reason it tasks me, is that there are alot of bloggers spreading the notion that visitor “on site time” (ala, direct hit), is something Google would factor in. There are alot of variations on that thinking. Without the same data google has in front of them, I don’t think we can even make those kinds of guesses.

  8. Glenn (Owner) wrote on

    Thanks for your comment Brett!

    I agree that we’re guessing, at best, but it does seem logical. Google has invested a lot in a host of services and products that track visitor behavior. I don’t think the question is IF they’ll use visitor behavior as a factor, but HOW they’ll use it.

    Cheers!

  9. Easton Ellsworth wrote on

    Great Problogger post today, Glenn!

    “The important thing to realize is that even on the Web, the word ‘connection’ doesn’t have to mean “‘hyperlink’.” – Awesome. And that’s what search engines are trying to capture.

    It’s very narrow-minded to just want to get links links links. You’ve gotta try to get people, people, people to connect with you in a real way.

Leave your comment

(required) (will not be published)