The Internet’s Infrastructure Needs a Revamp. Or At Least, Browsers Do.

The internet is huge. And it’s really volatile. Millions of people visit millions of websites each day. Digg, Reddit and more create a world in which a small site can get slammed into errors and unable to serve users. And it’s growing.

Take Twitter, one of the most successful web social products on the internet. Twitter goes down often. It’s not because they have one server. I’m sure they have several dozen in different locations (they wouldn’t comment on the statistics).

Digg constantly takes down sites. So do other ‘article discovery’ sites. It has become a normal occurance to use rorrim or Google Cache in lieu of a site going down. This is common.

But the thing is, it’s not completely horrible. People come back. Word spreads. And the next thing you know, people are writing blog posts about your blog going down. It’s a compliment. “Oh, my site just got crushed by Reddit,” “Awesome, dude!”

So if it’s not such a bad thing, why am I writing about it? Well, we need a solution. This issue is just going to get bigger and bigger, because more and more people are going online for more than to check their email. Because I want to be able to check out someone’s Twitter account when I want to. Because there are solutions.

I could say that more people need to use Squarespace. After all, Squarespace is an amazing site and blog creator that’s designed to keep your site up no matter what. With virtual server space, the guys at Squarespace automatically give you more bandwidth when you need it.

But that’s an utopian idea. And, not everyone wants to fork over $8 a month to keep their website up. I could also say that more people should be using Blogger, and WordPress.com to host their blogs. But even that can be volatile.

No, the solution does not lie in which server or service you choose. It actually lies with the regular computer user. Yes, the change needs to happen in the browser. Opera has gotten close with its browser that hosts your website for you. But it’s simpler than that. Not one server or computer should host a website. Not one service should be relied on as a backup for the internet. It should be every computer.

What I’m proposing is a torrent-like system of extra website hosting. Built into browsers should be the capacity to visit a site, download the webpage you just visited into a secure location on your computer and be able to push the webpage to other computers with the same browser capacity installed.

This would be demanding. It would also have to be set up in the most secure fashion. And it might not even be able to work with some sites, especially ones that use log ins and massive secure databases. But can you imagine it? I can:

Twitter goes down, again. But when you visit the site, a small pop up notifies you that you are grabbing @joeschmoe’s tweets from another computer, which downloaded the page 5 minutes ago when Twitter was up. Nice.

All of a sudden, the world’s internet infrastructure takes a huge load off of individual servers and lets computer users help each other out. Similar to Folding@Home, or The Pirate Bay, this would give the power to computers around the world.

This is a radical idea, and one I hope to see in the future. I hope that it can be built well enough so that a stable internet is not far away. Because technology is amazing. We owe it to ourselves to enable the ability to spread it.

Andres Max Salmeron is the writer behind Squealing Rat, Lone Iguana, Empty Quotes and more. Find him on Twitter. You can also support him by donating to The Jimmy Fund Walk or just clicking one of the affiliate links here to give him some more storage space.

I’m also looking for collaborators to work with me on other, less intensive web projects, so contact me for information if you’re interested!

This entry was posted in Squeaks. Bookmark the permalink.

5 Responses to The Internet’s Infrastructure Needs a Revamp. Or At Least, Browsers Do.

  1. Pingback: 09.08 2010 weblog news internet

  2. Pingback: 12 Ideas for Web Apps, Sites and Other Things | Squealing Rat: The Word

  3. Arley says:

    Wow, that might actually work. It could be a browser extension that sits in the place of the local cache, using some HTML5 local datastore for instance. It would check every request and in case of timeout (say 5-10 sec), visit the local cache, if it’s not there then ask P2P. It’s peer caching, basically. Naturally you’d exclude certain websites based on url pattern, like adblock does; mail.google.com, mycompany.secureweb.net, etc.

    P2P Caching. Cool.

Leave a Reply

Your email address will not be published. Required fields are marked *