You are viewing phreakhead

Previous Entry

Powerbass
The Internet was founded on the concept of a distributed network. If someone blew up servers in one place, the network would still function and data could still get where it needs to go. Now all websites are on specific servers, so you can easily take down a single website through virtual or physical means.

However, if we take a cue from BitTorrent, which is the next generation of distributed storage, and implement that idea across ALL network resources, the web would be invincible.

The problem with netbooks is that they have it backwards. Instead of being a dumb terminal that relies completely on a remote server for its storage and computing power, each netbook could be a node in a distributed version of the web. If each user had a cache of bits and pieces of websites they visit, pictures they peruse, videos they view, etc., they could connect with their fellow users and combine their collected fragments into a complete file. Thus the content providers would be relieved of their storage and bandwidth burden and could maintain a much smaller infrastructure to coordinate the transfers (trackers) and distribute the official versions and updates (seeders).

BitTorrent transfers alone saturate almost half of all Internet traffic. Why? Because it's the fastest way to download something! If all web traffic was distributed like that, your neighbors could be sharing the websites you visit in one or two hops! You could stream music in parallel. Slashdotting anything would be impossible because the barrage of requests wouldn't even make it out of your office building!

A distributed web would not only give the users faster connections and the providers an ease in hardware, it would make it easier for smaller organizations to have bigger followings without the monetary barriers of maintaining scalable server infrastructure. Eventually the entire Internet will be one giant cloud of independent users and nodes in a frenzy of information sharing. First step: down with netbooks!

Comments

( 10 comments — Leave a comment )
igliashon_jones
Aug. 19th, 2009 05:27 am (UTC)
2nd Step: Develop software framework and/or pitch to Google (who have total BONERS over netbooks).

But yeah, I totally agree! The internet would finally be democratized for real!
(Anonymous)
Oct. 13th, 2009 09:02 pm (UTC)
Freenet
Hmm, sounds like you want Freenet. http://freenetproject.org/ You can use that on netbooks too of course, just like bitTorrent. Don't blame the hardware; there are many netbook owners who do not use SaaS.

Regards,
Bryan
(Anonymous)
Nov. 12th, 2009 11:37 pm (UTC)
Shame many have to pay for traffic and have caps
In the real world many of us still pay for traffic and have caps. Roll on the brave new world.
(Anonymous)
Nov. 13th, 2009 03:30 am (UTC)
Missing the point
Yeah I agree, making the entire web available through a torrent like distributed network would be grand, oh but what about all those websites with databases?
Sites like Facebook, wouldn't work, their entire architecture would need to be turned on its head, torrents aren't designed for fast changing content, they are a static thing, once it's out there it cant be changed (or am i wrong about that) how would you manage version control of these sites, sure my neighbour might have a cached version on some website, but how would my computer know if his cache was up to date or not? I would need to go to some central entity to discover the latest version and compare to what my neighbour has.

Don't get me wrong, I think the idea is brilliant, I just don't think it's practical in this situation, maybe a whole new type of distributed network would need to be made.

You could get around the central database thing using micro-formats and a pubsubhubbub type of thing, pushing notifications around the web rather than requesting data. Maybe that might be a better method?

Ross Scrivener
http://scrivna.com
(Anonymous)
Nov. 13th, 2009 09:10 pm (UTC)
Re: Missing the point
Couldn't at least the CSS off of sites be a torrent-able thing? I mean even facebook and myspace to a certain extent use a bit of CSS that allows for their uniform feel even though the content changes at an extremely rapid pace... for that matter combining the power of CSS and RSS might make for a pretty good combo for torrent distributable content. Just a thought tho. What do you think?
(Anonymous)
Dec. 9th, 2009 10:08 am (UTC)
Re: Missing the point
if there were mechanisms for updating that cached version, creating distributed static clouds makes sense. it's not too dissimilar to amazon's cloud offerings, other than being on a network wide scale instead of proprietary residence.
jaksprats
Nov. 13th, 2009 01:59 pm (UTC)
torrents are not faster for small files, cant stream
Most webpages consist of small files (html, js, and css < 5K) and pictures (gifs, jpegs < 100k). torrents are slower for downloading these files because you have to download the tracker, find the seeds/peers, then start giving and getting file slices, while getting them directly from a http server is a one step process.
Torrents also do not support streaming as file slices arrive in random order, so they are not even fit for delivering youtube files.
The idea of distributing and caching static content is a must in the near future, especially considering digital tv/movies delivered via the internet, but its not as straightforward as it seems.
Once this concept is implemented, I am sure netbooks (you mean cloud-based-OS cheap laptops) will hop on its bandwagon.
phreakhead
Nov. 17th, 2009 08:33 am (UTC)
Re: torrents are not faster for small files, cant stream
You're right, the torrent protocol as it is now is a lot of overhead to maintain for the small files that comprise webpages, but if torrent technology was combine with, perhaps, some kind of change tracking system, large parts of websites could stay static and distributed amongst many users' caches while the dynamic parts could be based on a push model like Ross was talking about earlier.

Subversion comes to mind as a starting point. If a user wanted to refresh a blog, for example, he would just query the server for the latest revision number (and perhaps a hash to verify the data). If he wasn't up to date, he would query his nearest neighbors for the differences between his cached version of the page and the latest updates (probably a couple of comments or maybe a new article). It might take as many or more requests than just getting a fresh page from the server, but the overall data transferred, and the distance traveled, is much less when you think of it in terms of billions of requests traveling across the world and back.

Of course, websites like this would be completely different to program, and much more complex, unless there was a simple protocol that could handle the basics for you. We've gotten a faint idea of it from AJAX, moving nearer with HTML5 Web Sockets, and I think the latest developments in parallel processors will be pushing programmers to think of paradigms to distribute the workload, which will eventually spread to all aspects of computing, including the web. Perhaps even the code itself would be distributed, and users could just work as a cloud processing their own websites as they view them. But then again that's a little far-fetched, even from an open-source advocate like me.
miraease
Apr. 11th, 2011 12:25 am (UTC)
Hmmm for some reason only half the post can be seen. I tried reloading but still same.

miraease
Apr. 12th, 2011 02:45 pm (UTC)
found your site on del.icio.us today and really liked it.. i bookmarked it and will be back to check it out some more later

( 10 comments — Leave a comment )

Latest Month

August 2009
S M T W T F S
      1
2345678
9101112131415
16171819202122
23242526272829
3031     
Powered by LiveJournal.com
Designed by Tiffany Chow