P2P Content Delivery for Open Source 134
Orasis writes "The Open Content Network is a collaborative effort to help deliver open source, public domain, and Creative Commons-licensed content using peer-to-peer technology. The network is essentially a huge 'virtual web server' that links together thousands of computers for the purpose of helping out over-burdened/slashdotted web sites. Any existing mirror or web site can easily join the OCN by tweaking the HTML on their site."
Never mind that (Score:2, Insightful)
That's the only thing that matters!
No Fun (Score:5, Funny)
What good are these CPU hogging, network lagging programs if they aren't delivering pirated software and p0rn? I won't stand for this abomination!
Well, at least until the next time I need to download the newest slackware...
Re:No Fun (Score:4, Interesting)
At least the nicotine-less smokes still cause cancer.
You also forgot to mention all the potential dates here on slashdot. At least, you can still find someone to kiss on slashdot... oh, wait... nevermind.
What good are these CPU hogging, network lagging programs if they aren't delivering pirated software and p0rn?
Fullfilling their purpose, just like the above items you mentioned.
Just as in the above cases, part of Corporate America's master plan to remove all joy from the universe. (+1 Insightful)
Dunno (Score:2)
Re:Never mind that (Score:4, Interesting)
However, the problem with systems like Freenet is that the latency is incredibly high and has no way to leverage the existing mirrors that host much of the open source content today.
Since the OCN is built around HTTP, it can start its downloads immediately from the network of mirrors and as peers are discovered, it moves more of the burden onto the peers.
The end result is that the OCN has about the same latency as normal web browser, but can provide very fast parallel downloads when leveraging existing mirrors.
That said, if high latency isn't a problem, then we highly recommend using Freenet [freenetproject.org]. We think its a great project and look forward seeing more and more applications built on top of it.
sounds like a great concept (Score:2)
But at least when the net gets back a usable level, slashdot readers will be able to read about it without slashdotting a site.
Call me dumb... (Score:4, Insightful)
What we need is some gateway product to get young kids hooked on programming.
Re:Call me dumb... (Score:3, Funny)
Not everywhere (Score:1, Insightful)
Also, how is this different from bittorrent?
Re:Not everywhere (Score:2)
Um, yeah... that would make email *really* efficient.
Re:Call me dumb... (Score:2, Informative)
Bittorrent (Score:5, Informative)
Re:Bittorrent (Score:4, Interesting)
Re:Bittorrent (Score:4, Informative)
Re:Bittorrent (Score:2)
What if bittorrent came with your web browser and was already part of its download manager built into the product. Or what if bittorrent could be installed with the click of a button much like flash, etc..
Rather than telling people "please leave your bittorrent window opened for as long as possible after downloading..." and making a big fuss out of leechers, etc.. (This kind of stuff scares people away) just force everyone to use bittorrent. Or cap the number of open slots for non-bittorrent users downloading the file.
Bittorrent already saves bandwidth even if people don't leave the window opened. If you ONLY offered files via bittorrent, then people would use bittorrent, and there would be enough people downloading the file that you wouldn't need people to leave the window opened.
Re-reading, the above looks more like a ramble. But with a few tweeks, usability features added, and the annoying parts taken out, bittorrent could be as seamless as just "clicking" on the link to download.
Insted, they make you sit there and install an application, configure an application, etc etc etc.. jump through hoops... just to download a file. Joe Schmoe doesn't want to do that, and neither do I..
(PS, I do use bittorrent when a website offers files for it)
Re:Bittorrent (Score:5, Informative)
Yes, BT is exactly this. A good site about BT : http://smiler.no-ip.org/BT/info.php [no-ip.org]
I don't see much difference between BT and OCN, by the way. Or am I missing something ?
Re:Bittorrent vs. OCN (Score:2)
Also, of course, neither one really fixes the Slashdot problem, because you need to set them up _before_ you're slashdotted :-) On the other hand, once you know you're in trouble, you can update your site to use them all by yourself, as opposed to bullying Slashdot into making a cache or hoping Google has one.
Bittorrent should join the OCN (Score:2)
It has been intentionally made very easy to incorporate new clients into the OCN. All you need to do is make an XML-RPC call advertising the content you have available and you're in.
If Bittorrent joined the OCN, it could leverage the resources of other clients in the network and give more choice to users about which software they want to use.
The eventual goal is to get this type of technology integrated directly into Mozilla, Squid, and Apache. But that will probably take some time...
Why BitTorrent is a better solution than OCN (Score:3, Interesting)
The key insight here is that when data is encoded to increase its reliability within the p2p network it becomes useless to the person who is holding the data. This is not a problem for some applications, but when you are trying to solve the slahsdot effect or serve popular content it can become a limiting factor. The advantages that BT has over this system are that it does not require the data to be encoded in a special manner by the publisher and that data that is stored on the edge nodes is still useful to those nodes. A design like BT can peer data out of your browser cache and share data a larger range of data from each particular peer. This is going to be a significant advantage in the long-run.
This is untrue (Score:4, Insightful)
This means that it can download content from a regular Apache web server w/o any modification whatsoever.
This also means that the peers are simply embedded web servers, can stream content (video) straight to the browser, and can use SSL out of the box.
Why OCN is a better solution than BitTorrent (Score:1)
Because the OCN's techniques use minimal deltas atop HTTP, well-documented in open specifications, it has the potential to become a standard mix-in for almost all web servers and clients... transparently bringing the power of ad-hoc replication to every web transaction.
P2p web content? (Score:4, Interesting)
I'd love to see unreliable, poorly maintained, pop-up happy free websites like angelfire and geocities go away and use a vastly superior P2p system instead. SOmeone wants to connect to your special web page? Have them connect to you via the P2p client. No need to fuss with slow FTPing into servers to upload/update web content. It's already on your system.
Re:P2p web content? (Score:3, Informative)
Perhaps Slashdot could use this only in certain cases like when linking to a pdf or linking directly to the newest iso.
Re:P2p web content? (Score:2)
That sounds like you're just saying the P2P client should integrate a normal web server. Why not just run apache on your computer? Either way equally violates your TOS with your ISP.
Do you understand that the internet is fundamentally p2p?
Jason
ProfQuotes [profquotes.com]
Re:P2p web content? (Score:2)
uh... no, it isn't. The Internet is fundamentally client/server.
the benefit of this is precisely NOT what you said. it's not the same as having a web server on your 56K line, that's why it's useful. If you've got useful content, once someone downloads it once, they make it available too. The more useful it is, the more it gets downloaded from you, and the other guy (and they tell two friends and they tell two friends, and so on and so on...).
After a while its distributed all over the place, so the next user can download the file(s) in 30 discrete pieces from 30 different places in 1/30th the time and at 1/30th the bandwitdth on an individual connection to the network (from the content provider's perspective).
Also, many P2P networks tend to align themselves around super-nodes, high bandwidth machines, that are capable of serving a lot more content, so the 56K lines of the world aren't slagged...
see? that's the benefit. not so bad after all...
Re:P2p web content? (Score:2)
If you're creating a website for some friends, your 56k modem will easily be able to serve it. If it gets too many hits, pay a real hosting service. It's not that expensive.
ProfQuotes will easily withstand a slashdotting, it's on a 100mbps pipe. It would be impossible to build a 'real' site like that into a distribution service like you want because there would be no way to synchronize user submissions. A P2P file distribution network is good for large files that don't change often. Not for a 'live' website
Jason
ProfQuotes [profquotes.com]
Re:P2p web content? (Score:2)
I also agree that synch is a big problem to over come, and real-time or near real-time content distribution would not be an optimal use for something like this... but in reality most web content isn't real time, it's mostly static -- and therefore a candidate for this type of delivery.
it's nice that your site has a Big Fat Pipe coming into it, but not everyone with worthwhile content to share has one. this is a method of distributing bandwidth, and ostensibly, making a much more efficient use of our networks, collectively...
Re:P2p web content? (Score:2)
Besides the synch problem, P2P distribution has reliability problems; people shutting their computer half way through a download or having corrupt copies to begin with. Many others were already mentioned on this discussion.
Another problem is that for something as small as a web page, the overhead in co-ordinating where to get the data from is greater than the size of the data itself.
My point with my site is that this is the best way to deal with possible surges in use. I'm sharing the pipe at least 1000 ways, but 99.9% of the time I'm not even using my share. However, when I need the bandwidth it's there for me; I have peaks where I hit 1-2 mpbs for a few minutes. Since it's shared so many ways by other people in the same situation, the overall price is very cheap (a small fraction of what I'm paying for my broadband connection).
Jason
ProfQuotes [profquotes.com]
Re:P2p web content? (Score:1)
Places like FilePlanet... (Score:5, Informative)
This technology's out there, but it's nice to see it gaining some fairly widespread adoption.
levine
Re:Places like FilePlanet... (Score:2)
Of course, there's the issue that some patches or demos are 'exclusive' or prevent you from redistributing unless you're a big-name site through their EULA, which would make a ad hoc P2P network unable to offer a file.
Happy Puppy (Score:3, Informative)
That said, the OCN is open to any and all applications out there, so I'd encourage them to join the OCN.
Re:Happy Puppy (Score:1)
http://happypuppy.com/usr/register/index.jsp
Re:Happy Puppy (Score:2)
Re:Places like FilePlanet... (Score:1)
Fileplanet doesn't use this kind of technology simply because it defeats their business model.
(not really relevant, of course)
Re:Places like FilePlanet... (Score:1)
FilePlanet doesn't want to lessen people's dependency on paying for their central servers, so I don't think they'll be going this route. :)
--
impressive (Score:4, Interesting)
I'm wondering if maybe this is the future of blogs like Slashdot, with design, features, and content distributed the same way moderation and commenting are today. Creative Commons licensing would be a further boon.
This sort of next generation P2P network might be the weapon we need against the forces of evil, if only we are brave enough to use it.
Re:impressive (Score:1, Funny)
Re:impressive OT (Score:2)
Boromir, son of Faramir, King of Gondor and Minas Tirith
Uh... Boromir is the son of Denethor, and brother of Faramir.
In the future, you may wish to take the following advice: identity theft is only effective when you just manage to get the fucking facts straight!
Slight change to Content Addressible Web (Score:5, Insightful)
Scenario #1:
Assuming the Originator Apache responds with HTTP headers such as those in CAW to advertise site-wide mirrors like this:
X-URI-RES: http://urnresolver.com/uri-res/N2L?urn:sha1:; N2L
When the originator Apache site updates any documents, the URN resolver (or mirror) will silently fail without realizing which document has been updated. It would need to rescan the entire website, even when only one document has changed.
Scenario #2:
The opposite problem occurs with the Originator Apache responding with HTTP headers such as this:
X-URI-RES: http://untrustedmirror.com/pub/file.zip; N2R
The mirror will respond successfully, but will give an out-of-date version of the file without the client or the mirror realizing it. The mirror would then have to manually scan the website on a regular basis (even when nothing has changed) to prevent anything getting too out of date.
Scenario #3 (Solution):
However, if the Originator Apache responds with HTTP headers such as this:
X-URI-RES: http://untrustedmirror.com/pub/file-mirrors.list; N2Ls; urn:sha1
When the URN resolver or Mirror sees the SHA-1 hash mismatch, it knows which document needs to be updated, and can respond by doing so for just that document.
I realize that CAW is mainly designed with static files in mind (images, PDFs, ISOs) where updates occur rarely (or never). And no, I don't see Apache calculating the SHA-1 for dynamic pages like Slashdot anytime soon. However, updates do occur to images, PDFs, ISOs, etc. on occasion. I do think CAW(#3) could be used (and useful) for large, heavily subscribed RSS feeds without too much trouble. Maybe elsewhere in dynamic content.
Re:Slight change to Content Addressible Web (Score:4, Informative)
To get a better picture of how HTTP caching semantics work, I recommend trying out Cacheability Engine [ircache.net] and enter in a couple of sites to see how cacheable their content is.
Also, the OCN uses the SHA-1 hashes for all content addressing as soon as it translates the URL to the SHA-1 URN. There-after, the content is only referred to by its SHA-1 URN, so there is no concern about version conflicts between mirrors/peers, because a single SHA-1 URN can only ever point to a single version.
Your point re: dynamic data is a good one. The OCN really isn't designed for delivering dynamic content because it changes too frequently and the cached copies on the peers would quickly become stale. However, we are doing some work with caching RSS feeds, which provides a nice trade-off between dynamic and static content.
Re:Slight change to Content Addressible Web (Score:2)
The CAW spec explains how originator servers work, but not how mirrors should operate.
By the way, the OCN-dev list does not appear to be accepting new members when you reply to the confirmation e-mail. Don't know if there's a human in the loop or what.
Help the slashdot effect? (Score:5, Insightful)
I mean, I might get the plugins if I'm dealing a lot with sites that use this technology, but how many people will be dealing with a lot of these?
And those sites are using this, are probably the ones that are use to high volumes of traffic, so they prepare for it. The average site that can't handle a slashdot, can't handle it because they generally don't need to.
Re:Help the slashdot effect? (Score:2)
Re:Help the slashdot effect? (Score:1)
My worry would be in the sites serving up the plugin being
BTW, what's with
If a
How would it work? (Score:2, Insightful)
Re:How would it work? (Score:2)
Trojans? (Score:1, Insightful)
Magnificent (Score:3, Funny)
That's what the checksums are for (Score:4, Informative)
By the way, your home page is ugly.... [heineken.com]
Since when? (Score:5, Insightful)
Sarcasm aside, while I can see where they're going with this I can't see it ever seriously taking off. Most of the world are still on 56k (or less) and I know I regularly hunt for things to delete so I can squeeze something else on my hard drive.
Reliable 150MB file delivery in Freenet today (Score:3, Informative)
To recap:
Re: Reliable 150MB file delivery in Freenet today (Score:2, Funny)
> Some guy has made a bunch of 150MB ogg video files of Star Trek Enterprise episodes available, and they are all reliably downloadable (at about 40k/sec across a broadband connection) from Freenet.
And the amazing thing is, they're 100% as crappy as they were on television!
Re:Reliable 150MB file delivery in Freenet today (Score:2, Informative)
Yeah, but Freenet is still slow as fnuck. (Score:1, Funny)
That is what bandwidth limiting is for (Score:3, Informative)
Re:Reliable 150MB file delivery in Freenet today (Score:1)
Copying and pasting is obviously not as effective as it used to be...
You've missed the point. (Score:2)
Hence, the fact that the post tells you to install the software BEFORE using the link.
Do some reading on freenet, what it does and how it works. It's quite interesting.
Re:You've missed the point. (Score:1)
Re:You've missed the point. (Score:2)
Re:Reliable 150MB file delivery in Freenet today (Score:1)
Last time I installed it, a single page took anywhere from 30 seconds to 3 minutes to load (if it loaded at all).
Seriously, Freenet is an excellent idea, but until it reaches a decent speed (eg: anywhere close to regular HTTP) it's practically useless for casual surfing.
Here's an idea... (Score:5, Interesting)
Browser caches? (Score:4, Interesting)
I guess the big problem is still with the indexing.
Re:Browser caches? (Score:1, Interesting)
Good ol' fashioned caching is almost as good as p2p.
Re:Browser caches? (Score:3, Interesting)
Sharing directly browser's cache content, without much care, will share also webmail messages, personal data, slashdot preferences, whatever you don't want to share.
Eh? Sounds a bit like... (Score:2, Insightful)
Any existing mirror or web site can easily join the OCN by tweaking the HTML on their site
Sounds a lot like it to me - especially the bit about it being like a virtual server. I suppose if it stored images and stuff then it's be a bit better, but will it ever match Google's speed and breadth of content?
what about standards? (Score:3, Insightful)
In fact I'm pretty astonished none of these organisations has ever picked up P2P.
wow, i see alot of dumb replys. (Score:3, Insightful)
What this "could" mean is say if your favorite distro has just been updated, we all know how hard it is to download 3 isos while they are in high demand. The thing about OCN and OnionNetworks type software is that the high demand and download rate will help the availibility. Plus everything is authenticated and logged so worms/trojens/fakes really arnt a problem.
As far as OCN goes, it's not for warez, and divx. I think it's intended for geek's free software distrobution. So love it, and try to inovate it.
Re:wow, i see alot of dumb replys. (Score:1)
That's not a butt-plug, stupid!
k2r
Re:wow, i see alot of dumb replys. (Score:1)
If your being sarcastic with the distro comment: You may note that the article says nothing of getting the latest distros, but making redundant mirrors on a P2P network of sites that get overloaded or
The other thing is I don't understand how they can prevent people from using this for illegal files. Anyone can enable their website [onionnetworks.com] to support this type of downloading and there is no central moderation. People were screaming "READ THE ARTICLE" at one guy who posed the question of illegal files, but as you can see, it's really quite simple to slap that JavaScript code on any page reguardless of if it has an OSI-approved license. OCN isn't meant for divx or warez, but it wouldn't be hard to make it that way; I see no way of preventing this.
[sarcasm]Oh, and by "gay" you mean this plug-in is happy? Because using that as a derogatory term really makes your case stronger with another dumb reply.[/sarcasm]
Codecon Conference - P2P programming (Score:3, Interesting)
How far away is this from.... (Score:5, Interesting)
p2p://www.cnn.com/
style link for Explorer/Mozilla/Opera/Konqurer?
Turn everyone's browser cache into p2p.
CNN's probably a bad example, as the content would have to be updated more frequently... And you'd need some way of having a "revision model", so that sites could be updated. I guess it would be up to the clients to ditch old versions of pages.
Might also need some sort of (eep!) central authority to verify pages were who they claimed to be (so I couldn't take over CNN, for example). Maybe just signed keys for each content provider would be good enough?
Re:How far away is this from.... (Score:2)
I have recently performed cursory exploration of using connected user's caches as a source of a p2p web mesh. Yes, (some) people would scream bloody mercy at having their "privacy" invaded, but then again they probably aren't the target market anyway.
Re:How far away is this from.... (Score:2)
Revolutionary thinking (Score:3, Flamebait)
Re:Revolutionary thinking (Score:1)
Re:Revolutionary thinking (Score:2)
Perm node.
24/7 uptime, 384k upstream.
30GB datastore and counting.
Re:Revolutionary thinking (Score:2)
Only for "open" content and "open" software (Score:3, Interesting)
"Multimedia files are eligible to be distributed via the OCN if they are either released into the public domain or are available under a Creative Commons license that allows the content to be freely copied."
and
"Software is eligible to be distributed via the OCN if it is released into the public domain or is licensed under an Open Source license. The license must be an OSI Approved License that adheres to the Open Source Definition."
Honestly, those constraints seem to seriously restrict the real usefulness of the network. It means if I want to put up a webpage and publish the contents with OCN, I need to go through all the rigamarole to make sure that everything's copacetic with whatever the "approved" licenses are, instead of just saying "ok, stick this out there." Which I may not want to do because if I've just created some magnificent thing (music file, video file, whatever) I may not necessarily want the license to allow anyone to download it and modify it any way they want and then essentially claim it as their own.
Software is one thing, but online content is something else. Honestly, how many large "media files" have you seen that are licensed under an "Open Content" license?
Sure, it's nice to have something like this that caters explicitly to the OpenSource crowd, but with those constraints, I can't see it as used for very much other than putting new versions of GPL'd software packages online.
Re:Only for "open" content and "open" software (Score:2)
They don't have to be "Open" as in freely modifiable. They could be licensed under a Creative Commons 'redistribution without modification' licence. That would make perfect sense for movie trailers, game videos etc. It would end the stupidity of movie studios parking their trailers on apple.com, and games sites requiring paid "premium" accounts to download ads which are basically advertising.
Also, people putting out videos of their autonomous Lego stegosaurus or whatever could use this.
Re:Only for "open" content and "open" software (Score:2)
videos, rather.
How about using the CC RDF License Engine? (Score:3, Informative)
The CC License Engine framework really does all that is needed to identify a license that would allow the P2P network to redistribute the material , so why not use it?
Or am I missing something now?
Not really surprised... (Score:5, Interesting)
I'm the type of guy who doesn't like sharing my bandwidth, but I'd be willing to make an exception for Open Source stuff just on the grounds the it helps alleviate the costs of hosting free stuff.
Re:Not really surprised... (Score:2)
I, and hundreds of other sources, are sharing OpenOffice, and all the RedHat 8.0 ISOs [slashdot.org], among other things.
--
They need to be more descriptive. (Score:5, Informative)
This is terrible.
We complain when Gator is loaded as an 'add-on' to our system, yet we don't mind if we are not allowed to download some content without loading some P2P app which then uses our disk space and internet connection to serve others?
They need to put up a specific message that says, in effect, "This download client will significantly speed up the process of obtaining this file. Once downloaded your computer will allow other people to download this same file, or portions of it, from your computer so they can gain the same speed benefit you will get. There is no security risk, and you can stop the client from letting others download this file by moving or deleting the file, or ending the client by doing x, y and z. If you wish to simply download the file normally without installing this client, click here - otherwise click 'OK'"
Yes, we all understand what P2P means - we are donating part of our computer and network to the P2P network for as long as we are connected to the internet. But this is not common terminology - ask a non-computer expert who has spent hours downloading music from their favorite P2P app what the P2P app does, and all they know is that they can get "free" music with this cool program. They often have no idea that others are downloading music from their computer, etc.
This may slow down adoption, but the reality is that the backlash that may come out against it is not worth the extra adoption it may gain without full and well-explained disclosure - as well as a method to download the file normally.
-Adam
Re:They need to be more descriptive. (Score:4, Informative)
We should also probably explicitly state that absoutely no adware/spyware/shitware will be installed on your machine.
Re:They need to be more descriptive. (Score:1)
Creative Commons Share Alike License and the GPL (Score:2)
I think that the CC project is a great idea and some kind of P2P distribution is great too. I do have some some minor concerns though, one is over their share alike license.
On the FSF web site there is a short list of Licenses For Works Besides Software and Documentation [fsf.org] and the Creative Commons Share Alike license is not mentioned, so I sent the FSF an email about this and the opinion seems to be that the Share Alike license would not be considered GPL-compatible, which I think is a shame. The email about this can be found on the cc-licenses list archive [ibiblio.org]
Innovative!!! (Score:1, Redundant)
Relationship with OpenContent? (Score:2, Interesting)
Very cool... (Score:2)
I hope this works. It'd be a nice alternative to porn and mp3's on P2P networks. (Even though it is its own P2P network). Maybe distributed CVS should be next? That way we don't overload a server, but we can still download just the updates we need?
Re:Very cool... (Score:1)
Maybe distributed CVS should be next?
I think this is one of the aims of the Arch project (I can't post the link now because I can't find Arch's new home on Google... strange. I'll have to do it later when I'm on my own machine). It's a shame Arch doesn't get more publicity/support, it's an interesting project.
Re:Very cool... (Score:1)
Found it!
http://www.fifthvision.net/open/bin/view/Arch [fifthvision.net]
Keeping the content legit... (Score:1)
What we need is a P2P network that outright will not allow certain types of content. That's the only way we'll ever keep the SNR down and avoid giving universities and ISPs an excuse to restrict usage.
Ok, I now got a reason to install Squid (Score:1)
Nice to see that yet again I got a reason to fire up that compiler.
Gen toooo slow (Score:2)
That 40MB file downloading at a painfull 2k because the mirror you picked is abit bussy at the moment.
And why doesn't it continue to download while compiling. P2P networks let you play a file and download at the same time.
Many more OT gripes about emerge... but I'll save them for a patch.
the WWW *is* content-addressable (Score:1)
Regarding:
The World Wide Web is "the universe of network-accessible information [w3.org]", i.e. anything with a URI, including URIs that are not tied to a particular hostname.
The Web already includes non-location-based URIs like mid: [ietf.org] (for referring to message-ids), and urn:sha1: for referring to a specific set of bits by their checksum.
This proposal seems like a decent way of bridging HTTP-space with URN-space, but please remember that the Web is more than just HTTP. (see also: URIs, URLs, and URNs [w3.org])
Anyway, it seems to me that sites that tend to suffer from slashdotting are:
those that use dynamically-generated pages for what is basically static content: this problem can be fixed by sites making sure their content is cacheable [mnot.net], and further deployment of HTTP caches. (I'm not convinced a p2p-style solution is the solution here.)
those with large bandwidth needs (kernel images, linux distribution .iso's,
multimedia): as p2p software becomes more mature and widely deployed, everyone
will have a urn:sha1: resolver on their desktop (pointing to their p2p software
of choice), then whenever a new kernel is announced, the announcement can say:
Linux kernel version 2.4.20 has been released. It is available from:
2 .4.20.gz2 .4.20.tar.gz
Patch: ftp://ftp.kernel.org/pub/linux/kernel/v2.4/patch-
a.k.a. urn:sha1:OWXEOVAK2YJW3G6XSULXDWFCNWTX7B2K
Full source: ftp://ftp.kernel.org/pub/linux/kernel/v2.4/linux-
a.k.a. urn:sha1:PPWXYMA32YNDNO35UD3IQTCWBVBYK5DC
and people can just fetch the files using urn:sha1 URIs instead of everyone hitting the same set of mirrors. (gtk-gnutella already supports searching on urn:sha1: URIs [sourceforge.net])
Already discussed, and then... (Score:1)
Just my $0.02,
La Camiseta
Re:Is this the silliest thing ever? (Score:1)
OT: spelling checkers (Score:1, Offtopic)
And here's a pullet surprise winning poem too accompany it:
I have a spelling checker.
It came with my PC.
It plainly marks four my revue
Mistakes I cannot sea.
I've run this peom threw it.
I'm sure your pleased two no.
Its letter perfect in it's weigh;
My checker tolled me sew.