Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Programming Technology Hardware IT

Packet Juggling - Floating Data Storage 148

Filthmaster writes "I just saw an interesting paper that has been posted to bugtraq, full-disclosure and vulnwatch. It deals with the principles of stealthily using network infrastructure as either short-term or long-term storage. Not sure if I'm ready to implement it, but it makes interesting food for thought." There's also a mirror up.
This discussion has been archived. No new comments can be posted.

Packet Juggling - Floating Data Storage

Comments Filter:
  • .. as long as the power stays up..
    • would it no longer meet its qualification of being "volitile memory" if the power went down?
      • point taken, Lord Bitman. However, the world 'reliable' is given greater prominence by the author:
        The following paper explores the possibilities of using certain properties of the Internet or any other large network to create a reliable, volatile distributed data storage of a large capacity.
        • The use of mail servers (Class B) more adequately meets the term "reliable", but is still dependent on power being available at critical times.

          These situations seem to be for those cases where security of data far outweighs the ability to retain it. Check you mail queue. Maybe you have NSA encrypted documents right now :)
        • I dont know, this sounds like something that, with enough forthough, could work very well.
          Of course, that would only work if nobody actually knew about it. The fool published the idea, now no one can do it!
    • were they discovered a way to create extra space (from another dimension presumeably)? At first they used it to create extra closet space (everybody needs that right) but then people used it to create extra living space and extra apartments. And so extra space got tacked on to extra space and pretty soon almost everybody was living in borrowed space.


      And then power failed on one of the first units.

    • This is all fine,...

      but yet again, it raises the simple question that

      egotistical people never ask???

      What are the consequences of this?

      Imagine an internet paedophile ring using these

      techniques to TCP or SMTP packets into world

      (loaded with all the condemning data, but...

      no evidence to see who did it, OR, even if the

      data itself exists.. How will people know it)

      I think this is bad science,...

      but interesting nonetheless and truthfully

      something I have already delved in myself.. :/
    • there could well be redundancies built at both ends and geographical algorithms to reduce the locality of reference of redundant nodes to lessen the impact of wide scale power outages.
  • Bandwidth? (Score:4, Insightful)

    by shish ( 588640 ) on Monday October 06, 2003 @07:21AM (#7142462) Homepage
    Won't everyone pinging their gigabytes of data back and forth totally screw the net, a la slammer?
    • So better send you data to the moon and resend what is reflected ;-)
    • by Zocalo ( 252965 ) on Monday October 06, 2003 @07:29AM (#7142499) Homepage
      Ah, but in this context that could be a good thing since a slower Internet = more latency = longer TTL on your data. Take the ping for example; if it takes twice as long for your ping to echo back with your data, then you only need to retransmit back to storage at half the rate.

      I wasn't trying DoS the Internet your honour... I was trying to improve data retention times! ;)

      • That's not completely true. A ping only *seems* to take so long because it hops across a lot of routers. The TTL router is decreased anyway during the transit.

        In essence, what a DoS or 'slower internet' would do is just congest the network so much that a ping will probably not reach the next hop, but just get dropped because the ('slow') router has too much data to process.
    • It depends on the medium. Pinging data over the bongo drums [slashdot.org] wouldn't screw over the 'net, though anyone can probably think of some other adverse effects.

      However, the speed of sound [google.com] (or, rather, the lack of it compared to an electric signal) makes it an interesting option. Yes, storing data in TCP/IP packets represented in sound waves sounds cool.
    • Yes, and there is an even bigger problem with trusting ICMP: It's a UDP protocol, so there are no guarantees that your data will ever get to the other host, much less get back to you.

      Have you ever tried doing a traceroute and then pinging every hop along the way? Usually once I get about 5 or 6 hops out I start getting about 2-3% packet loss on pings.

      Also, another problem is that many backbone providers may decide to drop UDP traffic (especially ICMP since it's only supposed to be there for troubleshoot
  • by Amiga Lover ( 708890 ) on Monday October 06, 2003 @07:23AM (#7142476)
    If it were quick enough and timed well enough, could network bounced packets be enough for some really quick swapped out swap space? gig ethernet gets around 1ms to my next machine, and thats 10 times quicker than my hard drive (10ms access time) so if I could store swap in that space, would that work?

    I was thinking of tunneling ssh over sms before this, but that sounds just silly now.
  • by jubalj ( 324624 ) on Monday October 06, 2003 @07:24AM (#7142480) Homepage
    Why use your own network when you can..

    6seryoeyEe O.ot..>u&6eOyeUWrong loader, giving up...f1Afaef1UDf efPAMSfIr f=PAMSu e }eoACfuuEu1E1OeIr*uu uuAUfayyfAafafayyfaI1UIeS1AOA6Ee PAQuo1AOA6YoIrutEe A1AuoEe O1A AuIr!AOEe A
  • This would be very interesting with RAID 5.
    • Yes.

      1. Imagine a beowulf cluster of hosts sharing their ram via the network, possibly with supernodes acting as RAID-like controllers.

      2. Figure out a way to make it "secure" on a public network. Encryption ?

      3. ???

      4. Profit !
  • it's alive! (Score:5, Funny)

    by geoff lane ( 93738 ) on Monday October 06, 2003 @07:25AM (#7142485)
    First you give it comms, then unlimited CPU and now distributed memory.

    Can Skynet be far behind...
  • by CoolVibe ( 11466 ) on Monday October 06, 2003 @07:28AM (#7142495) Journal
    Q: Why the hell are you flooding the shit out of my network?
    A: Oh, I'm just storing data temporarily.

    Seriously, the idea is interesting, but I doubt that many network operators will like the extra network load. It would be interesting to build a SAN in this manner, just for academic's sake ;-)

    Oh, and the example with Microsoft's exchange servers made me chuckle. Finally a reliable storage "medium" from Microsoft! Go figure :)

    • It would be interesting to build a SAN in this manner, just for academic's sake ;-)
      Is that in a "Oh shit here comes the RIAA, hide the porn and MP3's" academic sake?
      • Interesting. Using the connection latency of a connection to the RIAA webserver to store your favorite Britney Spears (bleagh) mp3s. Sure, man. But strictly academicly, right? :)
        • Re:Great excuse! :) (Score:1, Interesting)

          by Anonymous Coward
          Think we could get them to sue themselves for illegally storing and distributing copyrighted material? Or would my mp3s now be legit (afterall the RIAA sent them to me)?
          • When the RIAA page was hacked (I say this as if it only happened once..) the new Linkin Park cd ended up on their webserver publicly. I believe this was a few days before it hit shelfs, to. End result: just pisses people off.

            You can pull DeCSS from disneys nameservers if you want, doesn't really mean anything.
  • Already there (Score:4, Informative)

    by Rogerborg ( 306625 ) on Monday October 06, 2003 @07:31AM (#7142503) Homepage

    When our network fileserver fills up (as it does twice a week or so), I start emailing things to myself through the corporate mail server. When the mail server fills up, I start adding to my intranet HTTP pages. When all else fails, I start sending (encrypted) data back to myself via my ISPs external mail servers.

    It would of course be far better for the company if they just sprang for some new drives in the fileserver, but engineer and bandwidth costs don't appear as capital expenses, so they are viewed as being effectively "free". Sigh.

    • I mean, really. If they've filled the disks then they need more space, why on earth are you fucking about in this bizarre way?

      • >If they've filled the disks then they need more space, why on earth are you fucking about in this bizarre way?

        Because the truly bizarre part is that it becomes my fault if I can't get work done because we've run out of space. Welcome to the corporate world.

        • If keeping disk space monitored is part of your job, then it *is* your fault, isn't it ?
          • No, but I use the network fileserver and one of the lunix boxen to perform builds. If it runs out of space, I can't build. My short term solution is to clear up space by sending myself enough of my own data to make room on the main fileserver.

            I never said that it made sense, just that I'm doing it.

            Of course, it would make more sense for the users with GB of data to get rid of some of it, or for the admin to implement quotas. But those users tend not to be the ones with customer deliberables, and, hey

        • I'm a sysadmin in a very large corporation and when we run out of space, I tell management. In fact, I tell them in advance, regularly, that's what capacity planning is all about.

          If they decide not to do anything, like let me buy more disk or a bigger file server, I leave the space full and suggest files which might be trimmed or people who are using more space than might be reasonable.

          It's their problem, they are the ones with the purse strings and it's their production which is being halted by their stu
          • >No it isn't, you're just the messenger

            Hey, do you work in one of those places where cowardly, incompetent spiteful management really don't shoot the messenger? Wow, what's that like?

            By the way, it sounds like I'm complaining, but I'm not really. It turns out that I get paid the same no matter what I do, so it all works out in the end.

  • by FastDownload ( 686456 ) on Monday October 06, 2003 @07:31AM (#7142506) Homepage

    While the authors try to use existing protocols to simulate temporary storage in the Internet, we are working on a scalable, shared storage resource that is open to the community.

    We currently have over 20 TB of storage around the world available in the public Logistical Networking Testbed and other groups have another 10-20 TB provisioned in private use testbeds.

    In additon to storage, we are also working on providing simple computational services at the storage nodes (work on the data in place while it is stored rather than moving it to computation centers).

    For more info, visit the LoCI Lab at http://loci.cs.utk.edu [utk.edu].

  • by ControlFreal ( 661231 ) <niek@nospAM.bergboer.net> on Monday October 06, 2003 @07:34AM (#7142512) Journal

    ... in which complete computer memories worked like this: those were called mercury delay line memories, in which pressure waves in mercury lines basically held information.

    The UNIVAC I had such an 18-channel [ed-thelen.org] memory. More information can be found here [ed-thelen.org], here [ed-thelen.org], and here [wikipedia.org].

    These channels could hold a whopping kilobit!

    • In future, we might be using light in fibre loops instead of sound waves in mercury delay lines, to act as computer memory. I googled for light loop memory --- look what turned up:

      Fiber loop makes quantum memory [trnmag.com]

    • It appears they are aware of this. In the Legal Notice section at the end of the paper, they have some weird stuff (copied below) which didn't make any sense until I read the links you provided. It still doesn't make a LOT of sense, but it does appear they're jokingly making an obtuse reference to delay-line memory.

      A do-it-yourself kit (long wire, speaker + microphone, shovel and a driver disk) will provide an affordable, portable and reusable way for extending storage memory on portable systems.

      It is e

    • We did this with X.25 [rad.com] packets in the 1980's.
    • They also used systems based on signal delays through wire and through quartz. (The wire system is also described on the Wikipedia page [wikipedia.org].)
  • by melonman ( 608440 ) on Monday October 06, 2003 @07:35AM (#7142515) Journal
    Just burn a CD a day and post it to a non-existant address on the other side of the world. That way you can probably keep a terabyte of data int he air without taking any space in your office, and, unlike TCP/IP, you may be able to reuse the wrappers.
    • yeah but the latency time is what kills ya...make sure you've got swap turned off...
  • I had this idea too once, and I told my collegues, who thought I was a nut-case as a result.

    Someone once said that if you have a truly good idea you don't need to worry about anyone stealing it : you'll have to try very hard to get anyone to listen at all.

  • by zen parse ( 607603 ) on Monday October 06, 2003 @07:43AM (#7142537)
    Not everyone can benefit, because of side effects of it's parasitic nature.

    The amount of storage this system gives in the text would be total available for ALL users of the system. More users, less avaiable storage.

    Parasites can do better when there little competition from other parasites, but if the system get's infested, the host it lives of may die. Or someone may develop a cure.

    Either way, after a certain threshold, the more popular any system using this gets, the less useful it would be.

    Just some random thoughts I had when I was talking about a similar idea with someone.

    • Not everyone can benefit, because of side effects of it's parasitic nature

      "If I come back as an animal in my next lifetime, I hope it's some type of parasite, because this is the part where I take it EASY!"
      - Jack Handey
  • Did anyone else read this as Pocket Juggling?
  • Wasn't there an old (pre theregister.co.uk) BOFH article where the Bastard tricked his boss into proposing the use of the network as a storage medium?
  • ... based around the premise of an 'industry' of 'dataspace', where data is converted into radio waves, and 'flung' out into space to be 'stored', retreived on the other end, and bounced back again, repeatedly, over millions and millions of miles, as a way of getting 'cheap' storage for stuff that you want to have around only on a periodic basis.

    I guess I should've made it a 'whitepaper' instead. I got a B, though, that was nice!
  • Delay Line Memories (Score:3, Interesting)

    by Gnissem ( 656009 ) on Monday October 06, 2003 @08:11AM (#7142623)
    There's never anything new...in electronics lab in college in the 60's I built a delay line memory, which was nothing more than a very large coil of wire and some rather simple circuity that would shove bit's into one end and 'catch' them out the other side and recirculate them. You used timing to specify the addresss and could read or update as the bit came by. Another variation that was commercially used on this used a column of mercury and cycling sound waves...see http://ed-thelen.org/comp-hist/mak-UNIVAC-I-delay- line.doc
  • by |>>? ( 157144 ) on Monday October 06, 2003 @08:16AM (#7142641) Homepage
    Did anyone else get the Einstein reference:
    Article: would it work without oranges?
    which to my mind refers to:
    Einstein: The wireless telegraph is not difficult to understand. The ordinary telegraph is like a very long cat. You pull the tail in New York, and it meows in Los Angeles. The wireless is the same, only without the cat.
    It could just be my mind - just fell down in the bath and hit my head falling over the edge...
  • I recall seeing this mentioned in a hacker "e-zine" article many moons ago. Perhaps someone could dig it up?
  • BOFH (Score:3, Insightful)

    by i.r.id10t ( 595143 ) on Monday October 06, 2003 @08:36AM (#7142716)
    Didn't the BOFH convince the Boss that you could store data on the network cables, causing him to order quite a few spools of the stuff?
  • by MadFarmAnimalz ( 460972 ) on Monday October 06, 2003 @08:44AM (#7142756) Homepage
    Whilte it's good to see people exercising their minds like this, it's also good to keep in mind that some things should not be regarded as more than just exercises.

    With this particular scheme, the inherent complexity (needing interfaces to all of these common network protocols) and the risks (there must be a billion ways to lose data this way) basically mean that storage according to this scheme would be really high.

    Disproportionate storage costs per unit data automatically means no real-world application outside of brain exercising.
  • by red_dragon ( 1761 ) on Monday October 06, 2003 @08:46AM (#7142764) Homepage

    It was 1997 when Simon the BOFH wrote about such a contraption, which won him the IT Idiot Award for Least Intelligent Supervisor. [iinet.net.au]

    ... This year I've decided to sell the boss on using the network as a storage medium. I casually drop a couple of remarks until the boss decides to channel his massive intelligence away from tying his shoelaces and onto the matter at hand.

    "It's simplicity itself!" I cry "We've got these Gigabit Ethernet switches all around the place that we just aren't using! Instead of letting them go to waste we could be sending data continuously around them until it's needed which would actually cut down on the amount of physical disk storage we would need! And just think of the time we would save with read and write latency when the data's already on the net!"
    ...

  • "Most of us, the authors of this paper, have attempted to juggle with three or more apples"
    Try juggling three iMacs (they're fragile ballistic objects, right?) and let's see how your arms feel!
  • This is not new (Score:5, Interesting)

    by eris_crow ( 234864 ) <`moc.niadle' `ta' `worc_sire'> on Monday October 06, 2003 @08:52AM (#7142801) Homepage
    When I was in college back in the Good Old Days (tm) of the Internet, I had a friend who sometimes stored files "in transit", so to speak, by emailing them to himself with explicit routing in the To address. He would send the message on a long circuit of several machines around the country and he had a script to automatically reforward them once they got back if he didn't save them within a certain period of time. Back in the day you could do this by setting the To address to something like "@hostone.com,@hosttwo.com,me@myhost.com" (see RFC 821 sec 3.6) and since the network and the machines on it were much slower in those days, if you added enough hosts then you could introduce a significant delay and have lots of files stored in transit (actually, on the various mail servers) even though your own disk quota was nearly used.

    Explicit routing is long gone, but it is an interesting early manifestation of the same principle: the network is my hard drive.

  • Let Google and thousands of news servers archive all your data for you.
    • If anyone does it, the guys at LUFS [sf.net] will write a driver for exactly that. "Remember, everything is a file, and if it isn't, it should be!"

      I can see a driver that posts to encrypted anonymous groups [google.com] optionally via anonymous remailers and then checks google for updates. Post 2 messages per chunk: a pointer to each chunk's subject line, and each chunk. The data in the pointer-posts should be enough to formulate a directory structure. (Hey, I might be onto something here...) Now, for the UI. Something that can
  • Satellites ! (Score:4, Interesting)

    by wtarreau ( 324106 ) on Monday October 06, 2003 @08:57AM (#7142837) Homepage
    I remember that when I was at the University, I explained to someone that with satellites at 37000 km from us, information took a quarter of a second to go there and back to earth. So if you use a 500 Mbps link, you can store 15 MBytes of data in the distance between, on an absolutely zero cost medium, during 0.25 second. And if you were confident enough in the reliability, you could even put a bouncer on earth, which goal would only be to resend the stream to the satellite and keep it looping. You would have 15 Mbytes of free storage with an average access time of 125ms (250ms max). Although absolutely useless, that would be as fun as TCP/IP over pigeon routing :-)

    Willy
    I will never put a sig.
    • And if you were confident enough in the reliability, you could even put a bouncer on earth, which goal would only be to resend the stream to the satellite and keep it looping.

      I remember Scotty did this in an episode of Star Trek, he bounced back and forth as a transporter beam for 70-odd years until Georgy La Forge came along rematerialized him.
      • Thats GeorDy, and I'm afraid he used the local buffers, while keeping the beam itself disabled, forcing the buffers to refresh every [trail away into technobabble here]...
        ---

        It's nerdy, just a differant kind of nerdy...
    • on an absolutely zero cost medium

      This seems a little problematic, since the actual costs are in maintaining a signal strong enough to overcome a minimum signal-noise ratio for the data to remain intact.

      So as storage capacity (distance) increases, it would seem that cost would increase, because the power needed to keep that signal alive would need to increase...
    • So if you use a 500 Mbps link, you can store 15 MBytes of data in the distance between, on an absolutely zero cost medium,

      I realize this is supposed to be funny (even though it is modded as "interesting"), but...

      Would somebody please explain to me how I can get a 500 Mbps link at absolutely zero cost?

  • Much simpler: just post all your stuff on Slashdot. With all the random garbage surrounding it noone will notice anyway.

    <storage>095257baf2ba839ec8605869dd3ddbd1</stora ge>

  • what about loop-connecting an optical cable -- tech it a little and use that as storage? :).

    When we're speaking of ideas -- have anyone developed a torrent-style mp3-radio? That would make it cheaper and easier to set up a mp3-radio? Or how about streaming video? -- that would've been cool :)

    • by Anonymous Coward
      That would be cool, but the problem with this is the strength of bit torrent is that users can share different parts of the file they are downloading. Since real-time streaming mp3-radio or video would be sequential in nature, everyone would be downloading the same thing, thus diminishing its value.
      • Thanks for a good answer
      • everyone would be downloading the same thing, thus diminishing its value.

        Just means it cannot work like bit torrent. There are other ways to distribute data. What you need to do is utilize the fact, that everybody wants the same data at the same time. Build a tree structure of clients each repeating everything they receive to two or three other clients, possibly with some redundancy, such that leaves repeat to different locations in the tree. The tricky part will be to dynamically compute a least cost tr
        • I have been pondering this for a few weeks, and really like this idea, it would almost eliminate the bandwidth cost of doing live streaming broadcasts. I started thinking of a torrent based solution, but the torrent concept doesn't place priority on any piece of the file arriving before any other, with a broadcast stream you want the chunks arriving in sequence, or close to it. I think it would have to be designed from ground up.

          Another big issue would be matching appropriate clients and peers based on th

          • a new client that connects to a peer that is almost finished with the stream
            What I had in mind was live broadcast like on the radio. When you turn it on, you don't get it from the start of the programme, you get what is broadcast right now.

            the broadcast is no longer live, but delayed considerably.
            There will be a delay, but keeping it below 10 seconds shouldn't be a problem. Such a delay would be unacceptable for telephone over the internet, but it is surely acceptable for internet radio.

            What about l
  • the heck? (Score:5, Insightful)

    by Frac ( 27516 ) on Monday October 06, 2003 @09:11AM (#7142927)
    How does using a scarce resource (bandwidth) to create an abundant resource (disk space) make any economic sense?

    Headline: How to turn gold into copper! News at 11.
    • How does using a scarce resource (bandwidth) to create an abundant resource (disk space) make any economic sense?

      Economics have nothing to do with it. At all.

      First and foremost, this is a cool hack. Second, it's an interesting way to hide information and/or make its recovery quite difficult, as well as to achieve some degree of plausible deniability.

    • How does using a scarce resource (bandwidth) to create an abundant resource (disk space) make any economic sense?

      Well, in a strict economic sense, how does using a scarce resource(diamonds) to create an abundant resource(money) make any econimic sense?

      Ask the broker who sold Kobe Bryant the $4,000,000 diamond ring for Mrs. Bryant. The broker probably never spent more than $10,000 to get the ring to him, and probably less.
      And yes, in terms of abundancy, money is a much larger pile than diamonds.
  • by kasperd ( 592156 ) on Monday October 06, 2003 @09:13AM (#7142935) Homepage Journal
    Or I could upgrade my internet connection to get the same amount of storage in other locations. In about three months the extra required capacity on the internet connection would have cost me as much as the harddisk. If I buy the harddisk now, in three months I will still have storage without having to keep paying. Besides, the harddisk is going to be more efficient and probably also more reliable.
  • From the article: Examples rely on sending a message that is known to result in partial or full echo of the original request, and include:

    DNS lookup responses and cache data. It is possible to store some information in a lookup request and have it bounced back with a NXDomain reply ...


    Not any longer!! (unless you're running the bind patch ...)
  • ...but I just love bringing up the fact that I'd thought up something similar, like, 4 years ago now, and every time my memory's jogged about it here on Slashdot, I like to post about it. Instead of hijacking existing services, however, I did create a new one, where a basically packets were flung from one machine to another (or many) around and around again -- each machine held on to a number of packets, but didn't keep them for too long. All traffic was encrypted, and no machine knew 1) where a packet started and 2) where a packet ended up, it just kept flinging packets around.

    The main goal of the service was to create a nice, neat, encrypted, secure messaging system where neither the origin or destination of a particular message could be detected, such that even if a message was intercepted and decoded, you still didn't know where it came from or where it was going. (This was envisioned about 2 days after the early reports of Carnivore.)

    One of the nice side effects, however, is that you could use the service to basically store a message "on-the-wire" damn near indefinately, broken apart into tiny packets, distributed more or less randomly to every other participating host, with those hosts having absolutely no clue what it was, who put it there, or who's going to retrieve it.

    The bandwidth usage was, in two words, potentially catastrophic. It could really hork a network. I mean, really, REALLY hork a network.

    It was kinda cool. God only knows where that paper is today, though -- I removed it from the web about 2 years back when the Justice Department was considering considering such papers, ideas, services, devices as potential aids to terrorism, and fining/imprisoning the bright young minds who come up with such stuff. So, until either our government stops playing the "T" card, our citizens calm down, and/or we eradicate the likes of Hammas, Islamic Jihad, the IRA, the ELF, and many other like groups, I doubt I'll ever make it available again. *shrug*

  • Floating Data Storage. Now they'll HAVE to approve my request for a jacuzzi in the server room!
  • Oh! The irony! (Score:2, Insightful)

    by Inspector ( 38755 )
    Filthmaster writes "I just saw an interesting paper that has been posted to bugtraq, full-disclosure and vulnwatch. It deals with the principles of stealthily using network infrastructure as either short-term or long-term storage. Not sure if I'm ready to implement it, but it makes interesting food for thought."

    There's also a mirror up.

    Sneaky bastards!

  • End of the Internet predicted, and all that.

    I vote we use the papers' authors as practice bombs.
  • This is new? (Score:3, Insightful)

    by jimfrost ( 58153 ) * <jimf@frostbytes.com> on Monday October 06, 2003 @12:07PM (#7144338) Homepage
    This is just a rehashing of an april fool's joke that went around on USENET some 15+ years ago. They were talking about using the UUCP transmission delay for archiving. I spent a few minutes trying to track down the original on deja.com, unsuccessfully, but trust me ... I remember it.

    It's also interesting that way back in the dawn of computing equipment they did use propagation delay as a way of doing storage. Mercury delay lines in particular. Not only that, the people that used them noticed that the tubes made noise and found ways to play tunes by saving the appropriate data. Google "mercury delay lines" and you'll find a few notes about the technology.

  • Scotty uses this in the TNG episode "Relics" to keep himself alive by being stuck in a transporter doing diagnostic loops for a good number of years (Hey at least it was better than the plothole for Kirk's TNG appearance in Generations!). I always find it amusing when I see sci-fi stuff start showing up in the real world.
  • I remember all sorts of tricks like that in college. Back in the day when many mail servers with UUCP would only call each other every few hours, one could store lots of data in them. Especially if your network's quota system let you exceed your quota for short periods of time. The best were slow mailers where you could use long bang-paths like somewherefar!somewherelsefar!sun!sono!mayer to explicitly target slow/big servers. We were inspired by an article in the 80s proposing similar with mirrors on t
  • This is absolutely true. You can actually use all the buffer/cache of all the routers, switches and computers allocate you in the way of the package to store data.
    About 4 years ago I did a test transfering approx 50 Mb of data, as an icmp payload, taking a long route (satelite/trans-atlantic route etc).
    My main problem with it, even with the cache, translation and bouncing delay, was that I was getting the first packets back by the time I was sending packet #. (My PoC wasn't very efficient.)

    I still think it
  • I don't imagine that a Class A (tcp/ip) parasitic computing model would be particularly ideal for sensitive data as the script/prog/implementation would be stored in RAM on the Host running the exploit. If the ram on this box were compromised, it would prove trivial to re-piece together the data via the individual hosts.
  • What about those /tmp directories -- who needs those? We can store data there.
    Even disks that are full let you create empty files -- uuencode and store data in empty files as long filenames.
    Who uses all that memory these days? Store data in memory, keep a few redundant copies for silly "rebooting" incidents.
    Virtual memory is another place ripe for picking. What about screen memory -- no one uses the whole desktop
    Couldn't you send data to the speakers, and grab the data back from the microphone?

New York... when civilization falls apart, remember, we were way ahead of you. - David Letterman

Working...