Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Security

A Critical Look at Trusted Computing 278

mod12 writes "After just attending a two-week summer program on the theoretical foundations of security (one of the speakers was from Microsoft research), I have been interested in trying to find out if the "trusted computing" initiative was still alive. I got my answer today in the New York Times from an article that was fortunately rather critical of the concept."
This discussion has been archived. No new comments can be posted.

A Critical Look at Trusted Computing

Comments Filter:
  • non DRM computers? (Score:5, Insightful)

    by I Want GNU! ( 556631 ) on Monday June 30, 2003 @10:47PM (#6336321) Homepage
    Does anyone know of companies planning on building processors without DRM? In a competitive marketplace there would not be DRM because consumers don't demand it and surely would prefer computers that aren't controlled by the market after the sale. But with only two major PC processor manufacturers having a duopoly over the market it isn't very competitive.
  • Weasel wording (Score:4, Insightful)

    by Atario ( 673917 ) on Monday June 30, 2003 @10:49PM (#6336338) Homepage
    "We think this is a huge innovation story," said Mario Juarez, Microsoft's group product manager for the company's security business unit. "This is just an extension of the way the current version of Windows has provided innovation for players up and down the broad landscape of computing."
    And that "way" would be: to the highest bidder.
  • by vegetablespork ( 575101 ) <vegetablespork@gmail.com> on Monday June 30, 2003 @10:50PM (#6336342) Homepage
    You'll be able to get a non-DRM'd computer. It'll be made illegal as a "circumvention device" in short order if it actually turns out to be useful for any sort of multimedia applications.

    I recommend not tossing systems when you upgrade--pre-ban PCs should be worth a tidy sum soon.

  • But by entwining PC software and data in an impenetrable layer of encryption

    COME ON! please, why do they make such claims?! or why do journalists make such claims? i think the establishment/private companies/whatever has been proved wrong on that issue over and over and OVER again. if there's someone who actually thinks their data is totally secure these days . . .

    another point: this initiative could be very dangerous. buying OS's with this crap already on them, limiting what you can do . .. so, what, should we stock up on Win2000, XP, and Linux OS's along with our CD and DVD burners?

    DRM may stop the morons, but soon enough, once a few "l33ts" circumvent it and it gets released into the wild, what's the point.

  • who do you trust (Score:5, Insightful)

    by ecalkin ( 468811 ) on Monday June 30, 2003 @10:55PM (#6336369)
    we all deal with 'trusted computing' to some extent or other. in any computer system there is a person/persons/entity that is trusted. in the simplest form it is supervisor/admin/etc. as you design a network you describe who is trusted.

    when you get a commercial digital certificate you are expressing trust.

    in a well designed (large) system you would build in multiple trusts to act as a check and balance. sort of an auditing feature. novell is real big on this.

    i find it interesting that the ms model of trust is pretty much putting all your eggs in what is mostly their basket. no auditing, no accountability, etc.

    i suspect that we will see more distributed trust as companies and isps become more involved in this.

    eric
  • trusted computing? (Score:4, Insightful)

    by jeffy124 ( 453342 ) on Monday June 30, 2003 @10:59PM (#6336390) Homepage Journal
    definition depends on who you ask.

    it originally meant protecting user keys via a secured tcpa chip (not drm). then microsoft started their trustworthy campaign and included palladium's announcement and that somehow changed the definition to include drm. so please, keep that in mind. palladium and tcpa are not the same thing.
  • by Animats ( 122034 ) on Monday June 30, 2003 @11:02PM (#6336414) Homepage
    This crap is all about DRM. It's not about real protection hardware, like support for rings or virtual machines or capabilities or channelized I/O or secure interprocess communication.

    If the Wintel crowd were serious about security, they'd push for a hardware architecture that supports secure microkernels really well and put a very partitioned OS on top of it. But no; it's all about boot-time lock in.

  • Positive sides (Score:5, Insightful)

    by DreadSpoon ( 653424 ) on Monday June 30, 2003 @11:03PM (#6336418) Journal
    I just wish people would remember all the _good_ parts of trusted computing. So far as the TCPA goes, DRM isn't even a part of it. It's just a standard hardware interface for encryption and key storage. Whether that's used to sign OS's, implement DRM, or simply secure Apache, is up to the OS. Yes, it _can_ be used for all that. But hell, a BIOS _now_ can be set to only boot an OS with a certain fingerprint - how the technology is used is independent from the technology itself. TCPA is a (possibly) good thing. Palladium/DRM, that's the real evil (from the consumer and OSS viewpoints, anyways).
  • "Industry leaders" (Score:4, Insightful)

    by ScuzzyTerminator ( 683387 ) on Monday June 30, 2003 @11:04PM (#6336422)
    Industry leaders also contend that none of this will stifle innovation.

    What the Industry Leaders mean is that the Industry Leaders will not be stifled. The rest of the industry should just not worry their little heads. It will all be done for us by those who know best.

  • by diabolus_in_america ( 159981 ) on Monday June 30, 2003 @11:05PM (#6336427) Journal
    The biggest argument made against Lindows was that people who bought the system would be turned off once they got it home and realized it wouldn't let them do what they expected. In this case, running MS Office, games, etc. As a result, Lindows has since abandoned much of their early claims about MS-compatibility.

    What happens when a someone gets one of these new Trusted systems home and realizes that they can't use it as expected? What happens when it doesn't let them them burn audio CD's or play previous burned songs on CD-R/W's? What happens when they have trouble just opening word processing or spreadsheet files, because they are not considered "trusted"? Even email could become a problem.

    I see this whole "Trusted" initiative by Microsoft as a potential boon to open source software developers and even "white box" computer manufacturers.

    Word will get out: "Don't buy any of the new Hewlett-Packards with that new Windows. They just don't work!" Microsoft has already turned many corporations against them with the new License 6.0 scheme. "Trusted" computing could turn many home users against Microsoft and all of the hardware manufacturers who have thrown their lot in with them.
  • by UltraSkuzzi ( 682384 ) on Monday June 30, 2003 @11:06PM (#6336433) Homepage
    Large corporations have historically always got what they wanted, unless of course the government had steped in. I'm no longer so concerned if this technology will be implemented. I am now concerned about HOW the computing community will deal with it. Gates already said he doesn't plan on deploying trusted computing technologies immediately. Why wouldn't he want to deploy this technology that can supposedly stop all forms of piracy? People will not buy computers that do not do what they ask. MS will wait until their TC enabled OS is prevailent on most PCs, and then send a signal from Redmond enabling it. There will be no way out. People will have to learn to live with it. After all they paid hundreds of dollars for their PC, right? You can't stop progress, but you can try. UltraSkuzzi The inherent vice of capitalism is the unequal sharing of the blessings. The inherent blessing of socialism is the equal sharing of misery. -- Winston Churchill
  • by thepacketmaster ( 574632 ) on Monday June 30, 2003 @11:08PM (#6336440) Homepage Journal
    I believe "Reliable and Secure" computing is what people want for home computers. The term "Trusted" computing is usually saved for military computers, etc, that are following the Rainbow books' criteria. Also for systems trying to get a Common Criteria rating. "Trusted" computing includes two-man controls, the kind that prevent one person from launching a bunch of nuclear missiles. The NYtimes version of trusted computing means computers that the RIAA and MPAA can trust not to let you download their stuff. It might even include letting the RIAA and MPAA destroy your computer if you do (based on what some senators want to pass as law)
  • by Delphiki ( 646425 ) on Monday June 30, 2003 @11:08PM (#6336443)
    I'm not so cynical about Apple. There's a big difference between the iTunes music store and what Microsoft and Disney want to happen. The iTunes music store lets you use those files on up to three computers and allows you to burn them as many times as you want and put them on your iPod, etc. Considering you can stream songs, listen to them as many times as you want on more than one computer, I see this as pretty reasonable. I imagine Apple had to really go to bat with the record companies to get that much too.

    Apple is typically better to their customers, because they have to be. Microsoft has shown a lack of respect for their customers fairly consistently and get away with it because people don't see much alternative at the moment. Also, Apple's embracing the open source community, though perhaps not to the degree that some would like (though I think it's a good balance of open and closed source). Their ties to the open source community I would think make them more likely to refuse to implement TCPA.

    The problem Apple is going to face though, is will Apple users be able to open TCPA encrypted documents? Apple, along with Linux, the BSD's, and any other non Microsoft platform need to oppose this so that Microsoft can't lock alternative platforms' users out of all documents created through Microsoft apps.

  • by Anonymous Coward on Monday June 30, 2003 @11:09PM (#6336452)
    The National Security Agency's "security-enhanced" Linux is an attempt to make Linux into a "trusted" computing platform, but that has NOTHING to do with DRM and other MPAA- and RIAA-borne stupidity.

    Security researchers are putting a lot of effort into defining trust relationships and developing guidelines for applying the term "trusted" to software. Has the software design been verified? How about the code? Who verified the design and audited the code? Have there been security problems in the past? Is the concept fundamentally compatible with security?

    Then along come the MPAA and RIAA, and they convince Microsoft (among others) to start talking about a totally fucking DIFFERENT definition of "trusted". Whereas the OLD definition of "trusted" involved concepts like integrity, secrecy, reliability, and auditability, the NEW meaning of "trusted" is essentially "crippled".

    As somebody who studies security for a living, it irritates me to see the two concepts confused. Microsoft's DRM-enabled operating systems will NOT include the features I've outlined above, and a highly "trusted" operating system could very well include software that allows you to "rip, mix, and burn" just as people are accustomed to doing today.

    Really, just who is "trusting" the DRM operating systems? Not the users-- I imagine there will be just as many viruses and exploits and bugs as before. Not software developers-- Microsoft hasn't really announced any plans to do things like, say, encrypt the swap space or integrate stack protection into their linkers, loaders, and compilers.

    In fact, the only people who are really trusting the DRM operating systems are the content industry associations. Which makes sense, as Microsoft and company are essentially doing the whole "trusted computing" thing at the behest of the MPAA's congressional whore [senate.gov].

    Please, folks, let's call a spade a spade: the DRM-enabled operating systems are NOT "trusted". They're "content-industry-friendly". They're "crippled". They're a lot of things, but they're not "trusted".

    Let's start asking for some precision of language, here.
  • by malia8888 ( 646496 ) on Monday June 30, 2003 @11:24PM (#6336523)
    There is nothing in trusted computing to benefit the consumer. I am hoping the word will get out to the average consumer in time for them to rebel by keeping their $$$'s to themselves.

    The very things that computer users want to be protected from--viruses and the tons of spam messages--are not addressed with these "improvements".

    As eloquently outlined in the Times article: the new encrypted computing world, even the most mundane word-processing document or e-mail message would be accompanied by a software security guard controlling who can view it, where it can be sent and even when it will be erased. Also, the secure PC is specifically intended to protect digital movies and music from online piracy. But while beneficial to the entertainment industry and corporate operations, the new systems will not necessarily be immune to computer viruses or unwanted spam e-mail messages, the two most severe irritants to PC users. "Microsoft's use of the term `trusted computing' is a great piece of doublespeak," said Dan Sokol, a computer engineer based in San Jose, Calif., who was one of the original members of the Homebrew Computing Club, the pioneering PC group. "What they're really saying is, `We don't trust you, the user of this computer.' "

    In "trusted computing" the public gets no security; the FAT entertainment industry gets fatter; and the common man is unduly scrutinized.

    Let's hope our everyday "Joe Consumer" rebels. If Intel comes out with a chip with this trusted-Big-Brother component, I hope the American consumer leaves it rotting on the shelves.

    Money talks, b.s. walks. If the public refuses to buy this garbage which is hyped to protect them, perhaps the companies will look at this trusted computing issue again and drop it in the trash can it belongs.

  • Re:Uh huh.. (Score:2, Insightful)

    by Anonymous Coward on Monday June 30, 2003 @11:30PM (#6336551)
    We break it on an old Athlon or Pentium IV and release the cracked/decrypted version on Freenet.

    The system used will always be breakable unless they can find a way to rid us of non-compliant technology, and the technology in my house will always be non-compliant.
  • by fermion ( 181285 ) on Monday June 30, 2003 @11:32PM (#6336562) Homepage Journal
    No one seriously believes that MS can create a secure OS. What can happen is that MS, along with laws that will make circumvention activities illegal, will create enough a of a facade of security that people will trade certain current freedoms for safety and convenience. It always happens. People want convenience and simplicity.

    OTOH it looks like this stuff will only effect Intel and MS products. Personally, I have always used Apple products myself. It has protected me from MS viral licenses. It has protected me from Intel's occasional desire to track all users. It is now protecting me from silly DRM schemes that do nothing but protect antique business models. Apple has done more for security by allowing the user to turn off HTML in mail.app that MS could possibly hope to do in a decade.

    The same could be said for GNU/Linux and other non-MS users. For these users there are only three concerns. First, laws could be passed to require certain attributes in entire classes of software. For example, as the article suggests, all email and music might have to be signed with a CPU generated hash. Of course all advanced users know that such technology could be circumvented, and, even with laws against circumvention, such actions will routinely occur.

    Second, the makers of Intel clone chips might, and probably will, succumb to pressure and include security features. This would be bad because right now OSS is very tied to Intel class chips. The solution to this is to build open hardware platforms around non-Intel class chips, and create OSS projects that run on such platforms. Intel may be a slave to MS, but AMD and others might be more scared of lost sales due to OSS moving to Motorola and IBM chipsets. In five years if OSS is still tied to the Intel instruction set, and Intel is only making chips that spy on the user, there will be no one to blame.

    The third issue comes from a quote in the article
    the system will also require a new generation of computer hardware, not only replacing the computer logic board but also peripherals like mice, keyboards and video cards
    from this we can infer that MS intends to push DRM to all hardware connected to the CPU, which, of course, is the logical course of action. The issue is as above. OSS runs mostly on what is essentially MS hardware. If all MS hardware requires software that is cryptographically signed and externally validated, probably by MS related service, one wonders if OSS will exist. If OSS does exist, one wonders if it would have any purpose the user was still ultimately tied to MS licenses and security schemes.

    This has always been the danger of the single environment ecosystem. The OSS people seem to forget how inherently dependent on MS whims they are. One wonders if some diversification might be in order.

  • by Animats ( 122034 ) on Monday June 30, 2003 @11:35PM (#6336581) Homepage
    Where they say:
    • The TCPA chip itself has three main groups of functions:
      • public key functions
      • trusted boot functions
      • initialization and management functions

    That's stuff you need to support DRM and crypto. None of the real security features I listed are in there. It won't prevent your Windows machine from being taken over by every worm and virus that comes along. It might prevent some attacks that steal your credit card number, but that's about it. Even that protection would probably work only if you'd signed up for Microsoft Passport or something similar.

  • by SmurfButcher Bob ( 313810 ) on Monday June 30, 2003 @11:41PM (#6336606) Journal
    ...what you think.

    Face it, the software market is pretty much saturated from their perspective, and there isn't much room for growth on the desktop compared with previous years.

    What MS discovered, about two years ago, was that they could sell a completely different product. What MS discovered was Radio.

    Radio doesn't make money by playing songs. Radio makes money by selling its listeners. Now, take a re-think of the Trusted Platform from that perspective, and what it's purpose will be completely obvious.
  • by Reziac ( 43301 ) on Monday June 30, 2003 @11:59PM (#6336721) Homepage Journal
    And that's what it's going to take -- a backlash at the level of Corporate Suit, and to a lesser extent Joe User (who has far less financial clout). When the CEO of some major corp discovers that he can't do what he's *used to doing* with email due to DRM enforced by the machine, there will be very loud hell to pay.

    Unfortunately, that's liable to come too late for most of the market, especially for the tiny fraction comprised of us geeks. Once DRM-in-hardware gets entrenched and Average Joe gets used to it, it'll be damned hard to displace. :(

  • by Graabein ( 96715 ) on Tuesday July 01, 2003 @12:02AM (#6336734) Journal
    Didn't I read right here on /. that the Chinese have started to develop and test their own CPU? Yup: The Dragon Chip [slashdot.org]. They've already got Linux booting on it.

    With most of the world's electronics manufacturing business in China anyway, I guess this means we'll all be running Linux on Chinese developed and manufactured hardware in a few years, while Microsoft, Intel and AMD all sit around in the wreckage of their once profitable empires wondering what went wrong.

    Here's a hint guys: You forgot what made the PC platform great in the first place: Freedom.

    Call it freedom to innovate, freedom to fsck up a computer beyond repair, freedom to write a virus or freedom to swap files. Whatever. But try taking our freedom away and you will face the consequences.

    Now that would be a deliciuos irony, wouldn't it. America and the West taking away the freedom of all computer users, and the Chinese coming to the rescue and restoring our freedom.

  • Amen (Score:2, Insightful)

    by Anonymous Coward on Tuesday July 01, 2003 @12:36AM (#6336904)
    "Here's a hint guys: You forgot what made the PC platform great in the first place: Freedom."

    You're right.

    I was there for the beginnings of the PC. We built them and bought them, even when they couldn't do much because we believed in the dream of freedom and computing and saying "fuck you" to big companies with their vision of how we should use their computers.

    Now 2 generations later, we seemed bound and determined to give it all away, just so we can watch "Star Wars" on our PC. And pay every time. And throw people into jail if they refuse.

    Its very upsetting to those of use who started the revolution.
  • Re:Positive sides (Score:5, Insightful)

    by firewrought ( 36952 ) on Tuesday July 01, 2003 @01:41AM (#6337183)
    I just wish people would remember all the _good_ parts of trusted computing.

    TCPA is going to be bad for more reasons than just Palladium... it's going to be a major headache for IT departments trying to cope with software that is actively unfriendly. Why? It's about visibility. When an IT department needs to replace a legacy app, write bridge code to shuffle data b/t two different software systems, or make revisions to a relic in-house app, the amount of visibility will determine how quickly and cheaply the change can be accomplished.

    Visible things include: good documentation, available source code, standard protocals, open data formats, strongly defined interfaces, generous/lax security, unencrypted traffic, non-regulated/classified data, informative error messages, enthusiastic vendor support, open bug databases, and software-oriented community forums (yay Google Groups!).

    Invisible things include: missing/shoddy/incomplete documentation, overly-flexible products, binary network protocals and file formats, marketing-centric websites [heh... just try to find technical info about Crystal Reports [crystaldecisions.com]], "friendly" error messages, abandoned development platforms, and (getting to the point)... stuff that's too locked down.

    DRM and trusted computing will add yet another layer of flaky security that prevents casual intrustion while seriously hendering IT. Businesses will be tantalized by the idea that they can precisely control how a memo get distributed, archived, and destroyed. They will be oohed and ahhed that they can enforce their "email retention policies" through the use of TCPA. But this will come with some heavy costs... of which visibility is one of the major ones. I can see it now:

    • Client: "Here's that email you needed to hook up system A to system B, but I can't send it to you. It says it's protected. I tried taking a screenshot, but it came out all black. I can't seem to print it out either. We could probably call Ginger and find out who could give the authorization to transfer this, but she's not here today. How about I just read it to you over the phone?" [Stupid DRMish Feature]
    • Product Expert: "Oh yeah... to import text records into RiskModeller3000, you have to create an executable and pay the vendor a wad of cash to sign it. Only then will RiskModeller be willing to execute your binary and munch in the text it produces." [Stupid Licensing Scheme]
    • Packaging Expert: "To transfer this program from our testing environment to the produciton environment, you'll need to recompile the binary and sign it with this 'production certificate'... hope your build environment hasn't shifted around much or you'll blow the integrity of all that 'final release testing' your clients just spent four weeks on." [Stupid Security Requirement]
    Visibility affects the agility of business and the cost of IT. It's not just an abstract good... it provides lubrication for business IT and reduces real cost. A company with a lot of visibility will be more agile and flexible than one without it. And, in the final analysis, a society with visibility will generate more wealth than one that gets too tangled up in an artifical form of security. TMCA is basically bad, because--while it could have good uses--it will ultimately reduce visibility and harm society.

    It's not just about pirating MP3's... it's about the creation of real wealth and new technologies.

  • by Anal Surprise ( 178723 ) on Tuesday July 01, 2003 @01:44AM (#6337195)
    There's a reason for the outrage.

    The "Oh, the consumer can switch it off" line is utter and complete fucking bullshit.

    Yes, you can turn off DRM. Yes, Zion can shut down the machines in the basement. What happens then? Applications that used to work stop, asking you politely to "Please enable DRM" and offering to tell you how. More polite dialog boxes pop up: "You need to be running DRM to use this application" or "This feature requires DRM support (where available)".

    You're given the choice between owning your own computer and being owned. Think this is paranoid fantasy? Try turning off cookies and javascript on your average user's machine. They're be completely fucked, with a big cloud of "turn cookies on" sites that simply do not work. Compliance or Else: That is the promise of DRM.

  • by Igmuth ( 146229 ) on Tuesday July 01, 2003 @01:46AM (#6337202)
    Well if it is uncorruptable so that it is immune to a virus, then it can't be patched or upgraded. If there is a method for patching there is (technically) a method for viruses to enter.

    In other words: You can't have a door and guarentee only one person can enter said door.

    (Ya I realize that wasn't exactly what you were saying...)
  • by ebyrob ( 165903 ) on Tuesday July 01, 2003 @02:00AM (#6337257)
    From the article:

    Bill Gates, Microsoft's chairman, told a technology conference in Washington on Wednesday. "This technology can make our country more secure and prevent the nightmare vision of George Orwell at the same time."

    Yes Bill that's right. You can usher in the technology that may bring about Orwell's vision and at the same time help it slide through by simply claiming the exact opposite from the other side of your mouth.

    Dyuh... It's somehow related to the truth, perhaps that means I should believe it.
  • Re:Doublethink (Score:2, Insightful)

    by Kenard ( 540102 ) on Tuesday July 01, 2003 @04:44AM (#6337690)
    Trusted computing meens Big Brother is no longer watching, He knows you can't do shit.
  • Absolutely agree! (Score:3, Insightful)

    by spitzak ( 4019 ) on Tuesday July 01, 2003 @02:25PM (#6341397) Homepage
    The idea that this has *anything* to do with what most people call "computer security" is rubbish.

    To counter your point, modern versions of Widows do use the CPU protections to stop programs from doing anything they want. They cannot randomly jump into the kernel or change it.

    However this reinforces your point:

    1. The CPU protections are hardware protections that stop "bad" programs (outside the kernel) from messing with "good" ones (inside the kernel).

    2. This hardware protection is absolutley bulletproof, far more reliable than the more complex Palladium. As far as I know there are no viruses that rely on a bug in the microcode to turn off the CPU's hardware protection state.

    3. It is obvious that despite this demonstratably perfect hardware protection, Windows system (and Linux ones) are not immune to viruses.

    The reason it fails is that such hardware protection does not stop bugs. Every single virus and attack relies on telling some software that somebody "trusted" to do something that it was not expected to do. The fact that the software is "trusted by Palladium" and by 1024-bit one-way encryption does ZERO to make it less likely that it will do something unexpected.

    In fact Palladium may make it worse, by encouraging far more stuff to be "trusted" (just like one security problem is that there is too much in the kernel). Claiming Palladium is a "micro" kernel is rubbish, as the current CPU hardware protection is probably a few hundred transistors in a tiny dot buried inside the processor chip and is more micro than anything Microsoft is dreaming up, and it is already proven that it does squat for protecting your machine.

    The other bad effect of Palladium is it may make it impossible to fix the problems, especially if it prevents unsigned filters from being installed between the network and executables.

    Palladium is 100% designed for DRM and that is 100% of it's purpose. Well on current machines a virus writer can probalby get Outlook to do all kinds of nasty things, but most involve email, they cannot get it to decrypt and play a DVD. Right now you can play a DVD by running another program. Palladium will not allow that program to run, so the only possible way to play a DVD would be the equivalent of fooling Outlook into doing it, and Microsoft and the RIAA knows that is impossible.

    Some Palladium defenders keep pointing out that the chip will provide hardware encryption calculations. The problem is that it has to so that trusted stuff can be decrypted without anybody being able to access the secret decryption key. So it is trivial to add a little extra access to that hardware that is already there. Considering this is the same industry that thinks it is a good idea to have the actual waveforms produced by modems and speakers be generated in realtime by the processor rather than add a $5 chip to the machine to do it, any suggestion that they are adding this expensive chip for any benevolent reason should be dismissed immediately.

  • Nuh Uh! (Score:2, Insightful)

    by some old guy ( 674482 ) on Tuesday July 01, 2003 @03:14PM (#6341985)
    Without a DRM-compliant public key, you won't even be able to log on to your ISP. No Usenet, no BBS, no telnet, no nuthin'.
  • by The Bungi ( 221687 ) <thebungi@gmail.com> on Tuesday July 01, 2003 @03:50PM (#6342334) Homepage
    Yes, let's extrapolate stupid company policies designed to keep stupid users from hurting themselves into what the world of computing will look like in a few years.

    Your rant is understandable to a certain extent - I've had to get around proxy restrictions on some client sites to read my corporate email. But that's how it is. Their network, their pipe, their computers, their money, their rules. Work at home or go into landscaping if you don't like that sort of thing. Further, your post implies that, since this is a "pure Windows shop" your company's policies are somehow dictated by the evil Microsoft borg. Tell you what - get the password for the domain administrator or your own box's and override the policy settings. What? You don't have the password? Well, I'm sure there's a reason for that.

    Just don't whine and make assumptions about how "this is teh sux and it gets worse and it's all m$ fault". Thanks.

  • by mr_e_cat ( 611996 ) on Wednesday July 02, 2003 @01:18PM (#6351109)
    Of course people will just be able to record the analog output anyway. Then the RIAA can bring back the pointless "home taping is killing music" campaign from the '80s. In those days every kid on the block had 10's to 100's of home taped albums.

    The RIAA really should just face the fact that there is nothing they can do. Most people wouldn't have paid for the music they download for free. Those who pirate music are usually high school/college students who have time and no money. Most people who work hard can't be bothered to go to the hassle of pirating music when they can buy it.
  • by Diomidis Spinellis ( 661697 ) on Wednesday July 02, 2003 @02:37PM (#6351787) Homepage
    In my recent column in the Communications of the ACM (Inside risks: Reflections on trusting trust revisited [dmst.aueb.gr] 46(6):112, June 2003) I describe two parallels: twenty years ago Ken Thompson showed us that one cannot trust an application's security policy by examining its source code if the platform's compiler (and presumably also its execution environment) were not trusted. The recent 007 Xbox attack [slashdot.org] demonstrated that one cannot trust a platform's security policy if the applications running on it cannot be trusted. (The Xbox is a specialized trusted computing platform.) The moral of the Xbox attack is that implementing on a trusted computing platform, a robust DRM, or mandatory access control, or an even more sinister security policy involving outright censorship will not be easy. It is not enough to certify the hardware and have a secure operating system; even a single carelessly written but certified application can be enough to undermine a system's security policy. As an example, a media player could be tricked into saving encrypted content in an unprotected format by exploiting a buffer overflow in its (unrelated) GUI customization (skin) code.

    Diomidis Spinellis
    Code Reading: The Open Source Perspective [spinellis.gr]
    #include "/dev/tty"

Our business in life is not to succeed but to continue to fail in high spirits. -- Robert Louis Stevenson

Working...