Become a fan of Slashdot on Facebook


Forgot your password?
Java Programming Security

New Programming Language Weaves Security Into Code 216

Ponca City writes "Until now, computer security has been reactive. 'Our defenses improve only after they have been successfully penetrated,' says security expert Fred Schneider. But now Dr. Dobb's reports that researchers at Cornell are developing a programming platform called 'Fabric,' an extension to the Java language that builds security into a program as it is written. Fabric is designed to create secure systems for distributed computing, where many interconnected nodes — not all of them necessarily trustworthy — are involved, as in systems that move money around or maintain medical records. Everything in Fabric is an 'object' labeled with a set of policies on how and by whom data can be accessed and what operations can be performed on it. Even blocks of program code have built-in policies about when and where they can be run. The compiler enforces the security policies and will not allow the programmer to write insecure code (PDF). The initial release of Fabric is now available at the Cornell website."
This discussion has been archived. No new comments can be posted.

New Programming Language Weaves Security Into Code

Comments Filter:
  • Re:Tall statement (Score:3, Interesting)

    by owlstead ( 636356 ) on Monday October 25, 2010 @05:54PM (#34018328)

    Yes, and it does not prevent burglary either. If you mess up the transport & application protocol you are in trouble, but what has that to do with secure *programming*? Christ, I bet you can make programs with it that display your password in 10 feet high numbers as well (given a large enough monitor).

  • Re:Yawn (Score:3, Interesting)

    by adisakp ( 705706 ) on Monday October 25, 2010 @05:59PM (#34018400) Journal

    Yet another programming language aimed at making it easier for novice programmers to do the right thing by protecting against the mistakes that novices have made in the past.

    The problem is that past mistakes are not a predictor of future mistakes.

    I know some less gifted programmers who code primarily by copy-and-paste. In that group, past mistakes are a very good predictor of future mistakes.

  • Re:Tall statement (Score:3, Interesting)

    by Alain Williams ( 2972 ) <> on Monday October 25, 2010 @06:34PM (#34018840) Homepage

    And then some old person will implement an email client in C using only the oldest and slimmest of libraries and everybody's heads will explode with shock at the speed of it.

    They already have and I use it, it is called mutt [].

  • by Anonymous Coward on Monday October 25, 2010 @06:42PM (#34018922)

    Unfortunately, you're 100% right. I had a somewhat similar system just occur where a library for processing credit cards has an option to check the card to make sure it's valid before even trying to process it. Well, using my personal card as a test it said it was invalid. Take that check out and it works perfectly, so I'm not sure what it is actually checking, according to the docs everything required was there, but what should have been a useful feature just gave me a false negative, so I can't trust it.

  • Not that bad an idea (Score:2, Interesting)

    by HiThere ( 15173 ) <> on Monday October 25, 2010 @06:55PM (#34019044)

    They'd have needed to make unbounded string the default literal character type, and given it a better name. Say, just "string". They'd have needed to make it easier to use the heap. Garbage collection would need to be built-in (optionally disableable) rather than optional, and never implemented.

    And they should start from the Spark subset of Ada.

    But Ada won't ever go anywhere, and wishing it would is futile. It's been purposed to the embedded systems market, and it's not likely to change.

    O, they should also define a built-in B+Tree data-type. (Generic, so it could be adapted to the particular types that you want to work with.)

    A problem is in the GUI. If you allow C callbacks you don't have a secure system. If you don't, you need to maintain your own GUI system. And it will never look like the other systems. Probably this would mean you need to partition the compilation into secure modules and non-secure modules. (Allowing you to call C libraries eases all kinds of problems...but it's death on a secure system. This means that you need to be able to partition things into parts that you can't really trust, and parts that you can.)

    It's a pity this is a waste of speculation, because it's not going to happen. Most people hear the term Ada and they think of scare stories, not realizing how much worse C++ is than Ada ever was in terms of complexity and difficulty to master.

  • by owlstead ( 636356 ) on Monday October 25, 2010 @06:56PM (#34019056)

    They partition it into several pieces so that you have modular access conditions. Java is already build in such a way that you cannot directly access the hardware - you can just run byte code. Of course, there may be bugs in native libraries or in the byte code execution, but that is a rather small attack surface. Basically, that's always what you try to do; you limit the exposure of security relevant features. There will of course still be bugs, but they should be much more localized.

    Building this on C++ would not make sense, since you cannot have modular security if your application logic runs in a single memory space. The only things you can do against that is trying to mitigate the fact that you *do* have access. Examples of mitigating that are the non-execute bit, random memory layouts with "no-go areas" and static analysis of code.

    So sure, there may still be holes. But I may at least be sure that that bug in the TCP socket library is not exposed to the part of the code that verifies user input, or badly written code in library X.

  • Snakeoil (Score:5, Interesting)

    by A beautiful mind ( 821714 ) on Monday October 25, 2010 @08:11PM (#34019688)
    The language is either not Turing complete and then mostly useless for practical general computing, or it is Turing complete and then it provides no real security.

    It might avoid some class of problems, but it will never free a programmer from having to clarify his/her intentions. Security is an abstraction-level free problem, meaning that it equally can be an issue at the x86_64 instruction set level and also at the level of high level contractual/social agreements that code has to handle.

    As Bruce Schneier said long ago: Security is not a product; it's a process. []

    Security is also a tradeoff between a system being secure and usable. You can make things more secure by allowing a system to do less. I'm not saying that this new programming language is useless, but it all comes down to a careful description of the language. If the creators advocate it as a secure programming language that makes code written in it secure by default, then they are almost certainly wrong and will quickly become a laughingstock. On the other hand, if they market it as a language that avoids or makes it impossible to commit certain classes of security problems, as a language that pays attention to it's core code for security issues and as a language that makes it clear security is a mindset, then I see it being useful.
  • Re:Snakeoil (Score:3, Interesting)

    by Just Some Guy ( 3352 ) <> on Tuesday October 26, 2010 @09:42AM (#34024006) Homepage Journal

    The language is either not Turing complete and then mostly useless for practical general computing, or it is Turing complete and then it provides no real security.

    It doesn't have to be all-or-nothing, though. You could make a similar argument about the usefulness of ACLs or Unix permissions or virtual memory: "these won't fix everything!" And yet all of those do make our systems more secure, and we certainly wouldn't want to get rid of them just because they're not perfect.

    Speaking of ACLs - you could bolt something like that onto an existing language pretty easily. Imagine a "hardened Python" (that's what she said!) that runs in a Java-like sandbox with no access to the outside world by default, and where you had to decorate functions with the list of permissions they need:

    def dosomething(): ...


    def expensivecalculation(): ...

    Obviously that wouldn't solve every security problem, because you'll inevitably get idiot savant programmers who figure out how to allow web clients to pass SQL strings and then execute them. It'd go a long way toward eradicating a lot of more common coding-with-insufficient-coffee errors, though, and I think that would be pretty valuable even if it's not perfect.

"Conversion, fastidious Goddess, loves blood better than brick, and feasts most subtly on the human will." -- Virginia Woolf, "Mrs. Dalloway"