Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Ruby Java PHP Programming

Deserialization Issues Also Affect Ruby -- Not Just Java, PHP, and .NET (zdnet.com) 62

An anonymous reader writes: The Ruby programming language is impacted by a similar "deserialization issue" that has affected and wreaked havoc in the Java ecosystem in 2016; an issue that later also proved to be a problem for .NET and PHP applications as well. Researchers published proof-of-concept code this week showing how to exploit serialization/deserialization operations supported by the built-in features of the Ruby programming language itself.

"Versions 2.0 to 2.5 are affected," researchers said. "There is a lot of opportunity for future work including having the technique cover Ruby versions 1.8 and 1.9 as well as covering instances where the Ruby process is invoked with the command line argument --disable-all," the elttam team added. "Alternate Ruby implementations such as JRuby and Rubinius could also be investigated."

The deserialization issues can be used for remote code execution and taking over vulnerable servers. While .NET and PHP were affected, it was Java until now that has faced the biggest issues with deserialization, earlier this year, Oracle announcing it was dropping deserialization support from the Java language's standard package.

This discussion has been archived. No new comments can be posted.

Deserialization Issues Also Affect Ruby -- Not Just Java, PHP, and .NET

Comments Filter:
  • by Anonymous Coward

    XML was supposed to fix all this!

    And we all know, XML is like violence: if it doesn't fix the problem you're not using enough of it.

    • by gweihir ( 88907 )

      If XML has been designed by people that actually understand security, if _could_ have fixed the issue. But all the security-ignorant masses of developers ever want is features, features and more features. They would probably have rejected an actually secure XML.

      • The problem has nothing to do with XML. It is a problem with serialization. Even if you use json or yaml or whatever, there are still security issues.
  • Serialisation and deserialisation happens when developers get lazy and/or the original architects of the system designed a shitty appmodel. Or none at all. You see this nice and clearly in PHP CMSes such as Expression Engine or WordPress.

    It goes like this:
    Check out the model, see bunch of crap, think: "Oh I know, I'll just serialize my stuff and dump it into a single field." Newer stuff in WP is full of this and it doesn't help that this is tacked on to a baaad application model with some really shitty DBAL mechanisms that quickly grow to 2-digit amounts of SQL statements being executed per API call and an ERD designed on crack.

    The truth of the matter is: If you don't take total control of your data every step of the way you are bound to be screwed when an exploit like this crops up. Simply serializing is the exact opposite of taking control. And taking control is basically impossible if you don't know how to design your app or its DB.

    Whenever I see serialized data lying around in persistence, I know that someone further up didn't do his job.

    My 2 eurocents.

    • by gweihir ( 88907 )

      I don't think it is laziness. I think it is plain incompetence. These people cannot really code and hence they need all these features to get anything to work at all. Of course, they also do not know how things actually work under the hood and they have zero understanding of security, nicely leading to the mess we can observe almost everywhere.

      But then, if you get competent coders, you may actually have to pay them like the qualified engineers they are. "Management" cannot have that. A coder may actually ea

      • by sphealey ( 2855 )

        Most of what I have seen is (1) lack of understanding of what databases are, what they were designed to do, and what they are capable of (2) fear of using databases based on a shared cultural non-understanding stemming from 1 (3) broken workaround after broken workaround to allow the architect and developer... not to use databases.

        • Most of what I have seen is (1) lack of understanding of what databases are, what they were designed to do, and what they are capable of (2) fear of using databases based on a shared cultural non-understanding stemming from 1 (3) broken workaround after broken workaround to allow the architect and developer... not to use databases.

          That's because databases are heavy lifting for light problems. Avoidance is rational.

      • have zero understanding of security
        Unless the programmer of the deserialization code did not plant an easter egg, aka a trojan, into the deserialization code: there is no security issue at all!

        WTF ... what is next? An SQL select from a database is a security issue?

    • by Luthair ( 847766 ) on Sunday November 11, 2018 @03:06PM (#57626708)

      I've been attempting to the write this and not sound like a jerk.... but serialization simply means translating whats in memory into a format that can be stored. Even the scenario you're complaining about isn't necessarily "bad", it sounds like they're using it as an alternative to disk storage and as long as they aren't running queries on the contents of the field that isn't a problem.

      The issue that Java (and presumably Ruby, though I don't care enough about Ruby to check) is that it turned out to be possible to craft serialized objects that simply deserializing would cause code execution. In the case of Java most development had long since switched to using other formats instead of native binary serialization before the vulnerabilities were discovered but as there are a ton of legacy applications and frameworks people still had problems.

      • If you think native binary serialization is the problem then you are ignorant (either that or you expressed yourself in a way I didn't understand). It doesn't matter if you use binary, or XML, or Jason or yaml, these vulnerabilities are going to be there. It's not just Java either, it's basically any language that does deserialization automatically on untrusted data. The reason is simple: untrusted data needs to be sanitized and treated as dangerous every step of the way. Sanitizing can't be done in an auto
      • is that it turned out to be possible to craft serialized objects that simply deserializing would cause code execution.
        Actually it does not. Unless the writer of the relevant classes deliberately put some special "deserialize()" methods into the classes on the server. Exploit from outside is completely impossible unless a programmer deliberately put in a back door.

    • nonsense

      moving a program from one place to another can face identical issues.

      the path is secure, or it is not secure.

      serialization irrelevant.

    • wrong

      serialization of an object is very useful.

      the path it moves on must be secure. if it is not secure, then you can have the same issues moving information or code by other means.

      serialization not the problem if the path is compromised.

    • Using serialization operations (properly to a non-binary format) makes things more resilient to changes. You can roll your own, but then again, that seems more likely to be prone to errors. NIH-syndrome is a bitch.

    • This has nothing do with serialization, which is just as much as serialization when written by hand. This has something to do with trusting inputs. One should not trust input that the user can control. If one only saves internal records to disk, automated serialization is welcome.

      And in general, it would be good if programmers stop hating processes for automating code. The whole point of coding, and programming, is automating things. If you insist on doing things by hand, why not try your hands at doing con

    • Whenever I see serialized data lying around in persistence, I know that someone further up didn't do his job.
      You mean inside of a database?

      There is nothing wrong with serialization. Use it when it is appropriated, don't use it if not.

      My 2 eurocents.

      Luckily a bit more worth than 2 US cents :D but your opinion: nope.

  • It boils down to (Score:5, Insightful)

    by jabberw0k ( 62554 ) on Sunday November 11, 2018 @02:20PM (#57626428) Homepage Journal
    • Data read from a file or network is only as secure as that file or network.
    • There are reasons why the Unix Way abolished binary data files wherever possible in favor of plain text files; read The Art of UNIX Programming by ESRaymond.
    • Binary formats can be perfectly safe, it takes a competent person to write the processors for them. They are not more difficult to write than one for a text based format, so its not like text based formats are safer. They can save some CPU cycles since using length field demarcation rather than demarcation characters, you dont have to inspect every single character.

      • Agreed. The simplest binary formats are very simple to store and retrieve, if you think about it, you just create a struct with non-pointer/reference types, and write your fields. Serialisation is a write of the memory to a file, deserialisation is a read. All in all, it'll be 10 lines of blindingly obvious code without any chance of programming errors, buffer overruns etc.

        However, and this is a big however, it's completely inflexible, and has a number of massive downsides. It's going to be endian specific,

"What man has done, man can aspire to do." -- Jerry Pournelle, about space flight

Working...