Oracle Calls Java Serialization 'A Horrible Mistake', Plans to Dump It (infoworld.com) 198
An anonymous reader quotes InfoWorld:
Oracle plans to drop from Java its serialization feature that has been a thorn in the side when it comes to security. Also known as Java object serialization, the feature is used for encoding objects into streams of bytes... Removing serialization is a long-term goal and is part of Project Amber, which is focused on productivity-oriented Java language features, says Mark Reinhold, chief architect of the Java platform group at Oracle.
To replace the current serialization technology, a small serialization framework would be placed in the platform once records, the Java version of data classes, are supported. The framework could support a graph of records, and developers could plug in a serialization engine of their choice, supporting formats such as JSON or XML, enabling serialization of records in a safe way. But Reinhold cannot yet say which release of Java will have the records capability. Serialization was a "horrible mistake" made in 1997, Reinhold says. He estimates that at least a third -- maybe even half -- of Java vulnerabilities have involved serialization. Serialization overall is brittle but holds the appeal of being easy to use in simple use cases, Reinhold says.
To replace the current serialization technology, a small serialization framework would be placed in the platform once records, the Java version of data classes, are supported. The framework could support a graph of records, and developers could plug in a serialization engine of their choice, supporting formats such as JSON or XML, enabling serialization of records in a safe way. But Reinhold cannot yet say which release of Java will have the records capability. Serialization was a "horrible mistake" made in 1997, Reinhold says. He estimates that at least a third -- maybe even half -- of Java vulnerabilities have involved serialization. Serialization overall is brittle but holds the appeal of being easy to use in simple use cases, Reinhold says.
Was very obvious back then (Score:5, Insightful)
But the Java fanatics just put in more and more features, regardless of whether sane languages had them or not.
Re: (Score:3, Informative)
But the Java fanatics just put in more and more features, regardless of whether sane languages had them or not.
Obvious?
Well, given the abstraction from actual hardware that is Java's goal, how would you create a way to pass data from machine to machine without worrying about things like word size and endianness?
Got any objective reasons? Because what you've posted is just an opinion. And just like that other thing everyone else has, frankly it stinks.
Re:Was very obvious back then (Score:5, Funny)
Re: Was very obvious back then (Score:5, Insightful)
The disadvantage with xml is that it creates a lot of overhead, which could be a problem in embedded applications and large scale solutions.
Don't you just use JSON (Score:2)
Re: (Score:2)
JSON is not entirely solving the serialization/deserialization problem, it's just making it in text format instead of binary format. It appeared after Java had created the serialization/deserialization and is even though it's better not the complete answer. It also adds a footprint penalty (bad for embedded) and performance penalty (bad for large scale solutions and embedded). So even though JSON is only ensuring that a numeric value isn't a character value suddenly it's only coming half-way and I expect th
Re: (Score:3)
The disadvantage with xml is that it creates a lot of overhead
They're already using Java, obviously they're not concerned about overhead.
Re: Was very obvious back then (Score:4, Informative)
Re: (Score:2)
XML is strictly hierarchical. Objects are cyclical graphs. I know you can squeeze one into the other, but it's an ugly mess.
Re:Was very obvious back then (Score:4, Informative)
It was solved even earlier with XDR (R.I.P. Dr Bruce Nelson..)
Re: Was very obvious back then (Score:2)
That link you posted is pretty much exactly the same as readObject and writeObject methods in Java serialisation
Re: (Score:3)
The link is about structs, not about objects.
So when you deserialize them, you have no vtable.
And if you had read (and comprehended your link) you had realized: the author shows up all the problems in serialization. He does not really propose portable solutions.
So: what exactly was your point?
Re: (Score:3)
Oh please. This wasn't a failure with their implementation. It's an issue with the concept which is still a good thing because the positives still outweigh the negatives.
It just sucks though going through a 100+ projects to add jaxb to their pom files to prepare for Java 11 LTS that's coming in September.
Re: (Score:2)
The good is minuscule, the bad is massive. And that was obvious back then. We did joke "Now Java even supports malicious mobile code!" when the feature was announced and wondered how this could ever be secured. Of course, many people though it was great because they did not get it. Just as many people today.
Re: (Score:3)
No. The problem is with the concept. Persistent objects are dangerous.
Insanity.
I like files. They are objects that are persistent.
Just because people are too weak to sanitize inputs doesn't mean that storing bytes persistently is a bad idea.
Re: (Score:3)
The issue is that in practice this feature (and features like it in other languages) 99% of the time the programmer intends it to be used for persisting "boring old data" in the laziest way possible. The feature of having data be evaluated with executable instructions being honored is just a huge liability.
The 1% of the time when the programmer explicitly does use the "data can have code-to-eval" capability, it has been in my experience, always done better another way (such code is generally a pain to debu
Re: (Score:2)
I like files. They are objects that are persistent.
I like files, too, but they are not persistent objects. Files are persistent data.
Re: (Score:2)
This has nothing to do with "Java fanatics".
Java has serialization because it has RMI. Java has TMI because it was designed by Sun in the early 90s when everyone except Sun had already realised that sunrpc was a bad idea. The network is the computer, right?
Re: (Score:2)
Ada is a design-catastrophe for other reasons. That is why it is a niche-language and will remain so. It is barely usable.
Re: (Score:3)
Ada is a subset of a HDL language used to design the CPU you are running your web browser on right now.
Just because software engineers find it hard, it doesn't stop hardware engineers managing just fine with it.
Re: (Score:2)
Considering how many bugs exist in CPUs this statement doesn’t inspire a lot of confidence. Doubly so when taking into account Meltdown and Spectre.
Re: (Score:2)
Meltdown and Spectre are not results of Bugs, but design flaws.
If you tell me to build a house 10m x 10m and when I'm finished you realize you actually wanted a 12m x 10m then this is not a bug: you made a mistake in planning.
Re: Was very obvious back then (Score:3)
The number of bugs in CPUs is an order of magnitude less than in most software. It has to be, because recalling a million CPUs is economically unfeasible. âoeRecallingâ a million software installs (via auto-update), OTOH, is so commonplace as to be unremarkable.
Re: (Score:3)
Ada is a subset of a HDL language used to design the CPU you are running your web browser on right now.
Both VHDL and Verilog are like the mafia. Hardware designers don't do business with those languages because they want to.
Re: (Score:2)
Ada is a subset of a HDL language used to design the CPU you are running your web browser on right now.
Both VHDL and Verilog are like the mafia. Hardware designers don't do business with those languages because they want to.
Designing logic for chips is on the order of 95% designing and 5% coding. We can use better front ends like myhdl, but it doesn't save much.
Re: (Score:2)
Yep. Ada was a construct created by the US Dept Of Defense in the late 1970s ostensively for programming embedded computer systems -- a task for which it was monumentally unsuited because the embedded computer hardware of the time had low clock speeds and very limited memory and Ada demands a lot of resource.
The DoD then decided that a single computer language -- Ada -- should be used across all its applications. The problem was that no one was able to create an Ada eco-system -- compiler, libraries, etc
Re: (Score:3)
Ada did not take off because when it came into the industry compilers were absurdly expensive and every "Ada vendor" wanted your leg and your first born.
Besides that Ada is a nice language, very well designed. I would love to program in Ada, but because of the idiots who made it unpayable expensive most ada projects switched to C++
It is barely usable.
If you can not use Ada effectively you likely can not use any other programming language either.
Re: (Score:3)
Well, just another aspect of why so many coders are so bad: They cannot recognize whether a tool is good or bad.
Re: (Score:2)
And the only way out f that is gathering experience and continuous learning and improvement. Obviously that only works if people want to improve or are forced to improve.
Re: (Score:3)
You think older mistakes should not be corrected?
Records? Is that a thing? (Score:2)
Re: (Score:3)
Cobol anyone?
I thought I was going to old-school school people by mentioning QBasic's "type" structures, but you punked me with Cobol.
But then again, not even Python does this well if you need a structure with specific data types to match a binary stream you need to read/write reading/writing.
Re: (Score:2)
Re: (Score:2)
A record is basically what in C you would call a struct. The reason why Java desperately needs them is that there is currently no way to efficiently store an "array of structs". Yes, you could do SoA, but that isn't what you want some of the time.
The inability to control memory layout more finely is the main thing that people trying to write high-performance Java complain about. This will help, at least a bit.
Re: (Score:2)
I agree, although to be fair, anything that makes Java less object-oriented is a clear improvement.
Re: (Score:2)
Efficiency isn't a goal or use case of Java? Well maybe it damned well should be then.
Re: (Score:2)
No, it isn't. If you want efficiency, write in a language that targets bare metal. Especially if you're talking about binary layouts, which depend on a dozen factors like padding and endianness which are supposed to be abstracted away in a language like Java.
Re: (Score:3)
This isn't about having bit-perfect layout, it's just about having a way to build an array of structs that doesn't require each individual struct to be individually allocated.
Re: (Score:2)
Efficiency is actually a goal of every programming language on the planet. Some more than others. There is no reason why a language must be either pure bare metal, or else completely oblivious to performance. One other example I can think of is Apple's Swift language which didn't opt for garbage collection, but rather reference counting. Maybe that's a good trade off for writing apps for phones that don't have much memory.
Re: (Score:3)
Better than Joy Division!!
It will return your data in a new order.
Object serialization is dangerous. (Score:3, Insightful)
Regardless of language, object serialization is a dangerous idea. While it may seem like a nice idea at first, loading objects from unverified mutable data is an invitation for someone to tinker with that data. The situation only gets worse when your object structure changes because now your object data is invalid or incomplete.
Much like goto, I'm not arguing that it's not useful but rather that it's use it is inherently dangerous.
Re: (Score:2)
> Much like goto
Or worse, eval().
Re:Object serialization is dangerous. (Score:4, Interesting)
Fortran had (has?) calculated goto. Not goto pointerVar, goto intVar where intVar contains _LINE_NUMBER_.
I've seen it used. Integer NextIter. Then you use the middle bits of that Int as binary option flags. At least that's what you do if you have an applied math PhD and a case of cranial rectosis...
On point: You're not supposed to deserialize from untrusted sources, in any language. Might as well execute SQL right from a web form.
Re: Object serialization is dangerous. (Score:5, Funny)
Thank Go weaning me off ruby's eval().
That's because Google's motto is "Do no eval".
Ruby eval (Score:2)
Why would you ever use eval in Ruby?
Re:Object serialization is dangerous. (Score:5, Interesting)
Regardless of language, object serialization is a dangerous idea. While it may seem like a nice idea at first, loading objects from unverified mutable data is an invitation for someone to tinker with that data.
Okay then, smartypants, what do you propose for persisting fields of an object? Anything you propose is, by definition, "serialisation". The only alternative to serialisation is non-persistent objects.
(TBH, I kinda like the thought of signed serialiased blobs)
Re: (Score:2)
Put it in the cloud, of course!
Re: (Score:3, Insightful)
Regardless of language, object serialization is a dangerous idea.
Okay then, smartypants, what do you propose for persisting fields of an object?
I was speaking specifically about object serialization. There's nothing wrong with data serialization but using it for object serialization is asking for trouble. If you don't understand the difference then you should excuse yourself.
Re: (Score:2)
Gravis Zero: Oh I guess all those guys making object databases are damned fools and you know better than them. We're not worthy, we're not worthy.
Re: (Score:2)
If you have to ask that, you don't belong in this conversation. Its a well defined term.
Re: (Score:2)
If you have to ask that, you don't belong in this conversation. Its a well defined term.
OP clearly stated that the problem with object serialisation was reading in fields that could be tampered with. I'm genuinely curious about what alternatives to persistence there is that overcomes this "problem".
I think the problem is that you and the OP are clearly newbies. That "problem" you think exists with object serialisation exists with all data serialisation. If you weren't newbies you'd know that.
Re: (Score:2)
OP clearly stated that the problem with object serialisation was reading in fields that could be tampered with. I'm genuinely curious about what alternatives to persistence there is that overcomes this "problem".
When data fields are filled in an object, they are validated by the methods that set them. However, object serialization by it's very nature bypasses these methods of setting data which results in the possibility of restoring object data that would have otherwise been caught by method setting it.
Like AuMatar wrote, you don't belong in this conversation.
Re: (Score:2)
Oh I can answer it. But all it will do is cause him to ask another dozen questions about it, because he's ignorant. Kind of like this response to you.
Re: (Score:2)
So you're an elitist wad of shit. Perhaps you belong here, but it would still be better if someone would shoot you in the face.
Re: (Score:2)
How do you propose to persist an object...?
You don't, which is the point.
So each time a program starts up it must prompt the user to manually enter all those values it had when it shut down? Frankly that sounds like a stupid solution.
Re: (Score:2)
You store off program state as data not as a serialized binary object.
Re: (Score:2)
You store off program state as data not as a serialized binary object.
How does that avoid the problem OP has with object serialisation (data is tampered with when persisted).
Re: (Score:2)
Not unless you think everything is an object, which Java (to its credit) never believed.
Re: (Score:2)
ok, help me out here. If you save an object (or any data) to a file, as long as you validate the data when you open the file and load it..... what's the problem?
Because we have "persistent objects", they're called files.
signed serialized blobs
We call those a CRC bits, or checksums. Usually there's one per record and/or one for the whole file / stream / whatnot.
Also, those round things are called "wheels", there's really no need to re-invent them. And PLEASE don't try patenting them, that has the potential for a big headache fo
Re: (Score:2)
Re: (Score:2)
No, you don't. But, you allow a hacker to modify the persisted bytes and thus make the production code load objects that have a state that they should never, ever have, breaking their invariants, and possibly make them call constructors of classes that they should never call.
Re: (Score:2)
Serialization is quite important, however. My preference is that it contain some "signing bytes" to identify what it is, including version number, and a checksum. This still doesn't protect against hostile action, of course, but is more for detecting that you can handle the version and you know what it is you're deserializing. It might also identify the word-length and byte order, to make it more portable, but in my typical use case having it match my native processor is more important than portability.
Re: (Score:2)
I hope I never have to use your code. You're an idiot.
That is nonsense ... (Score:4, Insightful)
Why would serialization be a security risk?
Hu? Cant ... you write to a disk or to a socket and thats it.
Sure, I'm nitpicking, because deserialization might be a security risk.
However only if you actually do it and e.g. leave open paths how bad files can end on your disk, which you then read, or open a socket and accept incoming serialized objects.
A typical Java program is absolutely not vulnerable to anything regarding serialization unless the programmer (intentionally?) made it so.
Articles about this (and basically every post here in the story while I type this): are simply wrong.
Java Serialization was once its strongest point of success. Many GUI builders let you edit "beans" and simply serialize the GUI as an graph of objects that simply gets read in again when the application starts and you call the setVisible(true) method to show your window.
Not needing to write any boilerplate code for writing and reading objects is a huge time saver and simplification.
Re: (Score:2)
Serialization of data only is not a risk. Serialization of objects that can contain references to other objects is the problem. Because an attacker can tamper with the raw binary object that can still be deserialized, but now has different contents and now will run differently on the other end, in a manner not expected or possibly controlled.
Basically, serializing anything that can be acted upon is dangerous. It is like sending you a package and then immediately telling you to pick the first thing out of th
Re: (Score:2)
You could make the same argument about using Hibernate and constructing objects from a relational database. You could use the same argument about basically any program that does IO. It's a damned stupid argument. Whether and how much validation you need to do on your IO is application specific. If your application needs to validate the entire state of the object graph before doing anything with it, then you need to do that. It's not a problem with the concept of deserialisation.
Re: (Score:2)
Because an attacker can tamper with the raw binary object that can still be deserialized, but now has different contents and now will run differently on the other end, in a manner not expected or possibly controlled. ... or a PERL script to change a line in a text file ... what exactly is the difference?
Yeah, and he can use an SQL statement to change a row in the data base
And it has nothing to do with graphs anyway. It can be a single object, only consisting out of primitive types.
Hint: the problem is code,
Re: (Score:3)
Deserialization is a risk, it's a risk in every language.
Are Java coders just letting users browse for object in the file system? Accepting objects in web form inputs or unvalidated webmethod parameters? Turning around and running those objects as root?
The problem isn't Java per se...it's coders who only know one language. They really do need a language that 'bubble wraps' the OS. But at some point, they have to get things done.
Re: (Score:2)
Of course it is. If you were reading stuff off magnetic tape or even sodding punch cards it was still possible that it had been fucked around with.
As the nuns said: if you don't know where it's been, don't stick it in your mouth.
I don't get it (Score:2)
OK I don't get it. Serialization is just saving the field values to a file and then reading it back. Of course if you just read a file without any validation and you don't know if it has been tampered with then of course you can have security issues. But this applies to any file formats or data anywhere. Java serialization is not a unique case. Serialization is an easy way to load values without going through an intermediary format. Replacing it with JSON or XML doesn't change the issue one bit.
Re:I don't get it (Score:5, Informative)
Java is in so far unique as when you use build in serialization, you also serialize the class files.
There are two "marker interfaces" to make a Java class serializable: Serializable and Externalizable.
In casse of the first one, the Java Framework/VM uses reflection to serialize and deserialize objects.
In case of the second one, you are required to implement the methods writeExternal() and readExternal().
As the class files are in the serialized data stream, a program reading "untrusted" serialized data might also load classes aka code from that stream. If that code implements Externalizable and thus has an "unknonwn foreign" method readExternal(), the deserialization framework will call that unknown/untrusted method readExternal() which means: you run code coming from outside, which can do what ever it wants besides reading the object from the object stream.
Re: (Score:2)
If you reading the code also from an untrusted stream then yes of course you will have security issues. But that is a completely separate matter. You don't ever load code from an untrusted source.
I don't see anywhere in the Java specifications that code is also read in when dezerializing can you point me to that spec?
Re: (Score:2)
You are right, normal serialization to files does not include the code, only via RMI the code is included (or requested be the recipient) when the recipient does not have the classes on the classpath.
Re: (Score:2)
Not just "field values" but executable code!
Re: (Score:2)
Errrm, I thought Java only serialised Objects that inherit from Serializable and if you don't inherit that it isn't serialised. Not that I find that a great design, but technically yes you can mark what is serialised. And you could also override readObject and writeObject to customise it further.
The reason why it is dangerous (Score:5, Interesting)
1. No validation. You might have a nicely designed object, well tested, and has all sorts of validation checks to ensure that the internal state is never broken. Java object serialization bypasses all validation, permitting an attacker to construct a malformed object. Exactly how that would cause a problem requires a bit more work on the attacker's part, by studying how the application reacts to the malformed object. Adding validation is supported with Java serialization, but its not used by default. The designers favored simplicity over safety. Does switching to JSON magically fix the validation problem? Nope.
2. Loading of classes that you didn't expect to load. If I expect to receive a serialized list of strings, there's nothing to prevent an attacker to providing a list of any kind of object instead, due to type erasure. The application might fail to process the list because of a ClassCastException, but the potential damage is done. Java serialization /does/ support filtering out classes that aren't expected, but this is off by default. You need to define the blacklist yourself. Why is loading other classes a problem? See the next reasons:
3. Custom code during deserialization, which is actually necessary for performing your own validation. You can define your own code which runs when the object is deserialized, and the code can do pretty much anything. An attacker might be able to trick the code (using malformed input) into doing something harmful.
4. Additional classes on the classpath. Even if all of your code is well behaved, and has proper validation checks, and proper custom code, you're still vulnerable because additional classes exist that you're not aware of. You had no idea that there's this class 'Q' which has broken custom code, because Q was sucked in as a dependency of something else. That popular open source library you're using might be exposing your application to attack, and you didn't even know it.
For anyone designing an object serialization mechanism, always consider the tradeoffs when trying to make the system easier to use. Always use whitelists for trusted code instead of blacklists. Always construct objects using the object's public API. Favor the use of standard representations (maps, lists, tuples) instead of supporting full-blown customization. A little bit of friction can be a good thing.
Re: (Score:2, Insightful)
To answer your points with the obvious:
1) Use the validation supported by java, just like you would in XML, JSON, . Problem solved. Serialization isn't the issue here, the app dev is. The app dev can be lazy on XML or any other serialization classes.
2) Same as point 1. The facility is there, use it. 'off by default' isn't an excuse for it being 'bad'
3) Unit tests and write proper code. Again, this problem isn't different to any other XML/JSON serialisation mechanism.
4) You have the same issue with any XM
Re: (Score:2)
Unit tests will never catch the cases you didn't think of. Attackers are going to exploit the cases you didn't think of. Ergo, unit tests cannot ensure security.
Re:The reason why it is dangerous (Score:5, Insightful)
If I'm following you correctly, the problem isn't serialization per se but rather the fact that the deserialization is being done by the Java runtime (which has no way to validate the resulting objects against the application's requirements, since its deserialization code is application-independent, and also has the power to instantiate any kind of object, even those that are totally irrelevant to the task at hand), rather than by the application itself.
A user-supplied deserialization-routine, OTOH, has at least a chance of being secure in the face of invalid source data, since it can check to make sure that its constraints are correctly satisfied and reject the data if they aren't.
Of course, avoiding making every application developer write his own application-specific serialization/deserialization routines was largely the point of this Java feature, but in hindsight it appears that was a bad decision.
Re: (Score:3)
Of course, avoiding making every application developer write his own application-specific serialization/deserialization routines was largely the point of this Java feature, but in hindsight it appears that was a bad decision.
And this decision is just further evidence of Oracle's incompetence. Instead of keeping it but requiring every application developer to write his own object verifier, they're simply removing it because doing it right is hard.
Re:The reason why it is dangerous (Score:5, Informative)
If I'm following you correctly, the problem isn't serialization per se but rather the fact that the deserialization is being done by the Java runtime (which has no way to validate the resulting objects against the application's requirements, since its deserialization code is application-independent, and also has the power to instantiate any kind of object, even those that are totally irrelevant to the task at hand), rather than by the application itself.
Java deserialization is magic. By which I mean it behaves in several ways that user code pretty much can't.
The default system effectively loads a binary blob off the input stream and then creates each object without calling a constructor*. You can't just not call a constructor in Java, but Java deserialization does. All the fields are set by magic, by which I mean it ignores getters and setters and whatever access level might be on the fields. Any field marked as "not serialized" (transient) is left with default values - but those may not be the default values you think! If you write private transient int foo = 3; then foo won't be serialized, and when the object is deserialized, it will instead be ... 0. Because 0 is the default for ints.
How does Java deserialization know if it's loading the right fields for a given object? Well, it's magic, but not that magic - you're supposed to let it know by setting the serialization ID for the class. And how do you do that? By declaring a static long serialVersionUID, and making sure you update it whenever your class structure changes. Don't do that and the deserialization logic might not notice that the structure doesn't quite match. No, you can't just have it autogenerate one - if not set, the serialization/deserialization code will create one, but it may be dependent on compiler and randomly break across identical code bases. Surprise!
But in any case, the serialization system is magic. How do you write a custom serializer/deserializer? By creating the private methods writeObject(ObjectOutputStream) and readObject(ObjectOutputStream). Because the serializer is magic, it can access these private methods. (Note that readObject(ObjectOutputStream) gets called on a magically created object that has never had a constructor called on it, so all fields will have their default values! How does that work with final fields? Well... the short answer is "like shit." The longer answer is that the default deserializer just ignores the final modifier (which you can't do in generic code), and that if you want to do the same, there's some reflection magic or non-standard APIs you can do.)
So anyway, there's a basic overview of how Java serialization defies expectations and basically guarantees that anyone writing code that involves serialization will do it wrong.
* This is false. What it really does is go up the object hierarchy and look for the first parent class that does not declare itself serializable and calls its default no-args constructor. But that means that your class that you declared serializable therefore, by definition, does not get its constructor called. Surprise!
Re: (Score:2)
But that means that your class that you declared serializable therefore, by definition, does not get its constructor called. Surprise!
No surprise. Calling a constructor when you deserialize an object makes no sense. That is why Java rightfully does not do that.
RAS Syndrome (Score:2)
EMP pulse, ATM machine, HIV virus, PIN number, MSDS sheet, NIC card, IRA account, ISBN number, LCD display, PDF format, SPF factor, AC current. English does this, deal with it.
Re: (Score:2)
Of course, avoiding making every application developer write his own application-specific serialization/deserialization routines was largely the point of this Java feature, but in hindsight it appears that was a bad decision.
Yeah. It seems like there's no really good way to make this feature work. Whitelists can help, but ultimately there is no way to avoid thinking about security when you read things off the wire.
Re: (Score:2)
What I've gotten is that Java allows the actual byte-code of the class implementation to be included in the serialized data. This allows code to be injected, and not just malformed data. I would think the fix would be to just remove the part of the JVM which creates new classes from serialized data.
It won't change anything (Score:5, Insightful)
Java infantilization (Score:2)
In some way, Java started as a toy language and headed downhill since. Multiple inheritance and deterministic object destruction are hard but useful. Java never had those, but it had an option to have full featured, grown up applications on desktop and in the web browser. Of course native look and feel of the former and security of the later is hard. So - out these features go. Couldn't make J2ME on mobile phones work either, so took another company to productize Java for mobile apps. Instead of appreciatin
Comment removed (Score:3)
Object serialization, bad for perf, sec and ops. (Score:2)
Object Serialization was horrible from day one. It was the tool of lazy programmers for years.
Performance wise it was a disaster. People would pass objects between jvm instances with no regard to the size of the data blob they were sending. When in reality very little of an object is required to be sent in most use cases. Example: When you have a java cluster and you sync session objects between instances. ( Note this is just dumb anyway. There are far better patterns for this. ) But your low cost dev
Re: RMI and serialization was useful (Score:2)
The problem lies in that there's no validation of who's submitting or fetching the data and that the data is correct when deserialized. Someone can compose a binary stream that can be crafted to result in something unexpected when deserialized.
Exchange formats like xml can be validated before parsing.
Re: (Score:3)
The Java serialization feature was fine. It just wasn't meant to be robust against adversaries. Java RMI uses the serialization feature and thus have the same problem. It is fine in trusted environments.
It is a bad idea to use them with untrusted sources.
I didn't RTFA so I don't know if there are other reasons that security for removing the feature.
Re: (Score:2)
I think it's got more problems than just the security thing. Java serialisation is a pain the ass.
Re: (Score:2)
Compared to having to write your own non-standard serialization/deserialization it's not bad. At the time when this was created there weren't much XML support either in Java and other languages.
A lot of computer languages and design paradigms have been created through history - some of them have been fine at the time, a few have survived, some have been overdue for replacement for decades but there are no way to migrate away from them.
If you want real pain in the ass - Cobol fixed size records where the sig
Re: (Score:2)
Oh I don't know, it's pretty bad. It doesn't solve any of the things that inevitably come up, like what happens when you change your classes, how do you migrate the legacy data? It's not bad if you're a school student hacking some rubbish. But in the real world it's pretty useless.
Re: (Score:2)
Java's rmi was not cross-language, so it was only useful if both ends were written in Java.
Corba was cross-language, but had several problems:
- "Design by committee". Designed in OMG, and it wasn't until version 2 that they bothered with specifying the on-wire format.
- No versioning of interfaces. Unless the developers took care not to break things you had to upgrade both ends.
- Cross-platform exceptions didn't work to well. I remember spending a week tracking do
Re: (Score:2)
While I've dabbled in several other language, Java has 'paid my bills' for most of the last twenty years, so my view may be slanted. However, I can say that I've never written nor encountered a 'buffer overflow' hidden away in any code I was directly responsible, and while there have been a few over the years in the core and some well used libraries, they have been usually related to Serialization. This is a good move.
I'd argue that Java is the reason why so many Linux servers are used in the corporate e
Re: (Score:2)
Yes, although technically you're talking about the portability of the Java VM rather than Java the language. The language I find adequate and mediocre, but I agree the VM and the "it just works" thing makes it compelling.
Re: (Score:2)
and fucking JEP 286 (yay, let's make Java like JavaScript!) is a good idea,
What’s wrong with type inference and how would that make Java like Javascript? Just because both would use the keyword “var” does not make the concept the same especially as like in C# and C++11 everything would still be statically-typed unlike Javascript.
Re: (Score:2)
Yes it is to save keystrokes in writing useless boilerplate that the compiler can infer on its own.
Re: (Score:2)
The entire purpose of computers is to automate (usually calculations in some form). If you're telling the compiler stuff it already knows, you're doing it wrong.