Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Java Programming Python The Internet

Thoughts On the State of Web Development 253

rmoskal recommends his blog post up at Most Media on finding the right level of abstraction, Grails, and SOFEA. "[Three years ago] I was very excited about Apache Wicket as the way to develop line of business applications with a domain model, CRUD [create-read-update-delete] screens for maintaining the model, and in the most interesting cases, doing something else useful besides. I still like Wicket. It has, as its website says, a small conceptual surface area.' It reminds me of Python in that 'You try something it usually just works.' In many respects, though, Wicket seems to be at the wrong level of abstraction for the for the sorts of line-of-business applications described above. If your team is spending any time at all writing code to produce listing, filtering, and sorting behavior, not to mention creating CRUD screens and the back-end logic for these operations, they are probably working at the wrong level of abstraction. ... Recently I did a small project using Grails and was quite pleased. Grails uses groovy, a dynamic language compatible with Java, and is based on the proven technologies that I know and love well: Spring, Hibernate, SiteMesh, Maven, etc. ... I get all the power of the Java ecosystem without the fustiness and lack of expressivity of the core language (no more getters and setters, ever!)."
This discussion has been archived. No new comments can be posted.

Thoughts On the State of Web Development

Comments Filter:
  • getters setter :) (Score:5, Insightful)

    by roman_mir ( 125474 ) on Sunday April 18, 2010 @06:17PM (#31890510) Homepage Journal

    First, getters/setters are generated by your IDE of-course, so you never have to write them by hand, however, more to the point, I have avoided many various parts of the 'Java ecosystem', while still using that language to do all sorts of development and really, you don't have to use getters/setters. I use many Java classes as simple data structures, just like C-Gods intended, no getters or setters there, just public or protected fields.

  • by Banacek ( 994201 ) on Sunday April 18, 2010 @06:27PM (#31890598)
    Using Java for web development is like using a wrecking ball to hammer in a nail. Use something that fits the job better, like Zend Framework, Django or Ruby on Rails. In web development, time to market is everything. Build your application and hopefully you get a large user base. Then when performance is an issue, you should already be working on a rewrite that can incorporate something like Java on the backend.
  • Need a New UI Tool (Score:5, Insightful)

    by Tablizer ( 95088 ) on Sunday April 18, 2010 @06:29PM (#31890610) Journal

    As I've said before on slashdot, the open-source movement should look at creating a new GUI browser that does desktop-like and CRUD GUI's well. Forcing the e-brochure-like HTML-based browsers to act like desktop/CRUD GUI's is like trying to roll Pluto up Mt. Everest: people have kept trying to pull it off with AJAX and whatnot for more than a decade, but it's still kludgy, bloated, buggy, security-challenged, and version-sensitive.

    It's time to throw in the towel and start a new tool and markup language.
       

  • by hessian ( 467078 ) on Sunday April 18, 2010 @06:49PM (#31890758) Homepage Journal

    Only Perl is true!

  • by paziek ( 1329929 ) on Sunday April 18, 2010 @06:51PM (#31890768)

    Title of summary is wrong, its not about web development.
    What is the purpose of this? Don't we want to go away from Java in web? It was slow back in the days, now its just yet another security risk that can compromise ALL browsers at once if Oracle/Sun screws up. And now some spin-off or whatever? No thanks.
    What we really need is some kind of consistency between output of HTML/JS engines, as well as CSS, so that GUI "just works". There is nothing wrong with those languages/markups, just with the implementation. While I'm all for competition and browsers trying to be better at something from others, it seems to me that in this area they just should cooperate more. It was IE6 back in the days, now its all over the place with vendor-specific extensions, that instead of going first thru W3C, they just are added to the browser with -moz/-webkit/-whatever thats supposed to make it okay to do this? Or perhaps they want to take approach "Hey, we implemented X and its popular, can we force this into W3C now?", with I hope won't work very well for them.

  • by icebraining ( 1313345 ) on Sunday April 18, 2010 @07:19PM (#31890924) Homepage

    The biggest problem with Java applets is that they pretty much force you to use Java as your programming language. Many of us want language choice.

    Well, there's at least proofs of concepts for Ruby (JRuby), Python (Jython) and Groovy.

  • by Kenneth Stephen ( 1950 ) on Sunday April 18, 2010 @07:20PM (#31890926) Journal
    ...are doomed to reinvent J2EE. Badly I confess that I haven't tried any of the frameworks mentioned by the parent. But I have had conversations with people who have, and here are some questions that I have for the folks who think that the people who came before them (and invented J2EE) are stupid:
    1. Can your framework handle two-phase transactional commits when it interfaces to other applications?
    2. How well does it support single-sign across apps deployed across different servers but behind a reverse proxy that unifies them under a single domain?
    3. Can you cluster multiple hosting servers for your app to minimize downtime during app upgrades? Does your application sessions failover to the other members of your cluster correctly, if so?
    4. Can you take legacy code and layer your app around it without needing to rewrite the legacy app? Can you do this even if the programming team who wrote the legacy app is no longer around?
    5. When you discover that you are having intermittent glitches (slow responses / server 500 response codes /etc), do you (a) reinstall (b) upgrade to a newer version of your framework / OS / whatever (c) Troll the user forums for your product / framework and hope that someone has seen your problem before. (d) Pull three all-nighters reading the source code to your product / framework? [Hint. The right answer is (e) Put your product into a supported trace mode and get your vendor to support you]

    IMHO, programming language wars are silly. The proof of the pudding is in what you can achieve with the framework of choice. After many years of observing the competitors to J2EE, I have yet to see a professional grade alternative to it.

  • by binarylarry ( 1338699 ) on Sunday April 18, 2010 @07:48PM (#31891072)

    It rocks until you need to support more than a dozen users. Then you need something like Java, which as proven to scale to meet any demand.

    Just ask the folks at Twitter.

  • by AuMatar ( 183847 ) on Sunday April 18, 2010 @07:49PM (#31891080)

    In Java, its a weakness about interfaces- you can't have a data member in an interface, so any interface needs to use getters/setters.

    In most languages, its so you don't give direct access to internal variables. In C++, you can just make the data member public and assign to it like a normal variable. But then you can't protect what is stored there or do sanity checks. Or you can write a getter/setter to do so. And if you decide to change from one to the other, you need to go back and change every access everywhere- a pain. Some languages like C# provide syntactic sugar to allow you to use foo.bar=9 and have that call a function, but there's problems with this- its impossible to know when you're calling a function or not by inspection, so you can have bugs that are difficult to find by code inspection- = does not always do what you expect it to. I prefer writing getters and setters to that.

  • by Rival ( 14861 ) on Monday April 19, 2010 @02:42AM (#31893304) Homepage Journal

    With the amount of abstraction in software development these days, very few people seem to really know what they're doing anymore. This should concern you -- if it doesn't, you haven't thought about it enough yet.

    We regularly see new exploits that affect systems that have been live for years, oft-times spanning multiple major versions and platforms. In retrospect these flaws are often usually painfully obvious, but because they have been buried in the layers of sediment of "best practices", "boilerplate" code and underlying platforms, they aren't seen. At least, not until a curious or malicious mind starts poking around.

    While this is in part a problem with QA, the deception of abstraction is that it provides a Black Box that is very easy to trust. This affects developers as much as QA.

    Are we really wise to keep building on these layers of abstraction? Toolkits on top of frameworks on top of virtual machines on top of operating systems on top of hardware -- even device manufacturers can't keep their locked-down devices from being rooted in a matter of days, sometimes even before release. While many of the Slashdot crowd laugh because there is a sense of social justice in seeing DRM broken, the same exploits may some day be used against systems we rely on. I don't consider myself a fearmonger, but I wouldn't be surprised to see significant digital infrastructure fail at some point, either due to malicious intent or simply instability, at some point in my lifetime. Poor software quality hurts us all.

    I realize that I sound like an old man yearning for the better days, but I learned to program in assembly on 8088s, and I knew exactly what my programs were doing. I'm not saying I want to go back to that, but the idea that most developers these days don't even understand memory management or garbage collection blows my mind. Asking for a new language because getters and setters are too much of a hassle? Somebody get this kid a lollipop, please.

    I read the article (no, I'm not new here) and the author's main point, emphasis original, is this:

    If your team is spending any time at all writing code to produce listing, filtering, and sorting behavior, not to mention creating CRUD screens and the back end logic for these operations, they are probably working at the wrong level of abstraction.

    Where does he draw the line at "wasting time writing code"? This is exactly the mindset that leads us to buffer overruns, SQL injections, and many other problems which should not make it into production software. He wants his developers to abstract as much as possible, but code reuse all too easily leads to blind acceptance and a failure to understand what is being imported. If he trusts that all those acronyms on the blog post he wrote are bug-free, then I would hate to be one of his customers. Not that there seems to be many categorically better options available.

    In the end, I think we need to abandon the cycle of "software bloat to more powerful hardware to software bloat..." and figure out what we can do with what we have. Good grief -- look at CUDA! We have orders of magnitude more processing power in a single video game console than all the world's computers before World War II, and available memory is simply insane. Take a look at what Farbrausch [wikipedia.org] has done, and you will see what dedicated focus on efficiency can do.

    Stop being lazy, understand what you are doing, understand what you have available, and use it well.

  • by Daishiman ( 698845 ) on Monday April 19, 2010 @04:10AM (#31893620)
    The fact that you even NEED an add() or increment() method shows the idiocy of the programming language.
  • Most can't (Score:1, Insightful)

    by Anonymous Coward on Monday April 19, 2010 @10:22AM (#31895726)

    Most people can't be an expert at all things, it is usually poor practice to attempt to implement all things yourself as most often those that wrote the library know better what they are doing. Sure if you are perfect and have infinite time doing what you propose makes sense, but it is almost always faster and usually safer to trust that the writers of a library know what they are doing.

    Do you wish to write your own sort algorithm? Your own SQL interface? Your own filtering implementations? If so that I would hate to be your employer as you will be spending much more time writing less efficient code that probably has more exploits than some mouth-breather that just used the default libraries.

    I understand your point, but abstraction is really important, an electrical engineer could never design a computer if he had to worry about what every transistor in his circuit did, he trusts that the amplifier provided works more or less as specified the same with the same with the phase detector and the variable oscillator. If he were to try to build it from transistors, (or worse from dopings in silicon) he would never get anywhere, the abstractions allow him to function, building something off the shoulders of those that created the various parts and trusting that they did their job well. It isn't perfect, but it is better than everyone starting from square one.

  • by wurp ( 51446 ) on Monday April 19, 2010 @10:26AM (#31895808) Homepage

    I haven't read your whole post in depth, but this stood out:

    Where does he draw the line at "wasting time writing code"? This is exactly the mindset that leads us to buffer overruns, SQL injections, and many other problems which should not make it into production software.

    No, rewriting functionality that already exists in stable, tested libraries is the mindset that leads us to buffer overruns, etc.

Anyone can make an omelet with eggs. The trick is to make one with none.

Working...