Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Programming

Web-based IDEs Edge Closer To the Mainstream 244

snitch writes "Last week Mozilla released Bespin, their web-based framework for code editing, and only a few days later Boris Bokowski and Simon Kaegi implemented an Eclipse-based Bespin server using headless Eclipse plug-ins. With the presentation of the web-based Eclipse workbench at EclipseCon and the release of products like Heroku, a web-based IDE and hosting environment for RoR apps, it seems that web-based IDEs might soon become mainstream."
This discussion has been archived. No new comments can be posted.

Web-based IDEs Edge Closer To the Mainstream

Comments Filter:
  • Why not? (Score:2, Insightful)

    by Seth Kriticos ( 1227934 ) on Thursday February 19, 2009 @11:09AM (#26916637)
    Sure, they will not replace local editing tools for the main development of applications, but for remote access and small stuff it sounds nice.
  • by puppetluva ( 46903 ) on Thursday February 19, 2009 @11:27AM (#26916921)

    Don't get me wrong. . . I think it is an amazing technical feat, but is it really practical to require internet access for this?

    I think it is time that we as a community get behind a project that allows these remote apps to be cached locally for fully disconnected use (with a desktop runtime -- something akin to Adobe Air). It would be great to visit the site once and thereafter run it local (and get updates later while connected). As long as I'm fantasizing, I think we should try to make this a standard for new desktop apps -- written like gadgets, but full blown apps.

    What do you think? Are there projects out there that are working on this already?

  • Offline coding??? (Score:3, Insightful)

    by Foofoobar ( 318279 ) on Thursday February 19, 2009 @11:36AM (#26917063)
    I often find myself without an internet connection and will just pull up Eclipse on my laptop and work on my checked out copy of the codeline. I don't need the connection except to check code back in and versioning control systems )if setup and used properly) already allow for collaboration (to an extent). So why should I require a connection to code? I want to work on code whenever I want regardless of whether I can find a wifi hotspot or not.
  • by TerranFury ( 726743 ) on Thursday February 19, 2009 @11:39AM (#26917105)
    Apart from "xhost +" (which is a bad security move), I wholeheartedly agree. This is what X was designed for.
  • by ktstzo ( 885924 ) on Thursday February 19, 2009 @11:45AM (#26917233)
    Applets
  • As people romance the scale and stability of the mainframe and move towards centralized, mainframe approaches, they forget the reasons that gave birth to the PC revolution to begin with.

    Having your stuff on your computer is an immensely liberating act. No matter what the terms of service, your data is in someone else's charge when its on yonder mainframe, and you are at the mercy of their data center when it comes to performance, user interface, virtually all aspects of the system.

    On the other hand, with a PC, particularly as applications move towards more open file designs, you get much more control, more choice, and as much power as you would like to invest in.

  • I'm skeptical (Score:5, Insightful)

    by IdahoEv ( 195056 ) on Thursday February 19, 2009 @11:52AM (#26917361) Homepage

    I'm going to remain skeptical.

    Net apps are great, but their performance in many areas is unavoidably way below that of native apps. When you can do everything with JS, you can be reasonably speedy if the processing requirements aren't huge and your browser doesn't leak memory too badly. (Dammit, Firefox!)

    But when you need to persist data, you have to spawn an ajax query and that 1/10 to 1/4 second (even over a fast network connection) just isn't comparable from the user perspective to hitting a local HD. As local mass storage switches from HD to solid-state over the next couple of years, the difference between native and web apps is going to increase, not decrease.

    Besides, half of these things are going to be ad-supported, right? At least in my experience, the performance of most websites has decreased the last 3 years or so as they hit and increasing number of different servers. It's typical for a single page to load content, ads, local javascript, stylesheets, and analytics from 10 or more pages. Each of these connections triggers its own DNS query. Every connection and every DNS lookup has a %age chance of hanging for a few seconds due to network traffic, server load, or what have you - as a result almost 10% of web pages I try to load these days stall for a few seconds. Do you really want that kind of crap going on in the background while you're developing? I don't.

    Hah! Just reminded of a most annoying example! Slashdot, for me, loads pretty much instantly. But every time I post and click that "preview" button, there's a five-second wait before the preview actually shows up. That'll be fun, and additional five seconds for every classfile save in my IDE...

  • by godrik ( 1287354 ) on Thursday February 19, 2009 @12:08PM (#26917601)

    What do you think?

    I think that I do not need web based applications. They are slow and do not do what I want them to do. It entraps you into a given view of your data without getting fine control on it. Can you search your mail on gmail for one containing an URL that matches a given list from a webcalendar ? No you can not, because you have no raw access on the data.

    I only use web based applications as a remote access when I am using a windows machine (Otherwise, I can use ssh and X11 forwarding and everything is fine).

  • Re:Why not? (Score:5, Insightful)

    by Marxist Hacker 42 ( 638312 ) * <seebert42@gmail.com> on Thursday February 19, 2009 @12:16PM (#26917695) Homepage Journal

    The one does not have anything to do with the other. One provides a managed place for you to put your code, the other lets you write code in a comfortable unified environment. Why would you want to integrate those two?
     
    When working in a team environment, integrating the two makes for another channel of communication, especially between geographically separated team members (which seems to be an increasing trend in my personal contracts; I've gone from originally working in a company where source code control was done by shouting over the cubicle walls, to a situation where I'm getting up 3 hours earlier than normal to collaborate with coders in European time zones). It also seems to me to greatly simplify the autosave process if the two were integrated, especially in a web environment- thus capturing all branches of the code automatically server-side, for the project manager to integrate the final code for build.
     
    Of course, this all would require at least two major advancements to the current codesets in TFA:
     
    1. enough speed for professional software development (something that even current client-side IDEs sometimes lack for me, though the problem might be more of a PEBCAK, or more precisely, a PEBBAF (Problem exists between brain and fingers, instead of between Chair and Keyboard).
    2. sufficient integration between the data entered in the client-side web interface and the code repository to show changes when two team members are working on the same source code file.

  • Re:Why not? (Score:3, Insightful)

    by JoeMerchant ( 803320 ) on Thursday February 19, 2009 @01:37PM (#26918899)

    Focus on the business at hand (e.g. coding) and quit wasting time on infrastructure (version control, defect tracking, build systems, backup & recovery, server sizing, etc...).

    In the past 2 years, I have spent 3490 hours on the "business at hand" (e.g. coding, documentation, meetings, etc.), 10 hours on infrastructure (setup and maintenance of trac, svn, backups, VPN) and 500 hours on lunch. It's a small company, I'm a programmer and the nominal back-end sys-admin.

    We easily spend more time configuring people's POP & SMTP settings on their e-mail than we do on our trac and svn servers, which are used daily by 75% of the company.

    The real infrastructure inefficiency happens when you give somebody the title "infrastructure guy" with no other responsibilities. Based on our experience, you should need one full time guy for roughly every 300 programmers. Problem is that he also configures people's e-mail, printer connectivity, etc. so when that 1/1000 programmer's support call comes, he's clueless.

  • by Bill, Shooter of Bul ( 629286 ) on Thursday February 19, 2009 @01:46PM (#26919029) Journal
    Why is that tempting? It seems like the equivalent of paying second graders to do your taxes.
  • Partial Bullshit (Score:3, Insightful)

    by jonaskoelker ( 922170 ) <jonaskoelkerNO@SPAMyahoo.com> on Thursday February 19, 2009 @02:00PM (#26919227)

    Let's see
    As a user you get:

    • more power - no, the server controls your data and you have to be the kind of person who knows what a web scraper is and how to write one to get it out without spending $BIGNUM hours.
    • reliability - depends on whether you're better at managing your local app than the bottleneck of you managing to always have a connection and them managing the app.
    • centralized support - whether that's good depends on how good support you can get. Your mom is going to prefer good support from you rather than mediocre support from "Robert" who works in Calcutta.
    • automated backup - if you delete a mail from your yahoo mail account, can you get it back? It's only really a guard against hardware failure, which I haven't seen on my computers except a single DVD burner (big deal, I lost only money and no data)
    • universal access - true.

    As a company, you get:

    • more billing options - great, except consumers hate having to optimize depending on their changing usage scenarios.
    • control over your data - unless you consider your customers' data yours, I don't see how.
    • better security - you are running more potentially vulnerable applications than otherwise. How's that better security?
    • licensing - Blizzard seemed to fail (there are (or were) no-pay servers working well with the no-pay trial client), despite slashdot saying companies should go that route.

    I'm not saying I hold the objective truth, just some counterpoints which seem to justify a deeper investigation.

  • by TerranFury ( 726743 ) on Thursday February 19, 2009 @03:00PM (#26920141)

    universal access

    If the Internet fsking worked the way it was supposed to, I wouldn't need some other server; my own machine would be a first-class citizen, and so long as I could remember its IP address I could SSH in.

    I used to do just this. I was at a university which had a very nice, rather open network, and I could access my machine from anywhere in the world. Why bother even carrying a laptop around when you can x-forward your machine to any of a thousand terminals scattered around campus? But these days I'm at another university, and their network is locked down in arcane and nondeterministic ways, so that sometimes I can access my machine, sometimes I can't, and god only knows why. The one thing you can reliably do is surf the web.

    ...which is why we're cramming all this bullshit into web browsers to begin with. We've kept the Web working, but broken the Internet.

  • Re:Why not? (Score:2, Insightful)

    by amoeba1911 ( 978485 ) on Thursday February 19, 2009 @03:09PM (#26920275) Homepage
    "it's not quite as good as Photoshop"

    Photoshop: $500
    Sumo-Paint: $0

    Value = Quality / Price
    Sumo-Paint = infinite value.

    Nice flash program.
  • by Mechanik ( 104328 ) on Thursday February 19, 2009 @03:30PM (#26920545) Homepage

    An other version would be to run your IDE from your netbook but alias make to "ssh make" or using a well configured distcc. The last point would be transfering datas. Two options are available here. Either you rsync them to the server, or you mount the code directory on your local machine using ssh, fuse and sshfs

    Mounting via NFS or SMB is generally dog slow and not recommended. Using an intelligent IDE like Eclipse that wants to parse and index all your source code, and trying to do that on a mounted drive is going to make things even worse as it crawls through every single file in your source tree.

    A better solution is an IDE that leaves the files where they are, and offloads all the heavy lifting to that machine, where disk accesses are fast, or are at least fast compared to sending everything over the wire.

    I mentioned this in another post, but due to my screwup of not applying good formatting to that one, I'll repost it here, as it's just as relevant. (mods, please ignore the other one)

    Shameless plug... I work on an Eclipse project (Remote Development Tools aka RDT) that allows you to have a local Eclipse that targets a C/C++ project that lives on the remote machine. No SMB or NFS mounting required, nor do you have to deal with the slowness of running Eclipse over X. Nor do you have to run your IDE in a browser. The UI runs natively on your local machine, so all that nice stuff like copy/paste and drag n' drop just works (mostly..).

    When you do something such as build your project (e.g. "make all", but it can be whatever you wish), the build commands are sent to the remote machine via the remote protocol of your choice. This can be via SSH if that's what you want. The output of the build is populated back to the local machine over the wire, and errors and warnings in your code are mapped onto those remote files so you can do lovely things such as clicking on the errors and being taken to the right location in that file.

    If you want all the whizz-bang features like intelligent search, content assist, call hierarchy, etc. to work in a secure environment then tunnel the mandatory dstore connection over SSH, or setup SSL (I'd recommend the SSH tunnel route).

    Here are the release notes [eclipse.org] for our first release. I'd recommend downloading the latest 2.1.1 nightly build over the 2.1 build to get some key bug fixes.

Pound for pound, the amoeba is the most vicious animal on earth.

Working...