Firefox 15 Coming With Souped-Up, Faster Debugger 125
StormDriver writes "Firefox 15 has hit the Mozilla pre-beta Aurora channel, and it features a redesigned, built-in debugger."
The original weblog post has more. Thanks to improved debugger internals in SpiderMonkey, supposedly code should run just as fast with debugging enabled as without (ever try loading Slashdot with firebug accidentally enabled?). There are also new tools for testing mobile layouts from the comfort of your workstation, and the debugger can attach to remote processes (Something Emacs users have enjoyed for years now, albeit in a hackish manner and without support for mobile Firefox).
Firebug or Built in Web Console? (Score:4, Informative)
(ever try loading Slashdot with firebug accidentally enabled?)
Yeah, it takes forever. But what is much faster is using the built in Web Console in the tools menu in newer versions of Firefox [mozilla.org]. I forget what version it was that started natively supporting debugging but it got a lot better (4 I think?). I'm very excited to see these improvements but my JavaScript has to support versions of Firefox all the way back to 3.6 so I'm still using Firebug and I'm still super grateful that Firebug came around. It literally revolutionized debugging web applications for me. There could have been tools before it but, man, that was the final nail on IE's coffin for support from us. Hell, even Chrome's built in debugging is way better than anything I can find on IE. I know the latest IE versions have gotten better but it's my strong opinion that every single person who uses the internet should be thankful for Chrome, Mozilla, Venkman and these debugging tools. They made the web experience a hell of a lot better and open by empowering developers.
Debugging Is the Next Frontier in Faster Browsing (Score:5, Informative)
So why not focus on faster browsing rather than debugging ?!?
As a web developing, most browsers (yes, even IE) have gotten to the sub millisecond rendering ranges. I mean, we're getting to the point where the browser is negligible compared to your network. Yes, you have broadband and it should be lightning fast but there are even little unavoidable delays for each GET or POST. So the next best thing is to empower developers who write the JavaScript code to be able to find out where their delays are. As debugging improves, we can even breakdown the experience and display that to the developer in the browser for each resource (images, CSS, JS, etc) on a page and then the developer can think about turning all those images into a spritesheet or improving some code. I mean, this is actually making the browsing experience faster for everybody by putting the right tools in the developer's hands. You can spend forever optimizing the backend but it doesn't mean jack squat when you're querying for 99 separate little images when the user first hits the page.
Re:wait isnt it firefox 150 ? (Score:5, Informative)
Re:Oh shut up already. (Score:3, Informative)
Like always Opera did it first (full disclosure, I used to be a major Opera fanboy). I don't think Firefox is trying to beat anyone. Once Opera got to 10 and worked out the bugs for everyone (stupid websites were only looking at a single digit of the version number at that time, so Opera coded the UA to say something like 9.6), everyone is doing it. For some reason, the Firefox team finds this versioning best for their development process, and so be it. It doesn't really matter what the version numbers are. Web browsers are constantly being updated, regardless of version number.
Re:ever try loading Slashdot without firebug enabl (Score:5, Informative)
The web 10 years ago was not fine. People were still supporting Netscape 4, which in practical terms meant that everybody was stuck with inaccessible, inefficient, inflexible table layouts that had to transmit style information for every page load. Mobile websites were practically nonexistent; where they did exist, it was a severely cut-back version. Using a single responsive design to cater to desktop and mobile uses would have been impractical even assuming today's mobile hardware. Lots of JavaScript was essentially written twice - once for Netscape and once for Internet Explorer, because the various DHTML and layout methods were different and incompatible. Netscape transcoded from CSS to JSSS internally, and lots of websites only supported Internet Explorer on Windows - a single browser on a single platform, both by the same corporation.
From a content point of view, it was still difficult to produce and manage content. Anything beyond basic stuff usually involved a very limited CMS and writing code. The "WYSIWYG" editors generated terrible, inefficient code that often only worked in one browser. Security was far worse than it is now, developers were largely clueless about even the most basic vulnerabilities, and things like the PCI standard weren't put in place yet.
These days, people are paying more and more attention to content because the technology is largely at a point where they can. Consider YouTube, Wordpress or Facebook - people generating content at phenomenal rates. Efficiency is still a prime concern due to mobile browsing, and techniques such as CSS, caching and CDNs have improved efficiency immensely. User-empowering features such as user stylesheets, user JavaScript and add-ons have grown into a thriving ecosystem, and accessibility support continues to grow.
Ten years ago was a really low point for the web. It lacked the client diversity that came before it, it was rife with incompatibilities and the inefficient designs necessary to compensate for them, and it lacked the compatibility and accessibility that mostly came afterwards. In all of the history of the web, that is probably the one point I'd least like to be stuck in.
Re:wait isnt it firefox 150 ? (Score:4, Informative)
ftp://ftp.mozilla.org/pub/firefox/releases/10.0.5esr [mozilla.org]
Re:Oh shut up already. (Score:3, Informative)
I think you're missing the point. Mozilla hasn't changed the way version numbers work.
Because of demand from those corporate types, Mozilla now provides extended support releases. Both those and the standard releases use the widely recognised major.minor.patch versioning numbers. The current mainline release is 13.0.1, which is a patch update to version 13.
Everything you used to learn from version numbers, you still do. The difference is that each new release introduces fewer new features (which makes it more stable!), and releases happen more quickly.
The alternative is infrequent releases with many significant changes which haven't been widely tested. This is exactly what makes those .0 release something that users avoid. Making numerous major changes simultaneously drastically increases the amount of testing required. Users are going to find bugs under either release strategy, but with fewer new features per release, the number of bugs they'll find is drastically reduced, and typically so is the impact of those bugs.
But more important in my mind is this: The slow-release system is a symptom of the software sales revenue model. Businesses drive revenue by bundling up new features until there's enough to make a sales pitch. Users who buy a release get only bug fixes to that release, and no guarantee that all of the bugs will be addressed. Very often, buying the current version is the only way to get a bug fix. In Free Software, we don't have to deal with that garbage. The vendor isn't trying to sell us each upgrade, so they don't need to hold back new features.
No, it isn't. It's the same as it used to be.
The same way you always have: by checking the release notes. There's no additional complexity.
Firefox has been repeatedly demonstrated to use less memory than Chrome, for a long time. Chrome's advantages have been in a somewhat faster JavaScript engine and faster startup/new tab time. Firefox has a better plugin API (AdBlock can prevent advertisements from being loaded rather than merely preventing them from being displayed) and lower memory use. I prefer Firefox.