Forgot your password?
typodupeerror
Firefox GUI Mozilla Upgrades IT

Firefox 4 Beta 1 Shines On HTML5 256

Posted by timothy
from the acknowledge-over-repeat dept.
snydeq writes "InfoWorld's Peter Wayner takes a first look at Firefox 4 Beta 1 and sees several noteworthy HTML5 integrations that bring Firefox 4 'that much closer to taking over everything on the desktop.' Beyond the Chrome-like UI, Firefox 4 adds several new features that 'open up new opportunities for AJAX and JavaScript programmers to add more razzle-dazzle and catch up with Adobe Flash, Adobe AIR, Microsoft Silverlight, and other plug-ins,' Wayner writes. 'Firefox 4 also adds an implementation of the Websockets API, a tool for enabling the browser and the server to pass data back and forth as needed, making it unnecessary for the browser to keep asking the server if there's anything new to report.'"
This discussion has been archived. No new comments can be posted.

Firefox 4 Beta 1 Shines On HTML5

Comments Filter:
  • by elucido (870205) * on Thursday July 08, 2010 @04:24PM (#32844172)

    Firefox needs to have better built in support for Ironkey, smartcards and security tokens. So we can once and for all switch away from passwords.

    If Firefox actually supports security tokens, it's not very intuitive.

  • by beelsebob (529313) on Thursday July 08, 2010 @04:33PM (#32844264)

    Yes, lets all live in 1999, so that you can continue to use your shitfest of a computer.

  • by Burz (138833) on Thursday July 08, 2010 @04:37PM (#32844296) Journal

    Mozilla and Google have got this one WRONG:

    Merging the address and search fields is a big drawback. It further confuses people about what a URL is, and it encourages them and others (esp. advertisers) to give directions to web sites as if the keywords == addresses. (Hey, like AOL!)

    If this trend continues, we'll have shenanigans and lawsuits claiming that "squatters" are using keywords on their pages that "belong to us". It will open another "IP" can of worms.

    Encouraging people to rely on keywords also opens them up to phishing big time. It's like having them clean their teeth with their enema: Very semantically dirty!

  • by Pojut (1027544) on Thursday July 08, 2010 @04:40PM (#32844348) Homepage

    I love this country as much as the next patriotic guy...and love means being able to view things honestly. Face it: as a country, we throw out a MASSIVE amount of stuff.

    Come on, mods: if you can't be honest about yourself, what can you be honest about? Shut off Olbermann and Beck, accept what our country is, and just deal with it. Seriously.

  • Re:Peter Wayner (Score:5, Insightful)

    by Hatta (162192) on Thursday July 08, 2010 @04:48PM (#32844430) Journal

    Because the hypertext transfer protocol was designed to transfer hypertext documents. It was not designed to be a remote application protocol.

  • by blair1q (305137) on Thursday July 08, 2010 @04:50PM (#32844448) Journal

    well, no, actually, that's a good thing.

    URIs have become cumbersome. Making the net content-addressable is a big efficiency measure.

    You can still give out a key that will only map to you, and return a URI that is clearly you. Or at least as clearly as happens now when someone does a Google search.

    But now you're not constrained to identifying yourself with some bogus fqdn with a limiting TLD stuck on it.

    As for Phishing, banks have moved to authentication systems that use graphics on the page to tell you that the password-entry box you're looking at is legit. If you don't see your predetermined secret glyph, you don't enter your password. And the glyph isn't sent until your browser and the server are connected by SSL, so it can't be sniffed and hacked into a phishing site. And it isn't sent unless your browser already has a cookie identifying it as having been validated previously, using a secret-question protocol. If you deleted the cookie, you go through the secret-question routine again.

    Short of adding more layers of such things, or using in-person pre-validated biometrics over secure links, you're not getting much more security than that on the internet. Using simple, recognizable URIs won't help you, and really, just invites social engineering based on URIs that look almost legit.

  • by blair1q (305137) on Thursday July 08, 2010 @04:54PM (#32844492) Journal

    Hmm. Have ten million users doing the same ten million calculations each on different data on the sever, or have the ten million users download their data and do the calculations on their own machine...which one will complete faster?

    Server-side scripting is a massive bottleneck if the page has any complexity at all.

    What you should be complaining about is the disastrous state of the code sent to the client side. Most of it is painfully bad.

  • Your posts defines two distinct categories: URLs and Search Terms. Most people don't think about those things as separate ideas. They're just means of telling the internet to show a website.

    The key distinction between a URL and a search term is that URLs are hard to remember and prone to typos. Search terms are far easier (and tend to be helpful even if you spell them wrong). why would I want to type in "http://krugman.blogs.nytimes.com/" when I can just type in "krugman [google.com]" (or "krugrman [google.com]") and get my daily Keynesian economic analysis that way.

    For the browser, the URL and the search term are completely distinct. For an engineer or a software programmer, it's clear why they would have separate fields for entry of one or the other.

    But for a user (even a technically savvy user) semantic cleanliness doesn't make any sense and causes more problems than benefits.

  • by Anonymous Coward on Thursday July 08, 2010 @05:01PM (#32844562)

    All is good as long as I can disable search from the toolbar.

  • by Anonymous Coward on Thursday July 08, 2010 @05:01PM (#32844572)
    No, he doesn't. He means what he said, regardless of the fact that *nix firefox has a different menu layout.
  • by Anonymous Coward on Thursday July 08, 2010 @05:09PM (#32844636)
    Uh, the method you described does almost nothing to stop phishing. Doing a man-in-the-middle on it is trivial, so really all it does it is require the phisher to handle each bank separately... which they probably have to do anyway in order to make their sites look the same. The only trip-off to the user would be an extra security question being asked, which no one will notice because banks randomly ask those security questions anyway.
  • by SatanicPuppy (611928) * <Satanicpuppy@ g m a i l . c om> on Thursday July 08, 2010 @05:15PM (#32844686) Journal

    Yea, and it'll also reduce the incentive for people to squat and typo-squat domain names.

    I'm frankly tired of all that crap: if ICANN wants to deal with the rampant squatting, I'll start supporting "address bar for addresses only" thinking. Until then, I'd rather google hijack me to a meaningful result than accidentally direct myself to some damn squatter site.

  • by PhxBlue (562201) on Thursday July 08, 2010 @05:23PM (#32844750) Homepage Journal
    Grandma doesn't care what a URL is, only that she can get to the sites she needs. If Firefox 4 is intuitive to her, then it doesn't owe me any apology.
  • by jsebrech (525647) on Thursday July 08, 2010 @05:51PM (#32845014)

    You do realize that flash internally manages a display object hierarchy not unlike the DOM? There isn't much difference between writing apps in flex/flash and writing apps in javascript with something like ExtJS toolkit. All rich app frameworks I know, on any platform, use the HTML-like approach of having an element hierarchy and a set of layout rules that are constantly re-calculated.

    HTML may be ill-suited to rich app development, but so is everything else. Win32 and X11 are both truly horrible API's, arguably much worse than HTML+JS+CSS, but combined they hold the majority share of native apps.

    And by the way, the browsers of today are designed for rich applications. They have been for a few years now. Cars were originally designed to make it up to a brisk walking pace at best. Things change.

  • by theswimmingbird (1746180) on Thursday July 08, 2010 @06:09PM (#32845200)
    Definitely. I love having them separate. Besides, even my netbook has a resolution of 1366x768. Who needs an address bar that's over a thousand pixels wide? I mean, really. So much of their efforts go into optimizing screen space usage, but I feel that a unified bar that's mostly blank really defeats this purpose.
  • by Nethead (1563) <joe@nethead.com> on Thursday July 08, 2010 @06:16PM (#32845286) Homepage Journal

    Ya know, 1999 wasn't all that bad for me. Dot com boom. making big bucks at an internet porn company, got married, had a nice car, nice house... yeah, I'll go back there.

  • by Anonymous Coward on Thursday July 08, 2010 @06:26PM (#32845374)

    As a developer, sysadmin and end user I would like to tell you that HTTP is not for this there are other ports than 80 and the web browser is not a virtual machine.

    With the addition of canvas and now websockets... it is now.

  • Re:Peter Wayner (Score:3, Insightful)

    by Requiem18th (742389) on Thursday July 08, 2010 @06:39PM (#32845476)

    I'd argue that MathML and SVG have a very proper place as components of a Hypertext Document, I don't know why are you talking about XPath.

  • Re:Peter Wayner (Score:4, Insightful)

    by holloway (46404) on Thursday July 08, 2010 @06:41PM (#32845490) Homepage

    Because the hypertext transfer protocol was designed to transfer hypertext documents. It was not designed to be a remote application protocol.

    Irrelevant. If it can be evolved to work well enough for people then it is suitable. The Type-III Secretory Gland evolved into the Bacterium Flagellum without any design, but it happened to work well enough to survive and so it did.

    Design helps cause effects but it doesn't prevent useful side-effects.

  • Re:Peter Wayner (Score:3, Insightful)

    by icebraining (1313345) on Thursday July 08, 2010 @07:31PM (#32845914) Homepage

    or uses it as a tool to push people to .NET instead

    .NET in the browser only with Silverlight, and that already had normal sockets.

  • by boxwood (1742976) on Thursday July 08, 2010 @07:58PM (#32846142)

    I think the opposite. DNS has gone to shit because of the squatters. To the point that its pretty much useless now.

    And with all the phishing sites.... well we should be discouraging people from typing in $COMPANY_NAME.com to get information they need. They make one typo or if the site they want is under a TLD other than .com then at best they're going to be inconvenienced by loading up the wrong page, and at worst they've entered their banking logon into a phishing site.

    Its far better for people to simply enter a reasonable approximation into a search bar and have a search engine give the site thats most likely what they wanted. Google is much more forgiving of typos than DNS.

    And if you actually know the exact URL, then the functionality is still there for you to bypass the search engine and go directly there. I don't really see a downside.

Pause for storage relocation.

Working...