Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Safari Java The Internet

Safari 10 In macOS Sierra Deactivates Flash, Silverlight and Other Plug-Ins by Default (webkit.org) 114

Apple's web browser Safari 10, which will ship with macOS Sierra, will disable Flash, Java, Silverlight, QuickTime and other plug-ins by default. The move will help the company improve the overall web browsing experience by focusing on HTML5 content. From a post on WebKit blog, authored by Apple's Safari team: When a website directly embeds a visible plug-in object, Safari instead presents a placeholder element with a "Click to use" button. When that's clicked, Safari offers the user the options of activating the plug-in just one time or every time the user visits that website. Here too, the default option is to activate the plug-in only once.
This discussion has been archived. No new comments can be posted.

Safari 10 In macOS Sierra Deactivates Flash, Silverlight and Other Plug-Ins by Default

Comments Filter:
  • For the best (Score:5, Informative)

    by LichtSpektren ( 4201985 ) on Wednesday June 15, 2016 @12:31PM (#52323237)
    Not only is this the optimal security practice, it also convinces corps still using that proprietary legacy crap to move to HTML5.
    • It's be nice if it would check to see whether a site offers HTML5 video and automatically drop it in over the Flash version.

      Reuters is my go-to news site; but it drives me nuts that they apparently sniff your browser and offer Flash if you're on a desktop and h.264 if you're on mobile. I could certainly spoof my browser identifier (and I have, occasionally) - but then I get stuck being presented with the gosh-awful "mobile" experiences some other sites offer up.

      • by jandrese ( 485 )
        Maybe you could force it to use h.264 by uninstalling Flash?

        I only leave Flash installed on IE, and only fire up IE when I run across that rare website where there is no option than to enable Flash. Websites like that are exceedingly rare these days. It's usually just internal corporate crap that is only happy with IE anyway.
        • Maybe you could force it to use h.264 by uninstalling Flash?

          I don't have Flash installed - so what I see is "missing plugin".

          Actually I do have Chrome around for those occasions when I truly need to go to some site which is, for some reason, still Flash-based. Fortunately those cases are few and far between. But I don't consider wanting to see a news story a "need"... so I either just skip it, or tell Safari to fake the User Agent and pretend to be an iPad (wish you could do that on a per site basis).

        • by tepples ( 727027 )

          Maybe you could force it to use h.264 by uninstalling Flash?

          Users with "mobile" User-agent values get the H.264 video. Users with "desktop" User-agent values get "Install Adobe Flash Player".

          • Maybe you could force it to use h.264 by uninstalling Flash?

            Users with "mobile" User-agent values get the H.264 video. Users with "desktop" User-agent values get "Install Adobe Flash Player".

            This has always annoyed me.

            If I can go to a site with my iPad and it happily serves up content, then that means the web monkeys have already "solved" that site's Flash-dependency problem.

            So why in HELL do they feel justified in PUNISHING me for not wanting Flash on my laptop? After all, even ADOBE wants people off of Flash, FFS!

            • Probably for copy deterrence. Because major mobile browsers don't support the sort of extensions that desktop browsers such as Firefox support, it's easier for a user to run a stream recording extension on Firefox than in, say, Safari for iOS. So users of desktop browsers are required to do the streaming inside a piece of proprietary software that is opaque to browser extensions, namely Flash Player.

              • Probably for copy deterrence. Because major mobile browsers don't support the sort of extensions that desktop browsers such as Firefox support, it's easier for a user to run a stream recording extension on Firefox than in, say, Safari for iOS. So users of desktop browsers are required to do the streaming inside a piece of proprietary software that is opaque to browser extensions, namely Flash Player.

                Yeah, because that stopped everyone from creating Applications that (I assume) identified as Browsers, that you could point to a URL serving up FLV and capture a copy of virtually any streamed Flash video, right?

                Oh, wait...

                As I said, PUNISHMENT for not bowing to the Demon of Flash...

                Fuck 'em.

        • Maybe you could force it to use h.264 by uninstalling Flash?

          I only leave Flash installed on IE, and only fire up IE when I run across that rare website where there is no option than to enable Flash. Websites like that are exceedingly rare these days. It's usually just internal corporate crap that is only happy with IE anyway.

          When I bought my MacBook Pro in 2013, it didn't come with Flash installed. I decided to see how long I could resist installing Flash.

          I am still Flash-free on that machine. Once in awhile I run into a site that won't show content; but the internet is big, and I would rather give my visits to sites without lazy web developers.

    • by _xeno_ ( 155264 )

      Yeah, about that, as someone who works as a web developer, my reaction to reading this story was "Safari is at version 10 already? When did that happen?"

      Now granted none of the websites I work on require any plugins at all, and some of them even use things like HTML 5's <canvas> to render pretty charts and graphs.

      But they all have one thing in common: no one tests in Safari. If it's broken in Safari, the answer is "download Chrome." Fortunately I can't remember a time when something was broken in Safa

      • But they all have one thing in common: no one tests in Safari.

        That became the case once Apple terminated development of Safari for Windows. This meant it suddenly cost $500 to $600 to buy a second computer [apple.com] on which to run a copy of Safari in which to test your site. And then you have to pay $500 to $600 more four to six years later when Apple stops porting new versions of Safari to your version of OS X or new versions of macOS to your Mac. For example, a 2009 Mac mini running Mac OS X 10.6 "Snow Leopard" can't be upgraded past OS X 10.11 "El Capitan".

        • You should look into virtual machines. They're swell.
          • by tepples ( 727027 )

            But you still have to buy the $500+ Mac on which to run macOS, even if you do run Linux on your Mac's hardware and macOS in a VM.

        • And then you have to pay $500 to $600 more four to six years later when Apple stops porting new versions of Safari to your version of OS X or new versions of macOS to your Mac. For example, a 2009 Mac mini running Mac OS X 10.6 "Snow Leopard" can't be upgraded past OS X 10.11 "El Capitan".

          As opposed to all those 6 year old PCs that are running Windows 10 ever so reliably? Hell I've worked on 5 year old machines that were upgraded to Windows 10 with all sorts of mostly driver issues.

          A 2009 iMac is actually 7 years old and still capable of running macOS. That's actually a massive achievement.

          • My MacBook Pro 17", which is similar to that 2009 iMac, is not supported to run "macOS" and neither are better Mac Pros. Why is this?

            And I've upgraded two desktops -- one of which is from 2010, a Wacom Companion, and two notebooks( both from about 2010) to Windows 10 and it was an improvement in all cases. Outside of a USB-wifi adapter not having official support early on, something I fixed by simply installing the drivers for its chipset which are included with Windows, I've not had any other driver iss
        • But they all have one thing in common: no one tests in Safari.

          That became the case once Apple terminated development of Safari for Windows. This meant it suddenly cost $500 to $600 to buy a second computer [apple.com] on which to run a copy of Safari in which to test your site. And then you have to pay $500 to $600 more four to six years later when Apple stops porting new versions of Safari to your version of OS X or new versions of macOS to your Mac. For example, a 2009 Mac mini running Mac OS X 10.6 "Snow Leopard" can't be upgraded past OS X 10.11 "El Capitan".

          News Flash: Use a Mac, and you only NEED one computer for web development, because you can run all the OSes you'd ever want on it.

          News Flash #2: Get over it. Sometimes work requires the purchasing of tools. Ask any mechanic how much he gives to the Snap-On or Mac Tools man. It will make the cost of ANY computer seem like a trivial expense.

          • by tepples ( 727027 )

            News Flash: Use a Mac, and you only NEED one computer for web development, because you can run all the OSes you'd ever want on it.

            Apple doesn't make a Mac in the size of the laptop on which I'm typing this comment and on which I do much of my web development and retro game development. Besides, you then have to buy a retail Windows license for $120 instead of having it included with your computer. Furthermore, you have to buy more RAM in order to run Windows on your Mac because a VM requires both the host OS and guest OS to be in RAM at the same time, and Apple overcharges for RAM. Furthermore, even if I bought a Mac used, it still wo

      • by gtall ( 79522 )

        The version of Safari I have is 9.0.x and has been for awhile. I recall 8.0.x and 7.0.x, it seems they've been updating it regularly.

        I don't care for it, I like Seamonkey, the interface doesn't change, I get a sidebar full of links, folders of links, folders of folders of links, etc. And it works with my Smartcard where Safari seems to fuck it up for some sites, but not all. I guess they like to keep me guessing.

    • Re:For the best (Score:5, Interesting)

      by Darinbob ( 1142669 ) on Wednesday June 15, 2016 @01:50PM (#52323795)

      To make HTML5 work you also requires additional components not specified by HTML5. You hope they're all supplied by default by the browser maker but utlimately all you're getting is one company vetting their video versus a different company vetting a different video. And it's still proprietary.

  • Here, I broke your crutches so you can focus on your leg recovering. =P

    Seriously, this is a good thing - but the way it is advertised is, frankly, ridiculous.

    • by jandrese ( 485 ) <kensama@vt.edu> on Wednesday June 15, 2016 @12:55PM (#52323397) Homepage Journal
      Apple is notorious for this. They ditched floppy drives back when most hardware still shipped drivers on floppies. They switched to USB before most vendors were ready. Then they more or less abandoned optical drives when the world was awash in disks. Sometimes it seems like if someone like Apple doesn't come along and force the issue the industry will happily sit on old technology for well past its use by date.
      • by __aaclcg7560 ( 824291 ) on Wednesday June 15, 2016 @01:01PM (#52323445)

        Sometimes it seems like if someone like Apple doesn't come along and force the issue the industry will happily sit on old technology for well past its use by date.

        Without Apple, we would be drowning in AOL coasters.

        • Without Apple, we would be drowning in AOL coasters.

          Actually they'd be CompuServe coasters, Prodigy coasters, or maybe GEnie coasters. AOL started out as AppleLink Personal Edition, which means no Apple, no AOL.

      • by Lisias ( 447563 )

        Sitting on old technology that still works is far from being a demerit.

        Floppy disks had their days long gone, but optical disks are still the best option for some applications - you don't have to worry if your boot DVD "firmware" was hacked by pluging it on a infected machine. Given the relatively easy process to hack pendrives firmware, my DVD-ROMs are still around.

        There's also a huge amount of still usable software around the web that lacks the resources to be updated and will simply be lost. For sure a l

        • by jandrese ( 485 )
          The thing I hate about Optical disks isn't the disk itself, it's that optical disc readers in computers are by far the least reliable piece of hardware on the box. They're just crap. Even $1 case fans are more reliable! I would buy reliable brand name drives to avoid this problem, but there really aren't any left. The entire market has finished the race to the bottom. The only good thing is that I almost never use them anymore, so it's not a big problem when they suddenly can't read discs anymore.
          • The thing I hate about Optical disks isn't the disk itself, it's that optical disc readers in computers are by far the least reliable piece of hardware on the box.

            Even Apple's Optical Discs. The DVD drive on my iMac was replaced once under warranty, and the new one quit working about 6 months after my AppleCare ran out. I think the one on my MacBook Pro still works, but I can't even remember the last time I inserted a disc.

      • Apple is notorious for this. They ditched floppy drives back when most hardware still shipped drivers on floppies. They switched to USB before most vendors were ready. Then they more or less abandoned optical drives when the world was awash in disks. Sometimes it seems like if someone like Apple doesn't come along and force the issue the industry will happily sit on old technology for well past its use by date.

        Exactly!

    • Here, I broke your crutches so you can focus on your leg recovering. =P

      Lol, well said. :)

  • by pecosdave ( 536896 ) on Wednesday June 15, 2016 @12:37PM (#52323265) Homepage Journal

    to kick their Silverlight habit?

    It's a plugin that should never have existed to compete where no competition was needed, and it sucks all around. I don't like Chrome either and for some ungodly reason Chrome is the only thing those two will respect where Linux is concerned, despite the fact Firefox will do HTML 5 video.

    • by Anonymous Coward on Wednesday June 15, 2016 @01:02PM (#52323447)

      I don't know about the companies you listed, but many other web developers no longer consider Firefox to be a relevant browser. That means they don't even bother testing their sites with it. Maybe the sites will work, maybe they won't.

      The latest web browser market share stats [caniuse.com] paint a very unfortunate picture for Firefox. It's now only about 6% to 7% of the browser market, across all platforms and all versions of Firefox.

      To put that into perspective, Firefox now has roughly the same number of users in total that individual versions of other browsers (like IE 11 and iOS Safari 9.3) have. Even Opera Mini nearly has more users than Firefox has!

      Firefox has only about one-third the number of users that Chrome for Android has, and even Chrome for Android has fewer users than desktop Chrome. Even UC Browser for Android has more users than Firefox.

      Yes, Firefox was once a significant player. But that's no longer the case, now that Mozilla has driven away so many Firefox users by making one unwanted change after another. Firefox essentially cloned the worst parts of Chrome (its UI and soon its extension system) while ignoring the best parts of Chrome (its excellent performance and low memory usage).

      Some people will wrongly blame "Google advertising" or claim that Firefox still has a "large absolute number of users", but those are both just excuses.

      People use Chrome because, despite its bad UI, it's a lot faster than Firefox is.

      Firefox's absolute number of users is still so proportionally small that it's not worth spending time and effort to support these users. It makes a lot more sense to ignore a few million Firefox users and instead focus on providing a better experience for the billions of people who use Chrome.

      Based on the current trends, Firefox will continue to see its market share shrink each month. If you think it's being ignored now, just wait until it's down to 1% or 2% of the market. At that point even the big players with resources to waste on supporting Firefox won't even bother trying to.

      • I stopped using FF about a year ago now. I guess when I got my new laptop. I just never bothered adding it. I don't like Chrome as much as I liked Firefox years ago, but Firefox kept morphing into Chrome. So why bother with both programs?

      • Then I'll stop visiting web sites that don't work with firefox.

      • Why is Android mentioned, that's not a full web browser, it's a phone/tablet that's going to serve up mobile versions of the web site, and firefox on mobile is always going to be used less than the built in browsers.

        Web devs for years have been whining that they only want to support one browser ever, and now the IE is declining they're scrambling to find the next half assed browser to support and if it's Chrome we'll be stuck with Google's strange extensions to HTML instead of Microsoft's strange extensions

        • by _xeno_ ( 155264 )

          Depends if management can do math.

          If the cost of testing in Firefox is higher than the cost of driving away Firefox users (and it almost always isn't, since the type of people using Firefox are the type of people who'll just use a different browser if the site fails), then it's simple: don't support Firefox.

          Right now, we're at the point where the decision is easy: don't support Firefox. It isn't worth the effort, and generally if things work in Chrome, they'll probably work in Firefox. So Firefox isn't supp

          • by pecosdave ( 536896 ) on Wednesday June 15, 2016 @04:05PM (#52324931) Homepage Journal

            In our company Firefox is the official browser.

            We don't want our users on Chrome. Chrome runs processes even when it's not started that can peg a CPU - we've seen it happen. We don't trust what it's doing - especially while it's not running. Chrome is out for security reasons.

            Also certain client pages require real versions of plugins like Flash and Java that Chrome won't use. Easier to keep the users corralled into one arena.

          • I see Firefox at 14% (gs.statcounter.com). But you can shop around for stats, I still see one site showing IE at 33%...

            Right now most web sites don't work at all on Firefox with no-script anyway. It's no big loss though, they're not web sites worth visiting.

      • Holy cow. But I shouldn't be surprised. 2010 was when I noticed it was getting kind of bloated. This was especially apparent when one of the computers in the school lab I worked in was found running a 1.5 release, which of course was blisteringly fast and lightweight. That was around the time I started using Chrome too.
      • Low memory usage? Look it's a fast browser but it's a memory hog. Right now for me 3 tabs - 6 instances of chome32 totalling 200 meg.

      • by Anonymous Coward

        The day Firefox won't render a site is the day that site will lose visitors.

    • by Anonymous Coward on Wednesday June 15, 2016 @01:07PM (#52323477)

      On OSX (which this story is about), you don't need Silverlight. I have Netflix and use Safari to view it, and I have neither Silverlight nor Flash installed.

      • by guruevi ( 827432 )

        Depends on your hardware. I think it has to do with DRM (Digital Restrictions Management) so if your video card doesn't support certain DRM extensions, you're forced onto Silverlight. I think the same happens on Linux - if you turn off DRM, you can't use Netflix.

    • by Solandri ( 704621 ) on Wednesday June 15, 2016 @01:23PM (#52323605)
      Hollywood doesn't want you to capture the video stream and save it to generate a digital copy of the movie, so the stream is encrypted. But obviously the computer doing the displaying has to decrypt it. With hardware players like Smart TVs and Rokus, the manufacturer just has to demonstrate that the decrypted stream is sent directly to the display with no chance for the user to intercept it, and Hollywood is satisfied.

      Software players are tougher, especially if you're playing the movie in a browser. So Netflix, Amazon, etc. create an encrypted virtual machine in Flash or Silverlight which decrypts the stream, and sends the resulting video directly to the computer's display. That's the only way Hollywood will approve software streaming video players.

      This is why streaming video players drain your laptop's battery a lot faster than playing a local cracked copy of the movie, and why you need a Pentium-class CPU (used to be i3) to play 1080p Netflix or Amazon. Because the decryption is done in a software virtual machine, it can't take advantage of any video decoding hardware built into the device's graphics hardware - the CPU has to do everything. This is also why iOS got the Netflix app before Android. Apple only had a few iOS devices at the time, so Netflix could get the app approved as a hardware player. Android had hundreds of different hardware configurations, and the ARM CPUs weren't powerful enough to decrypt and decode the video stream in a virtual machine. So Netflix had to get Hollywood's approval one Android device at a time as a hardware player.
      • Back when I had a Mac Mini I could play HD South Park from the website (before the contract put it on Hulu) and it looked beautiful. When I tried to use Hulu I couldn't go full-screen because that puny little processor (Core 2 I think - early Mac Mini Intel). I blamed the encryption 100% and I'm fairly sure I was right.

      • The funny psrt of that is I am still able to captire encrypted signals where they go to my TV.m via the vifeo out via component output. I usevthat to captute shows I want to wstch later or missed via their GO aps or other streams. It's a bit of apain brcsuse I have to capture the in tesl time but at least I can time shift them. Quality may not be as hood as a d/l but that is ok since I can still eatch them when I want while traveling or get one that foe whatever reason my DVR missed them. I'm not intested
        • by jabuzz ( 182671 )

          You don't even need to mess about with component signals. Thing is all 1.x versions of HDCP are a complete busted flush since the release of the master key six years ago.

          Go onto eBay search for "hdmi capture" and someone will happily sell you a stand alone device to record even an HDCP protected 1080p HDMI stream to a USB drive plugged into the device as an H264 stream.

          DRM on 1080p content is as useful as CSS is on a DVD, that is a total waste of time.

    • Netflix uses HTML5 in Chrome by default... they have been for a while. Both methods work fine.
    • I have tried for the past year to get the flash and html5 video block to work in Firefox. Right now surfing the web is as painful as it was ften years ago, even though we have built in plugin blockers. Nothing works. If safari can restore my ability to browse, it will become my primary tool.
    • despite the fact Firefox will do HTML 5 video.

      Developers don't bother with Firefox these days because it is a niche browser on the PC which is an ever-shrinking section of the web browsing market. People test on Chrome because that covers the PC, Android phones & tablets as well as Chromebooks. Same goes for Safari because it is on the Mac, iPhones and iPads. Of course IE & Edge get some exposure by virtue of being available on 90%+ of PCs.

  • I think it would have been better if Safari actually supported as much HTML5 and related features as other browsers before making a move like this.
    • by guruevi ( 827432 )

      Most browsers these days are WebKit based so they'll support just as much HTML5 as Safari does.

      • by tepples ( 727027 )

        Chrome is Blink based, not WebKit based. Blink gets features before WebKit gets them. Chrome got WebGL before Safari, and it got ServiceWorker (needed for offline use of web applications) before Safari.

"...a most excellent barbarian ... Genghis Kahn!" -- _Bill And Ted's Excellent Adventure_

Working...