Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Programming Apple

Apple Announces New Programming Language Called Swift 636

jmcbain (1233044) writes "At WWDC 2014 today, Apple announced Swift, a new programming language. According to a report by Ars Technica: 'Swift seems to get rid of Objective C's reliance on defined pointers; instead, the compiler infers the variable type, just as many scripting languages do. ... The new language will rely on the automatic reference counting that Apple introduced to replace its garbage-collected version of Objective C. It will also be able to leverage the compiler technologies developed in LLVM for current development, such as autovectorization. ... Apple showed off a couple of cases where implementing the same algorithm in Swift provided a speedup of about 1.3X compared to the same code implemented in Objective C.'" Language basics, and a few worthwhile comments on LtU.
This discussion has been archived. No new comments can be posted.

Apple Announces New Programming Language Called Swift

Comments Filter:
  • First Rhyme (Score:4, Funny)

    by Art3x ( 973401 ) on Monday June 02, 2014 @07:10PM (#47151017)

    AAPL's YAPL

  • by ugen ( 93902 ) on Monday June 02, 2014 @07:10PM (#47151023)

    Good bye source compatibility. We hardly knew ye.
    First Windows, and now OSX. I am still maintaining applications that are built crossplatform (Windows/Mac/Linux, with unified GUI look) but it's getting harder every year and, by the looks of it, will be impossible soon.
    Which means that an individual developer (like myself) or a smaller shop would have to choose one big player/OS vendor and stick with it. That increases risk and makes small players that much less viable (while, of course, helping the big ones consolidate user base and profit).
    Funny how the world works.

    • Since when does Qt not work X-platform anymore?

      • by ugen ( 93902 ) on Monday June 02, 2014 @07:59PM (#47151349)

        Qt does not (and cannot) support Windows "Metro" (or whatever the name is for the C#/event driven/non Win32 environment now)
        By the same token it won't be able to support this new environment.

        Qt, XWidgets and others like them rely on basic C compatibility and certain common UI themes and primitives to be able to build cross-platform libraries and applications. With proprietary, non-portable and non-overlapping languages vendors make sure that any development has to target their platform specifically.

        Aside from that, if new development environment does not support linking against "old" binary libraries - developers also don't get the benefit of code reuse (since they won't be able to use existing libraries for things like image handling, graphics, sound, networking, you name it).

        • Re: (Score:3, Insightful)

          by Frosty Piss ( 770223 ) *

          Qt does not (and cannot) support Windows "Metro"

          "Windows Metro" is dead, irrelevant to this discussion. QT will continue to be available for Apple's little garden. Your comment constitues "fear mongering".

    • by Lunix Nutcase ( 1092239 ) on Monday June 02, 2014 @07:34PM (#47151179)

      Why is this dumb post modded insightful? You can still use all the same languages you did before.

      • Re: (Score:3, Funny)

        by Frosty Piss ( 770223 ) *

        Why is this dumb post modded insightful? You can still use all the same languages you did before.

        Because Slashdot Sheep have a childish hate for apple, as they post comments from their iPhones?

    • by ddtstudio ( 61065 ) on Monday June 02, 2014 @08:09PM (#47151429)

      Hey, I'm not a developer/coder/programmer, so I honor and respect the work you've put in to things in the past. But if you've been tying yourself to a "unified GUI look" across platforms, you've long been dooming your products and yourself.

      As a UX person, I can throw data at you all day long that shows device/OS specificity and familiarity are key elements in making something work for the user. I'm sure you don't literally mean you chose either a menu bar in all app windows on every platform or a top-of-the-screen menu bar on every platform, but the obvious reason why that would be wrong also holds for controls, colors, placement, text sizes, and so on to the last turtle at the bottom.

      • by ugen ( 93902 ) on Monday June 02, 2014 @10:22PM (#47152165)

        That's not what cross-platform compatibility implies. Placement of specific elements and their view is a subject of "themes" and is readily customizable.
        As a developer I care about underlying primitives - things like "windows", "buttons", "menus" or more generically "events", "inputs" etc. Once those underlying things can no longer be shared - you have to write a new product from scratch for every platform.

        Think of something like Adobe Photoshop (I assume as a UX person you are using it?). It is possible to have a version for Windows, and one for Mac precisely because you have those common underlying primitives and APIs, even though they don't necessarily look the same in all respects.

        If commonality of platforms is gone - even a company like Adobe will have really hard time building products for both platforms. That will eventually affect users too, since they will likely have to select different (and no longer compatible) products for each platform as well. For now that's not the case - but given where things go, it probably will be.

      • by R3d M3rcury ( 871886 ) on Monday June 02, 2014 @11:20PM (#47152363) Journal

        It's a good point.

        Consider the menu bar. It's a pretty handy place for commands. On the Mac, it sits at the top of the screen. On Windows, it sits along the top of your window. Now if we consider Fitts' Law [wikipedia.org] for a moment and compare Mac and Windows, the menu bar is much easier to access on the Mac than it is on Windows because it's sitting at the top of the screen.

        So, putting things that people access somewhat frequently into a menu item on the menu bar isn't a horrible thing on the Mac. But on Windows--because the menu bar is harder to access--it will frustrate your users. You probably want to set up some kind of contextual menu on Windows.

        Do it the Mac way, you've annoyed your Windows users. Do it the Windows way and you confuse your Mac users (who are used to searching the menu bar to find things). Or devote the time and effort to doing it both ways.

        • Consider the menu bar. It's a pretty handy place for commands. On the Mac, it sits at the top of the screen. On Windows, it sits along the top of your window. Now if we consider Fitts' Law [wikipedia.org] for a moment and compare Mac and Windows, the menu bar is much easier to access on the Mac than it is on Windows because it's sitting at the top of the screen.

          And if we consider the real world with 24"+ screens been very common, putting the menu bar on top is ridiculous because it's so far away (and even if you just "swipe"
          up, said swipe still takes time to reach the top).

          By the way, for the same reason, you don't want to skimp on the context menu on a Mac.

    • by StikyPad ( 445176 ) on Monday June 02, 2014 @08:24PM (#47151527) Homepage

      Whatever source compatibility existed before Swift (and the degree to which that exists is surely debatable), it was not removed by Swift. Objective-C, C/C++, and Swift can coexist in the same project. I believe they can even coexist inline, which makes me shudder to think, but there it is. Still, you could ostensibly have a UI in Swift and your core business logic in C, if your architecture is solid. (Obviously YMMV, and there are bugs to be discovered, to be sure.)

    • by perpenso ( 1613749 ) on Monday June 02, 2014 @09:35PM (#47151895)

      Good bye source compatibility. We hardly knew ye.

      I have absolutely no compatibility problems. I strictly use objective-c for only user interface code. The core functional application code is written in c/c++. I have multiple iOS/Android apps whose core code is shared and can even be compiled with a console app under Mac OS X or Linux, I use this for regression testing and fuzzing. A headless Linux box in the closet exercises this core code. Similar story for Mac OS X and Windows.

      Swift code can replace objective-c code and it matters little to me. Has zero impact on other platforms I target.

      Admittedly I've ported other people's code between Windows, Mac and Linux for years and written my own code for Windows, Mac and Linux for years and as a result I am extremely aggressive about separating UI code from functional code.

      For those people using some sort of cross-platform wrapper for their project, if it supports Mac OS X objective-c it will probably support Swift. Even if it takes time for the wrapper developers so what, the use of Swift is entirely optional.

    • by tlhIngan ( 30335 )

      Good bye source compatibility. We hardly knew ye.
      First Windows, and now OSX. I am still maintaining applications that are built crossplatform (Windows/Mac/Linux, with unified GUI look) but it's getting harder every year and, by the looks of it, will be impossible soon.
      Which means that an individual developer (like myself) or a smaller shop would have to choose one big player/OS vendor and stick with it. That increases risk and makes small players that much less viable (while, of course, helping the big ones

  • Whoa 1.3x (Score:5, Funny)

    by the eric conspiracy ( 20178 ) on Monday June 02, 2014 @07:14PM (#47151033)

    That's what about 1/10,000th of what hiring a good programmer would get you, at the price of not being to find any programmers.

  • by garote ( 682822 ) on Monday June 02, 2014 @07:14PM (#47151039) Homepage

    I was particularly surprised to see closures appear. So far I've only been using them in Javascript and Perl, but my experience has been that they are about 15% added flexibility for about -40% readability. That is, they make it harder to tell what's going on, more than they reduce development time.

  • by slacka ( 713188 ) on Monday June 02, 2014 @07:15PM (#47151049)

    The live REPL reminds me of Bret Victor, who used to work for apple.
    http://worrydream.com/Apple/ [worrydream.com]

    I hope they take advantage of some of his ideas?
    https://www.youtube.com/watch?... [youtube.com]

  • by shutdown -p now ( 807394 ) on Monday June 02, 2014 @07:16PM (#47151057) Journal

    "Immutability has a slightly different meaning for arrays, however. You are still not allowed to perform any action that has the potential to change the size of an immutable array, but you are allowed to set a new value for an existing index in the array. This enables Swift’s Array type to provide optimal performance for array operations when the size of an array is fixed."

    i.e. Swift arrays that are "immutable" actually aren't. Way to rewrite the dictionary. But wait, it gets worse. Here's for some schizophrenia.

    "Structures and Enumerations Are Value Types. A value type is a type that is copied when it is assigned to a variable or constant, or when it is passed to a function. Swift’s Array and Dictionary types are implemented as structures."

    So far so good. I always liked collections that don't pretend to be any more than an aggregate of values, and copy semantics is a good thing in that context (so long as you still provide a way to share a single instance). But wait, it's all lies:

    "If you assign an Array instance to a constant or variable, or pass an Array instance as an argument to a function or method call, the contents of the array are not copied at the point that the assignment or call takes place. Instead, both arrays share the same sequence of element values. When you modify an element value through one array, the result is observable through the other. For arrays, copying only takes place when you perform an action that has the potential to modify the length of the array. This includes appending, inserting, or removing items, or using a ranged subscript to replace a range of items in the array"

    Swift, a language that is naturally designed to let you shoot your foot in the most elegant way possible, courtesy of Apple.

    • by mbkennel ( 97636 ) on Monday June 02, 2014 @07:31PM (#47151157)

      That is bizarre. So if you see a function signature which takes an array as a parameter, you either do know that elements will be changed, or will not be changed---but only depending on potentially hidden implementation of that function?

      And which things have the 'potential to modify' the length of an array? Implementation defined?

      Fortran 90+ had it right. You just say for each argument whether the intent is data to go 'in' (can't change it), 'out' (set by implementation), or 'inout', values go in, and may be modified.
      • by shutdown -p now ( 807394 ) on Monday June 02, 2014 @07:37PM (#47151203) Journal

        And which things have the 'potential to modify' the length of an array? Implementation defined?

        It's defined by the operations on the array. Basically, appending, inserting or removing an element would do that, but subscript-assigning to an element or a range will not.

        Fortran 90+ had it right. You just say for each argument whether the intent is data to go 'in' (can't change it), 'out' (set by implementation), or 'inout', values go in, and may be modified.

        Funnily enough, they do actually have in/out/inout parameters in the language.

        Note however that the story for arrays here does not apply only to parameters. It's also the behavior if you alias the array by e.g. assigning it to a different variable. So it's not fully covered by parameter passing qualifiers.

    • I don't agree with the decisions either. However, it is consistent with Java. Like it or don't like it, Java is popular and its semantics are well-known.

      • No, it is not consistent with Java. In Java, you can't change the size of the array, period. If you want a dynamically resizable collection, you use an ArrayList. Here, they have conflated the two together, and then added strange copy-on-write semantics that is triggered by operations that are unique to ArrayList, but not by those that are shared between it and array. There's nothing even remotely similar to that in Java - arrays are just passed by reference, and so are ArrayLists, and if you can mutate it,

    • by Tom ( 822 )

      I completely fail to see what your problem is.

      Immutable arrays are defined exactly the same in several other languages. If you want an array of constants, you need to defined its contents as constants, not just the array itself. It's good behaviour to give you this choice.

      Same for collections passed by reference. Again, several other programming languages do it exactly this way, implicitly passing collections by reference because collections can become large and implicitly copying them every time you touch

      • by shutdown -p now ( 807394 ) on Tuesday June 03, 2014 @01:47AM (#47152763) Journal

        You completely miss the point.

        Regarding immutability, it's not about an array of constants. It's about an immutable array - as in, an array which has its content defined once, and not changed afterwards. They actually do use the term "immutable" for this, and this is what it means in any other language. It's also what it means in Swift - for example, an immutable dictionary cannot be mutated at all, neither by adding or removing elements to it, nor by changing a value associated with some existing key. The only special type here is array, for which immutability is effectively redefined to mean "immutable length, mutable contents" - which is a very weird and counter-intuitive definition when the word "immutable" is involved (e.g. in Java you also can change elements of an array but cannot add new ones - but this is the rule for all arrays in Java, and it doesn't call that "immutable"). The fact that there's no way to have a truly immutable array is just icing on the cake.

        And they don't pass collections by reference. They say that value types are passed by value (duh), and that both dictionaries and arrays are value types (unusual, but ok). But then they completely redefine what copying an array means, with a very strange copy-on-write semantics whereby they do implicitly copy them if you touch them "in the wrong way" (e.g. by appending an element), but not if you touch them in the "right way" (e.g. by mutating an existing element). Again, this magic is utterly specific to arrays - for dictionaries, they behave like true value types at all type, and use full-fledged copy-on-write under the hood to avoid unnecessary copies - so if you "touch" a dictionary in a way that mutates it, it makes a copy, and it doesn't matter how you do so - by inserting a new key-value pair or by changing a value for an existing key. Not only this is very much non-orthogonal (why make copying of arrays and dictionaries so different?), the behavior that's defined for arrays just doesn't make any sense in distinguishing between various ways to mutate them.

    • I don't see what the big deal is. If you modify the size of an array, regardless context, you get a different array. Not exactly a brain buster.

  • by the eric conspiracy ( 20178 ) on Monday June 02, 2014 @07:19PM (#47151081)

    Must have 5 years experience.

  • SWIFT programmers (Score:5, Interesting)

    by msobkow ( 48369 ) on Monday June 02, 2014 @07:24PM (#47151105) Homepage Journal

    They could have chosen a name other than that of the international banking protocols. Asking for SWIFT programmers is going to get them a bevy of COBOL coders who know the protocol.

    • by Ecuador ( 740021 ) on Monday June 02, 2014 @07:36PM (#47151195) Homepage
      I mean, there is already a swift programming language [swift-lang.org]. Yes, it is not popular, but when you decide on a name for your language don't you at least google it first? Is "swift" so unbelievably cool that a name collision (even with a "small" entity) does not matter? But, yeah, it is Apple we are talking about, they probably invented the word "swift" which people and companies like SUZUKI have been copying for other uses here and there...
      • by msobkow ( 48369 )

        I just look forward to the organization behind the SWIFT protocols suing Apple into the ground for trying to appropriate a name that already has a well-established meaning in computing. They've certainly got the budget to do it... :)

      • When Steve Jobs announced the iPhone, Cisco owned the trademark to iPhone as I recall. And he didn't care.

        Apple has enough $$$ to payoff for virtually any name they set their mind to, just like what they did with the iPhone.

        http://www.idownloadblog.com/2012/01/27/apple-cisco-iphone-trademark/
      • Comment removed based on user account deletion
  • by bradrum ( 1639141 ) on Monday June 02, 2014 @07:27PM (#47151129)

    I find these two aspects interesting and wonder what the trade off is. Longer compiler times?

    "Designed for Safety
    Swift eliminates entire classes of unsafe code. Variables are always initialized before use, arrays and integers are checked for overflow, and memory is managed automatically. Syntax is tuned to make it easy to define your intent — for example, simple three-character keywords define a variable (var) or constant (let)."

    " Swift code is transformed into optimized native code, "

  • I wanted to write apps and tried to learn Objective-C, but as a coder that started with C and then moved on to C++ and PERL (the swiss army chainsaw), the language syntax hurt my ability to read it. In case you don't know what I am talking about, here are some of my learning notes

            myObject.someMethod(); // old school
            [myObject someMethod]; // send a message or call a method

            result = myObject.someMethod(); // old school
            result = [myObject someMethod]; // method returns a result

            result = myObject.someMethod(arg); // old school
            result = [myObject someMethod:arg]; // pass an argument

    You can see the Old School syntax above (which works in Objective-C) and the Objective-C standard syntax below. The square brackets [ ] and colons : just hurt my mental debugger... [ ] and yes I know Objective-C is a Superset of C, so they had to steer clear of the C-Syntax but it just looks wrong. Further, I know that I could write my own style of Objective-C but I wouldn't be able to read the code of others. Apple had to start somewhere and Steve had the NeXT languages ready to go but to me the syntax is ugly and offensive. However, I am ready for a better Apple language.

    I can't wait to see a SWIFT code example, if it gets rid of the NeXT Objective-C Superset Syntax, I might be coding for iPad and Mac sooner than I thought. If anyone has a code example, please share it, I would like to see what a function, method, or message call looks like. Hoping for parenthesis and a Standford iTunesU class. Guardedly excited!

    • by Proudrooster ( 580120 ) on Monday June 02, 2014 @07:55PM (#47151323) Homepage

      Ok, you guys are too slow, I RTFA and downloaded the iBook. So far, I am very much liking the SYNTAX, especially OBJECTS and FUNCTIONS, they even brought the LET keyword in from BASIC. SWIFT will make programming Apple products much easier for the C loving syntax crowd, from what I can see. Ahhh... what a breath of fresh air. Code snippet below of creating an object and exercising it. I feel bad for those that suffered through Objective-C.


      “class Square: NamedShape {
              var sideLength: Double

              init(sideLength: Double, name: String) {
                      self.sideLength = sideLength
                      super.init(name: name)
                      numberOfSides = 4
              }

              func area() -> Double {
                      return sideLength * sideLength
              }

              override func simpleDescription() -> String {
                      return "A square with sides of length \(sideLength)."
              }
      }
      let test = Square(sideLength: 5.2, name: "my test square")
      test.area()
      test.simpleDescription()”

      Excerpt From: Apple Inc. “The Swift Programming Language.” iBooks. https://itun.es/us/jEUH0.l [itun.es]

      • by maccodemonkey ( 1438585 ) on Monday June 02, 2014 @09:47PM (#47151965)

        Ok, you guys are too slow, I RTFA and downloaded the iBook. So far, I am very much liking the SYNTAX, especially OBJECTS and FUNCTIONS, they even brought the LET keyword in from BASIC. SWIFT will make programming Apple products much easier for the C loving syntax crowd, from what I can see. Ahhh... what a breath of fresh air. Code snippet below of creating an object and exercising it. I feel bad for those that suffered through Objective-C.

        To be honest, while this snippet is a few lines shorter, it's arguably more complicated than the corresponding Obj-C. It drops returning self in the init, and drops a few lines that would have had to go in to the class definition, but you gain a few unsightly keywords like "override", having to add the keyword "func" to every function, and you gain some more syntactical mess like "->".

        It's not horrible, but I'm not sure this sample is more readable than Obj-C. As others have noted, Swift has the habit of taking the important parts of a function (like what it's named and what it returns, or what name a class is and what it subclasses) and shoving them off to entirely different sides of the function declaration.

        • To be honest, while this snippet is a few lines shorter, it's arguably more complicated than the corresponding Obj-C. It drops returning self in the init, and drops a few lines that would have had to go in to the class definition, but you gain a few unsightly keywords like "override", having to add the keyword "func" to every function, and you gain some more syntactical mess like "->".

          "override" is a _massive_ improvement. It means you cannot override a superclass method by accident. And you can't try to override a non-existing superclass method by accident.

        • Re: (Score:3, Insightful)

          by countach ( 534280 )

          I haven't checked, but its a great idea to have override as a mandatory descriptor (If it is). Java now has @Override, but code quality suffers from it not being compulsory, leading later to subtle bugs. As for func and let, I imagine it makes it easier to make a scripting language to have less ambiguity about what you are trying to declare up front. I mean, without func, would the line "area()" be the start of a declaration, or a call to a function? Sure, you could wait for a semi-colon to finalise the dec

  • by ChunderDownunder ( 709234 ) on Monday June 02, 2014 @07:58PM (#47151343)

    Apple had a fine language 20 years ago. It was said to influence the design of Ruby and Python. They butchered it into an Algol-like syntax because 'real programmers' can't grok s-expressions. Then they abandoned Dylan.

    Next, they created a language for mobile devices. Its programming model was said influence the design of JavaScript. Then they abandoned NewtonScript.

  • by phantomfive ( 622387 ) on Monday June 02, 2014 @08:06PM (#47151413) Journal
    I knew that Swift was trouble when it walked in, trouble trouble trouble
  • Viva Eco (Score:5, Insightful)

    by fulldecent ( 598482 ) on Monday June 02, 2014 @09:44PM (#47151949) Homepage

    Ok, so now you'll be developing software using Apple's frameworks and Apple's language to run on Apple's runtime, after passing Apple's compiler (i.e. LLVM) for download using Apple's store (after finding your product with Apple's iAD) directly onto Apple's products built with Apple's custom processors, after you register as an Apple Developer. If your app needs to do something outside this environment, you can use said APIs now to reach out to Apple's Could and Apple's Database servers. And if your app is really successful as measured by Apple Crash Reporting and Apple Usage statistics or Apple's iTunes Connect, then they'll just straight out fucking copy you.

    Something about the new "language" is what makes that summary start sounding ridiculous.

  • Bjarne Stroustrup (Score:5, Interesting)

    by phantomfive ( 622387 ) on Monday June 02, 2014 @10:14PM (#47152125) Journal
    Bjarne Stroustrup once gave some ideas on what requirement should be met before he would consider designing a new programming language. This was his list:

    * What problem would the new language solve?
    * Who would it solve problems for?
    * What dramatically new could be provided (compared to every existing language)?
    * Could the new language be effectively deployed (n a world with many well-supported languages)?
    * Would designing a new language simply be a pleasant distraction from the hard work of helping people build better real-world tools and systems?

    Apple can definitely deploy the new language effectively, but I'm not sure it solves any problems.
    • Re: (Score:3, Interesting)

      It solves a problem ... not your problem, but Apple's problem. I think Apple created Swift to be a common language throughout all their frameworks. I believe Python was originally filling this role, but Apple doesn't control Python. I believe they intend to use this in the server as well, that way, you have one language used throughout the entire stack - app, server, and even in the debugger.
    • Also see http://lambda-the-ultimate.org... [lambda-the-ultimate.org]
      1. What problem does this language solve? How can I make it precise?
      2. How can I show it solves this problem? How can I make it precise?
      3. Is there another solution? Do other languages solve this problem? How? What are the advantages of my solution? of their solution? What are the disadvantages of my solution? of their solution?
      4. How can I show that my solution cannot be expressed in some other language? That is, what is the unique property of my language which is lacking i
    • by mrxak ( 727974 )

      It gives Apple complete control over their own destiny, which is something Apple likes to have (not exactly news). They now have a language they can tinker with to their hearts' content and no external group or standards body can restrict what they do with it. They've made it very clear they intend to listen to developer feedback and tinker with it, at least in the near future. Certainly even if they do eventually open it up, they'll still be able to extend it however they like and whenever they like in the

  • by surfcow ( 169572 ) on Tuesday June 03, 2014 @04:07AM (#47153081) Homepage

    ... HR departments began advertising for programmers with 3+ years of Swift programming experience.

"An idealist is one who, on noticing that a rose smells better than a cabbage, concludes that it will also make better soup." - H.L. Mencken

Working...