Apple Announces New Programming Language Called Swift 636
jmcbain (1233044) writes "At WWDC 2014 today, Apple announced Swift, a new programming language. According to a report by Ars Technica: 'Swift seems to get rid of Objective C's reliance on defined pointers; instead, the compiler infers the variable type, just as many scripting languages do. ... The new language will rely on the automatic reference counting that Apple introduced to replace its garbage-collected version of Objective C. It will also be able to leverage the compiler technologies developed in LLVM for current development, such as autovectorization. ... Apple showed off a couple of cases where implementing the same algorithm in Swift provided a speedup of about 1.3X compared to the same code implemented in Objective C.'"
Language basics, and a few worthwhile comments on LtU.
First Rhyme (Score:4, Funny)
AAPL's YAPL
Good bye source compatibility (Score:4, Interesting)
Good bye source compatibility. We hardly knew ye.
First Windows, and now OSX. I am still maintaining applications that are built crossplatform (Windows/Mac/Linux, with unified GUI look) but it's getting harder every year and, by the looks of it, will be impossible soon.
Which means that an individual developer (like myself) or a smaller shop would have to choose one big player/OS vendor and stick with it. That increases risk and makes small players that much less viable (while, of course, helping the big ones consolidate user base and profit).
Funny how the world works.
Re:Good bye source compatibility (Score:5, Insightful)
Since when does Qt not work X-platform anymore?
Re:Good bye source compatibility (Score:5, Interesting)
Qt does not (and cannot) support Windows "Metro" (or whatever the name is for the C#/event driven/non Win32 environment now)
By the same token it won't be able to support this new environment.
Qt, XWidgets and others like them rely on basic C compatibility and certain common UI themes and primitives to be able to build cross-platform libraries and applications. With proprietary, non-portable and non-overlapping languages vendors make sure that any development has to target their platform specifically.
Aside from that, if new development environment does not support linking against "old" binary libraries - developers also don't get the benefit of code reuse (since they won't be able to use existing libraries for things like image handling, graphics, sound, networking, you name it).
Re: (Score:3, Insightful)
Qt does not (and cannot) support Windows "Metro"
"Windows Metro" is dead, irrelevant to this discussion. QT will continue to be available for Apple's little garden. Your comment constitues "fear mongering".
Windows Phone and RT do not require C# (Score:5, Informative)
I was under impression that all new windows "apps" had to be written in C# against a new SDK that has neither binary nor source compatibility with Win32/posix/C/C++. I'd be glad to be wrong, but that's what I've seen so far.
Only Windows Phone 7 and Xbox Live Indie Games required C#.* C++ works on Windows Phone 8 and Windows RT, though they do require use of the Windows Runtime API. For actual Windows on x86, you can continue developing desktop applications without having to deal with Windows Runtime (the "Metro" crap).
* In theory, they required verifiably type-safe CIL compatible with the .NET Compact Framework. In practice, they required C#, as standard C++ is not verifiably type-safe, and DLR languages require functionality not in .NET CF.
Re:Good bye source compatibility (Score:5, Informative)
Are you sure about the "metro"? Name is dead, but I was under impression that all new windows "apps" had to be written in C# against a new SDK that has neither binary nor source compatibility with Win32/posix/C/C++. I'd be glad to be wrong
You're wrong! You can write Windows Store apps in C++ just fine. This C++ can use win32, posix, STL etc just fine. (however, all apps run in a sandbox, so some win32/posix functions aren't as powerful as normal, e.g. the sandbox doesn't let a Windows Store app enumerate files on the hard disk outside of the files it's allowed to touch).
You can also write Windows Phone apps in the same C++. So a mobile app developer could write their core in posix platform-neutral C++, and have it portable across Android+iOS+Windows. I know a few who do.
Of course posix doesn't have anything to do with windowing systems or touch, and win32 APIs (e.g. gdi32.dll, user32.dll) don't apply to windows store or phone apps since they don't have GDI or the traditional windowing system. So your C++ code will invoke new WinRT APIs to access the new functionalities. WinRT is perfectly compatible with posix/win32 APIs. Indeed, for some functionality (e.g. audio), you're REQUIRED to use win32 APIs because they're not exposed in WinRT.
Here's some example code that shows how you'd mix the new WinRT APIs to get hold of sandbox-specific stuff, and then interop with traditional posix APIs. It also shows how the asynchronous nature of WinRT APIs combine with the synchronous traditional posix APIs:
auto f1 = Windows::ApplicationModel::Package::Current->InstalledLocation;
create_task(f1->GetFolderAsync("Assets")).then([this](StorageFolder ^f2)
{
create_task(f2->GetFileAsync("Logo.scale-100.png")).then([this](StorageFile ^f3)
{
auto path = f3->Path;
FILE *f = _wfopen(path->Data, L"r");
byte buf[100];
fread(buf, 1, 100, f);
fclose(f)
});
});
If you care about Windows Phone or Windows RT (Score:5, Informative)
It doesn't use Metro's libraries.
Anything that doesn't use the Windows Runtime API (what you call "Metro's libraries") will not be approved for the Windows Store and will not run on Windows RT tablets or Windows Phone 8 smartphones.
Re:Since when does Qt "work" with OS X? (Score:5, Informative)
There is VLC
There is CMake
There is my project -- https://sourceforge.net/projec... [sourceforge.net]
There is Sorenson Squeeze -- http://www.sorensonmedia.com/s... [sorensonmedia.com]
I am sure there are others
Re:Since when does Qt "work" with OS X? (Score:5, Informative)
No this is NOT a troll, please read.
A claim of cross-platform is one thing. But in practice I know of no significant apps using Qt that exist in the wild that work on OS X.
Please provide a link to any mainstream working application for Mac OS X that uses Qt.
I don't know of a single one because Qt's support for XCode is incredibly poor.
Do you have to use Xcode, the IDE, to develop OS X apps? Or by "Xcode" do you mean "Xcode the IDE, plus the command-line tools"?
Re: (Score:3)
The scientific method is providing proof --- rhetoric and hot air is great, but science is providing evidence
Huh? So which one is it, proof, or evidence? Surely the two of them are not the same. Also, scientific evidence is quantitative and statistical in nature, and you're asking for an anecdote. Please stop pulling the term "scientific" into places it doesn't belong; we're not formulating theories of software here.
Re:Since when does Qt "work" with OS X? (Score:4, Insightful)
Then compile your code with Eclipse or from the command line.
What has XCode to do with developing Qt?
Re: (Score:3)
you haven't heard of Google Earth or VLC ?
Re:Since when does Qt "work" with OS X? (Score:5, Insightful)
There are plenty of apps that use QT--probably the most mainstream one is Google Earth.
Now, look at me with a straight face and say, "And Google Earth has a great UI!"
To me, this is the problem with cross-platform UI. It starts from a mistaken premise: Windows and Mac or iOS and Android have the same basic UI. There's even a grain of truth to it. But it doesn't really work.
The example I love to use is French and English. They are, basically, the same language, right? They both have words, sentences, and paragraphs. They both have nouns, verbs, and adjectives. So if you just translate the words and move around the adjectives, you've got a French/English translator! It's that simple!
No, not really. If it's 100 degrees outside and you've just come from the outside and remark to a pretty girl "Je suis chaud" (literally, I am hot), she might very well slap your face. Because you've just said that you are hot as in, "Oh, baby, you make me so hot."
And those are the silly mistakes that cross-platform UIs make.
Take a simple one from Mac versus Windows: On the Mac, in a dialog box, the default button is always the right-most button. So you have a dialog box that says, "Are you sure you want to do this?" and the right-most button would say, "OK" and the button to the left of it would say, "Cancel." On Windows, the default "OK" button would be on the left with the "Cancel" button the right of it.
Alignment, again, is a question. I'm not sure there's a standard on Windows--I've seen things centered [spaanjaars.com] and I've seen them aligned right. [samba.org] On Mac OS X, there's a standard. Which means when Windows aligns them on the right like on the Mac, I'm always pressing the Cancel button.
So, yeah, you can use QT to have a cross platform application and it will work fine. And it's great, if you have an application like Google Earth, which has lots of great GIS capabilities so that the result is worth the pain. But, frankly, if Microsoft did an equivalent to Google Earth but made a Mac application that was "correct," I'd use it in a heartbeat. Because, all else being equal, I'd rather have an application that "speaks my language" to one that only sort of does.
Have you ever spoken to a tech support person from another country with a thick accent? That's the equivalent of using Google Earth on a Mac.
Re:Since when does Qt "work" with OS X? (Score:5, Informative)
Oh, stop trolling. You have obviously never used Qt, it will automatically fix the order of the dialog buttons for you. You can even launch the same application under GNOME and get one order, and under KDE and get another. It is controlled by the widget-style it uses. And it does more than that, it also matches the reading direction of the language you are using so that it reverses for Hebrew, Arabic or other right-to-left languages.
There are things that you need to handle yourself in a crossplatform application, but that is not one of them.
Re:Good bye source compatibility (Score:5, Insightful)
Why is this dumb post modded insightful? You can still use all the same languages you did before.
Re: (Score:3, Funny)
Why is this dumb post modded insightful? You can still use all the same languages you did before.
Because Slashdot Sheep have a childish hate for apple, as they post comments from their iPhones?
Re: (Score:3)
You can't post from the iPhone anymore. Mobile Slashdot on safari is horribly broken. At least for me I can't log in or post once I do login. I switch to classic.Slashdot and everything works as normal. Switch back to mobile and it breaks. I don't know why they can't test it. It is not like you can extend mobile safari.
That's not Apple's issue, that's a Slashdot issue. Just like unicode...
Re: (Score:3)
That's why you use the desktop site and not the mobile one. ...
I never use the mobile version of any website on my iPhone or my iPad
Re:Good bye source compatibility (Score:5, Insightful)
Hey, I'm not a developer/coder/programmer, so I honor and respect the work you've put in to things in the past. But if you've been tying yourself to a "unified GUI look" across platforms, you've long been dooming your products and yourself.
As a UX person, I can throw data at you all day long that shows device/OS specificity and familiarity are key elements in making something work for the user. I'm sure you don't literally mean you chose either a menu bar in all app windows on every platform or a top-of-the-screen menu bar on every platform, but the obvious reason why that would be wrong also holds for controls, colors, placement, text sizes, and so on to the last turtle at the bottom.
Re:Good bye source compatibility (Score:4, Insightful)
That's not what cross-platform compatibility implies. Placement of specific elements and their view is a subject of "themes" and is readily customizable.
As a developer I care about underlying primitives - things like "windows", "buttons", "menus" or more generically "events", "inputs" etc. Once those underlying things can no longer be shared - you have to write a new product from scratch for every platform.
Think of something like Adobe Photoshop (I assume as a UX person you are using it?). It is possible to have a version for Windows, and one for Mac precisely because you have those common underlying primitives and APIs, even though they don't necessarily look the same in all respects.
If commonality of platforms is gone - even a company like Adobe will have really hard time building products for both platforms. That will eventually affect users too, since they will likely have to select different (and no longer compatible) products for each platform as well. For now that's not the case - but given where things go, it probably will be.
Re:Good bye source compatibility (Score:4, Insightful)
It's a good point.
Consider the menu bar. It's a pretty handy place for commands. On the Mac, it sits at the top of the screen. On Windows, it sits along the top of your window. Now if we consider Fitts' Law [wikipedia.org] for a moment and compare Mac and Windows, the menu bar is much easier to access on the Mac than it is on Windows because it's sitting at the top of the screen.
So, putting things that people access somewhat frequently into a menu item on the menu bar isn't a horrible thing on the Mac. But on Windows--because the menu bar is harder to access--it will frustrate your users. You probably want to set up some kind of contextual menu on Windows.
Do it the Mac way, you've annoyed your Windows users. Do it the Windows way and you confuse your Mac users (who are used to searching the menu bar to find things). Or devote the time and effort to doing it both ways.
Re: (Score:3)
Consider the menu bar. It's a pretty handy place for commands. On the Mac, it sits at the top of the screen. On Windows, it sits along the top of your window. Now if we consider Fitts' Law [wikipedia.org] for a moment and compare Mac and Windows, the menu bar is much easier to access on the Mac than it is on Windows because it's sitting at the top of the screen.
And if we consider the real world with 24"+ screens been very common, putting the menu bar on top is ridiculous because it's so far away (and even if you just "swipe"
up, said swipe still takes time to reach the top).
By the way, for the same reason, you don't want to skimp on the context menu on a Mac.
Re: (Score:3)
If the applications don't follow the OS norms, they are not fine applications.
Re:Good bye source compatibility (Score:4, Informative)
Whatever source compatibility existed before Swift (and the degree to which that exists is surely debatable), it was not removed by Swift. Objective-C, C/C++, and Swift can coexist in the same project. I believe they can even coexist inline, which makes me shudder to think, but there it is. Still, you could ostensibly have a UI in Swift and your core business logic in C, if your architecture is solid. (Obviously YMMV, and there are bugs to be discovered, to be sure.)
Re:Good bye source compatibility (Score:4, Funny)
I believe they can even coexist inline, which makes me shudder to think, but there it is.
If that makes you shudder, then this will terrify you [ideology.com.au]. It compiles in eight different languages.
Compatibility is no problem, before or after swift (Score:5, Informative)
Good bye source compatibility. We hardly knew ye.
I have absolutely no compatibility problems. I strictly use objective-c for only user interface code. The core functional application code is written in c/c++. I have multiple iOS/Android apps whose core code is shared and can even be compiled with a console app under Mac OS X or Linux, I use this for regression testing and fuzzing. A headless Linux box in the closet exercises this core code. Similar story for Mac OS X and Windows.
Swift code can replace objective-c code and it matters little to me. Has zero impact on other platforms I target.
Admittedly I've ported other people's code between Windows, Mac and Linux for years and written my own code for Windows, Mac and Linux for years and as a result I am extremely aggressive about separating UI code from functional code.
For those people using some sort of cross-platform wrapper for their project, if it supports Mac OS X objective-c it will probably support Swift. Even if it takes time for the wrapper developers so what, the use of Swift is entirely optional.
Re:Compatibility is no problem, before or after sw (Score:5, Informative)
// main.m
#import <Cocoa/Cocoa.h>
int main(int argc, const char * argv[]){
return NSApplicationMain(argc, argv);
}
// AppDelegate.h
#import <Cocoa/Cocoa.h>
@interface AppDelegate : NSObject <NSApplicationDelegate>
@property (assign) IBOutlet NSWindow *window;
@end
// AppDelegate.m
#import "AppDelegate.h"
#include "Work.h"
@implementation AppDelegate
- (void)applicationDidFinishLaunching:(NSNotification *)aNotification {
}
- (IBAction)buttonPressed:(id)sender {
some_work();
}
@end
// Work.h
void some_work(void);
// Work.c
#include <stdio.h>
#include "Work.h"
void some_work(void) {
FILE *fp = fopen("/tmp/work.txt", "w");
if (fp != NULL) {
fprintf(fp, "Hello World\n");
fclose(fp);
}
}
Re: (Score:3)
Re:Good bye source compatibility (Score:4, Informative)
What possible application could ever require GDB as a dependency?
LLDB is a far superior debugger anyways.
Clearly his team needs it. Don't question why. (Score:3, Insightful)
My god, it's like I'm at Stack Overflow, reading the typical stupid and cocky "WHY ARE YOU DOING IT THAT WAY?!?#?!?@?!@?!?" responses that are so prevalent over there.
His team probably has its reasons for using or requiring GDB. And you know what? They're probably pretty damn legitimate reasons, too.
I'm sure they know about LLDB. But it's probably not what they need, and thus they do not use it.
If they need GDB, then that's what they need. It's that simple.
What we don't need is somebody like you questioning
Re: (Score:3, Informative)
So then install GDB. There is no reason to stop supporting Mavericks because it doesn't come with GDB preinstalled.
Re:Good bye source compatibility (Score:5, Funny)
Creating GUI's in OSX is currently problematic because of font issues.
Obviously this must be the case as no-one else is creating GUIs in OS X either. That, and the fact that OS X is hated the world over by designers for it's awful handling of fonts.
Re:Good bye source compatibility (Score:5, Funny)
That, and the fact that OS X is hated the world over by designers for it's awful handling of fonts.
As a programmer, I can tell you that designers are hated the world over for their awful handling of fonts.
Re:Good bye source compatibility (Score:4, Funny)
Could you tell me who your team is? I have a tumblr for really shitty software and I'd love to feature them!
Re:Good bye source compatibility (Score:4, Informative)
Apple has always been hostile to unified look on their platform.
You do realize, of course, that you are talking about the company that literally wrote the book [scripting.com] on good, consistent UI design, right?
The above, linked pdf copy dates from 1995 (the earliest actual copy I could find in a 2 minute search), but Apple first published their most-excellent HIG manual on or around 1985 [goodreads.com], before most slashdotters were even born.
Now, get off my lawn!
Whoa 1.3x (Score:5, Funny)
That's what about 1/10,000th of what hiring a good programmer would get you, at the price of not being to find any programmers.
Re:Whoa 1.3x (Score:5, Funny)
Now hiring Swift programmers. 10 years experience required. /sarcasm.
New bells and whistles (Score:3, Interesting)
I was particularly surprised to see closures appear. So far I've only been using them in Javascript and Perl, but my experience has been that they are about 15% added flexibility for about -40% readability. That is, they make it harder to tell what's going on, more than they reduce development time.
Re:New bells and whistles (Score:4, Interesting)
Depends on the language.
In groovy closures are perfectly readable, as they are in Smalltalk.
Problem is that closures are often considered second class citizens, hence they get plugged in later nad then they look wiered.
Re: (Score:3)
Re: (Score:3)
If you use them right, they increase readability. Unfortunately they are very, very easy to use in a way that decreases readability.
Bret Victor influence? (Score:3)
The live REPL reminds me of Bret Victor, who used to work for apple.
http://worrydream.com/Apple/ [worrydream.com]
I hope they take advantage of some of his ideas?
https://www.youtube.com/watch?... [youtube.com]
Who designed this, and what drugs were they on? (Score:5, Interesting)
"Immutability has a slightly different meaning for arrays, however. You are still not allowed to perform any action that has the potential to change the size of an immutable array, but you are allowed to set a new value for an existing index in the array. This enables Swift’s Array type to provide optimal performance for array operations when the size of an array is fixed."
i.e. Swift arrays that are "immutable" actually aren't. Way to rewrite the dictionary. But wait, it gets worse. Here's for some schizophrenia.
"Structures and Enumerations Are Value Types. A value type is a type that is copied when it is assigned to a variable or constant, or when it is passed to a function. Swift’s Array and Dictionary types are implemented as structures."
So far so good. I always liked collections that don't pretend to be any more than an aggregate of values, and copy semantics is a good thing in that context (so long as you still provide a way to share a single instance). But wait, it's all lies:
"If you assign an Array instance to a constant or variable, or pass an Array instance as an argument to a function or method call, the contents of the array are not copied at the point that the assignment or call takes place. Instead, both arrays share the same sequence of element values. When you modify an element value through one array, the result is observable through the other. For arrays, copying only takes place when you perform an action that has the potential to modify the length of the array. This includes appending, inserting, or removing items, or using a ranged subscript to replace a range of items in the array"
Swift, a language that is naturally designed to let you shoot your foot in the most elegant way possible, courtesy of Apple.
Re:Who designed this, and what drugs were they on? (Score:4, Interesting)
That is bizarre. So if you see a function signature which takes an array as a parameter, you either do know that elements will be changed, or will not be changed---but only depending on potentially hidden implementation of that function?
And which things have the 'potential to modify' the length of an array? Implementation defined?
Fortran 90+ had it right. You just say for each argument whether the intent is data to go 'in' (can't change it), 'out' (set by implementation), or 'inout', values go in, and may be modified.
Re:Who designed this, and what drugs were they on? (Score:4, Informative)
And which things have the 'potential to modify' the length of an array? Implementation defined?
It's defined by the operations on the array. Basically, appending, inserting or removing an element would do that, but subscript-assigning to an element or a range will not.
Fortran 90+ had it right. You just say for each argument whether the intent is data to go 'in' (can't change it), 'out' (set by implementation), or 'inout', values go in, and may be modified.
Funnily enough, they do actually have in/out/inout parameters in the language.
Note however that the story for arrays here does not apply only to parameters. It's also the behavior if you alias the array by e.g. assigning it to a different variable. So it's not fully covered by parameter passing qualifiers.
Re: (Score:3)
I don't agree with the decisions either. However, it is consistent with Java. Like it or don't like it, Java is popular and its semantics are well-known.
Re: (Score:3)
No, it is not consistent with Java. In Java, you can't change the size of the array, period. If you want a dynamically resizable collection, you use an ArrayList. Here, they have conflated the two together, and then added strange copy-on-write semantics that is triggered by operations that are unique to ArrayList, but not by those that are shared between it and array. There's nothing even remotely similar to that in Java - arrays are just passed by reference, and so are ArrayLists, and if you can mutate it,
Re: (Score:3)
Java array is a reference type, so you're not declaring the array as final - you're declaring the reference to that array as final. Here, they're claiming that their arrays are value types.
And there's still nothing equivalent to copy-on-write behavior that they do for arrays when they're copied. Again, in Java, if you copy a value of array type, you just copied a reference - any changes to the actual array object will be seen through either reference. Ditto with ArrayList. In Swift, though, if you copy an a
Re: (Score:3)
I completely fail to see what your problem is.
Immutable arrays are defined exactly the same in several other languages. If you want an array of constants, you need to defined its contents as constants, not just the array itself. It's good behaviour to give you this choice.
Same for collections passed by reference. Again, several other programming languages do it exactly this way, implicitly passing collections by reference because collections can become large and implicitly copying them every time you touch
Re:Who designed this, and what drugs were they on? (Score:4, Informative)
You completely miss the point.
Regarding immutability, it's not about an array of constants. It's about an immutable array - as in, an array which has its content defined once, and not changed afterwards. They actually do use the term "immutable" for this, and this is what it means in any other language. It's also what it means in Swift - for example, an immutable dictionary cannot be mutated at all, neither by adding or removing elements to it, nor by changing a value associated with some existing key. The only special type here is array, for which immutability is effectively redefined to mean "immutable length, mutable contents" - which is a very weird and counter-intuitive definition when the word "immutable" is involved (e.g. in Java you also can change elements of an array but cannot add new ones - but this is the rule for all arrays in Java, and it doesn't call that "immutable"). The fact that there's no way to have a truly immutable array is just icing on the cake.
And they don't pass collections by reference. They say that value types are passed by value (duh), and that both dictionaries and arrays are value types (unusual, but ok). But then they completely redefine what copying an array means, with a very strange copy-on-write semantics whereby they do implicitly copy them if you touch them "in the wrong way" (e.g. by appending an element), but not if you touch them in the "right way" (e.g. by mutating an existing element). Again, this magic is utterly specific to arrays - for dictionaries, they behave like true value types at all type, and use full-fledged copy-on-write under the hood to avoid unnecessary copies - so if you "touch" a dictionary in a way that mutates it, it makes a copy, and it doesn't matter how you do so - by inserting a new key-value pair or by changing a value for an existing key. Not only this is very much non-orthogonal (why make copying of arrays and dictionaries so different?), the behavior that's defined for arrays just doesn't make any sense in distinguishing between various ways to mutate them.
Re: (Score:3)
By the way, it might be the case where showing how it behaves explains the strangeness better than trying to describe it. Have a look [imgur.com], and note how changes either are or aren't reflected across "copies".
Re: (Score:3)
I don't see what the big deal is. If you modify the size of an array, regardless context, you get a different array. Not exactly a brain buster.
Re: (Score:3)
VS has allowed for the worst programmers to get away with egregious stupidity for a long time because the preprocessor would "fix" garbage code
I don't know what your definition of "preprocessor" is, but it clearly isn't the common one because no matter how I try to parse the above, it makes zero sense.
Of course, the fact that VS is not a language, whereas Swift is, is also kinda telling.
Re: (Score:3)
That part of it kinda sorta makes sense for someone coming from ML background, I suppose. I've seen other languages do similar.
Though I think that Scala really did arrive at the clearest concise syntax for this distinction: "var" is a (mutable) variable; "val" is an (immutable) value. They're also similar enough that it's clear that the concepts are closely related in practice.
Swift Programmers Wanted (Score:5, Funny)
Must have 5 years experience.
Re:Swift Programmers Wanted (Score:4, Funny)
Good news; I've got over 20 years experience. (bullshitting my way into positions with languages I don't know. Then learning fast.)
Guaranteed results...'I can guarantee anything you want.' Bender
Re: (Score:3)
Re: (Score:3)
I'm sure in a month, some teams of 60 people will claim to have 5 years combined experience in swift....
Re: (Score:3)
The parallel scripting language described at swift-lang.org is NOT the swift language referred to by this article.
So no, you cannot have 5 years experience in this.
Re: (Score:3)
Apple was aware of swift and gave the project leaders a heads up before WWDC.
SWIFT programmers (Score:5, Interesting)
They could have chosen a name other than that of the international banking protocols. Asking for SWIFT programmers is going to get them a bevy of COBOL coders who know the protocol.
You think that is the problem? (Score:5, Informative)
Re: (Score:3)
I just look forward to the organization behind the SWIFT protocols suing Apple into the ground for trying to appropriate a name that already has a well-established meaning in computing. They've certainly got the budget to do it... :)
iPhone announcement (Score:3)
Apple has enough $$$ to payoff for virtually any name they set their mind to, just like what they did with the iPhone.
http://www.idownloadblog.com/2012/01/27/apple-cisco-iphone-trademark/
Re: (Score:3)
Actually Cisco was actively using the brand at the time that Apple released their's and Cisco sued but settled out of court without releasing any of the details other than both companies would use the brand.
Re: (Score:3)
Re:You think that is the problem? (Score:5, Funny)
Designed for safety & performance (Score:3, Interesting)
I find these two aspects interesting and wonder what the trade off is. Longer compiler times?
"Designed for Safety
Swift eliminates entire classes of unsafe code. Variables are always initialized before use, arrays and integers are checked for overflow, and memory is managed automatically. Syntax is tuned to make it easy to define your intent — for example, simple three-character keywords define a variable (var) or constant (let)."
" Swift code is transformed into optimized native code, "
Somebody post a SWIFT example PLEASE! (Score:3)
I wanted to write apps and tried to learn Objective-C, but as a coder that started with C and then moved on to C++ and PERL (the swiss army chainsaw), the language syntax hurt my ability to read it. In case you don't know what I am talking about, here are some of my learning notes
// old school // send a message or call a method
myObject.someMethod();
[myObject someMethod];
result = myObject.someMethod(); // old school // method returns a result
result = [myObject someMethod];
result = myObject.someMethod(arg); // old school // pass an argument
result = [myObject someMethod:arg];
You can see the Old School syntax above (which works in Objective-C) and the Objective-C standard syntax below. The square brackets [ ] and colons : just hurt my mental debugger... [ ] and yes I know Objective-C is a Superset of C, so they had to steer clear of the C-Syntax but it just looks wrong. Further, I know that I could write my own style of Objective-C but I wouldn't be able to read the code of others. Apple had to start somewhere and Steve had the NeXT languages ready to go but to me the syntax is ugly and offensive. However, I am ready for a better Apple language.
I can't wait to see a SWIFT code example, if it gets rid of the NeXT Objective-C Superset Syntax, I might be coding for iPad and Mac sooner than I thought. If anyone has a code example, please share it, I would like to see what a function, method, or message call looks like. Hoping for parenthesis and a Standford iTunesU class. Guardedly excited!
Re:Somebody post a SWIFT example PLEASE! (Score:5, Informative)
Ok, you guys are too slow, I RTFA and downloaded the iBook. So far, I am very much liking the SYNTAX, especially OBJECTS and FUNCTIONS, they even brought the LET keyword in from BASIC. SWIFT will make programming Apple products much easier for the C loving syntax crowd, from what I can see. Ahhh... what a breath of fresh air. Code snippet below of creating an object and exercising it. I feel bad for those that suffered through Objective-C.
“class Square: NamedShape {
var sideLength: Double
init(sideLength: Double, name: String) {
self.sideLength = sideLength
super.init(name: name)
numberOfSides = 4
}
func area() -> Double {
return sideLength * sideLength
}
override func simpleDescription() -> String {
return "A square with sides of length \(sideLength)."
}
}
let test = Square(sideLength: 5.2, name: "my test square")
test.area()
test.simpleDescription()”
Excerpt From: Apple Inc. “The Swift Programming Language.” iBooks. https://itun.es/us/jEUH0.l [itun.es]
Re:Somebody post a SWIFT example PLEASE! (Score:4, Insightful)
Ok, you guys are too slow, I RTFA and downloaded the iBook. So far, I am very much liking the SYNTAX, especially OBJECTS and FUNCTIONS, they even brought the LET keyword in from BASIC. SWIFT will make programming Apple products much easier for the C loving syntax crowd, from what I can see. Ahhh... what a breath of fresh air. Code snippet below of creating an object and exercising it. I feel bad for those that suffered through Objective-C.
To be honest, while this snippet is a few lines shorter, it's arguably more complicated than the corresponding Obj-C. It drops returning self in the init, and drops a few lines that would have had to go in to the class definition, but you gain a few unsightly keywords like "override", having to add the keyword "func" to every function, and you gain some more syntactical mess like "->".
It's not horrible, but I'm not sure this sample is more readable than Obj-C. As others have noted, Swift has the habit of taking the important parts of a function (like what it's named and what it returns, or what name a class is and what it subclasses) and shoving them off to entirely different sides of the function declaration.
Re: (Score:3)
To be honest, while this snippet is a few lines shorter, it's arguably more complicated than the corresponding Obj-C. It drops returning self in the init, and drops a few lines that would have had to go in to the class definition, but you gain a few unsightly keywords like "override", having to add the keyword "func" to every function, and you gain some more syntactical mess like "->".
"override" is a _massive_ improvement. It means you cannot override a superclass method by accident. And you can't try to override a non-existing superclass method by accident.
Re: (Score:3, Insightful)
I haven't checked, but its a great idea to have override as a mandatory descriptor (If it is). Java now has @Override, but code quality suffers from it not being compulsory, leading later to subtle bugs. As for func and let, I imagine it makes it easier to make a scripting language to have less ambiguity about what you are trying to declare up front. I mean, without func, would the line "area()" be the start of a declaration, or a call to a function? Sure, you could wait for a semi-colon to finalise the dec
Re: (Score:3)
It may not measure up to whatever fly-by-night languages to which you might compare it, but it's a MAJOR and LONG OVERDUE replacement for Objective-C, which is the only language any serious Mac developer has had to put up with. I for one welcome our new Swift overlords.
Re: (Score:3)
Oh come on. It is pretty obvious that you can add named parameters to a C-like syntax without having this weird square bracket stuff:
someObject->setColor(red:0.4, green:0.3, blue:1.0, alpha:0.5);
The square brackets are there because the original Objective-C compilier was very primitive. It basically looked for the square brackets, did some manipulation of the text, and passed everything to the C compiler. Pretty much it turned this:
[someObject method:x]
into this:
Re: (Score:3)
Colour me skeptical... (Score:4, Insightful)
Apple had a fine language 20 years ago. It was said to influence the design of Ruby and Python. They butchered it into an Algol-like syntax because 'real programmers' can't grok s-expressions. Then they abandoned Dylan.
Next, they created a language for mobile devices. Its programming model was said influence the design of JavaScript. Then they abandoned NewtonScript.
Swift (Score:3)
Viva Eco (Score:5, Insightful)
Ok, so now you'll be developing software using Apple's frameworks and Apple's language to run on Apple's runtime, after passing Apple's compiler (i.e. LLVM) for download using Apple's store (after finding your product with Apple's iAD) directly onto Apple's products built with Apple's custom processors, after you register as an Apple Developer. If your app needs to do something outside this environment, you can use said APIs now to reach out to Apple's Could and Apple's Database servers. And if your app is really successful as measured by Apple Crash Reporting and Apple Usage statistics or Apple's iTunes Connect, then they'll just straight out fucking copy you.
Something about the new "language" is what makes that summary start sounding ridiculous.
Bjarne Stroustrup (Score:5, Interesting)
* What problem would the new language solve?
* Who would it solve problems for?
* What dramatically new could be provided (compared to every existing language)?
* Could the new language be effectively deployed (n a world with many well-supported languages)?
* Would designing a new language simply be a pleasant distraction from the hard work of helping people build better real-world tools and systems?
Apple can definitely deploy the new language effectively, but I'm not sure it solves any problems.
Re: (Score:3, Interesting)
Re: (Score:3)
Re: (Score:3)
It gives Apple complete control over their own destiny, which is something Apple likes to have (not exactly news). They now have a language they can tinker with to their hearts' content and no external group or standards body can restrict what they do with it. They've made it very clear they intend to listen to developer feedback and tinker with it, at least in the near future. Certainly even if they do eventually open it up, they'll still be able to extend it however they like and whenever they like in the
In related news... (Score:5, Funny)
... HR departments began advertising for programmers with 3+ years of Swift programming experience.
Re:It's about time (Score:5, Funny)
I can't wait to add this to my résumé.... I already have 2 years of experience with Swift!
Re: (Score:3, Funny)
Oooh, sorry. We're only looking for candidates with at least 5 years of experience with Swift.
Re:It's about time (Score:4, Insightful)
Wow, I happen to meet that requirement. I've been using SWIFT [swift-lang.org] for quite a few years and have done image processing and molecular docking workflows in it./p.
Re: (Score:3)
Re:and it needs an new OS the mess up other apps (Score:5, Funny)
and the comment grammar no sense slashdot article read.
captcha: verbally. Seriously?
Re: (Score:3)
Re:and it needs an new OS the mess up other apps (Score:4, Funny)
*that poorly.
Re: (Score:3, Informative)
They've already got LLVM and Clang, no? Or did you mean better than those?
Re: (Score:3)
Just what we don't need, an easier way to write buggy code.
Exactly! Who, besides the Amish, need buugy code? What we need is self driving automobile code.
Re: (Score:3, Informative)
Yes, Swift itself does not have the baggage of C just like Python does not have the baggage of C. The fact that both languages can interoperate with C does not change that.
Re: (Score:3)
The statement is talking about if you only write pure Swift. What you describe is really no different than using C code with Java through JNI. But that does not mean Java itself has any C baggage.
Re: (Score:3)
From what I can tell (I just got out of WWDC and am reading through the docs) it can be bridged to, but not directly called. You can directly call Obj-C methods through the bridge, but not C methods. You'd have to bridge to the Obj-C methods which then call C methods.
I don't know what happens when that Obj-C method calls malloc and returns some memory for leak-tastic behavior. I still haven't read if or how Swift handles raw memory buffers.
Re: (Score:3)
Really, it is not the fault of MS, Google, or Apple but of academia. In the CS curriculum they still teach the "compiler" class and as long as you keep teaching kids how to write compilers, they will keep writing languages. SWIFT is definitely a variation on a C theme, but much better than the Objective-C (superset of C) syntax, at least at first glance.
Re:Just what we need, another C++ clone (Score:5, Funny)
You mean like C++?
Re: (Score:3)