Cross With the Platform 307
Tim Bray tweeted, No platform has hit the big time till it's been flamed by JWZ. He was referring to this rant in which Zawinski systematically dismantles any claim the iPhone has to cross-platform compatibility. "I finally got the iPhone/iPad port [of Dali Clock] working. It was ridiculously difficult, because I refused to fork the MacOS X code base: the desktop and the phone are both supposedly within spitting distance of being the same operating system, so it should be a small matter of ifdefs to have the same app compile as a desktop application and an iPhone application, right? Oh ho ho ho. I think it's safe to say that MacOS is more source-code-compatible with NextStep than the iPhone is with MacOS. ... they got some intern who was completely unfamiliar with the old library to just write a new one from scratch without looking at what already existed. It's 2010, and we're still innovating on how you pass color components around. Seriously?"
UIKit != AppKit (Score:5, Interesting)
The OS is the same, but UIKit is NOT the AppKit. It's like bitching against linux when trying to build your Qt code against gtk.
Re:UIKit != AppKit (Score:5, Insightful)
His example there is pretty clear, instead of using the perfectly good class NSColor, they rewrote it differently as UIColor, leaving some important functionality out. You can deal with it, sure, but it's kind of annoying.
Still, I don't know who was expecting any sort of compatibility on the GUI portion of the iPhone, since the paradigm is completely different. It doesn't even make sense to think that you wouldn't have to rewrite the front end. On the other hand, I haven't found any problem porting C code or C++ code to the iPhone; I don't claim to be an expert but it does use GCC. In other words, it is highly compatible with existing code, but you'll have to rewrite your user interface. Which is probably what you were planning on doing anyway.
Re: (Score:3, Insightful)
*head kersplode*
Re:UIKit != AppKit (Score:5, Funny)
Meh - the fix to get the Dali clock working is trivial - rename all pointers to smell like the colour yellow, and change all LONGINTs to SURREALs.
Re:UIKit != AppKit (Score:4, Interesting)
Thank you for that :). I was having a shitty day - it's a little bit better now.
Re: (Score:2)
The only difference between you and a madman is that you are not mad.
Re:UIKit != AppKit (Score:5, Insightful)
It's like bitching against something billing itself as "Linux desktop compatible" when your Qt code isn't supported on it, only gtk. Which would be a legitimate gripe; "Linux desktop compatible" should support Qt as well as gtk.
Re: (Score:2)
It should also support native X11, Blackbox, Enlightenment, perlwm, PLWM and WindowMaker.
Re: (Score:2)
Not necessarily directly, but it should not ban the users or developers from installing these if they wish so.
OTOH, it could just claim it's Linux Gnome and GTK compatible.
#ifdef APPLE_HARDWARE (Score:4, Insightful)
#ifdef APPLE_HARDWARE
doItTheirWayOrHitTheRoad();
#endif
You complaining about a company that retains control of whether or not you can release the app to the device even if it conforms perfectly to their APIs. If that's not a deal breaker for you why do you think that complaining about shitty incompatible frameworks or passing colour components on slightly different programs is going to worry them? You're wasting your breath.
Re:#ifdef APPLE_HARDWARE (Score:5, Interesting)
Um, not quite. The company doesn't control whether you can release the app to a device. The company controls whether the app will run on a device (either by buying the app through an app store or paying a set fee to the company). This isn't too far off from the XBox 360, either. To some extent, it's not that far off from most any commercial library/OS (the main difference is whether you effectively pay the fee upfront or whether they try to nickel and dime you later).
Apparently the Dali Clock is a rather old program (nearly 20 years) that's been ported to a variety of platforms. Presumably, the author chose to port the Dali Clock to the iPhone precisely because it was supposed to be relatively trivial to port from a Mac OS X version. The blog highlights how untrue that ended up being; comments on the blog suggest it's because Apple provided multiple graphical APIs and if the author had been lucky several years ago, he would have chosen the one that worked on the iPhone.
In short, it doesn't sound like the author bought his iPhone to write apps for it. It was more a porting exercise to see just how trivial the task would be.
No doubt. But, then, most blogs are a "[waste of breathe]". These comments, both yours and mine, would likely qualify as well. I don't think that'll stop me from commenting or considering the blog for what it is, a recognition of Apple having the same sort of failings that Microsoft does: designing too many APIs/interfaces/file formats, dropping support for them whenever they can, and generally being about as bad as any other platform when it comes to having a unified, solid solution to the many problems that exist for the developers. I will give Microsoft some credit, though, for generally waiting longer than most public, commercial software companies in maintaining strict backwards compatibility.
Re:#ifdef APPLE_HARDWARE (Score:4, Insightful)
"I will give Microsoft some credit, though, for generally waiting longer than most public, commercial software companies in maintaining strict backwards compatibility."
I no longer program, I moved on to a field where computers are ancillary to my line of work and happy about the reboot, but I remember this being the case even a few years back. Microsoft maintains strict backwards compatibility at all risks.
And this is the big difference between Microsoft and Apple. Microsoft cares more for their developers and the companies that make money off of them than they do their users. Apple cares more about the users than they do about the developers.
Microsoft has routinely left in holes in their OS that can't be easily fixed because a major software developer can't be bothered to fix their software.
Apple, on the other hand, I've seen them send out pretty terse notes to their major developers letting them know that if they don't change their use of an unexposed API (one they found has a hole it in...generally why Apple doesn't doesn't publish APIs until it is ready because they want to make certain it is ready for public...and apparently it applies to the iPhone as well)...and Apple will specifically tell major software houses that if their software isn't fixed in 30 days, it will cease working for anyone that updates their computer.
That said, I don't really care for Apple's walled garden approach to the iPhone and for those of us nerds, it is a major problem (I've had to jailbreak just for simple things like Googlevoice front ends...or tethering)...for the average user? not a problem. The point is, Apple cares far more for the user than the developer. Microsoft doesn't give a fuck about the user so long as the developers are happy.
So give credit to Microsoft for maintaining backwards compatibility, but you are just thanking them for providing a buggy OS that allows viruses to run rampant.
Something tells me he orders BigMac at Burger King (Score:2)
the desktop and the phone are both supposedly within spitting distance of being the same operating system, so it should be a small matter of ifdefs to have the same app compile as a desktop application and an iPhone application, right?
And at the core, they are - they share a large amount of code, with iPhone OS running a slightly modified version of the Darwin kernel. Where they diverge though, might have something to do with the whole UI being completely different. I assume he just didn't realise that the UI was different, since that seems to be the level of discourse available. So an app written for OS X, using a window manager with a point and click mouse and variable screen res, just dropped right onto a fixed resolution, touchscreen
Re:Something tells me he orders BigMac at Burger K (Score:5, Interesting)
The problem is not that the UI is -completely- different.
It's an UI that is massively the same, just ran through a bulk rename, shuffle parameters order around in function calls and explode/implode some methods / typical sequences.
The UI -could- have been VERY similar, with only minimal differences easy to #ifdef through - the underlying philosophy is. Instead, there was some active effort put in making it totally incompatible, where making it compatible would be easier and more obvious.
A typical case of "an extra week of writing code can save you a hour you'd spend on reading documentation instead."
Re: (Score:3, Insightful)
No, the UI is completely different. Events are completely different, because of multi-touch-related stuff, and consequently, everything else needed to be rewr
Re: (Score:2)
Re: (Score:2)
Wow, there's a big jump from "likes Apple" to "supports child rape" - I can see why you posted AC.
For the record, I hate that the built in calendar app on the iPhone doesn't carry across the ToDo lost from iCal, only the event calendars, and that I have to use a third party app to sync my ToDo list.
I also dislike the slight pause the phone has sometimes when I answer it and it says has picked up, but there's no connection for a second or so. I'm not sure if this is the network or the phone, but it's probabl
Let's look at what JWZ said... (Score:5, Interesting)
In TFA, JWZ said "It was ridiculously difficult, because I refused to fork the MacOS X code base: the desktop and the phone are both supposedly within spitting distance of being the same operating system, so it should be a small matter of ifdefs to have the same app compile as a desktop application and an iPhone application, right?"
FLAMESUIT ON
At the risk of being shot down by every MacOS/iPhone hacker here... There are two main points that JWZ makes which are quite interesting:
1) I refused to fork the MacOS X code base
2) the desktop and the phone are both supposedly within spitting distance of being the same operating system
So the beef he has, while totally valid is because of:
a) refusal to fork the codebase
b) assumed that both iPhone OS == MacOS X
Hmm. I understand the refusal to fork the codebase, but if that's what's _required_ then that's what needs to be done to have the app on the iPhone. And what's the other bit about "assume" making an ass out of you and me? Ditto for the OpenGL/OpenGLES rant...
FLAMESUIT OFF
Re:Let's look at what JWZ said... (Score:5, Insightful)
IF the code requires forking, THEN it should have no pretenses about being cross-platform compatible.
Which was the original point.
It's not a complaint that iPhone is devilishly difficult to program. It is not. The complaint is that it's devilishly difficult to write an iPhone/desktop cross-platform compatible app, which should have been easy if the device actually was cross-platform compatible.
Re:Let's look at what JWZ said... (Score:4, Interesting)
I don't really care about developing for any Apply product myself, so I haven't looked into it in-depth, but does Apple actually claim OS-X and iPhone development is cross-platform compatible?
Re: (Score:2)
Re: (Score:2)
So after the decades of the Slashdotters complaining about MacOS being MacOS, thier complaint about the iPhone OS is that it's NOT MacOS?
Reading comprehension fail. The complaint about the iPhone OS is that it IS MacOS, but that it's castrated in completely inexplicable ways, and the changes smack of incompetence.
Re: (Score:2)
Incompetence, yes that's what you call selling 85 million+ devices running the OS.
Re: (Score:2)
Incompetence, yes that's what you call selling 85 million+ devices running the OS.
By the logical extrapolation of your argument, Windows NT is the finest operating system on the planet.
Re: (Score:2)
No, that would be EMACS.
Re: (Score:2)
Windows NT is a damned good OS for the time it was made.
I don't think the changes were out of incompetence either, the bunch at Apple seem to be fairly competent in what they are doing. To me it seems like something that was necessary to go from a Mouse/Keyboard based UI to a multitouch one.
Having used both the old and new Palm dev tools and the Windows Mobile dev tools back when it was pocket pc, developing on the iPhone is much less painful.
Re: (Score:2)
The GP said "incompetence" because of the API changes. It has nothing to do with units sold.
If you can't read comments please don't make your own.
Re: (Score:2)
Incompetence, yes that's what you call selling 85 million+ devices running the OS.
Right... because popularity always equals competence. You must be a Britney Spears fan.
Re: (Score:3, Insightful)
So, in your world, the API for a variable resolution, mouse+keyboard driven GUI should be the same API as a fixed resolution touchscreen? And you think it's "incompetence" that the APIs are different for two interfaces that are different in size, input device and resolution, one of which can be rotated on the fly into different orientations?
You are surprised that an app that has existed for nearly 20 years on multiple platforms wasn't trivially easy to port to the iPhone because the developer was just too s
Re: (Score:2)
I don't get it though...why would you want to write a cross platform app between a desktop OS and a smart phone with the same UI? That what made Windows Mobile a disaster, having to navigate apps designed for a desktop UI on a small screen with a stylus.
Re: (Score:2)
Okay, but what's so bad to have a good phone app ported to desktop, say, as a desktop widget, with support added for full keyboard and using the extra CPU power? A weather gadget, a RSS ticker, a clock, a post-it notes app, this kind of thing. It should be trivial, shouldn't it?
Re: (Score:2)
It's not a complaint that iPhone is devilishly difficult to program. It is not. The complaint is that it's devilishly difficult to write an iPhone/desktop cross-platform compatible app, which should have been easy if the device actually was cross-platform compatible.
Anybody expecting it to be easy to take an app which was written for a desktop system and porting it to a mobile system is going to unpleasantly surprised. Porting is always a bitch and if the platform you want to port to is less capable than the one you originally wrote the app for you'll have allot of work ahead of you. Even in CP languages like Java you'll have problems since the Java implementations for mobile platforms aren't as capable as their desktop counterpart and there are always issues with impl
Re: (Score:2)
Porting J2ME phone midlet to J2SE runtime is not hard at all.
Sure you must take device limitations into account. But that's not the case here!
First, the app was not really being "ported for", but rewritten for both simultaneously. Then, they were supposedly the same OS. And last but not least, the app in question was dead simple: display a monochrome bitmap on screen, animate it.
The problem was that Apple developers went to some lengths to make the two deeply incompatible - in parts that could be easily com
Re: (Score:2)
Exactly the issue is not that tools are not cross platform or that the iphone is not compatibile with the Mac; that is not really surprising. The problem is Apple marketing said they were, when they are not. Its like Microsofts claim .Net is cross platform; sounds nice but lets be honest. .Net runs on what Windows or Windows x64. A small subset of what is considered "platform" on Windows is there for use on Windows Mobile; such that unless your app is trivial you are probably looking at a serious porting
Re: (Score:2)
The complaint is that it's devilishly difficult to write an iPhone/desktop cross-platform compatible app, which should have been easy if the device actually was cross-platform compatible.
Which, at a guess, was apple's intentions in making a completely new UI framework for the iPhone. What they absolutely did not want under any circumstances was a bunch of poorly ported desktop apps, with the exact same user interface, in such a way that nothing worked right.
Re: (Score:3, Informative)
JWZ's rants hinges on two points, based on assumptions that are false.
The first being, that iphone OS is (or should be) identical to OS X desktop/cocoa. I've been developing on OS X desktop for about three years, and iphone about a year. Never have I heard the claim (by Apple or anyone else) that the code is portable. It simply is not. In fact, Apple's iphone introductory videos explicitly mention that developers must think differently about a portable device in terms of what kinds of apps are good for port
Who is JWZ? (Score:4, Insightful)
Re: (Score:2)
geek card, hand it over, now
Re: (Score:2)
Re: (Score:2)
You realize how silly this sounds, right?
Anyways, the guy is a hacker and nightclub owner, well liked by many slashdotters for both making a gazillion shitton X screensavers, his awesome occasional rants, and proving he can have a successful life outside of the usual geekery. He also did some netscape/mozilla shit a while
Re: (Score:2)
Re: (Score:3, Interesting)
IOW, he's an archetypical cyberpunk character. Former hacker and coder who now runs a bar/nightclub, who sometimes dispenses wisdom from on high, and whom newbs deride and experienced people listen to.
The only thing missing is moving stolen data or cyberware, and/or arranging squads of disparate professionals to perform quasi-legal or illegal actions against corporations, generally on the behalf of other corporations.
Re: (Score:3, Funny)
Oh damn, I never even thought of it like that but you're incredibly right. Next we're going to find out he carries a katana and delivers pizza for the mob in his free-time... :O
Re: (Score:2)
JWZ is an important figure in the history of (Score:5, Informative)
web (specifically, web browser) development, with Major (capital M) contributions to the mozilla/netscape/firefox ecosystem since before mozilla/firefox existed as projects in their own right (going all the way back to Netscape 1.0), as well as fingers in things like Emacs and popular X applications.
Re: (Score:3, Interesting)
And why should we care?
web (specifically, web browser) development, with Major (capital M) contributions to the mozilla/netscape/firefox ecosystem since before mozilla/firefox existed as projects in their own right (going all the way back to Netscape 1.0), as well as fingers in things like Emacs and popular X applications.
Yes, we know he is a smart cookie, but that still doesn't answer the OPs question.
After looking at the webpage of the app in question (as posted by someone else here - I had never heard of the app before) all I see is some nostalgic clock App that seems to be being forced into a cross-platform test case where it doesn't really fit, and then complaining about the process. And then gratuitously throwing in some rant about the $100 developer cost. Yet nowhere have I seen any claims that a) OS-X and iPhone
Re: (Score:3, Funny)
Want Real Cross-Platform? Try ZooLib! (Score:3, Funny)
All with one set of C++ client code, compiled to native executables for each platform.
If you want iPhone support, you'll need the Subversion source base; the code works, but we haven't rolled a release for a while.
Its Open Source under the MIT License, chosen specifically to be compatiable with both GNU GPL and proprietary development.
That's my fault, not ZooLib's. Check Andy's page! (Score:2)
He's been developing ZooLib for twenty years, but it has only been Open Source for ten.
Andy is a consultant; whenever he gets a new contract, if ZooLib doesn't already provide some functionality he needs for his gig, he adds it into ZooLib.
As for the website not listing many apps - I'm the webmaster, but not a very diligent one. Most of ZooLib's apps are sold by the various clients that Andy and I have had over the years.
Properly cross-platform code cares not for the UI (Score:2)
There are some important architectural points to keep in mind if you ever hope to take your application cross-platform. One is the separate the "engine" or core code cleanly from any kind of user interface. That way you can keep what is most fundamental about your application constant, then write a new UI to exercise it for
Re: (Score:2)
I haven't actually read TFA
You needn't point out that out, because it's painfully obvious that you know absolutely nothing about Dali Clock, its cross-platform history, or JWZ's own well-documented history and experience.
Educate yourself first, then speak.
http://www.jwz.org/xdaliclock/ [jwz.org]
http://en.wikipedia.org/wiki/JWZ [wikipedia.org]
If it's already cross-platform, why the grief? (Score:3, Insightful)
While there are many conceptual similarities between the two operating systems, they are different enough that they really should have been considered separate platforms from the very start.
I've been doing cross-platform development for twenty years. Don't Even Get Me Started.
Defining moment (Score:2)
I'm also tempted to comment on his choice of developm
Re:Could be worse (Score:4, Insightful)
Windows mobile probably has more of a backwards compatibility problem than the iphone... The core OS of the iphone is the same as normal OSX and it's only the interface APIs which are different - and rightly so, the whole interface is fundamentally different in how you interact with it.
Windows mobile on the other hand is a whole different os with a completely different kernel.
Re:Could be worse (Score:5, Informative)
The only valid complaint I see in this whole article is with NSColor/UIColor – NSColor really should be in the Foundation API (common to both Mac OS and iPhone OS), not the AppKit/UIKit APIs.
His OpenGL example is hilarious. "Oh my god, I can't use glBegin and glVertex"... Function calls which have been deprecated in OpenGL since version 2, that was 15 years ago!
As for UIKit being very different from AppKit... Well of course it is! UIKit is for building touch based UIs, if you transfer the exact same things as you have on Mac OS straight over, you end up with a shit mishmash of rubbish. The important thing here is that both APIs share their Foundation API (the basic programmery stuff you need like dictionaries, arrays, strings, etc).
Re:Could be worse (Score:5, Informative)
Function calls which have been deprecated in OpenGL since version 2, that was 15 years ago!
Unless I'm missing something, or you're living in 2020, OpenGL version 2 was released in 2005, and you're 10 years off.
Re:Could be worse (Score:4, Informative)
Re: (Score:2)
Oh but don't worry, if you use the new enlightened way [duriansoftware.com] of doing OpenGL programming, you can write a "Hello World" app that fades between two images in:
Re: (Score:2)
As a non-OpenGL programmer, help me out here.
Is that sarcasm or are you impressed? I'm having a hard time telling. 380 lines of C for "Hello World" is a tad much, but using OpenGL to fade between images...
Re: (Score:2)
Not much of an OpenGL programmer either, but I'm pretty sure that's sarcasm.
OTH, fading between images using compositing to allow that operation to be GPU-accelerated is a pretty legitimate use of OpenGL, though.
Re: (Score:2)
Re: (Score:3, Insightful)
If you mean, are making an AAA game title, sure, but then your job is probably "3d graphics programming specialist" or something, so you can jump through whatever hoops are necessary. There's a huge range of apps for which performance is not really a concern; they ran fine on hardware of 10 years ago, so they ought to be able to run fine on today's. xDaliClock is one of those.
Re:Could be worse (Score:4, Insightful)
No, performance is *always* a concern on a battery powered device. Every single instruction has a cost in ergs. You don't want to waste them.
Re: (Score:3, Insightful)
The problem with this "explanation" is that the application's effort to use vertex buffers is significantly higher than the effort to use immediate mode.
A hardware implementation of IM (like the one in Silicon Graphics machines) would probably bring much higher energy efficiency than carefully packing up VBOs with software. Even when there's no hardware implementation, the packing up can be equally well performed by a driver, thus just shifting the energy consumption around, not increasing it.
Thus, immediat
Re: (Score:3, Informative)
The problem with this "explanation" is that the application's effort to use vertex buffers is significantly higher than the effort to use immediate mode.
No, no it's not.
Immediate mode requires at least as many (usually more 3 times more) calls as you have verticies in your model, during which the GPU is wasting time, and the driver is doing complex things to pack data into buffers in graphics memory.
Meanwhile, vertex arrays require a single upload of a constant array to graphics memory, which happens quickly as a single memcpy, and then frees the graphics card to get on with it. After that point, all the CPU need to is yell at the graphics card "now render
Re:Could be worse (Score:4, Interesting)
Can I take this opportunity to mention that I find programmer-fights fascinating?
Carry on, guys.
Re:Could be worse (Score:5, Interesting)
Entirely correct @ shaders.
However, I have to take exception with your description of immediate mode - the reason it performs so poorly now is that modern graphics chips are designed pretty much exclusively for DirectX (at least, this goes for ATI).
On machines where immediate mode performance was actually some kind of a priority (for instance, SGI Octane IMPACTSR and relatives), executing a glVertex command amounted to 3 memory writes into a command FIFO that was mapped into a fixed address in userspace which was accessible with a short form of a SW opcode (remember, this is MIPS, there is a range of 64k addresses that can be accessed without loading a base register: -32768 to 32767).
The hardware even managed the hiwater/lowater status of the fifo, and notified the kernel to perform a context switch to a non-gfx process when the gfx process was filling up the command FIFO. Those switches were as a matter of fact "virtualized" (before it was cool) by a combination of hardware, kernel (if hardware contexts are exceeded) and userspace - not entirely unlike what DX10 ADM was supposed to be, except this was in 1995.
For large static meshes (only transforms applied with Vertex Shaders), buffers are definitely going to perform better, because the meshes can be located in local memory (VRAM). However, if something is dynamically generated, immediate mode in a good implementation is no slower than a memcpy, and it does not require a kernel transition to submit a command buffer to card's ring (like modern cards like to do).
Re: (Score:2)
Immediate mode requires at least as many (usually more 3 times more) calls as you have verticies in your model, during which the GPU is wasting time, and the driver is doing complex things to pack data into buffers in graphics memory.
Meanwhile, vertex arrays require a single upload of a constant array to graphics memory, which happens quickly as a single memcpy, and then frees the graphics card to get on with it.
Unless I'm very much mistaken, vertex arrays (in client memory) is what the driver does internally to implement immediate mode anyway. I.e. all your glVertex etc. calls are collected in an array, which is sent to the GPU on glEnd(). So, if you just do all that yourself to avoid using immediate mode, you won't gain any performance. What IS faster is using vertex buffer objects, because they reside in graphics memory and thus DON'T "require a single upload of a constant array to graphics memory" -- which mi
Re: (Score:2)
That I agree with, which is why I purposely quoted the part of his sentence including "on the desktop". On the desktop, the vast majority of OpenGL apps, i.e. anything except state-of-the-art AAA game, does not really need to squeeze that last bit of performance out.
Re: (Score:2)
You missed the part where this is the Dali Clock. The iPhone needs to melt in order for the program to be completely functional.
Re: (Score:2)
"His OpenGL example is hilarious. "Oh my god, I can't use glBegin and glVertex"... Function calls which have been deprecated in OpenGL since version 2, that was 15 years ago!"
What?
glBegin() was deprecated as of OpenGL 3.0.
OpenGL 2.0 was released in September 2004 (5 years ago.)
If you meant OpenGL 1.1, that was 13 years ago, and glBegin() certainly wasn't deprecated.
Re: (Score:2)
Well, we get three stories a day, everytime someone says they do like something about the Iphone/pad, so fair's fair.
Re:We get it already (Score:5, Insightful)
No. People like making money with the iPhone. But development in the classical sense, i.e. "growth; progress", does not occur on iPhone.
Re: (Score:2, Interesting)
Re:We get it already (Score:5, Insightful)
Objective-C is what C++ could have been if they had done it right.
No, there is no real way of objectifying C well, because C is essentially a low level systems and high performance macro assembler, designed for people who want to and need to care about the underlying system. Now, C# is a fairly good language with C-type constructs,and Java is ok-ish, but they are managed languages more abstracted from the underlying hardware.
Objective C is an attempt to mix macro assembler with the beautifully pure OO language that is Smalltalk, giving the advantages of neither.
I did like Objective C when I first learnt about it, about 16 years ago. I was a teen and my knowledge of languages extended little beyond BASIC, C, C++, Forth and a vague understanding of LISP. I craved something fit for a more high level purpose. Objective C is an experimental half way house which has been hanging around because C++ is so bad and Jobs happened to run NeXT, but it's no pleasure.
Re: (Score:2)
I thin
Re: (Score:2)
Rather than giving good feedback on your code functionality Obj-C fans just go on about how your coding style isn't the one true way.
This is true. I've found that programming for the iPhone a lot revolves around following convention and best practices. I kind of like it, maybe because I'm not that experienced a programmer, it provides a guide.
Don't believe that they'll have you coding iPhone apps in a week. Even as an experienced programmer it took me longer than that to get familiar with Obj-C and learning Cocoa and Touch is more involved than they tell you. XCode is okay but stay away from their crappy Interface Builder as it's a complete waste of time.
Check out the free Stanford course [stanford.edu] on iTunesU (videos available through iTunes). It's probably the best resource out there if you've already got the necessary background (some C and OOP and design pattern knowledge). You won't be doing The Next Big Thing in under a week, but you can be creating simp
Re: (Score:3, Insightful)
Objective-C is what C++ could have been if they had done it right.
You are kidding, right?
Re: (Score:2)
Re: (Score:2)
Objective-C is what C++ could have been if they had done it right.
ObjC and C++ have totally different design goals.
For some applications I would prefer C++, for others ObjC. They do not compete against each other, but rather complement.
Why in the end the Objective-C++ was actually born: to get the weak typing and messaging where it is needed it - without loosing compile time binding and strict typing where it counts.
Re: (Score:2)
Why in the end the Objective-C++ was actually born:
"Just because you shouldn't, it doesn't mean you can't."
to get the weak typing and messaging where it is needed it - without loosing compile time binding and strict typing where it counts.
With the utmost respect, anyone who fails to recognise the very basic difference between `loose' and `lose' is unlikely to have a proper appreciation of when (or indeed whether) weak or strong typing is needed. As for "messaging", well, just because Objective C calls it a message and C++ calls it a polymorphic method call, it doesn't mean there's a relevant difference.
Re: (Score:2)
The difference is very relevant. I don't think there is a nice way to pack the kind of information that exists in a NIB file using C++ as a development language. Certainly none of the Microsoft, Gnome or KDE designers have done it. You basically have to specify all the callbacks by hand somehow in the interface file in C++, and compile the interface in. Compare this with the NIB+objective C way. The NIB file contains your whole interface, you can change almost everything about the appearance of your applica
Re: (Score:2)
What on earth has the availability of an interface builder app got to do with language function call semantics / implementation differences?
Are you trying to imply that something about C++ means that I have to recompile backend.cpp when I change the implementation detail of called routines in gui.cpp, even when the interface remains the same? Because no.
Re:We get it already (Score:4, Interesting)
As for "messaging", well, just because Objective C calls it a message and C++ calls it a polymorphic method call, it doesn't mean there's a relevant difference.
You apparently never tried to implement the message passing in general or in C++ in particular. I unfortunately did.
It all boils down to the trivial problem: given an object, one should be able to call a random method on it.
C++ forbids this due to strict typing and compile time binding. Nor you can't represent a method as a variable. Nor implementing hundred/thousands of abstract classes is practical or sane.
Objective-C has that as a feature. Selector is a basic data type.
You can queue up selector/object pairs in ObjC for later calling - you can't anything close to it in C++. Thus no native messaging in C++.
P.S. One can implement that also in C++ - see all the insanities TrollTech had to go into to do it in Qt. They use strings to identify methods during compile time, create class vs. method tables during link time and during run-time perform look-up on the table to identify the method's entry point.
P.P.S. loosing v. losing. Give me a break, man. It's Monday. I obviously meant "losing".
Re: (Score:3, Insightful)
It all boils down to the trivial problem: given an object, one should be able to call a random method on it.
C++ forbids this
What you might mean is, "I can't build up a random method call at runtime in an ANSI standard way". Your "trivial problem" is soluble at compile time, as is the intention for statically typed languages.
Nor you can't represent a method as a variable.
However "insane" you like to think it is, a method isn't a variable (although you can indicate a particular non-static method of an class in a variable using member function pointers). You probably want to use an pointer of abstract base class type, i.e. interface. Why do you keep wanting to defeat static typ
Re:We get it already (Score:4, Insightful)
However "insane" you like to think it is, a method isn't a variable (although you can indicate a particular non-static method of an class in a variable using member function pointers).
Method pointers are bound to a class.
That means code need to know explicitly the interface to call a method.
You probably want to use an pointer of abstract base class type, i.e. interface.
When you get a pointer to the object of a base class, you can't upgrade it inside of the message dispatch - because that would require the message dispatch to know all the hundreds/thousands interfaces used all over the program. And that's simply impractical, most of the time impossible.
Why do you keep wanting to defeat static typing?
I'm not.
It was you who tried to indicate that the messaging is somehow implementable with polymorphism. And it is not. As you yourself point out between the lines: it is simply incompatible with static typing.
If you want to be able to queue random calls to /anything/, represented in some language-defined way as calls with all their parameters [...]
That what messaging often boils down to.
Constantly serializing/deserializing is way too expensive.
[...] you are probably looking for a completely dynamically typed language like Smalltalk.
The End.
P.S. I have tried to implement messaging in C++ at least twice. Once by serializing the calls, second time by trying to have interfaces for all used methods. First failed due to miserable performance. Second failed when more people were assigned to the project and it became impossible to maintain -in any sane fashion- list of used interfaces and dispatch code was constantly broken due to changes to the other parts of the software.
Re: (Score:2)
With the utmost respect, anyone who fails to recognize the very basic difference between having a unique online personality instead of typing "FuckingNickName" because they could not be arsed to come up with something unique is unlikely to have a clue or valued opinion about anything, at all, ever.
Or we could proceed the usual way where I don't know your name and I evaluate your ideas instead on their merit, and you don't know other peoples' typos because we're speaking to each other. Civilly. Like grown
Re: (Score:2)
Steve, is that you?..
Re: (Score:2)
Indeed - not to mention that there are plenty of other mainstream widely used alternatives too (e.g., Nokia).
Jailbreaking is NOT a solution. It just isn't.
Exactly. It's interesting that it's considered an acceptable workaround for a phone that's meant to "Just Work", and is supposed to be easy to use and good for non-geeks. The irony is that whenever a similar workaround is suggested for another platform, the Iphone fans ridicule it.
Then there's the point that when we get stories about Iphone viruses, peopl
Re:Apple is like... (Score:5, Insightful)
More like Audi/BMW putting a 250 km/h speed limiter on the car you just bought. Sure, you can go ahead and remove the limiter yourself, and why the hell not change the fuel mappings on the ECU while your at it? Audi/BMW will not support the modifications nor honor the warranty on your car, but there's nothing 'physically' stopping you from making the modifications. They are by no means obligated nor legally required to tell you how to circumvent their limitations and reverse engineer their software.
When an engine suddenly catches on fire doing 270 km/h+, or you suddenly loose control on the car, the last thing they want is for you to point the finger at them and say: "Well you technically allowed us to do this". They are just doing everything possible to cover their asses.
Look at Windows Mobile for a minute. Stock installs are actually quite decent. But when Joe Sixpack starts installing "Bubble Popper 2.0", and "FREE XXX PIX" on his phone, and the phone shits a brick, guess who takes the blame? Yeah, Microsoft and their "damn unreliable OS".
Re:Stop making apps, start making web-apps (Score:5, Insightful)
No thanks.
Personally, I hate web apps. They're still vastly inferior to desktop applications. They need a constant connection, are less responsive than a desktop app, are limited in the GUI they can have, work or not depending on the browser, and are in many cases outside of my control, which is excellent for lock-in.
There still are many places where I have no internet connection. It happens when travelling in the underground. It's frequent above the ground in a train in some areas. It's unaffordable when roaming. It doesn't work in the middle of nowhere. I find it unacceptable to lose access to my stuff just because I happen to be somewhere without a cell tower.
What we need is more open architectures, where anybody can make anything they want without interference.
Re: (Score:3, Insightful)
Hear, hear!
In addition to not being future-proof. I predict that any data will be inaccessible in a mere decade or two, and you can't just boot up a 15 year old and compatible version of your web app, e
Re: (Score:3, Interesting)
Not sure it answers all your concerns, but on the iPhone at least, you can package up a web-app so it installs locally. Then it's basically a local app that happens to be written in Javascript and render via the Webkit toolkit.
Web Apps Don't Work When You're Not Online (Score:5, Insightful)
Re: (Score:2)
Web apps designed in HTML5 can indeed work when you're offline. However, you're definitely right about being at the mercy of the website.
Re: (Score:2)
web apps are slow, lack features easily implemented in an exe and worst of all leave you even more at the mercy of some other asshole who could pull the plug on your app at anytime.
web apps have their place, but they aren't a solution to the apple syndrome.
Re: (Score:2)
this post wins fail of the day
But I still got modded +4 interesting at the moment, haha.
However, I do agree with you to some point.
web apps are slow, lack features easily implemented in an exe
True, however, with HTML5 things will improve. The current javascript engines are sufficiently fast to allow most complicated tasks. Further, there are technologies in beta which allow execution of machine code in the browser (in a sandboxed environment). See google code/labs (I forgot the name of that project). This may eliminate problems with performance of web apps altogether, in the (hopefully near) futu
Re: (Score:2)
Almost all web apps, by their very nature, are proprietary. How many websites make their source code available? How many websites can you set up on your own web server? How many websites allow you to opt out of software updates (i.e. updates to the website itself)? Such examples certainly exist, but they are few and far between.
Re: (Score:3, Funny)
Blame and credit alike... it does claim that if you read C code into memory, that it can then parse the read C, but I don't think it could parse the read C, which is why the java is being used for things it really doesn't want to be used for in Egypt to this day.
Re: (Score:2)
Wow dude man.