For those not familiar with Objective-C, the title is an allusion to manual memory management (i.e. no garbage collection).
When you are done using an object, you call its 'release' method. Once everyone using the object has called 'release' on the object, it is deallocated.
No, it's reference counting, which by definition is not manual memory management. Rentzsch used this notation because he, like most Mac developers, uses Objective-C, a language that has had any significant pertinence to the iPhone only for the last few months.
You manually count references. It has plenty of the downsides of manual memory management, even if some silly technical distinction puts it into the camp of the vastly superior (in terms of programmer time) solutions offered by higher level languages (Java, lisps, Python, Ruby, Perl, C#, etc.).
"but I can count the total number of software engineering advances they’ve made on one hand."
I was just thinking the other day about the number of software engineering advances they've made that I wish would fan out:
- Hypercard
- Dylan
- MPW Shell (thank god for BBEdit)
- Newton OS and its soups
- Interface Builder (as opposed to code-generating IBs)
- WebObjects (especially the tools)
Sure, that's barely more than a single hand (and is less if you rule out NeXT things), but really, how many software engineering inventions do you want from a single firm?
What have they done recently, though? Every last thing you listed is from the 90s. And if we're looking at the 90s, I can add to that list with things that actually stuck around: ColorSync, QuickTime, TrueType, AppleScript, and more all come from that decade. But while Apple has made innovative stuff on the UI front recently, I can't even name one damn thing they've done technically that I found that impressive other than the new JIT in Safari. Hell, I honestly think that what Microsoft's doing with .NET and Azure is more technically sophisticated and interesting than anything coming out of Apple--especially on the Cocoa/CocoaTouch/Objective-C front.
I think that's what's irritating Rentzsch so much, and to be honest, it really pisses me off, too. I don't mind if Apple doesn't want to be that innovative, but I take exception to being told that I'm not allowed to be innovative without their permission.
I was at WWDC last year, and I was surprised at how hardcore their compiler guys are. LLVM/Clang is some amazing stuff. And blocks have been in development since at least summer '08.
Well, if you want to talk about "hardcore compiler guys", I'd say that Google has the advantage since e.g. Ken Thompson and Rob Pike have chosen to work there and not at Apple. They are on the team that's developing Go and the pace of improvement is pretty remarkable.
Rob Pike et al also developed Sawzall, a massively parallel data analysis language that's been in use in production at Google for several years.
Google also developed a custom register based Java VM for Android. And they have a Java to JavaScript (!) compiler for Google Web Toolkit.
Microsoft is doing more interesting work than Apple in developing production ready languages with e.g. Haskell (Simon Peyton-Jones is an MS researcher), .Net/C#, F#, IronPython. But unfortunately for MS they are tied to the albatross of Windows...
LLVM/Clang might look impressive but AFAIK not even Apple is using it to ship production code. Proof of concept implementations are one thing, production quality shipping code is another.
LLVM/Clang might look impressive but AFAIK not even Apple is using it to ship production code.
Now, see, thats just plain wrong. Apple ships some pretty big products that are built with clang/llvm: Xcode, OpenGL and OpenCL to name a few. And I wouldn't be surprised if parts of iPhone OS4 are built with it, due to [REDACTED due to the NDA].
Plus, you know, MacRuby. Which is built with LLVM and just as fascinating as IronPython is, in terms of integration with a preexisting environment (only the Cocoa/ObjC runtime instead of .NET runtime). And the few people working on that (some of which are Apple employees) are pretty damn smart as well.
Well, if you want to talk about "hardcore compiler guys"...
I'm also going to have to disagree with you on this. Chris Lattner and the rest of the people working on LLVM at Apple are pretty damned smart (and serious) about what they're working on. They might not have as large of a neckbeard as Ken Thompson does, but, that doesn't mean that they should be ignored outright.
Grand Central Dispatch and C blocks aren't technically interesting. Lots of languages have lambdas--the .NET framework has since, what, 2003? including via an incompatible extension to C++, which is almost exactly like C blocks to C?--and GCD, while fascinating for the Mac platform, isn't meaningfully different from thread pools on other platforms, such as Windows NT 4.
clang is interesting, and I will grant you OpenCL. So there are a couple. But very, very few.
"Interface Builder first made its appearance in 1988 as part of NeXTSTEP 0.8. It was invented and developed by Jean-Marie Hullot, originally in Lisp (for the ExperLisp product by Expertelligence). It was one of the first commercial applications that allowed interface objects, such as buttons, menus, and windows, to be placed in an interface using a mouse. One notable early use of Interface Builder was the development of the WorldWideWeb web browser by Tim Berners-Lee at CERN using a NeXT workstation."
To be honest I don't care much whether they're actually innovating technically or not, but I do care that they are not picking up other peoples' technical innovations. Having to program in Objective C makes me feel like I'm back in the 90s again. It's embarrassingly bad.
I suppose it's ok compared to something like C++, but coming from a more dynamic/functional preference (Common Lisp, Python, etc) I find Objective C clunky (my experience is in using it on the iPhone, so I'm really talking about the language as it is implemented on that platform).
Reference counting memory management gets old and annoying very quickly.
I'm glad to see what are effectively lambdas in iPhone OS 4, but wow it took a long time to get them! And the whole iPhone api requires delegate objects all over the place, so without lambdas or the ability to make some kind of anonymous inline class your code gets spread all over the place.
I don't like header files, nor do I see the need for them in 2010 for crying out loud - I shouldn't have to write the same thing in two places just because a compiler isn't smart enough to figure it out.
Overall I would prefer to just use C, if I need something really low level, or something much more high level if I don't (Common Lisp, Python, etc). Objective C straddles the two places badly imho.
Verbosity doesn't mean long method names. In this case it means header files, primitive types vs objects, poor collection support, and manual memory management. Smalltalk does a better job in all of those areas.
Smalltalk has equally long method names in many places... see some of the morphic libraries... ObjC is basically naming type parameters. I don't think their (ObjC) collection support is too bad either. Also, the manual memory management only exists on the iPhone, if you use ObjC on the desktop, it comes with a garbage collector.
Which is what makes it even worse: it's not that they can't do it on the iPhone, it's that they won't.
And why not? It can't be a question of RAM: Java has had GC back when the average home computer had half the memory available in the first gen iPhone. Nor can it be efficiency: most GCs these days are efficient enough for the average program, and a developer can always disable it if she prefers manual memory allocation.
My point is that it just ends up being yet another example of Apple making arbitrary restrictions that do little for performance or user experience, while causing a lot of pain for developers.
The Newton had a garbage collector with NewtonScript. And the Newton only had something like 640kB of RAM.
Although, I don't think the decision to leave it out was arbitrary or solely to piss off developers. Apple likely wants to have as little code duplication as possible, so they're not going to add it in until the iPhone/iPad has enough ram free so that game developers don't have to worry about when the GC will run. My guess (and lots of emphasis on guess) is that this will happen when support for the iPhone 3G is dropped.
Yes more so than Smalltalk. While both have unary and keyword messages, the messages in the Cocoa APIs are generally much more verbose than their Smalltalk analogues. Compare insertObject:atIndex: from NSMutableArray with at:put: from Smalltalk's Array, OrderedCollection, and Dictionary.
Also, Smalltalk has binary messages (with which arithmetic is implemented), which allows for the creation of useful shortcuts like "," for collection concatenation and "@" for point creation. Smalltalk has precedence levels associated with its message syntax; Objective-C doesn't, hence [the [bracket soup]]. Smalltalk also has an extremely concise block closure syntax and no need for header and object files or special function/class declaration syntax.
sort of...webOS, or more specifically, the mojo and ares framework/tools provide javascript libraries for doing hypercard-like things. The Palm Pre's main UX is very hypercard-like. Now that Ares is out, ares.palm.com, you get nice visual dev tools similar to what you had in hypercard.
"But unlike previous issues such as the senseless iPhone SDK NDA, the majority of the community isn’t riled by 3.3.1"
I think the previous issues have already weeded the field. Those developers prone to righteous indignation have already left. There's another group of apple-fanboys who will never complain. There's a third group that are getting more and more fed up and looking to port to Android, but they're still working on iPhone apps because there's a market. Just like most issues, there's a big quiet middle between the two extremes that is too busy to blog about their ambivalence.
My understanding is that C4 was a Mac development conference.
So I'm not sure why section 3.3.1 of the iPhone developer agreement (and the absence of "outrage" from iPhone developers) should prompt C4 to release itself.
Microsoft doesn't have any restrictions like that for Windows. But Xbox 360 development is restricted, like other game consoles. You have to get their permission to sell games on disc or through Live.
What new features would you want? I still use Tiger on my primary machine and haven't missed much (outside of performance boosts) in the subsequent releases.
There was a time when Apple's ability to ship a new OS every 1.5 years was touted as one if proudest accomplishments. MS was insulted for years over their inability to ship a new version of Windows.
Desktop computers (and laptops) aren't advancing at the same rate that they used to. Also, OS X has matured tremendously since its reboot of the Mac OS. Maybe there's just not that much left on the queue.
When you are done using an object, you call its 'release' method. Once everyone using the object has called 'release' on the object, it is deallocated.
Depressingly poetic.