I've been watching reactions to Apple's controversial decision to prohibit the publication of iPhone applications created in environments other than Apple's own.
The policy has a number of implications, including the fact that iPhone apps are prevented from running any interpreted code. For example, Apple removed the iPhone Scratch player from the App Store, because it runs Scratch apps (a popular programming education tool for kids).
One victim of the change is Flash. Adobe had created a Flash exporter for iPhone, which now clearly violates the new terms. Adobe has since scrapped the exporter. After a flood of angry responses online, Steve Jobs penned his own response.
Faced with Apple's decision, programmers have many choices. For example, Cocoa programmers will have to decide if Apple's policies sit well with them. If not, they may have to choose another platform, one that won't be programmable in Objective-C. Likewise, Flash developers who want to make software for iPhone will have to decide if they're willing to move over to Cocoa, or if they want to opt for another mobile platform like Android.
No matter what one chooses, and no matter how one reads Apple's intentions, there's something perhaps even more insidious going on among the programming public. Specifically, a large number of developers seem to think that they have the right to make software for the iPhone (or for anything else) in Flash, or in another high-level environment of their choosing. Literally, the right, not just the convenience or the opportunity. And many of them are quite churlish about the matter. (A few among many examples can be found in text and especially comments at 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11)
This strikes me as a very strange sort of attitude to adopt. There's no question that Flash is useful and popular, and it has a large and committed user base. There's also no question that it's often convenient to be able to program for different platforms using environments one already knows. And likewise, there's a long history of creating OS stubs or wrappers or other sorts of gizmos to make it possible to run code "alien" to a platform in a fashion that makes it feel more native.
But what does it say about the state of programming practice writ large when so many developers believe that their "rights" are trampled because they cannot write programs for a particular device in a particular language? Or that their "freedom" as creators is squelched for the same reason?
I wonder if it doesn't amount to an indictment of the state of computational literacy.
There are lots of types of computer platforms. There are embedded systems that have to be programmed in low-level or machine languages. There are scripting environments that sit inside commercial productivity software. And there are many in between. Part of understanding computation is understanding the differences between platforms—what makes them unique and how to consider and exploit those uniquenesses. Such is part of the goal of the platform studies project (see, for example, my and Nick Montfort's book on the Atari, Racing the Beam).
When I teach Introduction to Computational Media at Georgia Tech, I purposely force my students to work with a large variety of platforms. Some of them are familiar, like Java. Others are less familiar, like Inform or AIML. Some are downright unusual, like the Atari VCS. And others still are just plain absurd, like the esoteric programming language Chef.
I do this to force them to touch multiple platforms, each of which requires a different way of thinking about computational creativity. Inform, for example, inspires a different sort of work (interactive fiction) than does Processing (generative, abstract visual art).
I worry that we're losing a sense of diversity in computation. This seems to be happening at both the formal and informal levels. Georgia Tech's computer science bachelor's degree doesn't require a language survey class, for example (although one is offered as an elective). This year in the Computational Media curriculum committee, we've been discussing the idea of creating a history of programming languages course as a partial salve, one that would explain how and why a number of different languages and environments evolved. Such a course would explicitly focus on how to learn new languages and environments, since that process is not always obvious. It's a wonderful and liberating feeling to become familiar with and then master different environments, and everyone truly interested in computing should experience that joy.
I am not suggesting that Flash developers are lazy or stupid. But I do think that the reaction Apple's iPhone terms have inspired should tell us something about our collective attitude toward creating things with computers. And not something good.
The computational ecosystem is burgeoning. We have more platforms today than ever before, from mobile devices to microcomputers to game consoles to specialized embedded systems. Yet, a prevailing attitude about making computational creativity longs for uniformity: game engines that target multiple platforms to produce the same plain-vanilla experience, authoring tools that export to every popular device at the lowest common denominator; and, of course, the tyranny of the web, where everything that once worked well on a particular platform is remade to work poorly everywhere.
It is a kind of computational extirpation, where everything unique is crippled or cleansed in order to service a perverted belief in universality. I consider it a kind of jingoism, and I hope we can outgrow or destroy it.
This seems a tad unfair. Jobs' criticisms of Flash make a sound case for why they want to keep the runtime off of their device for technical power, performance and usability reasons. They could easily block all Flash runtime access to the device for these reasons and have quite a bit of sympathy from developers. But they didn't do that. They banned all middleware interpreted code and cross compilation to their device.
There are many concrete negative effects of that decision. First, if you were doing a start up to support cross-platform phone development middleware, you're probably screwed. I think it's pretty justified for those developers to be pissed about losing their jobs. Scratch, as you said. For that matter, it seems that apple only middleware companies providing software like analytics libraries could suffer too.
Lastly, I don't buy your argument that cross platform code is detrimental at all. I hate the fact that I can't run most PC games on mac, solely because they're all written to DirectX. It's just a pain. Are Braid or Facade cheapened by the fact that I can actually play them?
I don't understand how wanting to simulate one machine, virtual or physical on another constitutes some form of perverted jingoism. Does the same logic apply to the universal simulation of Turing machines? Apple chose to effectively ban one of the most fundamental ideas in computer science from being realized on their device. How many new, yet to be imagined platforms built on top of the iPhone has Apple written off wholesale. I am personally offended as a computer scientist.
To add yet some more examples: Open data kit aims to make medical devices and all kinds of other sensors available to developing countries by hooking up sensors to phones. That's something I'd really like to see cross platform support for. Maybe National Instruments could make labview available on mobile devices so scientists and engineers could test and deploy measurement programs more easily. Oh right, there's already an app for helping doctors monitor patients while they're away from the hospital. It would be nice if that technology could be standardized and made available to all doctors regardless of which company they choose to purchase their smart phone from.
I'm sorry I've been so long winded and mildly acidic here. I totally understand the value of learning to work on multiple different intellectually and creatively challenging platforms, but I don't think that's what this is about.
I won't disagree with what you've written, but I think the sense of "Apple is doing something wrong here" interacts with other senses of right/wrong as well, especially economic ones. It seems to have a strong flavor of what antitrust law calls anticompetitive "tying": attempting to use a strong position (not necessarily a monopoly) in one market, like the hardware market for smartphones, to gain market position in another one, like that for software-development tools such as XCode, by mandating a tie between the two markets that is intended primarily for that market-leveraging purpose rather than any inherent purpose. I.e. the suspicion is that Apple is doing this for evil-capitalist reasons, not for diversity-of-platforms reasons.
It can also have a sort of anti-experimentation angle, which is oddly the opposite of the view you take. While it's true that one thing it forecloses is the lowest-common-denominator frameworks, another thing it forecloses is people building their own ways of programming the iPhone. Now they can no longer write their own Scheme interpreter that compiles to the iPhone, unless they do it covertly (maybe they could write a Scheme interpreter that compiles to Obj-C, keep it quiet, and Apple would be none the wiser). It would be as if Sun had said that only Java may target the JVM, and nobody may experiment with new languages like Scala or Clojure that also do so.
While I think the "Flash is a Right" meme is a bit of a strawman in the argument, I think the reason the issue inspires such heated feelings is Apple has squarely chosen to take the iPad out of the realm of "computer." Not through technology, but policy limitations.
Compilation and interpretation go to the fundamental definition of a Turing Machine, and Apple has chosen to make a device that is non-Turing Complete by contract.
Obviously Adobe isn't a neutral observer here, and I agree that Flash is overused in the world, but HTML5 is not here yet, not in a real way, anyway. Either way, I think focusing on this is a missing all of the other tools that Apple has cut off.
Unity3D and MonoTouch are behind some of the most popular games in the app store. Lua is used widely by game companies (esp. EA) as a duct tape language within iPhone games. I agree with you that people need more language exposure, but the reasons go beyond the simple viewpoint expansion you argue for. They need it so they can choose the right tool for the job. Arguing that Objective-C is "always the right tool" -- well, Apple has mandated that you only get a hammer. Hope your app is a nail.
Mandating the restriction to supported APIs is not only a good idea, it is critical. The fact that Java, for instance, did this at the language level was a big step forward. Lord knows Microsoft has been hamstrung by this in Windows development for decades now. However, limiting tooling above this level ties the hands of developers. Whether you want the game-oriented development tools that Unity gives you, the rich Enterprise-y library availability that MonoTouch gives you, or yes, the designer friendly tooling and familiarity that Flash CS5 give you, those decisions should be yours.
The story of computer software is a story of abstraction. Apple, here, is specifically saying, "This far, no farther." I think that is a loss for everyone.
Finally, going to the emotion here, I think a lot of it is fear. The caprice with which Apple has enforces App Store rules is notable. "Soft porn is OK once there is parental controls." "No it isn't, 5000 apps canned." "Wait, Playboy, Victoria's Secret and SI can stay." Even this rule change is a BIG change coming seemingly from nowhere that affects thousands of apps already in iTMS that, I presume, can't update their app after the iPOS 4 update. If Apple wants to serve as arbiter of all things on the platform, but doesn't want to establish clear and reliable guidelines, of course they are going to generate animosity.
You are making an overall plea for diversity in computational environments. I support that.
(I don't particularly care about Flash, by the way.)
There is also something to be said for knowing lots of different languages and platforms.
I do think, though, that it is more valuable when platforms have major philosophical differences. Being used to program in C++ and Java, and then being forced to rewrite your code in Objective C isn't particularly interesting from that perspective.
In the concrete case, you are arguing that _restricting_ the tool diversity on Apple's mobile platforms leads to _more_ diversity overall. Isn't this a bit of a stretch?
The reason I find Apple's policy offensive is that it is plainly meant to prevent cross-platform development, just so Apple can retain their lock on the market. I.e. less diversity.
Also worth noting -- Scratch was a violation of the OLD terms since it was a code interpreter, however limited. They were skating the edge already. Sad, but I don't think that decision was part of this policy.
I think Apple's policies are deeply troubled. I've been afflicted by the App Store approval process personally and at some cost. But this post is not about Apple or the App Store. It's about what that situation might tell us about a broader
I think you underestimate the difference between CocoaTouch and other platforms; it's not just a matter of language. But more importantly:
In the concrete case, you are arguing that _restricting_ the tool diversity on Apple's mobile platforms leads to _more_ diversity overall. Isn't this a bit of a stretch?
I'm not saying that at all. I'm trying to tease out a broader situation and attitude about programming computers via Apple's new policy. But I think you also miss the point about platform specificity. The iPhone is not just a device, it is a device + the modified residue of NextStep. It's certainly true that Apple is trying to prevent other, new interpretations of the iPhone platform, but that doesn't change my fundamental observation, which is about an overall attitude toward computational creativity (diversity writ large) rather than one limited to Apple.
I'll admit that tech startups are not at the top of my list of charity concerns. Still, they can write code for multiple platforms, yes? It's been known to happen for, hmm, the entire history of the software marketplace. Nobody's going to lose their jobs unless they bet the company on a single environment that can't ever target any platform natively. Which, if they did, rather supports my point here about an overall attitude toward computation.
I don't buy your argument that cross platform code is detrimental at all.
I never made that argument. I retold Apple's argument. I don't necessarily buy it either.
The jingoism isn't about Flash or Apple; it's related to a general attitude that computation is one big, monolithic apeiron entirely accessible from every other point within it.
... another thing it forecloses is people building their own ways of programming the iPhone...
I suppose if someone built a system that compiled to iPhone bytecode then it would be indistinguishable from one created by Apple's tools. But it's clear that such a practice would be undertaken at some risk.
I suppose one could argue that Apple's policy is actually encouraging experimentation on other platforms! But that's also rather at odds with an attitude that everything should be done in Flash or HTML+CSS.
I'm not a developer, so I can't speak to whether that group is acting excessively churlish, but I do think there's a more important issue that has emerged as a result of apple's decision to block flash. Apple is, for better or worse, increasingly the only game in town. That means that the decisions Apple makes about what will be 'permitted' to run on its iphones, ipods, ipas, and macbooks--and about what will be 'permitted' to be distributed via itunes--affect whose stories get told, and how, and when, and why. Developers are mad that they can't make flash-based stuff for the ipad, of course, but users are also outraged because they've been shown just how much power Apple has to decide how we engage with the mediated world.
And while developers may not have a 'right' to build stuff for the ipad, consumers have the right--the absolute, fundamental right--to resist when corporations try to dictate communicative practice. And we are resisting, thank christ. Whether we have any power to effect change in this respect is questionable, but when it comes to decisions about how to market our lived experience, resistance is never futile.
Ian: "it's related to a general attitude that computation is one big, monolithic apeiron entirely accessible from every other point within it."
While I certainly understand that this isn't true, I am just not sure I buy that this is something to be accepted. Again, going to the Turing Machine definition of a computer, a computer is a device that can (possibly, anyway) execute code created for any other Turing-Complete computer. In a way, what you care decrying is the definition of computing.
From a realistic standpoint, limitations on devices certainly don't make this a reality, but the industry has worked for years to make it closer to reality.
Obviously apps need to be targeted to devices with different input mechanics, but I am just not sure that is relevant.
Also re: "But that's also rather at odds with an attitude that everything should be done in Flash or HTML+CSS."
I guess what I don't see is anyone (aside from a few well meaning zealots) arguing that Flash is the One True Way, or HTML5 is the One True Way (though Jobs made that argument himself for a year on the iPhone). The point is, while I think most people agree that people need exposure to other languages and other platforms, artificially restricting developers does violate something that feels like an unspoken rule in the world of computation. While there are certainly many platforms in history that, by default, only had one development toolchain, I can't think of another than mandated it. Even in the most strident days of Nintendo's platform guardianship, they never rejected a Quake Engine game because it was written with QuakeC.
Honestly, I think a lot of people would be more accepting if Apple simply said, "We are going to make a subjective call on the quality of your app, so don't give us shovelware Flash games," than with the current policy. That was Nintendo's stance and people grumbled, but it was never like this.
"This year in the Computational Media curriculum committee, we've been discussing the idea of creating a history of programming languages course as a partial salve, one that would explain how and why a number of different languages and environments evolved."
Sounds AWESOME! I once asked that the Design + Technology department at Parsons offer a course like this, not realizing at the time how antithetical to the curriculum it would be.
Apple is clearly exerting very tight control over their devices. It's not the first time this sort of thing has happened (Nintendo has always been much worse, but nobody notices ;), but it's perhaps the first time it's gotten considerable attention. I don't disagree with anything you've said, but I also don't think your comments contradict my points about computational literacy (nor do I think you meant them to).
The old "it's all just turing machines" argument is a reductionism. It should be very clear that the principles universal computation don't mean that individual implementations of computer platforms don't have unique features that cannot be reduced to the operation of turing machines.
The book Nick and I wrote about the Atari VCS offers a concrete example of how surprising and far-reaching these platform influences can be.
I don't see is anyone (aside from a few well meaning zealots) arguing that Flash is the One True Way, or HTML5 is the One True Way.
This must be true. However, there are a lot of folks who do seem to believe that whatever they want to do ought to be doable in Flash or HTML5 or (whatever). It's this attitude that interests me most.
On the Nintendo/QuakeC point, again, even though all the comments here seem to think otherwise, I'm not arguing that Apple's position is correct or desirable. I'm arguing that it underscores a different, less visible attitude about programming in general, which is worth questioning. One could argue, for example, that the very use of a game engine attempts to flatten the expressive space of computation, to make the Nintendo device (or any other) a viewer for the Quake platform, rather than to treat it as a unique device on its own terms. One isn't necessarily "better" than the other, but it is certainly worth understanding their differences.
Thanks for the positive feedback. Curious: why would such a course be antithetical to the Parson's curriculum?
If I can be frank: the history of programming software has always been built on a series upon series of "forced choices". Apple's constraining policy of "you either code in Objective C first, or you can't do anything", is the latest exceptional forced choice out of an almost infinite number.
I think Ian's point here is one of attitude presupposing policy. If a somewhat ludicrous policy like Apple’s comes into play, this limitation should spawn a creative solution which exploits the structure of the forced choice, rather than a static demand for a solution.
Ian: "However, there are a lot of folks who do seem to believe that whatever they want to do ought to be doable in Flash or HTML5 or (whatever). (snip) One could argue, for example, that the very use of a game engine attempts to flatten the expressive space of computation."
I guess I don't see people arguing everything should be doable with every language/API/whatever. And you are right, when you make a choice, you eliminate options. I think the point here, though, is that for a great many applications, narrowing the expressive space doesn't matter if the language/API/whatever encompasses what you need.
HTML5, for instance, is still a great technology, but it is targeted at traditional WIMP-y computers. There is no direct support for multitouch gestures. There is still not even a cross-platform idea of a context menu. Still, the lesson of HTML3 and 4 is this: even when you narrow the UI space down to layout and 5 widgets, you can cover the space needed for a majority of applications.
I think people who want to use Appcelerator, Unity, Flash or (to a much lesser extent) MonoTouch understand that they are making a choice that might limit them. However, Apple's argument that apps built with these technologies diminish the platform seems to work under the assumption that every app needs, or should, take advantage of every feature of the platform.
You may well be right that, particularly in the Flash world, there may be a lot of people who have learned ActionScript and just don't give a damn about learning a new language and having to do their own memory management. I just look out over the whole of computing space and am amazed at the choices we have today. Selection of a toolchain best suited to your problem is part of the new problem space. Certainly this means you have to play with a LOT more languages and technologies than you did when I was first getting into the field. Still, I value that as a developer.
I find this bizarrely contorted. How can you not see that the outrage stems from the very thing you're attempting to champion? Apple has "crippled or cleansed in order to service a perverted belief in universality," yet Flash developers are the bad guys?
This isn't about an unwillingness to learn Cocoa or Objective C (although the latter does have a couple of nauseating qualities). This is about being able to pick the right tool for the right job. Yes, most of the time that will be Apple's tools, but now we'll never know if certain components or apps could have been more elegantly derived by other means. That's "computational extirpation".
(To some extent it's also about timing. If making this change one week before CS5 shipped isn't a gigantic middle finger, I don't know what is.)
It's not about rights, either. Of course it's within their rights to do this. Just as it's within my rights to focus on Android instead.
It isn't so much that it is there right to write in any language for any platform. However, give that the technology already works it seems more churlish for Apple to ban it than any complaints lodged against Apple.
Potentially restating your thesis: The gentrification of computation literacy leads to suburbanization of platforms / platform studies?!?
Computational literacy is just a red herring. The real issue is one of effort. Apple's policies make it more difficult to develop cross-platform applications, which means unnecessary extra work for developers. If people were only developing apps for the iPad/iPhone they'd use Objective C like everybody else and would not complain.
It's rather arrogant of you to reduce criticism of Apple's policies to a matter of illiteracy and entitlement. You couldn't be more wrong.
Good points. On a related note, it's interesting to note that its relatively uncommon to hear people complain about having to work in C.
Good points too. I think the fact that we've come to take APIs and high-level environments as a primary or even a sole way of thinking about creativity on computers is an interesting corollary to all this.
I'm arguing the opposite: the outrage stems from a desire for sameness, to be able to use one universal or near-universal approach to most computational problems. Apple's making a (very strong!) claim for exceptionalism in their platform. The quote you attribute to Apple wasn't meant to point to Apple at all, but to an overall attitude.
Interesting. I fear I'm getting lost in your urbanism metaphor, though. Rephrase?
Sigh. I've reduced nothing to anything else. I've used Apple's new policy as a lever for a different conversation. Unfortunately, nobody seems capable of seeing past Apple to the bigger picture. For example, consider this question: why *should* developers get to make cross-platform applications "easily" in the first place? An intriguing question worthy of some consideration. Not that I expect you to do so.
It's definitely churlish, no question. But I'll say once more: I'm less interested in Apple's policy specifically, than what it suggests about the broader programming ecosystem.
Why should developers get to make cross-platform applications "easily" in the first place?
1. Because it saves money.
2. Because it's technically possible.
3. Because artificial restrictions against it are essentially anti-competitive.
No developer has a legal right to force Apple to allow their products into the App Store, but that doesn't change the fact that ease of portability is generally a good thing for developers.
Adrian, I think you believe that I'm posing a pragmatic question when I'm actually posing a philosophical one. I'm well aware of the pragmatic benefits of cross-platform development. But what does it mean that we *expect* cross-platform development to be easy? To me, it suggests that we culture an attitude toward computational universality, rather than platform exceptionalism. An interesting situation, and one I think worth pondering in further detail.
I hate flash, and have for a long time.
I'm an Oracle guy.
Oracle requires flash for its online support (there are caveats to that statement, but true for general mainstream use, and it certainly raised a stink when they forced the issue).
Therefore, I have to use flash as an end-user whether I like it or not.
I find it odd that Steve and Larry are next-door neighbors, and this restriction exists. However, this makes sense in the context of Steve arrogantly deciding that this is a for a consumer product, not a business product. Which seems odd to me due to the overlap by end-users. But far be it for me to judge Steve's view of marketing, he's obviously way better at it than me, maybe he's going to create the market first, then extend to business users - I remember before the IBM PC, helping MBA's with their regression analysis programs on their Apple ][.
But you mention "writ large," and I think you are missing something in your position. That would be, in the larger computing environment, we need to be able to go anywhere and run anything. I think it is excellent that you make students use different paradigms, they really need to know that. They must also understand the limits of re-inventing wheels, and understand that standards and cross-platform compatibility create a whole larger than the sum of its parts.
Larry a while back was touting the idea of database as a utility, and that concept really transfers to a larger view. Think of what URL stands for - UNIFORM. Can you imagine what a hassle it would be if you had to use a different electrical plug for every device in your house? Bad enough trying to keep the chargers straight... you'd think we'd be able to set the laptop or phone down and have it charge by induction already.
Joel, you're right of course. I'm not suggesting total anarchy. Yet, sometimes the desire to avoid wheel-reinvention can blinder us too.
Flash isn't a right. Interpreted languages (python, lua, or just custom-made languages) are. This policy sucks.
OK. Putting practical concerns aside, it is certainly unfortunate there are any programmers who would dismiss a platform simply because it doesn't support their choice of development tools or because it doesn't bend over backwards to allow for cross-platform compatibility. I just don't like it when the reasons are artificial rather than technical.
I think software portability is a good thing, but I agree that platform diversity can also be a good thing as it avoids monoculture and is often the source of important innovations. I just prefer it when cross-platform tools coexist with platform-specific ones.
Note to Apple, Adobe, MS, et al. Having developers code for your platform and having consumers lap them up like pigs is not a "right". It is a privilege for which you should be grateful, unless you seriously intend your platform to never grow beyond the ideas of your one company. Rigid structure inevitably loses to synergy over time.
Although I'm no fan of Flash (OK, it's a good 'off' switch for all the blinking crap I don't want to see on the web), I'd defend the right to use it.
This right was granted to us by Alan Turing himself and derives from eternal law: the law of Turing-equivalence. http://en.wikipedia.org/wiki/Turing_machine
In oversimplified terms, any program that can be written in one language can be written in any other.
If Apple doesn't want certain programs to be written or run by users, they will have to prevent their systems from being Turing-complete (although they could just make them a crappy user experience). There is some evidence that this is exactly what they intend to do: make devices with no keyboard that are so limited that you can not write code on them.
IMHO, any device that fails to be Turing-complete is not a 'computer' and should not be advertised as such. It is instead something closer to an Etch-a-sketch.
I agree with you. Developers shouldn't consider cross-platform a right. If we think about it Microsoft and developers did just fine building native applications for their Windows OS for years. But as you also state in your article, there is a right "tool" for each situation. But what Apple is doing is telling you that you can use only "this" tool. Even if your application in terms both cost and presentation would be better handled by something different they wouldn't allow it. Obviously Apple can do with its platform whatever they want. But consider this example. I'm building a native application for the iPhone but I require the use of Silverlight to use Microsoft Smooth Streaming technology and deliver the best possible streaming quality to my clients.
This scenario is simply not possible. Now the question I ask you is this. I've got limited resources and time. Do you really expect my company to put the resources into building a technology like smooth streaming?
I'm one of those who complain about working in C :-) I've long maintained there are so many better ways for application development, but I don't complain too much for a similar concept to what I stated above - there's a huge pool of talent because that's the mass production of the education system, for good or ill.
I used to maintain that if one learned the various types of languages and the related concepts that one could easily learn similar new ones. I reneged on that position when I hit C, as an inappropriate use of language for business applications. Since I'm not in academia or publishing, I've never had any illusion that I could be anything but a lone voice in the wilderness on this (well, I know others say various bits and pieces on places like comp.risks). But one never knows with memes.
Subbing in another of your terms: "The gentrification of computation literacy leads to the demise of platform exceptionalism."
Returning to my suburbs analogy, if every suburb in the country has essentially the same feature set (tracks housing, standard franchised businesses, etc.) and user interface (typical sprawl-esque planning), then what is the point of really living in any of them?
Similarly if the majority of applications were developed as cross-compiled monoid, then what is the point of choosing one platform over another?
I didn't start out with this point, but I'm arriving at the argument that while "Developers" argue that closed platforms limit their freedom of choice, ubiquitously neutering by allowing cross-compilation actually limits the choices of the end-consumer.
I, and I am realizing that I am in a minority, want platform to be a meaningful entity and an introspective choice. I acknowledge that this limits the content that may or may not be made available to me on that platform, but it also likely means that I am willing to pay more for content that validates my snobbish platform preferences.
Apple, Nintendo, etc. are built upon the ideals of boutique economics which goes hand-in-hand with their need to maintain Ian's more academic concept of platform exceptionism and his lightweight survey of the landscape of computational literacy.
But what does it say about the state of programming practice writ large when so many developers believe that their "rights" are trampled because they cannot write programs for a particular device in a particular language?
Instead of "cannot" I think you mean "may not." Developers can write apps in Flash and port to the Flash-free runtime of the iPhone; they're just contractually prohibited from doing so. It's Apple's legal prohibition that many find irksome.
Of course, we don't have a legal "right" to run Flash; such a law doesn't yet exist. When people talk about their "right", they are saying something different. They are saying that Apple's decision is wrong, offensive, and dangerous to the software development community. And they are saying that they are going to fight Apple's decision and that they are going to let potential buyers know about Apple's policies and their implications.
And if you think that "the computational ecosystem is burgeoning", you're quite mistaken. The diversity of programming languages and ideas today is far smaller than it was 25 years ago, in part because of monopolistic and abusive policies by companies like Apple and Microsoft. There is nothing in XCode, Objective-C, or Cocoa that we didn't have in the 1980's already, and actually in better form.
And it is bizarre for you to argue that Apple's move to restrict the kinds of programming languages people can use to three 25 year old languages somehow increases diversity or contributes to innovation.
Microsoft and Apple have held back innovation and killed diversity in programming languages and operating systems and condemned us to sticking with tools that were obsolete decades ago. Programmers really need to take steps against that. A good start is not to develop for Apple's platforms, and instead pick platforms that do not impose such restrictions. And another good thing to do is to complain loudly in terms that non-programmers understand.
I certainly am telling people about all the software they can't run on iPhone and iPad because of Apple's restrictive policies. And you know what? They get it.
Overall I think the main point is exaggerated for the purpose of the post. I have yet to see many people discuss the topic in terms of "rights". But assuming there are those who think that way, they might have a point. Read any article on the topic and substitute Microsoft for Apple and Bill for Steve and see if your attitude changes at all.
I have a vague recollection of the investigations into Microsoft and how they were using their power to kill off others. Seems like one point was how it was unfair to have IE pre-installed in their own OS. At that same time Apple was playing the act of the little guy who was just trying to survive and needed protection. And I do think that wasn't an act and they did face some serious issues in the market.
I hardly think Apple is that little guy anymore and in fact they dominate some markets. In some ways I think they dominate like Microsoft did. Yet for some reason they are able to still use that little guy attitude in order to promote their ideals and agendas.
It seems like the Apple market is large enough where it should be looked at with the same scrutiny as IE was in the past. If Apple shuts that market off to certain developers it's exerting a large amount of control over a market. Is that much different than IE being bundled with Windows?
In legal terms it's probably very different and my argument holds no merit. But in perception I think it's similar. For these people who think they have a "right" I bet the perception is that Apple is abusing it's power over the market without a respectable reason. Remember, the key to much of this appears to be more about drama and personalities and profits then about what the official statements are saying like technology or what's best for the web.
Apple and Microsoft both seem to have a lot of power in the marketplace. Yet it seems that Apple is free to exert that power without repercussion while Microsoft is never allowed to take any hard stand at all. Isn't Microsoft getting grief now for trying to end support for IE6 and XP?
When someone feels they have the "right" to do something it usually means they feel that someone is bullying them. Why is it that Apple has some kind of Teflon defenses against being seen as a bully for nearly everything they do?
I think the attitude you're describing is a natural outgrowth since the iPhone platform straddles the fuzzy border between embedded system and general-purpose computer. General-purpose developers are bringing their expectations where they may not be appropriate.
On systems that are more unequivocally "embedded", no one thinks twice about tight controls by the manufacturer, e.g.: Would you want a service technician to modify the program on your office building's elevator system controller using anything but the manufacturer's development tools?
FWIW, I think Apple has made a wise business decision, and as long as there are competitors to remind customers that the Apple way is not the only way, there is no great harm to the public good.
Are you serious? Forcing me to use one of three programming languages from the early 1980's (C, C++, Objective-C) is supposed to create "diversity in computation"? I don't want to use Objective-C, not because I'm afraid of learning it, but because I stopped programming in it two decades ago: it's an obsolete programming language that should have been retired long ago.
And who are you kidding? Apple isn't doing this because they want to drive innovation or diversity, they are doing this because they want to leverage their current lead into a monopoly; Jobs is like Gates. Let's hope for the sake of "diversity" that he is overplaying his hand.
And you're right: we don't have a legal "right" to program the iPad and iPhone in Flash or anything else. But we do have a legal right to complain about Apple's restrictions, to give Apple tons of negative press, and to explain to everybody who will listen what they are missing out on and what a threat Apple poses to the future of technology, free speech, and computer science. We also have a right to get politicians involved and get Apple investigated for violations of fair business practices.
Yup, I'm worried about computational literacy and lack thereof. It's particularly worrisome when people like you teach students and present something like Objective-C as "diversity". I suggest you go back and look at the rich and diverse landscape of programming languages and computational paradigms that existed in the early 1980's, and then compare it with the barren landscape that Microsoft, Apple, and Linux have created today.
You speak about homogeneous user experience being a Bad Thing.
If you're an app developer on multiple platforms it behooves you to NOT confuse your customer with wildly different interfaces between platform-specific versions of your app. Minor variances for platform compliance? Sure. Completely different?
While maybe not ALWAYS the best (or even "a good") choice, middleware like Flash can help provide this sort of continuity of user experience.
Decrying it because "it provides the same plain vanilla experience" is both ludicrous and shows an (at best) facile grasp of the problem.
I really don't see the point here with people disagreeing with Apple's chosen path. If you want "in" on their game, then play by their rules or find another game. How much simpler can it be? Stop acting childish and get on with more productive work.
Apple is, for better or worse, increasingly the only game in town.
The only game in town, with an 4% market share in the desktop OS market and a 25% market share in smart phone usage? How do you figure that?
And while developers may not have a 'right' to build stuff for the ipad, consumers have the right--the absolute, fundamental right--to resist when corporations try to dictate communicative practice. And we are resisting, thank christ.
Considering that Apple has sold 78 million iPhones and 1 million iPads, all sans Flash, it doesn't seem like consumers--at least the ones with no ideological axes to grind--are resisting very hard.
Apple says that I must use a particular development tool. I cannot translate code...
1 - No lex or yacc in my application.
2 - No other "little language"
3 - No higher order programming.
4 - Some severe restrictions in what I can deliver.
Of course, the "rules" are not necessarily applied -- after all, there is an HP 41 emulator available, with user programming allowed, and a Commodore 64 emulator.
Still, I am not allowed to write my code in Scheme or any other higher order language, and then translate (automatically, or manually) into Objective C for delivery.
How would Apple KNOW if that is the route I took? Maybe they will look at my code, and comment, "Gee, that looks too functional and recursive, I guess we have to reject it...".
Now, I will give Apple the "right" to disallow any app from the app store, for any reason. But this reason?
What if I prototype in Flash, and then send the source results to India to have it converted to a native app? Will that be disallowed?
I guess it will...
What if I don't bother to COMPILE the prototype, I simply code it, and send the detailed specs (including the code) to India instead: a "human compiler" will then do the necessary conversion. Assume the process works. It will STILL be disallowed.
In point of fact, the TECHNICAL reading of the restriction forbids a detailed specification... if that specification COULD be automatically processed. I guess that eliminates most formal design tools.
On a positive note, Apple HAS increased the value of Objective C skills.
Anyway, they have lost me as a developer -- I pretty much only use Scheme or Python for new code. Especially small games and utilities.
You don't even have the terms of the argument correct. No one is preventing developers from writing anything they want. I can write flash apps for the iPhone all day long, and no one is going to say word one about it. The right that is being withheld is the right of the USER to PURCHASE and RUN apps that were written in flash. That is an entirely different equation. Now you must ask yourself whose device is it? Who's experience is it? This has nothing whatsoever to do with what programmers are allowed to do and is only an issue of what users are allowed to do with their own property to exert control over their own experience.
As an 11-year ActionScript veteran, and a 1.5-year Objective-C developer, who's released high-profile productions on both platforms, I still think it's a greedy d*ck move from Apple to block publishing from 3rd-party tools. Also, as someone who codes in a number of other languages every week (.js, java, php, sql, chuck), your post doesn't make much sense to me as a defense of the Apple policy. Of course it's good to know lots of languages and understand different platforms! I'm not sure who's saying it's their "right", but rather think it should be their "freedom" if someone has built the tools in a language they're already proficient in. As for me, I'll continue writing in both.
I'm probably not going to be able to keep up with responses anymore now that so many people are arriving from Slashdot, but I'll try. Sorry they're getting shorter.
I think at long last we're saying the same thing, more or less.
Please see my comment earlier about why Turing universality is too reductionist a position for platform differentiation.
I like your angle. I guess that's no surprise.
At that point in the comments, I meant not to speak about Flash/Apple specifically. Hopefully that clarifies somewhat. I agree that a legal constraint is harder to swallow than a material one, but then again law is a material force as well.
I agree that the 80s were a veritable rain forest of computational diversity. But things now are much better than they were in the DOS/Windows era.
Good point about bullying. Again, I think it's interesting that they feel cornered, if we can keep the analogy.
I think the attitude you're describing is a natural outgrowth since the iPhone platform straddles the fuzzy border between embedded system and general-purpose computer.
Good point, it might be amplifying the issue.
Forcing me to use one of three programming languages from the early 1980's (C, C++, Objective-C) is supposed to create "diversity in computation"?
No, that's a separate matter. I'm trying to point to the way Apple's policy draws our attention to issues of diversity.
We're talking about two different things. You're talking about HCI-style user expectations. I'm talking about the messy reality of a multitude of computing systems.
That's true, although presumably developers like it when people can obtain and run programs they create.
My post isn't a defense of Apple's policy. It's an attempt, perhaps unsuccessful, to understand the implications of people's reactions to it.
This is matter of discipline. The discipline is enforced by the boss - and it is what Jobs does.
The moment you start writing public apps for iPhone you become, somehow an Apple employee. And as employee you have to follow the CEO's direction. If not you're out.
Just remember all the flame wars about C++ and OO in Linux kernel. Linus stood strong and today the kernel is still pure C. Guys here may range the C in the obsolete shelf but the truth is C still dominates the embedded world (cause an OS is a pure embedded application).
Jobs does what Linus did - keep/reinforce some discipline in the name of the industrial reliability.
The iPhone may be a toy for some of vexed kiddies above but for Apple is an industrial product it has to support. And support is something you, as customer want absolutely but do not care how expensive is.
I read Jobs letter about Flash an I found it absolutely reasonable. And I do not think he has to justify it. If Apple thinks is more efficient to support only a set of tools then developers have to comply.
i'd like to take a shit on steve jobs - stupid mother fucker.
"It's an attempt, perhaps unsuccessful, to understand the implications of people's reactions to it."
Unsuccessful -- for my part, yes.
I don't know what kind of response you were looking for by calling a large contingent of developers illiterate jingoists*, but I don't see how you could possibly have been hoping for intelligent discourse. If you really have a point to make, how about hunting down some concrete examples? Back your assertions up. How about quoting some developers who believe that they have the right to do everything in a particular language? Who are these people? Do they exist, or did you invent them? Do the legwork. Show us.
* But we're not "lazy or stupid"--thank goodness for small favors.
A few thoughts:
First, Flash is just a strawman. Apple's policies extend far beyond Flash, so making Flash the focus just detracts from your arguments.
Second, it's interesting that you make the common mistake of conflating rights and freedoms. You said:
But what does it say about the state of programming practice writ large when so many developers believe that their "rights" are trampled because they cannot write programs for a particular device in a particular language? Or that their "freedom" as creators is squelched for the same reason?
Their freedoms as creators are being squelched (or reduced, to use a less strong term). That is undeniable: if you make a rule that says "you can't do X" then you are reducing another's freedoms. Whether that person had the *right* to do that thing that you're forbidding is a separate, but easily confused, matter.
So while you might be right in saying that being able to develop for the iPhone OS in non-Apple-approved languages is not a "right", the fact that they won't let developers use other languages is reducing their freedom. This is especially true when you compare against pretty much every major platform in existence where the only limitations on what tools you can use are technical, not legal/contractual.
Finally, your argument about diversity and jingoism verge on the ironic. The diversity of programming environments for the iPhone OS is being reduced by Apple's policies. If you think about Apple's actual motives (rather than their stated motives) you'll realize that it's practically the definition of jingoism: "extreme patriotism in the form of aggressive foreign policy". They want to make it harder for developers to develop for multiple platforms, in the hopes that they'll only develop for iPhone OS.
I have a suspicion that this will backfire. The best developers don't use one tool/language to write everything. The best developers are fluent in a wide variety of tools and use the best tool for each particular job. Only mediocre developers use only one development tool/language, whether that's Flash, Objective-C, or whatever.
So now iPhone OS developers are hobbled by a lack of choice in what tools they can use while developers for competing platforms have access to whatever tools they want. Mediocre developers who use only Objective-C won't know the difference and will stick with the iPhone. The better developers have the choice of artificially handicapping themselves, or switching platforms. Either way, the iPhone OS platform will miss out.
"Just remember all the flame wars about C++ and OO in Linux kernel. Linus stood strong and today the kernel is still pure C"
Linus made a decision only for the project he maintained. If I don't like it, I can take the Linux kernel and fork it. I can also replace the Linux kernel with any of half a dozen work-alike kernels.
The analog of Apple's move would be for Linus to try and dictate what programming languages I can use to write applications on top of Linux.
"Jobs does what Linus did - keep/reinforce some discipline in the name of the industrial reliability."
Objective-C is a C derivative with an object system grafted on top of it. Its object system has horrendous type holes, and the entire language lacks fault isolation. Manual storage management is error and leak prone. How is that supposed to create "industrial reliability" compared to a safe, garbage collected language?
And--I have an iPad--it does not. The iPad OS leaks memory (at least with some apps) and needs periodic reboots, other apps already crash with regularity.
The idea that imposing Objective-C makes the iPhone/iPad reliable is a joke. Jobs is doing this to increase the cost of switching for developers. Apple is using Microsoft's strategy: if you can't win with technology (and Apple's technology is pretty mediocre), then win through marketing, legal, and business shenanigans.
"And I do not think he has to justify it."
No, he doesn't. In fact, Jobs should just shut up altogether, because every time he opens his mouth, a lie comes out.
Doing things in new and different ways just because they're fun is a luxury most real programmers don't have. If you introduce a new language into an existing, working project due to your yearning desire to broaden your horizons you should probably be severely reprimanded and / or fired.
Businesses who have supported the Flash platform because they knew these iPhone / iPad compatible tools were coming have now been screwed out of thousands of dollars in man-hours as well as real revenue lost from not being able to sell applications that were otherwise production-ready.
IMO anyone who doesn't find what Apple has done to be morally objectionable probably has a naive view of how long it takes to write code that actually works.
You're arguing here that a train company has the right to determine how I get to the station.
I'm seeing parallels between how Apple treated licencing out desktop development back in the 80's - and looking at Microsofts overwhelming market share we can all see how well that worked out for them.
Maybe I misunderstand but I thought Ian Bogost is really raising the question of why developers think they have a right to develop how they like. Is it some mythical notion of the Internet as being free knowledge and information for all?
Using Apple as an example creates a bonfire though because
a-people who have been following Apple religiously when no one else used them ..now follow Apple religiously even though some people use them and now are even more religious as they feel justified for their time in the wilderness.
b-some people now hate Apple because the very reasons they followed Apple (little guy, open access) are the very reasons they hate Apple (come on, it is an Intel with BSD that runs Office and even Windows!)
If I understand Ian's argument, platform diversity is fine, to sharpen tools and creativity. And language specification is fine to improve device performance.
Where I don't agree is when a company forces you to write code for their device which is not necessarily more efficient, and where (as said above) they control the use of the device once it is bought.
If you buy an iPhone or iPad or iTouch, don't you have the right to run on it what you like? Apple don't have to support it but to go out of their way to ban the right to purchase foreign software?
I know Java developers inside Apple who in 2008 were pushed sideways, out or made to change teams and skills. Because Java is not good enough? Or because it could not be controlled centrally by Apple? Flash may have issues with performance, but all other non-sanctioned languages etc? Really?
I've added some links to places where I've seen this conversation play out, in part. The comments in the first one are particularly indicative, I think.
It's not Flash developers who are jingoistic—it's much worse than that, it's everyone. Reread the last paragraph of the article and you'll see the net I'm really casting.
Forgive me this, but, one might say that your own obvious ire emanates partly from just the sort of universalism I'm trying to critique.
You raise important points about the business situation of software development. But, I can't help but think that very situation is a part of the scenario I'm describing, one in which we've worked our way out of the computational multitude and into expectations of universality.
As I've said a number of times in the comments here, Apple's policies are dubious at best, and downright destructive at worst. But maybe they are actually beneficial in at least one way, if indeed they can help us see some of the collective assumptions we've been making about how computational media ought to be created.
Hey everyone, thanks for reading and commenting. There are a bunch more comments over at the Slashdot post on this article.
I'm going to close comments here, since I don't have time to respond to all of them, and since I seem to be repeating myself over and over.
One final word on the matter: it was never my intention to comment one way or another about Apple and Flash. Rather, I think there are general lessons to be learned from the situation, which point to a homogenization of computational creativity, rather than a focus on a wealth of platforms and environments that operate uniquely.
Put more simply, computational media is not just content. It's also in the connections between software and hardware, constraint and creativity.