Creativity Finds a Way
Great observations
There is currently a nice little discussion on HyperCard going on in the comments on Stanislav Datskovskiy's article Why HyperCard had to Die:
The article looks at the right facts, but I think draws the wrong conclusions: Yes, HyperCard was an echo of an era where a computer was a complex machine, and its owners were tinkerers who needed to customize it before it became useful. Yes, when Steve Jobs came back, he killed a lot of projects. And the Steve Jobs biography mentions that he doesn't like other people screwing around with his designs.
But I do not think this automatically leads to the conclusion that Apple is on a grand crusade to remove the users' control over their computers. Nor does it mean what many of the commenters say, that Apple is trying to dumb down programs and that programmers are underestimating their users.
How people work
Every programmer knows how important a coherent whole is: If a button appears in the wrong context, it will easily (and unintentionally) trick the user into thinking it does the opposite of what it really does. You can add paragraphs over paragraphs of text telling your users the opposite and they will not read it.
This is not because users are stupid, but because users "scan". Screens are complex, and full of data. For the user to find something without spending hours of their life on it, they tend to quickly slide their eyes across the page, looking for words that come from the same category as the thing they are trying to do next.
This is a human efficiency optimization. It is a good thing. If we didn't have this mechanism, we'd probably all be autistic, and incapable of coping with the world. Once a word is found, the user starts reading a little bit around it to verify the impression that this is what they want, and then they click the button.
It seems trivial to engineer a program for that, but it's easy to overlook that a computer is not a single application at a time. There are other things happening on the screen, there may be other windows open. There may be system alerts popping up. Even if they are marked with each application's icon or name, chances are that most users are too busy getting actual work done to memorize application names and icons. They won't be able to distinguish what is your application, what is another.
Similar with haxies. Any halfway successful programmer probably has a story of how they tried to track down a crash or oddity a user encountered in their program that was actually caused by a plug-in or haxie that injects itself into every application to modify some behaviour system-wide. And once they are installed, even I occasionally forgot I had them installed. Or didn't expect it to have an effect; Why should a tool that watches when my cursor hits the edge of my screen and then remote-controls the cursor on another computer as if it was an attached screen cause the menu bar to just randomly not show up when switching between applications?
Software is complex. Designing reliable, usable software is complex. In a comment, Stanislav had a great analogy for this (in response to someone's pipe dream that one would just have to use HTML, and the technical stuff was all already done, you just had to add the human touch):
All the pieces of the world's greatest statue are sitting inside a granite mountain. Somebody just has to come and chip away all the extra granite, adding the human touch. The technical problems are all virtually solved!
Software is hard. I don't say this because it makes me sound cooler when I say I'm a programmer, but because you're not just building a thing. You are building behaviours. HyperCard was notorious for being the tool for the creation of a number of the ugliest, least Mac-like programs ever released on the Mac. Because even with the best camera, your movie is only as good as the camera operator.
So was Steve Jobs happy to get rid of HyperCard and stop people from screwing with his design? Probably. Was he forced to let it linger instead of killing it outright because he didn't want to lose the educational market? I can't disprove it. But Steve Jobs was also known to be an idealist. He genuinely thought his work would improve the world. What would he gain by making everyone dumb and uncreative?
Why assume malice when Occam's Razor is a much better explanation?
You can't hold a good idea down
When the Mac was originally released, it was intended as a machine for everyone. To bring computers to the masses. Almost from day one, the goal of Apple Computer, Inc. has been to drop the darned "Computer" from their name. Compared to the mainframes of the time, the Apple ][ that started the home computing revolution already was a "dumbing down" of computers.
Was this the end of the world? Should we have stayed in the trees? Will people become un-creative? Look around on the net. There are people out there who have no programming skills, who dig around in the games they bought and modify them, create their own levels, use existing game engines to create a game about their favorite book or TV show. Heck, there are people out there who create a 3D game engine in Excel.
If there is one thing we can learn, it is that Creativity Finds a Way.
HyperCard was designed in the late 1980s, for hardware of the time, for what very smart people thought would be the future at the time. Being creative with a computer, at the time, meant writing code. So they gave us a better programming language. Ways to click on a "Link to..." button to create the code to change pages. Not unlike Henry Ford's competitors would have built you a better horse, but not a car.
Yes, I am saying that the developers of HyperCard didn't quite anticipate the future correctly. They didn't anticipate the internet, for example. That's not a shame. It was '87 back then. I didn't get what the internet would be good for in '91. I probably wouldn't even have managed to invent a better horse. But anyway, all I am saying is that HyperCard's creators didn't know some things we know now, and probably made some compromises that wouldn't make sense now.
The world has changed: This is 2011! All our programs do so much more. You can create 3D graphs in Excel, colorful drawings and animations in Keynote, and upload it all to the web with Sandvox. So many tools are available for such low prices. Why would you bother with a low-level, rudimentary tool like HyperCard when all you want to do is a movie with some branching?
A new tool for a new world
After all that, it might surprise you that I still agree with everyone in the comments who says that we need a new HyperCard for the 2010s. However, I do not agree that any of the examples the commenters mentioned (or even HyperCard as it shipped) are this program. Yes, Xcode and the NeXT-descended dev tools, and VB and others use the Rapid Application Development drag-and-drop manipulation to lay out your UI. But guess what? So does Pages.
Yes, you can use Ruby and Python and Smalltalk to branch between different choices. Or you could just use links to move between web pages built using Sandvox.
Yes, you can build real, runnable applications from your work with Java or AppleScript. But why would anyone want to build an application? Movies can be uploaded to YouTube, web sites can be built with WordPress, and I don't have to transfer huge files to users. I just send my friends the link, and they know what to do. There's no installer.
Our computing world has become so much richer, so much easier, that it is more efficient and actually smarter to just create your stuff with those tools that we old HyperCarders see as dumb. They can stand on the shoulders of giants, and spend their time creating the best possible gameplay instead of coding yet another 3D renderer. That is why HyperCard 2.4 just won't cut it, or as David Stevens commented on that very same article:
most people get on a train to go somewhere, not because they really want to lay track, which explains the shortage of track laying machines in yard sales, and the demise of HyperCard.
The new HyperCard won't be like HyperCard. Maybe the web is enough. Maybe it will just be a good "web editor", like it used to be included in every copy of Netscape back in the day.
Or maybe, it will just be a niche product aimed at people who find that they want to do more than their tools let them do. This will not be the typical movie-maker, podcaster or writer. Like the directors, radio hosts or journalists in the generations before them, those will specialize. They will be exceptional at directing, making a show or researching the truth. But they will not care how the camera transports the film, they won't care how their voice is really broadcast as radio waves and re-assembled in the receiver, nor how to build a printing press.
The people a new HyperCard is aimed at will be a person like you, who saw HyperCard, and at some point stood up and said: This is not enough. I want to create more. And then maybe went out and bought CompileIt!, which let her use the system APIs from the comfort of her HyperCard stack, only needing to use the scary memory management stuff when absolutely necessary. And then went and bought MPW, or THINK C, or CodeWarrior, or Xcode.
A real programmer doesn't program because she wants to use HyperCard. A real programmer programs because she wants to. Because she just has to. A real programmer doesn't limit herself to that one language and IDE she learned once so she never has to learn anything else. A real programmer learns new programming languages because each language teaches her a new way of looking at or solving a problem. A real programmer has been programming since she opened PowerPoint for the first time. She will find a way.
It's been like that back in the days of HyperCard. Why shouldn't it be like that again?