> (the Mac was done in 1984, 11 years after the more powerful and capable Parc Alto, which started working in 1973).
I find this comment mildly irritating. The Alto apparently cost $40k [1] in (or around) 1973, which was equivalent to $95k in 1984 (or $232k today). Compare with the Mac which cost $2.5k in 1984.
Sure, the Alto may have been more powerful, but if it was financially out of reach of almost everybody, it's no wonder it had a limited direct impact.
I wonder if it's a trend that Alan underestimates the importance of economics for technology to actually have an impact.
We understood Moore's Law, so it made sense in the early 70s to make a personal computer in 1973 that would have the power of much less expensive personal computers 10 or more years later. Why? Because the new SW and UI takes a lot of work to invent -- this allowed us to show Steve "the 80s" in 1979.
(Worth pondering this way of going about things.)
What was disappointing is that the market couldn't value personal computing, especially the general market. For example, the Lisa with its hard disk was really the better machine to be the flagship for Apple. In the early 80s it was priced at less than an average car -- ~$10K -- a mass market price if you could see that this would be your "information and speculation vehicle".
Instead, PCs could only be sold at consumer price levels, more or less as novelties and in business as spreadsheet machines -- and this forced the Mac to be much smaller (both in RAM and disk) so that it was more like a toy than a vehicle.
So, no, we (not I - I was part of a research community) did not underestimate the importance of economics. Note that my original Dynabook paper in 1972 said these would eventually be sold for the same price as color TV sets. But we did underestimate the ability of the early market to value computing.
> In the early 80s it was priced at less than an average car -- ~$10K -- a mass market price if you could see that this would be your "information and speculation vehicle".
A car is a major purchase and people keep them for years or get significant resale value out of them. Moore's law would mean Lisa would be rapidly devalued. For businesses that could justify it generating $x amount of revenue in that time it could make sense. For individuals at the time I'm not sure.
That's not to say people don't spend extravagant amounts on cars beyond the base level of needed functionality. I just can't think of too many purchases at that level that get obsolete so quickly.
The economy in the early 80s was in pretty bad shape as far as the average household was concerned, and computing wasn't a "thing" yet. There wasn't yet an internet, not in anything resembling the modern sense, and there were nearly no services designed to be accessed from a computer.
So for the majority of the consumer market, computers just didn't really offer $10,000 of value.
The business market was maybe a little bit different, but still, businesses suffered badly through the early 80s too, and a computer would be an R&D expense moreso than a business expense. It would be a very long time before computers could do things beneficial enough to businesses to justify high upfront costs (plus the costs of finding or training anybody to operate the things). And, computers represented quite a big change in operations, and as anyone that's done any kind of business consulting has learned first-hand, businesses hate that.
With all due respect to the people who are much smarter than I am, the folks that invent and pioneer technology so often view it through a very different lens than the people who would use it, and they really struggle to bring the technology to larger society. Dean Kamen is my favorite example of a brilliant guy creating amazing technology that nobody can afford or find the right use for. Someday somebody else might come along and give the world the same thing but much cheaper and in a different package, and it'll change a lot of lives, and there will as usual be people to say, "But Dean Kamen invented it first and did it better." Well, sure, but nobody knew about it.
Despite the flaws that his detractors are always eager to bring up, that was something I respected about Jobs (and a few other industrialists) -- the combined ability to understand new technology, visualize how it could be brought to a broad market, and then do it successfully.
It's kinda fun to imagine how things might've worked out if Apple had never visited Xerox. Maybe SGI would do it instead and Apple would be that company that made the boxes we all played Oregon Trail on. But, I'm pretty sure we wouldn't be using a Xerox OS.
>that was something I respected about Jobs (and a few other industrialists) -- the combined ability to understand new technology, visualize how it could be brought to a broad market, and then do it successfully
Jobs participated in a Q&A at the MIT Sloan School of Management in 1992, and one of the topics he discussed were the marketing pivots needed for the Mac and Next once he figured out what real world tasks those platforms excelled at that their competition did not. (desktop publishing and enterprise rapid application development)
Video of the talk has recently been made available.
Moore’s Law was working fine. What technologists could not predict was a Republican president imposing protectionist tariffs that made advanced computers prohibitively expensive in the mid-1980s.
Although consumerism was already in full effect in the 80s people hadn't been conditioned into anything like the crazy update cycle we're in nowadays. I suppose the business world may have been different (seems unlikely?) but at least in home/personal computing people weren't buying new computers every 2-3 years - people did keep them for years. Maybe technically they'd have been devalued but if you're still using an item and it's doing its job the resale value isn't a consideration (in the late 90s it was still common for people in eg. sales offices to be using thin terminals with green-screen monitors, despite Windows PCs being readily available).
I was able to buy an Apple ][+ with my high school job at a supermarket. The Lisa was amazing (when it came out a few years later) but obviously not affordable by me. ;) Nor the Macintosh when I was on military pay a few years after that. Maybe I wasn't their target market by then. :(
But I don't think the Lisa was right for the corporate market either - since it didn't have connectivity to the firm's mainframe or midrange. On the Lisa, you had a mouse port, parallel port, and two serial ports. The serial ports could have worked, but IIRC the SNA network stack needed wasn't on the Lisa (and Apple would have been quite hostile to IBM if they'd tried to add it). So firms would have had to share data via sneakernet, and that was the same thing they'd been doing with their Apple ]['s for several years now -- so the 10x price difference wasn't worth it, even if the UI was gorgeous.
There was also the matter of the installed base of machines the company might have had. The Apple ][ used standard Shugart 5-1/4" mechanisms, but the Lisa (originally) used their new FileWare "Twiggy" drives which placed the top/bottom heads on opposite ends of the diskette. So information couldn't be shared between Lisa and Apple ][ users.
There were machines like the Commodore C64 - and later to some extent the Amiga as well - that more fully embraced and valued computing in that they booted into an environment which allowed you to create programs right away.
With the C64 in particular there was no real distinction between the operating system and the programming language, the programming language actually WAS the operating system. The machine booted into an empty canvas for you to create something with.
I still find that idea fascinating. Absurdly enough, the closest we've come to this again afterwards is Microsoft Excel. A spreadsheet application today is the closest general purpose computing equivalent to 'an empty canvas everyone can use to create something with right away'.
Arguably modern windows has more programming abilities preinstalled with the OS: batch scripting, powershell, javascript (command line and browser), and C# (csc.exe compiler).
The difference is one of visibility and discoverability. The 70's and 80's were the time when user equated to programmer, mostly by technical necessity. The separation between those two concepts came later, and it's evidenced by the way modern windows hides the programming languages it bundles.
My memory on this is super vague because this was way back in the mists of time, but I think some old versions of DOS used to have a BASIC interpreter built in, where you could just start writing BASIC code at the DOS prompt.
I was disappointed that something better than Smalltalk wasn't on the Mac and iPhone (Smalltalk was truly wonderful in the context of the 70s, but we considered it just a step in a good direction).
Not understanding Hypercard was one of Apple's largest mistakes in the world of end-users. It was a real breakthrough in something that end-users could really handle and be usefully programmable by them. Besides not understanding its significance on the Mac, we (old Parc hands) pleaded until we were blue in the face to make HC the basis of a really good web browser (it was a great model of a symmetric author-consumer media tool). Missing the latter was a tragedy.
In the light of the first comment, we could then contemplate an end-user system that combined what was great about Hypercard, Smalltalk, and some other experience from the 80s (e.g. the use-cases from Ashton-Tate "Framework", etc.).
Inspired by HyperCard, we (old Sun NeWS hands) also pleaded until we were blue in the face to make HyperLook (a NeWS/PostScript/network based reinterpretation of HyperCard) the window manager / editable scriptable desktop environment!
>This is a discussion of ICCCM Window Management for X11/NeWS. One of the horrible problems of X11/NeWS was window management. The X people wanted to wrap NeWS windows up in X frames (that is, OLWM). The NeWS people wanted to do it the other way around, and prototyped an ICCCM window manager in NeWS (mostly object oriented PostScript, and a tiny bit of C), that wrapped X windows up in NeWS window frames.
>Why wrap X windows in NeWS frames? Because NeWS is much better at window management than X. On the surface, it was easy to implement lots of cool features. But deeper, NeWS is capable of synchronizing input events much more reliably than X11, so it can manage the input focus perfectly, where asynchronous X11 window managers fall flat on their face by definition.
>Our next step (if you'll pardon the allusion) was to use HyperNeWS (renamed HyperLook, a graphical user interface system like HyperCard with PostScript) to implemented a totally customizable X window manager!
>What is HyperLook? It's a user interface environment for NeWS, that was designed by Arthur van Hoff, at the Turing Institute in Glasgow. HyperLook was previously known as HyperNeWS, and GoodNeWS before that.
>Open windows with HyperLook
>HyperLook is an interactive application design system, that lets you develop advanced multimedia systems, via simple direct manipulation, property sheets, and object oriented programming. It releases the full power of OpenWindows to the whole spectrum of users, ranging from casual users who want a configurable desktop and handy presentation tools, to professional programmers who want to push the limits in interactive mulltimedia.
>You design interfaces by taking fully functional components from an object warehouse. You lay them out in your own window, configure them with menus and property sheets, define their appearance in colorful PostScript fonts and graphics, and write scripts to customize their behavior.
>You can write applications in C or other languages, that communicate with HyperLook by sending high level messages across the network. They need not worry about details like layout, look and feel, or fonts and colors. You can edit HyperLook applications while they're running, or deliver them in an uneditable runtime form.
>HyperLook is totally extensible and open ended. It comes with a toolkit of user interface classes, property sheets, and warehouses of pre-configured components with useful behavior.
> He brought the iPhone to me, put it in my hands, and asked: “Alan, is this good enough to be criticized?”. My reply was to make a shape with my hands the size of an iPad: “Steve, make it this size and you’ll rule the world”.
I find this especially telling after Apple's recent education-focused announcement[1], where Apple continues trying to push the iPad as an educational tool. Is this Tim Cook trying to continue Steve Jobs' vision of Alan Kay's vision? Or do Apple seriously believe the iPad as it is has a business chance in the cash-strapped education market?
My son's school has a Rubbermaid tub of iPads that they wheel from classroom to classroom on a cart as needed.
He was talking today about playing games on the PCs in the computer lab, and my wife asked about the iPads, and you know what he said? "Those are for doing work."
The most common place for me to see iPads these days is point-of-sale, everything from sales at small cafes to the front desks of car mechanics and childcare.
(The second-most common place is as a distraction device given to three-year-olds.)
I agree with your observations. I can't speak for Kay but I think his comment to Jobs about the size was based on the premis that the iPad would be a production device.
The percentage of users that use computing devices primarily for something other than content consumption is pretty small. I actually think an argument could be made that phones are used more for productive tasks than tablets.
But the deep point of the inventions of personal computing and the Internet was not primarily to make old media more convenient, but to "augment human intellect" by making the next qualitative media after writing and the printing press. The commercialization of these went after the lowest simplest properties, and this hid from most people what the computer is really about.
Computing has augmented human intellect, but right now almost exclusively for scientists, engineers, the medical profession, and other professionals. The real computer revolution will happen when the general public are able to boost their own intellects internally and to boost their reach externally with the help of computers.
Getting fixated on a poor problem can mask the problems that need to be worked on. A big one is "real education". Another is "real governance and democracy".
An analogy is to the Kennedy moonshot, which set back space travel more than 50 years. Real space travel has to be done quite differently than just relying on chemical rocketry, and the better ways to do it were swamped by the "stunt" to this day.
> Computing has augmented human intellect, but right now almost exclusively for scientists, engineers, the medical profession, and other professionals. The real computer revolution will happen when the general public are able to boost their own intellects internally and to boost their reach externally with the help of computers.
Isn't this already well under way, considering people's day-to-day use of phones and tablets for google searches, wikipedia, calendars, maps, and various 'helper' apps (for meditation, productivity, project planning, etc.)?
Where as both Steve Jobs and Alan Kay believed in the Dynabook, I find it interesting how starkly different their views are on the stylus. One of the first things Alan Kay had after seeing the iPad was try using a stylus [0], where as Steve Jobs was heavily anti the stylus [0]. So I can imagine a fairly interesting back and forth over the topic. But Alan has only ever mentioned the following two sentences:
In another exchange, I pointed out to Steve that 2 year olds and 92 year olds use their fingers, but everyone else uses tools -- so it's crazy to only sell to fingers-usage just because the learning curve is lower.
Education is not about just reading, but regardless - you know what's better for reading than an iPad? A paper book. And if your concern is weight, you know what combines the best qualities of a paper book and a tablet? An e-book reader with e-Ink display.
(This is sad, really. As I understand, e-Ink tech is patented to death, so there's little progress there.)
Still, I'm not saying iPads are useless in general. The topic is relative merit of iPad vs. other possibilities, in education. My claims are as follows:
- iPad is a bad idea in education, partly because of the low capabilities of the device, and partly because not much apps that would would make full use of the capabilities of the device in order to encourage learning (this is kind of similar to "edutainment" or "educational games" for PC - utter crap that only fools parents)
- given that, iPad looses to plethora of other tools available for teachers, most of which are low-tech
- neither of the two points above will prevent iPad from succeeding on this market, as the hype is strong enough to overcome rational obstacles
--
Incidentally, I feel like back in 2015. I thought iPads already succeeded in getting sold to schools en masse, and it already turned out to be a colossal waste of money. I'm pretty sure we already had those discussions on HN.
Good points, but if children resort to Android devices I am not sure that makes me happy given they fall pray to advertising and profiling.
An e-reader has some disadvantages compared to a tablet. For example, it does not have a capacitive screen and does not have the ability to draw (low refresh rate) or use a stylus as input. It is a great tool for reading and it does that better than any other device (if you agree with its advantages over paper book which is weight vs amount of content) but not for say practice and exams.
Wow. As a highly educated person, I had no idea I was choosing inferior tools! Thanks for letting me know.
I gave all my nieces and nephews tablets, and they all used them, I hope I didn’t destroy their lives.
Seriously, I have a hard time believing you can quote any evidence that a paper book is much better than a tablet for everyone. Which appears to be your assertion, although it’s hard to tell.
My assertion is in context of education. If you're going to use a tablet as an electronic version of paper books, then paper books are better, because:
- they're much faster to skim through and search through
- print has better resolution than screen
- you can actually make notes on them - I mean free-form notes, made in an efficient way
- you can pull out multiple books and compare them
- they're much nicer on your eyes
Note again, I'm talking about education. That's different from recreational reading of fiction, and more akin to reading technical books - you need the ability to skim, skip, quickly move around and cross-reference stuff separated from each other by many pages.
Ultimately, my point isn't that tablets are bad. Hell, I'm happy user of such devices, and they definitely help me read on the go (they beat Kindle for technical books, due to color and better workflow with PDFs). My point is that tablets get dumped on schools and are used to view textbooks, which is something at which they're worse than regular textbooks. Both teachers and publishers have little clue on how to use these devices to their full potential, and there doesn't even seem to be a movement to help them get that clue.
Not OP but I can second this. When I have a really hard topic I need to understand that requires rapid context switching to check on details nothing beats a book. Highlighters and post it book marks make it much faster to browse through than PDF on my IPad.
I have a printed book that's the same age as the U.S. constitution (now 231 years) and another printed book that's a few decades older (293 years), both of which are still completely legible. "Decades" feels like a big understatement to me. :-)
We mostly think of paper as a poor archival medium because of a few decades, mainly at the end of the 19th century and the first half of the 20th century, when high-acid wood pulp paper was very widely used. (Newspapers are also sometimes printed on cheap acidic newsprint.) Acidic paper can become brittle within 20 years or so, or in any case much less than a century, although this also depends on many other factors. But both before and after this period, many books were printed on acid-free paper, and these are likely to survive for centuries, even with occasional use. The observations about paper's longevity in, for instance, Nicholson Baker's polemical Double Fold, tallies well with my experience as the son of a bookdealer. My father routinely sells books that are over a century old to people who fully expect to read them.
And I remember visiting an antiquarian book fair where a first folio Shakespeare was for sale for millions of dollars. It was still possible to turn the pages with no damage and all the printing was completely legible, even though the book was already 380 years old or so.
Books on low-acid paper are really a very durable technology when treated with respect.
Assuming the file format is documented, also bits remain completely legible throughout time. Perhaps we haven’t yet worked on a redundantly encoded, self documenting file format but eventually we will. At that point in time - assuming all relevant content won’t be copyrighted to death for 1000 years, that’s a big one - we will be able to simultaneously have unfettered access to any content
Also books exist in few states, of which only one is useful: well preserved, wet, burnt, torn apart. That’s how entropy works folks! One advantage of books is that the information density is lower so the loss of one specimen isn’t that significant, unless it’s unique
A disk will also survive some local data corruption or loss, even recover it and provide warnings sometimes. Catastrophic failure of a disk akin to total combustion of the book. Listen, can we stop this false analogy debate? It’s not advancing the discussion that much
It includes my mothers chemistry book from before I was born as well as a few books bought second hand.
I have maybe one disk from the time I was a teenager.
Yes, fire will destroy a book (although parts will often be recoverable using low tech methods).
The same fire will certainly destroy disks and make them unrecoverable for everyone but recovery labs (if they are recoverable at all.)
Summary: books fail slower than disks. When they fail they often fail gracefully and you can still use them and even repair them using low tech tools.
Disks fail more often. In fact you rarely see working disks from last decade. They fail for a number of reasons and other times seemingly for no reasons at all. When they fail they might have tried to warn you but often they just fail with no warning and unless you work in IT you'll rarely see those warnings anyway. Also after they've actually failed you more likely than not need a recovery lab to get anything back.
I bought Heisenberg's quantum theory paper back from a second hand shop. It's 90 years old and in amazing condition with just sun damage to the spine. The pages are sewn together.
I've digitized a bunch of 16th century books so I (and everyone else, they're out of copyright) can read them on a tablet or whatever. Added a search function, too.
I am suggesting that claiming that one thing is best for everything and everyone is not true.
Undergraduates mostly have textbooks which change every couple of years. Also, electronic books that don't have DRM don't die when the device does.
I was physics/math and was lucky that my textbooks didn't change that rapidly, but still, it's a bit odd to claim that paper is the obvious choice for everyone based on widely varying circumstances.
And you can carry thousands or millions of them in a single device
And you can add a new one to your collection without having to physically go anywhere to buy
And you can draw and make marks without damaging the content forever
And you can create bookmarks in multiple ebooks and have them all available at the same ___location, also with search functionallity
And the device you use to read can be lighter than an actual book, at least it will always have the same weight regardless of how many ebooks you carry
As a starting point, can you quote any evidence that a tablet is better than a paper book? Both for reading and for more general educational activities?
I assume your family doesn't use their tablets with candycrush-like apps unlike some of my acquaintances. Which apps do they use to make it worth it?
I tend to assume that the burden of the proof of value is on the new tool, otherwise we just keep changing and nothing gets ever battle-tested. Paper, pen and other boring educational tools have drawbacks and benefits that are well-known; what about tablets?
>As a starting point, can you quote any evidence that a tablet is better than a paper book? Both for reading and for more general educational activities?
Have we regressed to the point where the simplest statement requires third party ("scientific" references)?
A table offers 100x the functionality of a paper book.
Even just in reading mode, it can carry 1000s of books in it, one can take notes, highlight, search them, remember its position on each, it has its own light (adjustable to match ambient levels or go beyond them), increase the font size (ever though about those with no perfect eyesight? Sucks to be them with paper books huh?), even spoken to the reader through TTS.
Not to mention that the price of ebooks is much cheaper than paper books to boot (and that one can even pirate the latter, if they are so inclined). Or that there are also 100,000s of FREE ebooks (classics, and such) but seldom (if ever, except for charity or so) free paper books.
Sorry I should have phrased it better than I have, I hope the following makes what I meant clearer.
I do agree with your statement that a tablet has more functionalities than any given book. Indeed, as you point out, a tablet can contain thousands of them.
What I tried and failed to express was that, in an educational setting, these functionalities are sometimes poorly used. This obviously depends on the teacher, if there is one.
It is not clear to me how teachers are supposed to use a tablet to enhance their classes, and in my (admittedly limited) experience, they are left to their own devices or with poor guidance to say the least. It could be that teachers more familiar with technology (beyond just using it) know what to do with it while teaching, but to which extent? I would have assumed that using tablets instead of books is at least as expensive; is this cost worth it in terms of gains?
In the Alan Kay interview which has already been linked in this thread[0]:
[quote]
I said, “If we’re gonna do a personal computer”—and that’s what I wanted PARC to do and that’s what we wound up doing [with the Alto]—”the children have to be completely full-fledged users of this thing.”
Think about what this means in the context of say, a Mac, an iPhone, an iPad. They aren’t full-fledged users. They’re just television watchers of different kinds.
[/quote]
That doesn't sound so promising in terms of education, unless I am missing something. From time to time, I see parents leaving their phone or tablet to their kid, who invariably plays some game like candy crush. How do you transform a device that was made for entertainment into an educational tool?
There have also been stories in the past about big tech leaders sending their children to "boring" schools without much in terms of technology, and mostly pen and paper, which suggests that better technology is not necessarily better for education.
Regarding the more limited scope of "just reading", I read a lot on my phone, but it is easier for me to get distracted since I can do so many things with a phone or a tablet. Granted it's the user's fault, but you don't need any self-discipline when using a simple book, which is at least one advantage.
I was expressing a personal anecdote, which I said (up one) that I knew was a minority opinion. I wasn't saying that I was expressing a universal truth. Sorry that I wasn't obvious.
You’re right. But it’s not about making textbooks digital. It’s about bringing more interactive software into the mix. IMO a combination of text, video, and interactive applications would be much more effective than just reading a book. And for what it’s worth, textbooks are rarely “read” anyways. Subsections of chapters are used and often not in order. And that’s mixed with a plethora of other content which is still largely being managed in a manual and ad-hoc manner. Clearly we can do better than this.
E-readers, for example. The battery lasts far and far longer, they're a lot cheaper (at least 50% of the price), and probably lighter. Source: my own experience with them two Kobo devices.
Are you a k12 student? If not, why are you replying at all? Anyone who asserts that iPads are a good choice for education are completely out of touch at best, or ignorant at worst. I'm sorry if this borders on a personal attack, but it feels like you're just looking for a reason to be offended.
I had an iPad in high school and loved it, despite not having any apps for any of my school stuff. Just the browser and the form factor was enough to be a pretty good device. Note taking was the main thing I couldn’t do with it, and now it’s a great device for note taking. That is a big enough differentiator from Chromebooks to make it the best choice IMO.
But of course this is just based on my limited personal experience, I could be totally wrong.
I'm not offended, so if that's what you're reacting to, now you know. Instead, I was giving a personal anecdote in reaction to a statement which looked like an over-generalization.
My nieces and nephews were k12 when I gave them tablets.
Apparently I missed the news that tablets were terrible 100% of the time! Which appears to be what you're saying, but I can't really tell, what with all the names you called me. "Out of touch" or "ignorant", what a choice!
Perhaps we are looking at it from two different angles. It seems you may be looking at it from the 'educational to have one', while i'm looking at it from feasibility at an institutional level, which I thought OP was referring to.
I know nothing about ipads, so I'm not questioning it as a device. What I'm questioning is why anyone thinks it is realistic for schools to spend the money on the, vs similar options at a fraction of the cost. Teachers in multiple states are walking out over pay, right now. Teachers in most states are underpaid. Most students are on free lunches. Where does the money come from to buy kids an ipad? Seems the state doesn't have it, nor the parents.
why do you find that irritating? The Alto wasn't build for the mass market -- Xerox produced only several thousands units and used mostly internally. It was also developed when they didn't have all the necessary, affordable tech and parts that later made far more affordable copycats (ahem) possible. The Star was Xerox's first commercial product aimed at the mass (business) market -- but like the Lisa which came later didn't succeed because of their high cost.
> I find this comment mildly irritating. The Alto apparently cost $40k [1] in (or around) 1973, which was equivalent to $95k in 1984 (or $232k today). Compare with the Mac which cost $2.5k in 1984.
Sorry, but what is really irritating is the comparison you are making. 1973 component prices are in no way comparable to 1984 component prices, and Xerox's target market and desired margins were completely different from Apple's. If you actually look at the commercial machines that Xerox developed after the Alto prototypes, you find things like the Star, that came out for $16,000 in 1981. Compare that to the Apple Lisa, which came out for $10,000 in 1983. And it is not really a fair comparison, because the Star had much better hardware in every respect. What would be a fair comparison to the Lisa in terms of hardware capabilities would be the 1978 NoteTaker, built with off-the-shelf components for about $15,000 (my estimate: https://oneofus.la/have-emacs-will-hack/2012-01-11-the-perso...). The people at Xerox were perfectly capable of designing and building low-cost computers with third party components. Xerox never entered the low-cost personal computer market. In the early 1980s Xerox sold high-margin workstations to large corporate clients, and in terms of price/performance their offerings were very competitive.
If you want to say that not selling cheap mass-market products has "limited direct impact," that is true, but is also not a very insightful observation.
> The people at Xerox were perfectly capable of designing and building low-cost computers with third party components. Xerox never entered the low-cost personal computer market. In the early 1980s Xerox sold high-margin workstations to large corporate clients, and in terms of price/performance their offerings were very competitive.
This was a rather plain CP/M computer released almost at the same time as the Star. Pretty sad when you think how much better it would have been for them to just make the 3 year old Notetaker with no changes instead.
It's not made very gracefully, but the point is that the Mac wasn't a big leap forward in capability, it was the consumer version of the thing they had working a decade earlier.
Of course it's an open question what sort of computers Apple would have made if Jobs hadn't visited PARC.
It isn't any less an open question, but I think there's a good shot that Jef Raskin's original concept for the Macintosh would have made it to market under Apple's banner.
Possibly the best look we'll ever get at the type of Human-Computer Interface he wanted to build is written down in "The Humane Interface", although I believe he did, and correct me if I'm wrong, develop a system for Canon that implemented some of his designs.
I felt the same, but if I were him and as self aware as he is as to his impact, it'd be hard for me to unwind the BetaMaxing of 'my' work, emotionally. (Economics aside.)
Based on other writings of his, which I've also enjoyed, he doesn't seem shy to throw in these jabs.
A very important question! I don't know the answer, but I think it would be something that included FPGAs -- Intel has been making interesting hybrid chips that combine an ARM variant with a lot of FPGA real estate, and that would likely be a good basis.
What are your thoughts on the home-grown effort to produce small, portable .. and open .. computing devices, as represented by such as the Pyra Handheld and so on?
Earlier iterations of the machines supported by this community utilized ARM-based CPU's that had smaller cores onboard that could be repurposed for other things .. audio and video codecs, 3d engines and so on.. maybe you have yet to see the Pyra/Pandora classes of devices?
The physical nature of the device needs to match up to human abilities and needs: so I always ask questions about visual angle, resolution, pointing, drawing, typing, extensions of the hardware (especially via software), capacity, speeds that match to human nervous system, etc.
The software nature of the device also needs to match up to human abilities and needs: so I always ask questions about the GUI, how does it help in learning itself and other things, what kinds of things can I see and contrast and compare, what kinds of things can I do by programming, what kinds of things can kids do by programming.
"We further showed that certain variants of DOT’s recursive self types can be integrated successfully while keeping the calculus strongly normalizing. This result is surprising, as traditional recursive types are known to make a language Turing-complete."
"What is the minimum required change to achieve Turing-completeness? Consistent with our expectations from traditional models of recursive types, we demonstrate that recursive type values are enough to encode diverging computation."
"But surprisingly, with only non-recursive type values via rule (TTYP), we can still add recursive self types to the calculus and maintain strong normalization."
"Can we still do anything useful with recursive self types if the creation of proper recursive type values is prohibited? Even in this setting, recursive self types enable a certain degree of F-bounded quantification, ..."
4. Software (UX) & hardware (again: UX) elements/modular components that scale using morphic numbers, i.e. the golden ratio and the plastic number (whose square Donald E. Knuth likes to refer to as "High Phi". He even suggested a special mark for it, which nobody ever bothered to include in Unicode): https://en.wikipedia.org/wiki/Plastic_number
6. Spatial Analytic Interfaces: https://mspace.lib.umanitoba.ca/handle/1993/31595 Just take a look at the figures to see where this goes. Incidentally, this kind of user interface lends itself well for Dataflow programming...
9. Xanadu, Zettelkasten, Smalltalk, Lisp Machine, Plan 9, Inferno etc. related hyperlink crowd/mesh/cloud/grid/whatever hijinks, both for social and for programmatic cooperation. I'll not link anything on that here, since, well, kind of pointless to feed Alan Kay the kool-aid he in part came up with long ago. ;)
And then there exist a million other things that should go here, too, but for which I lack the time to type them out - I already consider this far, far too terse to make any sense to someone without A LOT of preexisting knowledge.
Steve revealed at D10 that the iPad actually did get prototyped first. IIRC, he tasked a team with creating a keyless keyboard tablet thing (forgive my terrible paraphrasing). They came back with a tablet. When Steve scrolled up and saw the rubber band effect, he said "oh my god we can make a phone with this" and they went the phone first.
Edit: Below is the interview. He "had an idea of being able to type on a multi touch glass display"
"This guy badgered me about how Microsoft was going to completely change the world with this tablet PC software and eliminate all notebook computers, and Apple ought to license his Microsoft software. But he was doing the device all wrong. It had a stylus. As soon as your have a stylus, you’re dead. This dinner was like the tenth time he talked to me about it, and I was so sick of it that I came home and said, “Fuck it, let’s show him what a tablet can really be."
Jobs went into the office the next day, gathered his team, and said, "I want to make a tablet, and it can’t have a keyboard or a stylus."
I always find the “make it this size and you’ll rule the world” quote interesting. The users I know who really love their iPads/tablets are closer to @alankay1’s generation. Younger generations don’t have much use for device that can’t fit in a pocket. I wonder how much the interest in iPad-sized devices came from years of using paper about that size.
Fitting in a pocket is a great advantage for a portable device, to everyone.
The market seems to be saying: if I need something that I have to lug around, I might as well have a keyboard. i.e. a laptop.
Another effect is people are doing everything they can on a phone, because of the convenience. This leads to slightly larger phones every year, and now a 5.5" phone is no longer considered a "phablet", and apple is rumoured to having a 6.5" phone next (their smallest tablet is 7").
So... tablets will rule the world, but they'll be "phones". The barrier to adaption was changing consumer behaviour... as always.
iOS and Android turn tablets into oversized phones, so no surprise they lose against phones - they have the same (or usually worse, at a given price point) capabilities while being larger, thus less convenient to carry and more fragile.
These days I'm not using much Windows for desktop (HN took over the time I had for games), but I do sincerely hope this OS will live on for at least couple more years, because Microsoft is the only company that knows how to do tablets. Seriously. A proper operating system, with real capabilities and proper user interface that can make use of the tablet form, makes a world of a difference. And I say that as someone who owned a decent Android tablet in the past. As someone who's friends and family with people owning currently decent Android tablets. My little Windows 10 el-cheapo tablet/netbook I picked up second-hand for ~$130 is much more useful than the top-of-the-line Samsung tablet offering. Hell, it already earned me many times more than its own value back, as it's good enough to do some meaningful dev work on the bus/train.
Windows did try the same as Android and iOS with Windows RT, thankfully that was a disaster. Netbooks are ridiculously useful, I used to have a 15 minute bus ride to work with an Asus EEE and would manage to fill that 15 minutes with active moonlighting development time every day. The work I did on that bus became the frontend for what now 10 years later is a $50m company.
I realise I'm making a slightly different point as I'm purely talking about netbooks/laptops. But having the full power of an OS on a tablet is definitely the way to go.
Certainly that's one of the bad points about iOS and Android they are so locked down that you have to jump through hoops if you wanted to use them as a work machine.
For my current job I bought myself a $350 refurbished Thinkpad (T430 8GB SSD core i5), this brings me in all my income. You can compare this to my nephew that is paying $1000 for an iPhone X because he got bored of his iPhone 8.
Further to that I bought exactly the same spec Thinkpad for my 5 year old daughter. The Thinkpad T-series are great because you can pour a litre of liquid over them without problem [0] plus they're built like a brick, so basically perfect for kids. My daughter immediately covered the grey brick with shiny stickers and gave it a name (she has an iPad, but that's always just called 'iPad'). It has the full capability to do everything she'll ever need in theory for the rest of her life/career. Further to that it's got Ubuntu installed and I can then install Sugar [1] for her to use (the same used for One Laptop Per Child).
The possible drawback is that it doesn't have a touchscreen. But with my experiment of buying a laptop with a touch screen I found I pretty much never wanted to use the touch screen, it's a slower interface than keyboard and mouse. You want the screen in front of you at arms length but then you have to reach with your arm to touch the screen.
I can now teach her over the years what it means to have real freedom with your software and hardware.
Sadly, netbooks seem to be gone. Even chromebooks are full-size, full-weight.
Do you know of termux.com, for linux without rooting? It's basically a linux distribution, adjusted for android's slightly odd filesystem. The app itself is tiny, and you install from its repository. e.g. there's latex, youtube-dl, curl, vim, clang, ecj, python, ruby, perl, lua, erlang etc.
A bluetooth keyboard completes the DIY linux netbook... sadly, bluetooth keyboards are also almost gone, presumably as people adjust themselves to "touch" typing.
I have both a recent iPad and a Dell 2-1. For "consumption" I find my iPad far more usable. The 4/3 screen is preferable to the 16:9 screen on my laptop and of course it's a lot lighter. The battery life of my iPad is amazing.
For the occasional typing, I use my same mini Bluetooth keyboard for my computer and my iPad.
I had an original iPad bought in mid 2011 but it didn't get much use - it was no more than a "big iPhone".
But a lot changed since then.
- Cellular connectivity is a lot cheaper. I pay $20 a month for unlimited data with T-Mobile.
- I got rid of cable and use Netflix/Hulu/DirecTv and Plex.
- PluralSight is available and I use that for watching technical videos.
- The screen resolution is a lot better for ebooks.
- better productivity software from Apple, Microsoft and Google.
- iOS has made iPad specific improvements - split screen, picture in picture, and drag and drop.
I've been using an iPad mini with Cellular as my phone-replacement mobile device since the day iPad mini came out. It fits in my pocket (not as comfortably as a phone, but well enough).
My only other device is a 15" MBP.
To be more accurate, the iPad mini is 7.9", not 7" though. And 4:3 aspect ratio. So in area, it's still much bigger than a 6.5" 16+:9 phone would be.
I know students who have written essays, put together presentations, etc solely on their phones.
If you really believe that the only thing “younger generations” do is post on social media, I suggest you spend more time around high schoolers (for example), particularly those with limited financial means.
Limited financial means is the key here. A decent smartphone costs as much as a lower-end computer; for kids, there's a good chance it was either-or proposition in terms of which device they get to own (instead of sharing with family).
Reminds me of a friend who, few years ago, was a teenager of limited financial means. He earned his first real $50 through a phone. His laptop broke, and he didn't have anyone to borrow one from in time, so he ssh'd into a remote machine from his phone and did the work (some data wrangling task) this way. Where there's a will, there's a way.
But still, that in no way suggests phones are good at text editing or productive work. They're capable enough, if you have lots of will and no other option.
Until Mike Nesmith's mother changed the world :) But even then, trying to realign using that red line hovering half an inch off the surface of the paper made the mistakes obvious :/
I phrased my comment specifically trying to avoid sounding like I was saying that's the only thing they do. Guess I didn't do it right.
It's awesome that those with limited funds can still get stuff done, and phones are great for consuming information, but I don't think there's much argument against the efficiency gain that hardware keyboards, mice and larger screens with multiple windows give when you get into things that require fine tuning and multi-tasking like drawing, coding and video editing.
It read fine to me. I didn't read any implication that young people don't do anything more complicated than social media posts - only that when do they could use a larger device.
(sorry I know posts like this don't make interesting reading, I just know how frustrating it can be when you put effort into writing something to avoid being misunderstood but still get taken the wrong way... so wanted to point out that at least for this reader the negative connotation was not there)
One of my students submitted an essay today entitled "On 'Democracy' and How Humans Lie" which was written entirely on their phone about inequality in the US
It has been, it's called Microsoft Surface. It's sized for creating things and is running on the OS made for productive work. But it's expensive and made by a company that's not very much liked by people.
That's an excellent point. It's funny how Microsoft has a reputation for making poor copies of things, but that hasn't really been the case for 20 years. They're often the ones to innovate now. Sadly, they don't seem to be able to get the traction with their innovations that they once did with their copies.
In Facebook's case, it was famously harvard.edu. A lot of people want to belong to or participate in exclusive clubs. Something other people are widely excluded from. The commons is Answers.com or Yahoo Answers. The people writing junk and spam questions & answers on Yahoo Answers, are not participating on Quora.
In Quora's case, it was that it originated / spread out of prestigious tech circles. A small group of connected, rather elite, techies out of Silicon Valley seeded the site in its early days. Those people were connected to other prestigious tech people, and so it went.
Adam D'Angelo is one of Quora's two founders, the former CTO of Facebook, and is a near-billionaire that went to Exeter with Zuckerberg and then CIT. Elite + connected.
Then last but not least, Quora enforces fairly high standards on quality, similar to why Stack Overflow isn't overflowing with trash despite its immense scale.
> similar to why Stack Overflow isn't overflowing with trash despite its immense scale
Oh boy. You haven't checked out the newest questions page on SO, have you? There are oh so many new low-quality questions.
That said, they're mostly so badly written, that they rarely show up on Google. But if you're actually trying to answer questions, then good luck finding any worth answering.
I simply gave up answering questions unless I stumble upon them while googling.
The entire Fast Company interview is gold, but a few quotes on the iPhone:
"Think about this. How stupid is this? It’s about as stupid as you can get. But how successful is the iPhone? It’s about as successful as you can get, so that matches you up with something that is the logical equivalent of television in our time."
...
"Yeah. We can eliminate the learning curve for reading by getting rid of reading and going to recordings. That’s basically what they’re doing: Basically, let’s revert back to a pre-tool time."
>"“Have I got a deal for you: a Honda with a one-quart gas tank!”. Steve did not like this memo, but what could he do given the history, and that it was quite true?"
Can someone explain the joke?
Also does anyone know how long Alan was with Apple?
Honda is a car for everyone. Macintosh was a computer for the rest of us (everyone). The first Mac was short on memory (128K) for the type of apps it was meant to run. Hence the "not enough gas."
Faster computer has increased everyone's free time, which in turn are being used to eat other persons free time. Does that means computer will never increase our free time ? ( Or increase and eat up in point less video games and ad watching ? ). The ultimate benefit I think is, because millions use computer for point less thing, but ultimately investing their time and money in to it, are kind of crowd sourcing the technology advances that are other wise was not possible. So on that look out, kudos to cat video and ads. It that is what it takes to make human kind forward so be it.
iPhone first helped them get the mobile OS right. Because MicroSoft had a tablet for a decade and its OS was horrible- a cluttered version of shrunk windows. If Apple had shrunk MacOS as an interface, that would have been awful too.
I find this comment mildly irritating. The Alto apparently cost $40k [1] in (or around) 1973, which was equivalent to $95k in 1984 (or $232k today). Compare with the Mac which cost $2.5k in 1984.
Sure, the Alto may have been more powerful, but if it was financially out of reach of almost everybody, it's no wonder it had a limited direct impact.
I wonder if it's a trend that Alan underestimates the importance of economics for technology to actually have an impact.
[1] http://www.computerhistory.org/revolution/input-output/14/34...