Jump to content

Alan Kay

From Wikiquote
Alan Kay (2008)

Alan Curtis Kay (born 17 May 1940) is an American computer scientist best known for his pioneering work on object-oriented programming and windowing graphical user interface design.

Quotes

[edit]

1970s

[edit]
  • [ Computing ] is just a fabulous place for that, because it's a place where you don't have to be a Ph.D. or anything else. It's a place where you can still be an artisan. People are willing to pay you if you're any good at all, and you have plenty of time for screwing around.

1980s

[edit]
  • Technology is anything that wasn't around when you were born.
    • Hong Kong press conference in the late 1980s
  • The future is not laid out on a track. It is something that we can decide, and to the extent that we do not violate any known laws of the universe, we can probably make it work the way that we want to.
    • 1984 in Alan Kay's paper Inventing the Future which appears in The AI Business: The Commercial Uses of Artificial Intelligence, edited by Patrick Henry Winston and Karen Prendergast. As quoted by Eugene Wallingford in a post entiteled ALAN KAY'S TALKS AT OOPSLA on November 06, 2004 9:03 PM at the website of the Computer Science section of the University of Northern Iowa.

1990s

[edit]

2000s

[edit]
If you don't fail at least 90 percent of the time, you're not aiming high enough.
By the time I got to school, I had already read a couple hundred books. I knew in the first grade that they were lying to me because I had already been exposed to other points of view. School is basically about one point of view — the one the teacher has or the textbooks have.
Simple things should be simple, complex things should be possible.

A Conversation with Alan Kay, 2004–05

[edit]
  • Perl is another example of filling a tiny, short-term need, and then being a real problem in the longer term. Basically, a lot of the problems that computing has had in the last 25 years comes from systems where the designers were trying to fix some short-term thing and didn't think about whether the idea would scale if it were adopted. There should be a half-life on software so old software just melts away over 10 or 15 years.
  • Basic would never have surfaced because there was always a language better than Basic for that purpose. That language was Joss, which predated Basic and was beautiful. But Basic happened to be on a GE timesharing system that was done by Dartmouth, and when GE decided to franchise that, it started spreading Basic around just because it was there, not because it had any intrinsic merits whatsoever.
  • Computing spread out much, much faster than educating unsophisticated people can happen. In the last 25 years or so, we actually got something like a pop culture, similar to what happened when television came on the scene and some of its inventors thought it would be a way of getting Shakespeare to the masses. But they forgot that you have to be more sophisticated and have more perspective to understand Shakespeare. What television was able to do was to capture people as they were. So I think the lack of a real computer science today, and the lack of real software engineering today, is partly due to this pop culture.
  • Most creativity is a transition from one context into another where things are more surprising. There's an element of surprise, and especially in science, there is often laughter that goes along with the “Aha.” Art also has this element. Our job is to remind us that there are more contexts than the one that we're in — the one that we think is reality.

2010s

[edit]
  • However, I am no big fan of Smalltalk either, even though it compares very favourably with most programming systems today (I don't like any of them, and I don't think any of them are suitable for the real programming problems of today, whether for systems or for end-users).
  • The Internet was done so well that most people think of it as a natural resource like the Pacific Ocean, rather than something that was man-made. When was the last time a technology with a scale like that was so error-free? The Web, in comparison, is a joke. The Web was done by amateurs.
[edit]
Wikipedia
Wikipedia
Wikipedia has an article about: