Quotes about computer
page 9

Cory Doctorow photo

“One form of math denial is the belief in the ability to make computers that prevent copyright infringement. Computers only ever work by making copies: restricting copying on the internet is like restricting wetness in water.”

Cory Doctorow (1971) Canadian-British blogger, journalist, and science fiction author

The FBI wants a backdoor only it can use – but wanting it doesn't make it possible http://theguardian.com/technology/2016/feb/24/the-fbi-wants-a-backdoor-only-it-can-use-but-wanting-it-doesnt-make-it-possible in The Guardian (24 February 2016)

Frank Wilczek photo
Edsger W. Dijkstra photo

“It is time to unmask the computing community as a Secret Society for the Creation and Preservation of Artificial Complexity.”

Edsger W. Dijkstra (1930–2002) Dutch computer scientist

Dijkstra (1996) "The next fifty years" https://www.cs.utexas.edu/users/EWD/transcriptions/EWD12xx/EWD1243a.html (EWD 1243a).
1990s

Linus Torvalds photo

“I may be a huge computer nerd, but even so I don't think education should be about computers. Not as a subject, and not as a classroom resource either.”

Linus Torvalds (1969) Finnish-American software engineer and hacker

Sam Varghese, iTWire interview, 2014-09-15, 2018-07-20 https://www.itwire.com/business-it-news/open-source/65402-torvalds-says-he-has-no-strong-opinions-on-systemd,
2010s, 2014

Merlin Mann photo

“If you need to appear on an internet list to know whether you're someone's friend, you may have problems a computer can't solve.”

Merlin Mann (1966) American blogger

meatrobot http://meatrobot.org.uk/post/47885354/if-you-need-to-appear-on-an-internet-list-to-know
Tweeting as @hotdogsladies

Dave Barry photo
Ted Nelson photo

“Ted Nelson: Computer Lib/Dream Machine. Self-published, 1974, revised 1987..”

Ted Nelson (1937) American information technologist, philosopher, and sociologist; coined the terms "hypertext" and "hypermedia"

References

Alberto Manguel photo
Florian Cajori photo
Daniel Dennett photo

“I have grown accustomed to the disrespect expressed by some of the participants for their colleagues in the other disciplines. "Why, Dan," ask the people in artificial intelligence, "do you waste your time conferring with those neuroscientists? They wave their hands about 'information processing' and worry about where it happens, and which neurotransmitters are involved, but they haven't a clue about the computational requirements of higher cognitive functions." "Why," ask the neuroscientists, "do you waste your time on the fantasies of artificial intelligence? They just invent whatever machinery they want, and say unpardonably ignorant things about the brain." The cognitive psychologists, meanwhile, are accused of concocting models with neither biological plausibility nor proven computational powers; the anthropologists wouldn't know a model if they saw one, and the philosophers, as we all know, just take in each other's laundry, warning about confusions they themselves have created, in an arena bereft of both data and empirically testable theories. With so many idiots working on the problem, no wonder consciousness is still a mystery. All these charges are true, and more besides, but I have yet to encounter any idiots. Mostly the theorists I have drawn from strike me as very smart people – even brilliant people, with the arrogance and impatience that often comes with brilliance – but with limited perspectives and agendas, trying to make progress on the hard problems by taking whatever shortcuts they can see, while deploring other people's shortcuts. No one can keep all the problems and details clear, including me, and everyone has to mumble, guess and handwave about large parts of the problem.”

Consciousness Explained (1991)

David Crystal photo

“Micro computers used as word processors complement the audio facilities, enabling the interactive teaching of all four language skills reading, listening, speaking and writing.”

David Crystal (1941) British linguist and writer

Source: The Cambridge Encyclopedia of the English Language, 1987, p. 377

Manuel Castells photo
William Irwin Thompson photo
Seth Lloyd photo
John D. Barrow photo

“What causes us the most misery and pain… has nothing to do with the sort of information made accessible by computers.”

Neil Postman (1931–2003) American writer and academic

Amusing Ourselves to Death: Public Discourse in the Age of Show Business (1985)
Context: What causes us the most misery and pain... has nothing to do with the sort of information made accessible by computers. The computer and its information cannot answer any of the fundamental questions we need to address to make our lives more meaningful and humane. The computer cannot provide an organizing moral framework. It cannot tell us what questions are worth asking. It cannot provide a means of understanding why we are here or why we fight each other or why decency eludes us so often, especially when we need it the most. The computer is... a magnificent toy that distracts us from facing what we most need to confront — spiritual emptiness, knowledge of ourselves, usable conceptions of the past and future.

“It was not easy to have the imagination to foresee that computers were to become one of the most important developments of the century.”

James H. Wilkinson (1919–1986) English mathematician

Oral history interview http://history.siam.org/wilkinson.htm by John C. Nash, SIAM History of Numerical Analysis and Scientific Computing Project http://history.siam.org/,13 July 1984
Context: Very belatedly in 1947, Darwin [Sir Charles Darwin, great-grandson of the famous Charles Darwin] agreed to set up a very small electronics group [... ] It was not easy to have the imagination to foresee that computers were to become one of the most important developments of the century.

Richard Stallman photo

“Freedom means not having a master. And in the area of computing, freedom means not using proprietary software.”

Richard Stallman (1953) American software freedom activist, short story writer and computer programmer, founder of the GNU project

Free Software and Beyond: Human Rights in the Use of Software", address at Goeteborg, Sweden (16 May 2007)
2000s
Context: To have the choice between proprietary software packages, is being able to choose your master. Freedom means not having a master. And in the area of computing, freedom means not using proprietary software.

Thomas Carlyle photo

“To the British subject who fancies genius may be lodged in him, this liberty remains; and truly it is, if well computed, almost the only one he has.”

Thomas Carlyle (1795–1881) Scottish philosopher, satirical writer, essayist, historian and teacher

1850s, Latter-Day Pamphlets (1850), Stump Orator (May 1, 1850)
Context: If the young aspirant is not rich enough for Parliament, and is deterred by the basilisks or otherwise from entering on Law or Church, and cannot altogether reduce his human intellect to the beaverish condition, or satisfy himself with the prospect of making money,—what becomes of him in such case, which is naturally the case of very many, and ever of more? In such case there remains but one outlet for him, and notably enough that too is a talking one: the outlet of Literature, of trying to write Books. Since, owing to preliminary basilisks, want of cash, or superiority to cash, he cannot mount aloft by eloquent talking, let him try it by dexterous eloquent writing. Here happily, having three fingers, and capital to buy a quire of paper, he can try it to all lengths and in spite of all mortals: in this career there is happily no public impediment that can turn him back; nothing but private starvation—which is itself a finis or kind of goal—can pretend to hinder a British man from prosecuting Literature to the very utmost, and wringing the final secret from her: "A talent is in thee; No talent is in thee." To the British subject who fancies genius may be lodged in him, this liberty remains; and truly it is, if well computed, almost the only one he has.

“Don't you think that it's amazing that I'm singing into this silly camera with the desk lamp, and it's going through all these wires and everything else, and these computers, and you still feel what I'm feeling, and you still get what I'm trying to do?”

Ysabella Brave (1979) American singer

"This Just In!" (30 January 2007)
Context: Don't you think that it's amazing that I'm singing into this silly camera with the desk lamp, and it's going through all these wires and everything else, and these computers, and you still feel what I'm feeling, and you still get what I'm trying to do? Yeah. I think its amazing. And I think it's so nice in a period when we're very isolated people, and kind of emotionless people, I think it's great that we can still touch one another and we can still feel what we're feeling, and we can still have fun, and we can be sad, and we can be happy, and to know that someone cares about you — because I really do. I really do.
And I can't believe that I have over 10,000 subscribers. What is wrong with you people?

Holden Karnofsky photo

“I now believe that there simply is no mainstream academic or other field (as of today) that can be considered to be "the locus of relevant expertise" regarding potential risks from advanced AI. These risks involve a combination of technical and social considerations that don't pertain directly to any recognizable near-term problems in the world, and aren't naturally relevant to any particular branch of computer science.”

Holden Karnofsky (1981) American nonprofit executive

In "Three Key Issues I've Changed My Mind About" https://www.openphilanthropy.org/blog/three-key-issues-ive-changed-my-mind-about, September 2016
Context: I now believe that there simply is no mainstream academic or other field (as of today) that can be considered to be "the locus of relevant expertise" regarding potential risks from advanced AI. These risks involve a combination of technical and social considerations that don't pertain directly to any recognizable near-term problems in the world, and aren't naturally relevant to any particular branch of computer science. This is a major update for me: I've been very surprised that an issue so potentially important has, to date, commanded so little attention – and that the attention it has received has been significantly (though not exclusively) due to people in the effective altruism community.

“Nothing could be more misleading than the idea that computer technology introduced the age of information. The printing press began that age, and we have not been free of it since.”

Neil Postman (1931–2003) American writer and academic

Amusing Ourselves to Death: Public Discourse in the Age of Show Business (1985)
Context: In the Middle Ages, there was a scarcity of information but its very scarcity made it both important and usable. This began to change, as everyone knows, in the late 15th century when a goldsmith named Gutenberg, from Mainz, converted an old wine press into a printing machine, and in so doing, created what we now call an information explosion.... Nothing could be more misleading than the idea that computer technology introduced the age of information. The printing press began that age, and we have not been free of it since.

Nicholas Carr photo

“Hardly a dollar or a euro changes hands anymore without the aid of computer systems.”

Nicholas Carr (1959) American writer

Why IT Doesn't Matter Anymore http://hbswk.hbs.edu/archive/3520.html, Harvard Business Review, June 9, 2003.
Context: Today, no one would dispute that information technology has become the backbone of commerce. It underpins the operations of individual companies, ties together far-flung supply chains, and, increasingly, links businesses to the customers they serve. Hardly a dollar or a euro changes hands anymore without the aid of computer systems.

Jef Raskin photo

“In looking back at this turn-of-the-century period, the rise of a worldwide network will be seen as the most significant part of the computer revolution.”

Jef Raskin (1943–2005) American computer scientist

Interview in The Guardian (21 October 2004)
Context: I am only a footnote, but proud of the footnote I have become. My subsequent work — on eliciting principles and developing the theory of interface design, so that many people will be able to do what I did — is probably also footnote-worthy. In looking back at this turn-of-the-century period, the rise of a worldwide network will be seen as the most significant part of the computer revolution.

Terence McKenna photo

“Computer networks, paradoxically enough, are a deeply feminizing influence on society, where, in hardware, the unconscious is actually being created.”

Terence McKenna (1946–2000) American ethnobotanist

Psychedelic Society (1984)
Context: Orient yourself towards the psychedelic experience, towards the psychedelic phenomenon, as a source of information. A mirror image of the psychedelic experience in hardware are computer networks. Computer networks, paradoxically enough, are a deeply feminizing influence on society, where, in hardware, the unconscious is actually being created. It's as though we took the Platonic bon mot about how "if God did not exist, Man would invent him", and say "if the unconscious does not exist, humanity will invent it" — in the form of these vast networks able to transfer and transform information. This is in fact what we are caught up in, is a transforming of information. We have not physically changed in the last 40,000 years; the human type was established at the end of the last glaciation. But change, which was previously operable in the biological realm, is now operable in the realm of culture.

Richard Stallman photo

“For personal reasons, I do not browse the web from my computer.”

Richard Stallman (1953) American software freedom activist, short story writer and computer programmer, founder of the GNU project

OpenBSD mailing list (15 December 2007) http://lwn.net/Articles/262570/
2000s
Context: For personal reasons, I do not browse the web from my computer. (I also have no net connection much of the time.) To look at page I send mail to a daemon which runs wget and mails the page back to me. It is very efficient use of my time, but it is slow in real time.

Douglas Adams photo

“The divine spark leaps from the finger of God to the finger of Adam, whether it takes ultimate shape in a law of physics or a law of the land, a poem or a policy, a sonata or a mechanical computer.”

Alfred Whitney Griswold (1906–1963) American historian

Address at Yale University, New Haven, Connecticut (9 June 1957).
Context: Could Hamlet have been written by a committee, or the Mona Lisa painted by a club? Could the New Testament have been composed as a conference report? Creative ideas do not spring from groups. They spring from individuals. The divine spark leaps from the finger of God to the finger of Adam, whether it takes ultimate shape in a law of physics or a law of the land, a poem or a policy, a sonata or a mechanical computer.

Ragnar Frisch photo

“Frequently we even go so far as to assume linear relationships. Only in this way have we been able to feed our problems into the electronic computers and get mechanical answers quickly and at low cost.”

Ragnar Frisch (1895–1973) Norwegian economist

R. Frisch (1964), Theory of Production, p. v: Lead paragraph of preface
1940-60s
Context: In this feverish world of ours, where one wants the economic analyses to produce easily understandable results quickly and at the least possible cost, some of us have fallen into the habit of assuming for simplicity that the hundreds sometimes thousands of variables that enter into the analyses are linked together by very simple relationships. Frequently we even go so far as to assume linear relationships. Only in this way have we been able to feed our problems into the electronic computers and get mechanical answers quickly and at low cost.

John D. Barrow photo

“Mathematics became an experimental subject. Individuals could follow previously intractable problems by simply watching what happened when they were programmed into a personal computer.”

John D. Barrow (1952–2020) British scientist

Introduction
Cosmic Imagery: Key Images in the History of Science (2008)
Context: Mathematics became an experimental subject. Individuals could follow previously intractable problems by simply watching what happened when they were programmed into a personal computer.... The PC revolution has made science more visual and more immediate.... by creating films of imaginary experiences of mathematical worlds.... Words are no longer enough.

John F. Kennedy photo

“The growth of our science and education will be enriched by new knowledge of our universe and environment, by new techniques of learning and mapping and observation, by new tools and computers for industry, medicine, the home as well as the school. Technical institutions”

John F. Kennedy (1917–1963) 35th president of the United States of America

1962, Rice University speech
Context: The growth of our science and education will be enriched by new knowledge of our universe and environment, by new techniques of learning and mapping and observation, by new tools and computers for industry, medicine, the home as well as the school. Technical institutions, such as Rice, will reap the harvest of these gains. And finally, the space effort itself, while still in its infancy, has already created a great number of new companies, and tens of thousands of new jobs. Space and related industries are generating new demands in investment and skilled personnel, and this city and this state, and this region, will share greatly in this growth.

David Bohm photo

“The rules which govern the operation of the computer are, of course, different from those that govern the behavior of the figures displayed on the screen. Moreover, like the implicate order of Bohm's model, the computer might be capable of many operations that in no way apparent upon examination of the game itself as it progresses on the screen.”

David Bohm (1917–1992) American theoretical physicist

Source: Synchronicity: Science, Myth, and The Trickster (1990) by Allan Combs & Mark Holland
Context: The universe according to Bohm actually has two faces, or more precisely, two orders. One is the explicate order, corresponding to the physical world as we know it in day-to-day reality, the other a deeper, more fundamental order which Bohm calls the implicate order. The implicate order is the vast holomovement. We see only the surface of this movement as it presents or "explicates" itself from moment to moment in time and space. What we see in the world — the explicate order — is no more than the surface of the implicate order as it unfolds. Time and space are themselves the modes or forms of the unfolding process. They are like the screen on the video game. The displays on the screen may seem to interact directly with each other but, in fact, their interaction merely reflects what the game computer is doing. The rules which govern the operation of the computer are, of course, different from those that govern the behavior of the figures displayed on the screen. Moreover, like the implicate order of Bohm's model, the computer might be capable of many operations that in no way apparent upon examination of the game itself as it progresses on the screen.

Roger Penrose photo

“Does life in some way make use of the potentiality for vast quantum superpositions, as would be required for serious quantum computation?”

Roger Penrose (1931) English mathematical physicist, recreational mathematician and philosopher

Foreword (March 2007) to Quantum Aspects of Life (2008), by Derek Abbott.
Context: Does life in some way make use of the potentiality for vast quantum superpositions, as would be required for serious quantum computation? How important are the quantum aspects of DNA molecules? Are cellular microtubules performing some essential quantum roles? Are the subtleties of quantum field theory important to biology? Shall we gain needed insights from the study of quantum toy models? Do we really need to move forward to radical new theories of physical reality, as I myself believe, before the more subtle issues of biology — most importantly conscious mentality — can be understood in physical terms? How relevant, indeed, is our present lack of understanding of physics at the quantum/classical boundary? Or is consciousness really “no big deal,” as has sometimes been expressed?
It would be too optimistic to expect to find definitive answers to all these questions, at our present state of knowledge, but there is much scope for healthy debate...

“Numerical analysis has begun to look a little square in the computer science setting, and numerical analysts are beginning to show signs of losing faith in themselves.”

James H. Wilkinson (1919–1986) English mathematician

Some Comments from a Numerical Analyst (1971)
Context: Numerical analysis has begun to look a little square in the computer science setting, and numerical analysts are beginning to show signs of losing faith in themselves. Their sense of isolation is accentuated by the present trend towards abstraction in mathematics departments which makes for an uneasy relationship. How different things might have been if the computer revolution had taken place in the 19th century! [... ] In any case "numerical analysts" may be likened to "The Establishment" in computer science and in all spheres it is fashionable to diagnose "rigor morris" in the Establishment.

Richard Stallman photo

“One reason you should not use web applications to do your computing is that you lose control.”

Richard Stallman (1953) American software freedom activist, short story writer and computer programmer, founder of the GNU project

"Cloud computing is a trap, warns GNU founder Richard Stallman", in The Guardian (29 September 2008)
2000s
Context: One reason you should not use web applications to do your computing is that you lose control. It's just as bad as using a proprietary program. Do your own computing on your own computer with your copy of a freedom-respecting program. If you use a proprietary program or somebody else's web server, you're defenceless. You're putty in the hands of whoever developed that software.

Tim Berners-Lee photo

“You're joining a group of people who can do incredible things. They can make the computer do anything they can imagine.”

Tim Berners-Lee (1955) British computer scientist, inventor of the World Wide Web

From An Insight, An Idea with Tim Berners-Lee http://www.weforum.org/sessions/summary/insight-idea-tim-berners-lee at 27:27 (25 January 2013)
Context: When somebody has learned how to program a computer … You're joining a group of people who can do incredible things. They can make the computer do anything they can imagine.

Christopher Vokes photo

“But as far as I am concerned, the computer is the worst damn instrument devised by man to screw up man-management.”

Christopher Vokes (1904–1985) Canadian general

England, p. 74
Vokes - My Story (1985)
Context: I looked for certain attributes in a soldier. I know the modern method is to put the attributes into a computer and see what comes out. But as far as I am concerned, the computer is the worst damn instrument devised by man to screw up man-management.

Camille Paglia photo

“The computer, with its multiplying forums for spontaneous free expression from e-mail to listservs and blogs, has increased facility and fluency of language but degraded sensitivity to the individual word and reduced respect for organized argument, the process of deductive reasoning.”

Camille Paglia (1947) American writer

The Magic of Images: Word and Picture in a Media Age (2004)
Context: The computer, with its multiplying forums for spontaneous free expression from e-mail to listservs and blogs, has increased facility and fluency of language but degraded sensitivity to the individual word and reduced respect for organized argument, the process of deductive reasoning. The jump and jitter of us commercial television have demonstrably reduced attention span in the young.

David Deutsch photo

“Quantum computation is … nothing less than a distinctly new way of harnessing nature”

Source: The Fabric of Reality (1997), Ch. 9 : Quantum Computers
Context: Quantum computation is … nothing less than a distinctly new way of harnessing nature … It will be the first technology that allows useful tasks to be performed in collaboration between parallel universes, and then sharing the results.

Larry Ellison photo

“The computer industry is the only industry that is more fashion-driven than women's fashion.”

Larry Ellison (1944) American internet entrepreneur, businessman and philanthropist

Referring to the term "cloud computing" in his Oracle OpenWorld 2008 speech, as quoted in "Oracle's Ellison nails cloud computing" at cnet (26 September 2008) http://news.cnet.com/8301-13953_3-10052188-80.html?part=rss&subj=news&tag=2547-1_3-0-5.
Context: The computer industry is the only industry that is more fashion-driven than women's fashion. Maybe I'm an idiot, but I have no idea what anyone is talking about. What is it? It's complete gibberish. It's insane. When is this idiocy going to stop?

William Gibson photo

“On the most basic level, computers in my books are simply a metaphor for human memory: I'm interested in the hows and whys of memory, the ways it defines who and what we are, in how easily memory is subject to revision.”

William Gibson (1948) American-Canadian speculative fiction novelist and founder of the cyberpunk subgenre

Interview with Larry McCaffery in Storming the Reality Studio : A Casebook of Cyberpunk and Postmodern Science Fiction, Duke University Press (December 1991)
Context: On the most basic level, computers in my books are simply a metaphor for human memory: I'm interested in the hows and whys of memory, the ways it defines who and what we are, in how easily memory is subject to revision. When I was writing Neuromancer, it was wonderful to be able to tie a lot of these interests into the computer metaphor. It wasn't until I could finally afford a computer of my own that I found out there's a drive mechanism inside — this little thing that spins around. I'd been expecting an exotic crystalline thing, a cyberspace deck or something, and what I got was a little piece of a Victorian engine that made noises like a scratchy old record player. That noise took away some of the mystique for me; it made computers less sexy. My ignorance had allowed me to romanticize them.

Edsger W. Dijkstra photo

“As a result, the topic became – primarily in the USA – prematurely known as ‘computer science’ – which, actually, is like referring to surgery as ‘knife science’ – and it was firmly implanted in people’s minds that computing science is about machines and their peripheral equipment. Quod non”

Edsger W. Dijkstra (1930–2002) Dutch computer scientist

Dijkstra (1986) On a cultural gap http://www.cs.utexas.edu/users/EWD/transcriptions/EWD09xx/EWD924.html (EWD 924).
1980s
Context: A confusion of even longer standing came from the fact that the unprepared included the electronic engineers that were supposed to design, build and maintain the machines. The job was actually beyond the electronic technology of the day, and, as a result, the question of how to get and keep the physical equipment more or less in working condition became in the early days the all-overriding concern. As a result, the topic became – primarily in the USA – prematurely known as ‘computer science’ – which, actually, is like referring to surgery as ‘knife science’ – and it was firmly implanted in people’s minds that computing science is about machines and their peripheral equipment. Quod non [Latin: "Which is not true"]. We now know that electronic technology has no more to contribute to computing than the physical equipment. We now know that programmable computer is no more and no less than an extremely handy device for realizing any conceivable mechanism without changing a single wire, and that the core challenge for computing science is hence a conceptual one, viz., what (abstract) mechanisms we can conceive without getting lost in the complexities of our own making.

Umberto Eco photo

“No algorithm exists for the metaphor, nor can a metaphor be produced by means of a computer's precise instructions, no matter what the volume of organized information to be fed in.”

[3] Metaphor, 3.12. Conclusions
Semiotics and the Philosophy of Language (1984)
Context: No algorithm exists for the metaphor, nor can a metaphor be produced by means of a computer's precise instructions, no matter what the volume of organized information to be fed in. The success of a metaphor is a function of the sociocultural format of the interpreting subjects' encyclopedia. In this perspective, metaphors are produced solely on the basis of a rich cultural framework, on the basis, that is, of a universe of content that is already organized into networks of interpretants, which decide (semiotically) the identities and differences of properties. At the same time, content universe, whose format postulates itself not as rigidly hierarchized but, rather, according to Model Q, alone derives from the metaphorical production and interpretation the opportunity to restructure itself into new nodes of similarity and dissimilarity.

Benoît Mandelbrot photo

“The extraordinary surprise that my first pictures provoked is unlikely to be continued. Many people saw them fifteen years ago, ten years ago. Now children see it on their computers when the computers do nothing else. The surprise is not there.”

Benoît Mandelbrot (1924–2010) Polish-born, French and American mathematician

Segment 144
Peoples Archive interview
Context: The extraordinary surprise that my first pictures provoked is unlikely to be continued. Many people saw them fifteen years ago, ten years ago. Now children see it on their computers when the computers do nothing else. The surprise is not there. The shock of novelty is not there. Therefore the unity that the shock of novelty, surprise, provided to all these activities will not continue. People will know about fractals earlier and earlier, more and more progressively. I think that the best future to expect and perhaps also the best future to hope for, is that fractal ideas will remain either as a peripheral or as a central tool in very many fields.

George Gilder photo

“Let there be light, says the Bible. All the firmaments of technology, all our computers and networks, are built with light, and of light, and for light, to hasten its spread around the world.”

George Gilder (1939) technology writer

Telecosm : How Infinite Bandwidth Will Revolutionize Our World (2000), p. 31
Context: Let there be light, says the Bible. All the firmaments of technology, all our computers and networks, are built with light, and of light, and for light, to hasten its spread around the world. Light glows on the telescom's periphery; it shines as its core; it illuminates its webs and its links. From Newton, Maxwell, and Einstein to Richard Feynman and Charles Townes, the more men have gazed at light, the more it turns out to be a phenomenon utterly different from anything else. And yet everything else — every atom and every molecula — is fraught with its oscillating intensity.

Richard Stallman photo

“A hacker is someone who enjoys playful cleverness — not necessarily with computers.”

Richard Stallman (1953) American software freedom activist, short story writer and computer programmer, founder of the GNU project

Words to Avoid (or Use with Care) Because They Are Loaded or Confusing (1996) http://www.gnu.org/philosophy/words-to-avoid.html
1990s
Context: A hacker is someone who enjoys playful cleverness — not necessarily with computers. The programmers in the old MIT free software community of the 60s and 70s referred to themselves as hackers. Around 1980, journalists who discovered the hacker community mistakenly took the term to mean “security breaker.”

Grace Hopper photo

“A human must turn information into intelligence or knowledge. We've tended to forget that no computer will ever ask a new question.”

Grace Hopper (1906–1992) American computer scientist and United States Navy officer

The Wit and Wisdom of Grace Hopper (1987)
Context: We're flooding people with information. We need to feed it through a processor. A human must turn information into intelligence or knowledge. We've tended to forget that no computer will ever ask a new question.

John D. Barrow photo

“The computer argues, to put it baldly, that the most serious problems confronting us at both personal and professional levels require technical solutions through fast access to information otherwise unavailable. …this is… nonsense.”

Neil Postman (1931–2003) American writer and academic

Technopoly: the Surrender of Culture to Technology (1992)
Context: Because of what computers commonly do... With the exception of the electric light, there never has been a technology that better exemplifies Marshall McLuhan's aphorism "The medium is the message." …the "message" of computer technology is comprehensive and domineering. The computer argues, to put it baldly, that the most serious problems confronting us at both personal and professional levels require technical solutions through fast access to information otherwise unavailable.... this is... nonsense. Our most serious problems are not technical, nor do they arise from inadequate information. If a nuclear catastrophe occurs, it shall not be because of inadequate information. Where people are dying of starvation, it does not occur because of inadequate information. If families break up, children are mistreated, crime terrorizes a city, education is impotent, it does not happen because of inadequate information. Mathematical equations, instantaneous communication, and vast quantities of information have nothing whatever to do with any of these problems. And the computer is useless in addressing them.

Marvin Minsky photo

“In today's computer science curricula … almost all their time is devoted to formal classification of syntactic language types, defeatist unsolvability theories, folklore about systems programming, and generally trivial fragments of "optimization of logic design"”

Marvin Minsky (1927–2016) American cognitive scientist

the latter often in situations where the art of heuristic programming has far outreached the special-case "theories" so grimly taught and tested — and invocations about programming style almost sure to be outmoded before the student graduates.
Turing Award Lecture "Form and Content in Computer Science" (1969) http://web.media.mit.edu/~minsky/papers/TuringLecture/TuringLecture.html, in Journal of the Association for Computing Machinery 17 (2) (April 1970)

Camille Paglia photo

“The new generation, raised on TV and the personal computer but deprived of a solid primary education, has become unmoored from the mother ship of culture”

Camille Paglia (1947) American writer

The Magic of Images: Word and Picture in a Media Age (2004)
Context: As a classroom teacher for over thirty years, I have become increasingly concerned about evidence of, if not cultural decline, then cultural dissipation since the 1960s, a decade that seemed to hold such heady promise of artistic and intellectual innovation. Young people today are flooded with disconnected images but lack a sympathetic instrument to analyze them as well as a historical frame of reference in which to situate them. I am reminded of an unnerving scene in Stanley Kubrick's epic film, 2001: A Space Odyssey, where an astronaut, his air hose cut by the master computer gone amok, spins helplessly off into space. The new generation, raised on TV and the personal computer but deprived of a solid primary education, has become unmoored from the mother ship of culture. Technology, like Kubrick's rogue computer, Hal, is the companionable servant turned ruthless master. The ironically self-referential or overtly politicized and jargon-ridden paradigms of higher education, far from helping the young to cope or develop, have worsened their vertigo and free fall. Today's students require not subversion of rationalist assumptions -- the childhood legacy of intellectuals born in Europe between the two World Wars -- but the most basic introduction to structure and chronology. With out that, they are riding the tail of a comet in a media starscape of explosive but evanescent images.

Scott Adams photo

“These days it seems like any idiot with a laptop computer can churn out a business book and make a few bucks. That's certainly what I'm hoping.”

Scott Adams (1957) cartoonist, writer

The Dilbert Principle (1995)
Context: These days it seems like any idiot with a laptop computer can churn out a business book and make a few bucks. That's certainly what I'm hoping. It would be a real letdown if the trend changed before this masterpiece goes to print.

John D. Barrow photo
Terry Winograd photo

“Seekers after the glitter of intelligence are misguided in trying to cast it in the base metal of computing.”

Terry Winograd (1946) American computer scientist

"Thinking Machines: Can there be? Are we?", in The Boundaries of Humanity: Humans, Animals, Machines (1991), ed. James J. Sheehan and Morton Sosna, p. 216.
Context: Seekers after the glitter of intelligence are misguided in trying to cast it in the base metal of computing. There is an amusing epilogue to this analogy: in fact, the alchemists were right. Lead can be converted into gold by a particle accelerator hurling appropriate beams at lead targets. The AI visionaries may be right in the same way, and they are likely to be wrong in the same way.

Tom Clancy photo

“Computers are going to be even bigger. TVs are one-way. You sit there and you watch it. Computers, you interact with.”

Tom Clancy (1947–2013) American author

1990s, Schafer interview (1995)
Context: I was one of the first generations to watch television. That's technology. TV is like any other kind of tool. TV exposes people to news, to information, to knowledge, to entertainment. How is it bad? Computers are going to be even bigger. TVs are one-way. You sit there and you watch it. Computers, you interact with.

Freeman Dyson photo

“Our grey technology of machines and computers will not disappear, but green technology will be moving ahead even faster.”

Freeman Dyson (1923) theoretical physicist and mathematician

Progress In Religion (2000)
Context: Our grey technology of machines and computers will not disappear, but green technology will be moving ahead even faster. Green technology can be cleaner, more flexible and less wasteful, than our existing chemical industries. A great variety of manufactured objects could be grown instead of made. Green technology could supply human needs with far less damage to the natural environment. And green technology could be a great equalizer, bringing wealth to the tropical areas of the world which have most of the sunshine, most of the human population, and most of the poverty. I am saying that green technology could do all these good things, bringing wealth to the tropics, bringing economic opportunity to the villages, narrowing the gap between rich and poor. I am not saying that green technology will do all these good things. "Could" is not the same as "will". To make these good things happen, we need not only the new technology but the political and economic conditions that will give people all over the world a chance to use it. To make these things happen, we need a powerful push from ethics. We need a consensus of public opinion around the world that the existing gross inequalities in the distribution of wealth are intolerable. In reaching such a consensus, religions must play an essential role. Neither technology alone nor religion alone is powerful enough to bring social justice to human societies, but technology and religion working together might do the job.

Katherine Paterson photo
Richard Stallman photo

“Steve Jobs, the pioneer of the computer as a jail made cool, designed to sever fools from their freedom, has died.”

Richard Stallman (1953) American software freedom activist, short story writer and computer programmer, founder of the GNU project

Steve Jobs

David Lyon photo
Robert Sheckley photo
Newton Lee photo

“To an increasing number of practitioners, computer simulations rooted in mathematics represent a third way of doing science, alongside theory and experiment.”

Ivars Peterson (1948) Canadian mathematician

Source: The Mathematical Tourist: New and Updated Snapshots of Modern Mathematics (1998), Chapter 1, “Explorations” (p. 10)

Stephen Wolfram photo

“Computational reducibility may well be the exception rather than the rule: Most physical questions may be answerable only through irreducible amounts of computation. Those that concern idealized limits of infinite time, volume, or numerical precision can require arbitrarily long computations, and so be formally undecidable.”

Stephen Wolfram (1959) British-American computer scientist, mathematician, physicist, writer and businessman

[Undecidability and intractability in theoretical physics, Physical Review Letters, 54, 8, 1985, 735–738, 10.1103/PhysRevLett.54.735, https://www.stephenwolfram.com/publications/academic/undecidability-intractability-theoretical-physics.pdf]

Charles Stross photo
Charles Stross photo
Charles Stross photo
Charles Stross photo
J. Howard Moore photo
Carl Sagan photo
Yuval Noah Harari photo
David Chalmers photo

“The easy problems of consciousness are those that seem directly susceptible to the standard methods of cognitive science, whereby a phenomenon is explained in terms of computational or neural mechanisms. The hard problems are those that seem to resist those methods. …The really hard problem of consciousness is the problem of experience.”

David Chalmers (1966) Australian philosopher and cognitive scientist

When we think and perceive, there is a whir of information-processing, but there is also a subjective aspect. ...When we see, for example, we experience visual sensations: the felt quality of redness, the experience of dark and light, the quality of depth in a visual field. Other experiences go along with perception in different modalities: the sound of a clarinet, the smell of mothballs. Then there are bodily sensations, from pains to orgasms; mental images that are conjured up internally; the felt quality of emotion, and the experience of a stream of conscious thought.
"Facing Up to the Problem of Consciousness," 1995

Samuel T. Cohen photo

“As you can well imagine, any nuclear bombing study that neglected to target Moscow would be laughed out of the room. (That is, no study at that time; 10 or 15 years later senior policy officials were debating how good an idea this might be. If you wiped out the political leadership of the Soviet Union in the process, who would you deal with in arranging for a truce and who would be left to run the country after the war?) Consequently, two of RAND’s brightest mathematicians were assigned the task of determining, with the help of computers, in great detail, precisely what would happen to the city were a bomb of so many megatons dropped on it. It was truly a daunting task and called for devising a mathematical model unimaginably complex; one that would deal with the exact population distribution, the precise location of various industries and government agencies, the vulnerability of all the important structures to the bomb’s effects, etc., etc. However, these two guys were up to the task and toiled in the vineyards for some months, finally coming up with the results. Naturally, they were horrendous.”

Samuel T. Cohen (1921–2010) American physicist

Harold Mitchell, a medical doctor, an expert on human vulnerability to the H-bomb’s effects, told me when the study first began: “Why are they wasting their time going through all this shit? You know goddamned well that a bomb this big is going to blow the fucking city into the next county. What more do you have to know?” I had to agree with him.
F*** You! Mr. President: Confessions of the Father of the Neutron Bomb (2006)

Gerrit Blaauw photo
Gerrit Blaauw photo

“In computer design three levels can be distinguished: architecture, implementation and realisation; for the first of them, the following working definition is given: The architecture of a system can be defined as the functional appearance of the system to the user, its phenomenology.”

Gerrit Blaauw (1924–2018) Dutch computer scientist

Although the term architecture was introduced only ten years ago in computer technology (Buchholz), the concept of architecture is as old as the use of mechanism by man. When a child is taught to look at a clock, it is taught the architecture of the clock. It is told to observe the position of the short and the long hand and to relate these to the hours and the minutes. Once it can distinguish the architecture from the visual appearance, it can tell time as easily from a wrist watch as from the clock on the church tower.
The inner structure of a system is not considered by the architecture: we do not need to know what makes the clock tick, to know what time it is. This inner structure, considered from a logical point of view, will be called the implementation, and its physical embodiment the realisation.
Source: Computer architecture (1972), p. 154

“Although I am not averse to wasting a few hours playing computer games, I have never tried my hand at Doom.”

James Berardinelli (1967) American film critic

Judging by sales figures and testimonials, playing the game has to be an infinitely preferable experience to watching this pathetic excuse for a movie.
Review http://www.reelviews.net/php_review_template.php?identifier=928 of Doom (2005).
One-star reviews

Dylan Moran photo
James Burke (science historian) photo
Steve Jobs photo
Edsger W. Dijkstra photo

“[Though computer science is a fairly new discipline, it is predominantly based on the Cartesian world view. As Edsgar W. Dijkstra has pointed out] A scientific discipline emerges with the - usually rather slow!”

Edsger W. Dijkstra (1930–2002) Dutch computer scientist

discovery of which aspects can be meaningfully 'studied in isolation for the sake of their own consistency.
Dijkstra (1982) as cited in: Douglas Schuler, Douglas Schuler Jonathan Jacky (1989) Directions and Implications of Advanced Computing, 1987. Vol 1, p. 84.
1980s

Edsger W. Dijkstra photo

“LISP has been jokingly described as "the most intelligent way to misuse a computer."”

Edsger W. Dijkstra (1930–2002) Dutch computer scientist

I think that description a great compliment because it transmits the full flavor of liberation: it has assisted a number of our most gifted fellow humans in thinking previously impossible thoughts.
1970s, The Humble Programmer (1972)

Arthur C. Clarke photo

“I am afraid that this chapter will amply demonstrate the truth of Clarke's 69th Law, viz., "Reading computer manuals without the hardware is as frustrating as reading sex manuals without the software."”

Arthur C. Clarke (1917–2008) British science fiction writer, science writer, inventor, undersea explorer, and television series host

In both cases the cure is simple though usually very expensive.
"Appendix II: MITE for Morons," The Odyssey File (1984), p. 123
1960s, Clarke's Three Laws, et al (1962; 1973…)

Joseph Weizenbaum photo
Steve Jobs photo
Daniel Abraham photo

“Computers, it seemed, could be programmed to do almost anything but sense when someone was up to no good.”

Daniel Abraham (1969) speculative fiction writer from the United States

Source: Nemesis Games (2015), Chapter 7 (p. 76)

Robert Silverberg photo
John Allen Paulos photo
John Allen Paulos photo

“Humor, since it depends on so many emotional, social, and intellectual facets of human beings, is particularly immune to computer simulation.”

John Allen Paulos (1945) American mathematician

Source: Mathematics and Humor: A Study of the Logic of Humor (1980), Chapter 3, “Self-Reference and Paradox” (p. 51)

John Allen Paulos photo

“Appreciating humor—even recognizing it—requires human skills of the highest order; no computer comes close to having them.”

John Allen Paulos (1945) American mathematician

Source: Mathematics and Humor: A Study of the Logic of Humor (1980), Chapter 3, “Self-Reference and Paradox” (p. 50)

Marilyn Ferguson photo
Howard H. Aiken photo

“Originally one thought that if there were a half dozen large computers in this country, hidden away in research laboratories, this would take care of all requirements we had throughout the country.”

Howard H. Aiken (1900–1973) pioneer in computing, original conceptual designer behind IBM's Harvard Mark I computer

1952. Quoted in I. Bernard Cohen: Howard Aiken: Portrait of a Computer Pioneer. 1999. MIT Press. p. 292. And I. Bernard Cohen: IEEE Annals of the History of Computing 20.3 pp. 27–33. (1998)

David Pearce (philosopher) photo
Noam Chomsky photo
Ward Cunningham photo

“When I was at Tek, I was frustrated that computer hardware was being improved faster than computer software. I wanted to invent some software that was completely different, that would grow and change as it was used. That’s how wiki came about.”

Ward Cunningham (1949) American computer programmer who developed the first wiki

"Startup mines for riches in collaboration software" in The Portland Tribune (7 March 2008) http://www.portlandtribune.com/rethinking/story.php?story_id=120430910578805900

Wendell Berry photo