
"Quotations by 60 Greatest Indians" at Institute of Information and Communication Technology http://resourcecentre.daiict.ac.in/eresources/iresources/quotations.html
Remarks Recorded for the Opening of a USIA Transmitter at Greenville, North Carolina (8 February 1963) Audio at JFK Library (01:29 - 01:40) http://www.jfklibrary.org/Asset-Viewer/Archives/JFKWHA-161-010.aspx · Text of speech at The American Presidency Project http://www.presidency.ucsb.edu/ws/?pid=9551
1963
Variant: A man may die, nations may rise and fall, but an idea lives on. Ideas have endurance without death.
"Quotations by 60 Greatest Indians" at Institute of Information and Communication Technology http://resourcecentre.daiict.ac.in/eresources/iresources/quotations.html
“To die is poignantly bitter, but the idea of having to die without having lived is unbearable.”
Source: Man for Himself (1947), Ch. 4
“It is better to die for an idea that will live, than to live for an idea that will die.”
Quoted in Scott MacLeod, "South Africa: Extremes in Black and Whites" http://www.time.com/time/magazine/article/0,9171,975037,00.html, Time, March 9, 1992, p. 38
Quoted in "The Mind of Black Africa" (1996) by Dickson A. Mungazi, p. 159
Part i, canto ii.
Lucile (1860)
Source: Anti-Intellectualism in American Life (1974), p. 29
Source: Computing Machinery and Intelligence (1950), p. 454.
Context: Another simile would be an atomic pile of less than critical size: an injected idea is to correspond to a neutron entering the pile from without. Each such neutron will cause a certain disturbance which eventually dies away. If, however, the size of the pile is sufficiently increased, the disturbance caused by such an incoming neutron will very likely go on and on increasing until the whole pile is destroyed. Is there a corresponding phenomenon for minds, and is there one for machines? There does seem to be one for the human mind. The majority of them seem to be "sub-critical," i. e., to correspond in this analogy to piles of sub-critical size. An idea presented to such a mind will on average give rise to less than one idea in reply. A smallish proportion are super-critical. An idea presented to such a mind may give rise to a whole "theory" consisting of secondary, tertiary and more remote ideas. Animals minds seem to be very definitely sub-critical. Adhering to this analogy we ask, "Can a machine be made to be super-critical?"