“Economic theory deals with two concepts, Value and Economy. Abstract reasoning regarding these concepts rests ultimately on mathematical concepts of quantity, time and energy. The three are inseparable, for quantity and time are dimensions of energy. The quantity relationships of energy, usually termed "statics," turn on the problem of the relation of the parts to the whole, while the time relationships, usually termed "dynamics," are the relations of a process that connects past, present and future.”

Source: Legal foundations of capitalism. 1924, p. 1; Lead paragraph first chapter on Mechanism, Scarcity, Working Rules

Adopted from Wikiquote. Last update June 3, 2021. History

Help us to complete the source, original and additional information

Do you have more details about the quote "Economic theory deals with two concepts, Value and Economy. Abstract reasoning regarding these concepts rests ultimatel…" by John R. Commons?
John R. Commons photo
John R. Commons 26
United States institutional economist and labor historian 1862–1945

Related quotes

Ernst Mach photo

“I see the expression of… economy clearly in the gradual reduction of the statical laws of machines to a single one, viz., the principle of virtual work: in the replacement of Kepler's laws by Newton's single law… and in the [subsequent] reduction, simplification and clarification of the laws of dynamics. I see clearly the biological-economical adaptation of ideas, which takes place by the principles of continuity (permanence) and of adequate definition and splits the concept 'heat' into the two concepts of 'temperature' and 'quantity of heat'; and I see how the concept 'quantity of heat' leads on to 'latent heat', and to the concepts of 'energy' and 'entropy.”

Ernst Mach (1838–1916) Austrian physicist and university educator

Mach (1910) "Die Leitgedanken meiner naturwissenschaftlichcn Erkennenislehre und ihr Aufnahme durch die Zeitgenossen", Physikalische Zeitschrift. 1, 1910, 599-606 Eng. trans. as "The Guiding Principles of my Scientific Theory of Knowledge and its Reception by my Contemporaries", in S. Toulmin ed., Physical Reality, New York : Harper, 1970. pp.28-43. Cited in: K. Mulligan & B. Smith (1988) " Mach and Ehrenfels: Foundations of Gestalt Theory http://ontology.buffalo.edu/smith/articles/mach/mach.pdf"
20th century

Leonhard Euler photo

“To those who ask what the infinitely small quantity in mathematics is, we answer that it is actually zero. Hence there are not so many mysteries hidden in this concept as they are usually believed to be.”

Leonhard Euler (1707–1783) Swiss mathematician

As quoted in Fundamentals of Teaching Mathematics at University Level (2000) by Benjamin Baumslag, p. 214

Nicholas Sparks photo
Buckminster Fuller photo

“The quantity of energy that ceased to "fall in" is the system's entropy.”

Buckminster Fuller (1895–1983) American architect, systems theorist, author, designer, inventor and futurist

130.01 http://www.rwgrayprojects.com/synergetics/s01/p3000.html
1970s, Synergetics: Explorations in the Geometry of Thinking (1975), "Synergy" onwards
Context: Critical proximity occurs where there is angular transition from "falling back in" at 180-degree to 90-degree orbiting—which is precession. (Gravity may be described as "falling back in" at 180 degrees.) The quantity of energy that ceased to "fall in" is the system's entropy. Critical proximity is when it starts either "falling in" or going into orbit, which is the point where either entropy or antientropy begins. An aggregate of "falling ins" is a body. What we call an object or an entity is always an aggregate of interattracted entities; it is never a solid. And the critical proximity transition from being an aggregate entity to being a plurality of separate entities is precession, which is a "peeling off" into orbit rather than falling back in to the original entity aggregate. This explains entropy intimately.

Karl Pearson photo
George Holmes Howison photo

“Mathematics is that form of intelligence in which we bring the objects of the phenomenal world under the control of the conception of quantity. [Provisional definition. ]”

George Holmes Howison (1834–1916) American philosopher

"The Departments of Mathematics, and their Mutual Relations," Journal of Speculative Philosophy, Vol. 5, p. 164. Reported in Moritz (1914)
Journals

“Information theory, introducing the concept of information as a quantity measurable by an expression isomorphic to negative entropy in physics, and developing the principles of its transmission.”

Ludwig von Bertalanffy (1901–1972) austrian biologist and philosopher

General System Theory (1968), 4. Advances in General Systems Theory

Related topics