By Leopold P. Mureithi
In what follows, we examine a few dilemmas facing people in the domain of technology, understood to be the tools, skills, methods and processes used to produce goods and services to satisfy human wants. The choices made at such forking points set future trajectories, for good or for worse.[1]
Macrohistory
Macrohistory has been defined as ”exploring the past on many different large scales up to and including the largest scales of all, those of cosmology.”[2] Capacitating “search of patterns, even laws of social change, macrohistory is thus nomothetic and diachronic [enabling]macrohistorians use the detailed data of historians for their grand theories of individual, social and civilizational change.”[3]
One significant application of macrohistory is by Yuval Noah Harari. In his book Sapiens: A Brief History of Humankind,[4] he identifies four major phenomena, namely:
- The Cognitive Revolution – c. 70,000 years before the present (YBP), when Sapiens evolved imagination, “the emergence of fictive language;”[5]
- The Agricultural Revolution — c. 12,000 YBP, the development of agriculture;
- The First Kingdoms – c. 5,000 YBP, the gradual consolidation of human political organisations towards one global empire; and
- The Scientific Revolution — c. 500 YBP, the emergence of objective science.
Needless to say, the most critical stage is the cognitive revolution since it seems to drive the stages that follow it.
The Cognitive Revolution
The cerebral cognitive revolution came about when it dawned on humans that, in addition to physical reality, there exist imagined realities. Utilizing this dual reality, humans were able to leverage the power of “shared fictions”, stories, metaphors and myths to bring people together. This made Homo sapiens the only animals that can cooperate flexibly in large numbers. By so doing, humankind revolutionized agriculture, built empires and exploited science and technology to achieve industrial and information revolution. The down side of this development is human domination of other biota — both fauna and flora – as well as abiotic forms. With this as an overriding motivation, we are now living in an Anthropocene geological-scale age, where human activity has markedly impacted climate change and environmental degradation.[6] It is time to rethink and reset for the sake of humanity’s very survival.
Capital-Labour Substitution
Macrohistory has documented that there has been production revolution: agrarian, industrial to cybernetic.[7] The underlying characteristic of this revolution is the relaxation of labour burden, with capital taking up various tasks and people enjoying more leisure. Essentially, capital is a tool for labour – serving as “extra-corporeal limbs.”[8] However, capital-labour substitution is economically incentivized by the falling price of capital relative to that of labour.[9] The question is: how far can this continue? Could there be a time when capital does all the work and people rendered completely redundant?
The latter scenario is unlikely because capital is strictly made by human beings, items produced by people to produce goods and services. For people to be irrelevant in capital production, capital would have to be able to produce and reproduce itself, and also program, retool, and maintain itself – plausible, but improbable due to cognitive limitations of artificial intelligence (AI). That human touch is stubbornly needed, although technological unemployment is a real prospect. Economic ecology calls for a rethink and reset, just like the physical environment.
A Singularity
A scenario in AI development is technological Singularity,[10] a cleverness convergence of all technologies (bio, information, nano) whereby “there will be no distinction…between human and machine”[11] in terms of brainpower. At that point substitution, combination and permutation of various technologies and human input will be possible in vivo, in vitro and in silico. If and when this happens, it is plausible that all work could be done by machines in “algorithm-only zero-employee companies.”[12]
Strategic Responses
What options present themselves in post-work futures? Human ingenuity is such that one can invest in that which replaces oneself – robots — and earn dividend and/or rental incomes. If all (or most all) work is done by machines, people can enjoy leisure and self-actualization; thus placing humans on a higher level on Maslow’s hierarchy of needs.[13] Robots could be avatar personal assistants in tasks that can be routinized. Robot automation and other taxes could yield sufficient tax to facilitate payment to all people an unconditional universal basic income (UBI), education and universal health care (UHC) so that no one is left behind. The future of work calls for a rethink in the face of a possible paradigmatic shift brought about by unstoppable technological dominance.
Moving ahead
There is a need to rethink and reset on a number of issues because alternatives, more sustainable and human futures, are entirely possible. Crucial to sustainability is public discourse by stakeholders in search of consensus and a viable societal contract.
Correspondence:
Leopold P. Mureithi is a Professor at the University of Nairobi. He can be contacted at Lpmureithi@hotmail.com
[1] The diagrammatic representation are sourced from Microsoft Windows 10 by insert shapes command.
[2] David Christian, “Macrohistory: The Play of Scales,” Social Evolution & History, Vol. 4 No. 1, March 2005, p. 22.
[3] Sohail Inayatullah, “Macrohistory and Futures Studies,” Futures, Vol. 30, No. 5, 1998, p. 181.
[4] Yuval Noah Harari. 2019. Sapiens: A Brief History of Humankind. London, Vintage.
[5] Op. cit., p. xi.
[6] See WWF. 2018. Living Planet Report – 2018: Aiming Higher. Grooten, M. and Almond, R.E.A. (Eds). WWF, Gland, Switzerland.
[7] Leonid Grinin, A. Korotayev and Arno Tausch Corvinus, “Kondratieff Waves and Technological Revolutions.” Chapter 5·in L. Grinin et al., Economic Cycles, Crises, and the Global Periphery. International Perspectives on Social Policy, Administration, and Practice, October 2016, p. 144. DOI 10.1007/978-3-319-41262-7_5.
[8] A phrase used by Bruce Mazlish in his chapter “The Fourth Dimensionality,” in Melvin Kranzberg and Willian H. Davenport (Eds.), Technology and Culture: An Anthology. New York: Meridian Books, 1975, p. 226.
[9] See, for example, Mai Chi Dao, Mitali Das, Zsoka Koczan, and Weicheng Lian, “Why Is Labor Receiving a Smaller Share of Global Income? Theory and Empirical Evidence,” IMF Working Paper WP/17/169, July 2017, passim.
[10] The term was coined by John von Neumann. See Stanislaw Ulam, “Tribute to John von Neumann,” Bulletin of the American Mathematical Society: 5, 64, #3, part 2. It was popularized by Venor Vinge in his presentation “The Coming Technological Singularity: How to Survive in the Post-Human Era” in National Aeronautics and Space Administration (NASA). 1993. Vision-21: Interdisciplinary Science and Engineering in the Era of Cyberspace, pp. 11-22.
[11] Ray Kurzweil. 2005. The Singularity is Near: When Humans Transcend Biology. Ney York: Penguin, 2005, p. 9. See also Hans Moravec. 1999. Robot: Mere Machines to Transcendent Mind. New York: Oxford University Press, p. 61.
[12] Rohit Talwar (Ed.) 2015. The Future of Business. FutureScapes, p. 415.
[13] Abram H. Maslow, “A Theory of Human Motivation,” Psychological Review, 50(4), 370-96, 1943.