Published:
|
Updated:

Published on INFORMS Analytics Magazine (Joseph Byrum)
Author’s Note: This blog series Understanding Smart Technology – And Ourselves examines our relationship with advancing technologies and the fundamental choices we face. As we stand at the threshold of an uncertain future shaped by artificial intelligence, the author challenges readers to consider whether we should embrace these transformative changes or resist them in defense of our humanity. Drawing from historical patterns of technological adoption and resistance, the series promises to deliver nuanced perspectives on our technological trajectory, beginning with a comprehensive overview of our current understanding of smart technology and its implications for society. Read Part 1 where the author provides a history lesson on smart technology.
“Knowledge itself is power.”
Francis Bacon, “Meditationes sacrae,” 1597
To better understand smart technology’s impact on society, it makes sense to begin by observing that we already know with certainty its potential and limitations. There’s no simpler place to start than the smallest unit of knowledge in the modern era, the ones and zeros of a binary machine, to build our way up to the bigger picture of smart technology in general.
Although smart technology is still evolving, we are quite familiar with the computers upon which it relies. A computer can be a defined as any device that executes a program – a set of operations or instructions that the machine can carry out. The computers we know today may seem complex, but in fact they perform only three fundamental functions: store data, move data and compare data. That’s why the first electronic general-purpose digital computer was not called a computer but an “electronic numerical integrator and calculator” [1].
Computers are limited by their programming, which is bounded by the limits of software and hardware, algorithms and language.
The limits of software and hardware
Smart technology, like all information technology, consists of two interdependent “wares.” The “software” is the programming, which as an operating reality has no physical existence. To manifest itself, it requires a physical body – the “hardware” comprised of electronic circuitry. Both software and hardware have known limits. When it comes to software, the entire structure of information technology (at least until quantum computing becomes mainstream) is built on the ones and zeros that can be represented in the “on” and “off” states of digital circuity, as opposed to analog which can exhibit a continuum of values within a range. Hardware abilities are constrained by the laws of physics.
The limits of algorithms
An algorithm is a set of instructions that enables a system (such as a computer program) to produce output from input. While we might not know what algorithms may do in the future, we already know what they can reliably do now – from data compression to random number generation.
The limits of language
In programming, a human language is reduced to ones and zeros. This is considered artificial. Meanwhile, humans themselves are considered to have natural language. When a computer performs natural language generation (NLG), it turns code into text. When it performs natural language processing (NLP), it turns human words into code. If it understands the words, this is called natural language understanding (NLU). The programming principles for each of these functions – NLG, NLP and NLU – is now in the realm of the known.
The fundamentals of life science
What is the dividing line between a living creature and an artificial one? That is, what is the difference between a human and a machine? We know a good deal about this at a scientific level. Until the 19th century, the distinction between the living and nonliving was commonly understood through two branches of chemistry, organic (with its study of cellular material) and nonorganic (with its periodic table of the elements of matter). Today, some nonliving materials are considered “organic” in terms of their composition [2]. So now biochemistry has emerged as the branch of science that studies the chemical processes in living organisms at the molecular and cellular level [3]. Still, the quest to define “life” continues.
The National Aeronautics and Space Administration (NASA) has a useful definition of life on its website, namely, that to qualify as living, a creature must meet some variation of all of the criteria listed below.
Living creatures:
- tend to be complex and highly organized
- possess metabolism, taking in energy from the environment and transform it for growth and reproduction
- seek balance and stability through homeostasis
- respond to their environment and some learn from it
This same discussion boils the definition of life down to carbon, water and isotopes: “In hopes of restricting the working definition at least terrestrially, all known organisms seem to share a carbon-based chemistry, depend on water, and leave behind fossils with carbon or sulfur isotopes that point to present or past metabolism.”
The fundamentals of intelligence
Finally, what does it even mean to be smart? There is general agreement that human beings have various levels and types of intelligence. Those who would argue for levels have identified a factor for general intelligence, called “g factor,” by British psychologist Charles E. Spearman more than a century ago [4]: correlating to tests for an intelligence quotient or IQ. By contrast, those who focus on varieties have embraced the theory of multiple intelligences advocated by Howard Gardner, Thomas Armstrong and others [5]. These types can be classified as logical/mathematical, verbal/linguistic and six others that are less often tested in academic settings, namely visual/spatial, bodily/kinesthetic, musical, interpersonal, naturalist, and intrapersonal intelligence. So far, machines have been best at logical mathematical skills, although programmers are trying to incorporate some others, such as verbal and spatial [6].
In the next installment, we will continue our look at what we know about smart technology and what we know about how the human mind works.
Notes & References
- This refers to John W. Mauchly’s ENIAC, Electronic Numerical Integrator and Calculator, completed in 1945 at the Moore School of Engineering, University of Pennsylvania. Source: “History of the Computer” in Computer Organization and Architecture, by Alan Clements, Cengage Learning Engineering, 2014. http://alanclements.org/COA_Students__ComputerHistoryOverview_V2.3.pdf
- Organic chemistry is the branch of chemistry concerned with carbon compounds, both those emanating both living and man-made materials. Prior to 1828, scientists believed that carbon compounds could only come from living organisms and were distinguished by a life force within their chemical makeup but the was disproved by Friedrich Wöhler when he synthesized urea (a known component of organic material) from inorganic ammonium cyanate.
- See “What is Biochemistry,” http://www.biochemistry.org/Education/BecomingaBiochemist/Whatisbiochemistry.aspx
- The original study, which is still commonly referenced today, was “General Intelligence, Objectively Determined and Measured” by C. Spearman, American Journal of Psychology, Vol. 15, No. 2 (April 1904), https://www.jstor.org/stable/1412107?seq=1#page_scan_tab_contents
- “Learning Through Many Kinds of Intelligence,” by Dee Dickson, Johns Hopkins School of Education, 1996, 2016. http://education.jhu.edu/PD/newhorizons/strategies/topics/mi/dickinson_mi.html. For a recent overview of current academic debate by an attorney and educator, see “Multiple Intelligences: Fact or Fiction” by Heidi Getchell Bastien, published on LinkedIn, May 14, 2017, at https://www.linkedin.com/pulse/multiple-intelligence-theory-fact-fiction-heidi-getchell-bastien
- Ibid.

Joseph Byrum is an accomplished executive leader, innovator, and cross-domain strategist with a proven track record of success across multiple industries. With a diverse background spanning biotech, finance, and data science, he has earned over 50 patents that have collectively generated more than $1 billion in revenue. Dr. Byrum’s groundbreaking contributions have been recognized with prestigious honors, including the INFORMS Franz Edelman Prize and the ANA Genius Award. His vision of the “intelligent enterprise” blends his scientific expertise with business acumen to help Fortune 500 companies transform their operations through his signature approach: “Unlearn, Transform, Reinvent.” Dr. Byrum earned a PhD in genetics from Iowa State University and an MBA from the Stephen M. Ross School of Business, University of Michigan.