Monday, December 11, 2023

Leisure, Capital, and AI

 

Hot Tub Thinking

 

            Here are a few disjointed thoughts.

December 2023.  The Israel/Hamas war has been raging for more than two months.  The Russia/Ukraine war is now almost twenty months old and shows no movement toward peace.  Meanwhile, I take time out to sit in my hot tub.

            Here I am, a retired teacher, living in middle-class American luxury: a 2500 square foot house (plus detached garage) on a half-acre in Dundee, Oregon.  The house is warm and dry.  We have internet, television, gas heat, solar panels, air-conditioning in summer, and lots of devices to help us prepare food, wash clothes, and amuse ourselves.  Sarah has a car, I have a pickup, and our granddaughter drives another car (insured by me). 

            Why am I so favored?

            Capital and leisure, that’s why.  I’ve written about this before.  Human beings sometimes produce more basic goods—food, clothing, shelter, security—than they need.  Civilizations arise when people use this “excess” wealth to free some of us from the production of basic good.  When excess wealth is “saved,” it allows the favored few to invent and produce other goods, including art (of many sorts), religion, philosophy, and science.  Saved wealth can become “capital,” when it is used to increase the productivity of workers.  Over many centuries, capital has so greatly increased human productivity that Earth now supports a population over eight billion, of which perhaps only one billion live in “absolute poverty,” one bad harvest or one bad fishing season from starvation.  Absolute poverty was the condition of most human beings over most history and pre-history.  But in many countries in the 21st century, a large proportion of the world’s population live in some degree of prosperity.  Billions of people in China, Japan, Europe, and North America reasonably expect to live in clean houses or apartments and enjoy many of the machines—cars, washers, televisions, computers, etc.—that I enjoy.

            I can own a hot tub because of capital and leisure.  But I could not enjoy it if I did not also have security.  I hear sirens as I sit in the hot water.  Something is wrong somewhere: a crime scene, an emergency, injuries, perhaps worse.  Someone, or some number of people are suffering.  Who knows?  Perhaps it’s more than just one household; maybe a score of people are hurting tonight in our town.

            A score, perhaps.  In a town of a couple thousand.  As I sit in my hot tub, I realize I take local security for granted.  In Gaza and Ukraine, the situation is vastly different.  We see how desperate we are when security disappears.  War robs people of security.  Houses are bombed, children are killed, food supplies are stolen, medical systems are broken, and on and on.

            We need peace, and we pray for political leaders—not just the ones of which we approve, but all of them.  We pray they change their minds.  We pray for political change when leaders refuse to change.  (In the 1980s, I prayed for peaceful change in the Soviet Union.  It’s okay to pray for miracles.)

            Most people think the key to security is power, coercive and if necessary violent power.  Over the centuries, capital has increased our power, giving us deadlier weapons.  In recent centuries, and especially in this new 21st century, the application of capital to weaponization has increased tremendously.  We invent new and more powerful weapons.  And now we are turning to artificial intelligence to coordinate our weapons, to give us security.

            In 2023 ChatGPT was released.  Lots of people are talking about artificial intelligence.  Governments are just beginning to talk about laws to govern AI. 

            Science fiction stories warn us that AI could turn on us.  Defense systems could malfunction.  In the older stories, computers gain control of missiles and rain thermonuclear destruction on us.  In newer stories, cyber-attacks take control of information systems; rather than killing us, the machines control what we think.

            AI is made possible by capital, just like my hot tub and other conveniences.  On one hand, AI promises to improve security (while also increasing the supply of basic goods).  On the other hand, AI takes control in order to deliver its promises.

            I don’t check my word processing program, to see how it transforms my keystrokes into essays.  I just use it.  The engineer at the hydropower plant can’t inspect his much-more-complex program; he just uses it.  Smart people used lots of leisure to build the programs, and I assume they built in checks so people could monitor them.  I hope so!  But when we use the programs, we just use them.  Can we design AI to win our security without giving up control?  Do we just have to trust the program?