Saturday, April 8, 2023

What happens when a computer thinks for us? Chooses for us? Has intentions?

Stephen Hawking.

By George Templeton

Gazette Blog Columnist

Computational Mindfulness

Stephen Hawking pondered, "Scientists built an intelligent computer. The first question they asked it was, 'Is there a God?' The computer replied, 'There is now.”

Are we continuously improving?  The Wynonna Judd song, I Just Drove By, captures it.  “The life we had might have been, but it was one we loved believing in…  In a world that's seen its better days, it's good to know some things remain the same, though standing still is not time's way…”

Is the glass half full or half empty?  As Dr. Wayne Dyer explained, you’ll see it when you believe it!  Feelings determine our choice.  We pick the thread corresponding to our perception, but our results going into the future are not independent of the path or reversible.

What happens when a computer chooses for us?  Will our culture become less responsible and more gullible?  Will it reduce human interaction, get to “know” us, and engage in predatory marketing to target our narrow interests.  We already know that social media has largely replaced factual journalism with flights of fantasy and pure conjecture that capture our attention.  A distrusting society that is highly interdependent and tightly connected will not surmount the challenges necessary for survival.

You can subtract today from yesterday to get a single number.  It is just a point.  The percent change involves two measurements of similar things.  You divide them.  The result involves not only what but also when.  A function defines one thing in terms of another, perhaps implying cause and effect.  The average implies a shape like a bell curve.  If it is constant we are confident that there must be reasons.  But statistics are a mathematical abstraction.  They might not be you.

To be human, a computer has to think mathematically.  It has to solve problems, perform derivations, and make proofs.  People have different abilities and interests.  Would the “person-machine” be the same way?  Wouldn’t it make sense to create and use forms of specialized artificial intelligence?  We would risk losing interdisciplinary connections!

You can start with the particular, the bottom, and work out the divergent consequences.  In another approach, you start with everything and try to find something convergent.  Your anxiety arises from the clash between interdependence and the disarray of knowledge.  You don’t think flexibly because you naturally fear anything different or ambiguous.  When you see things changing, you become anxious about the future.  Psychologists claim this impairs creativity and problem-solving.

It’s an interdependent world.  Its size measures complexity.  Organized complexity discriminates between a rock and an egg, between inanimate death and the trajectory of life, between the Dewey Decimal System and the Thread, between classification and just happenings.  Rapid instability is a sign of life.  Change is a series of happenings.  It is time.  Complexity is a sign of life.  But aren’t we more than complexity?

A computer never dies.  It can keep adding to its information base.  If a computer memorized the periodic table and everything written about chemistry would it be a chemist?  If it memorized the phone book would it know those people?  Information is not knowledge.  Isn’t understanding more than accuracy?

We know the power of compounding and the pain of inflation.  When we act abruptly it comes from our feelings.  When we delay, we are waiting to “realize” things.  Our emotions and feelings exist even when there are no concrete things.  But love, hate, empathy, and humor are never in themselves.

The Zhejiang University researched whether brief exposure to Buddhist practice would help.  If you chant "Namo Amitabha" and place your hands together in front of your chest, this is all that it takes.  They say that delaying gratification for future rewards is more important for success than intelligence or creativity.

The ancient Greek, Socrates, thought that truth was in our souls.  Our immortal soul determines our proclivities.  Emotions differ depending on geography and the environment that a culture lives within.   They are just a reflection of engagements with that world.  For example, in Tahiti anger is viewed as evil and mental illness, while in America anger can be righteous as evidenced by the feminist “Me Too” movement.  The Utku Eskimos don’t get angry.  They have to survive in the brutal arctic cold.  They have replaced the emotion of anger with resignation.  There is nothing they can do about it.  In the small densely populated country of Japan, people have learned that they have to depend on and get along with one another.  People are entangled so they can’t stand out.  But in America, the ethos has been one of self-reliance, independence, and autonomy.

Can a machine have emotions?  They stem from engagements in the world.  Computers can infer your feelings.  Can a machine have intentions?  Can it laugh and cry?  What then?

We think with words and sentences.  We construct our thoughts intuitively.  Children do not learn to speak from a book.  They seem to have a natural ability to learn how to talk.  We have to be careful to define our pronouns, to prevent confusion.  We have to be rational to make our experiences, our information, into knowledge.  Thought is required to develop language, but language is how we think.

Our minds require structure.  We have broad plans, goals, and dreams, not just the immediacy of narrow threads.  We remember in category, in folders.  Those folders contain our threads.

Could technology change how our minds work?  Let’s make your computer’s operating system more like a search engine.  You put everything into a single pot.  There would be no folders, only threads.  But they can tangle like a snagged fishing line.

Artificial intelligence (AI) and unlimited computer power make this possible.  Novelty is in favor of it.  You want a summary.  You would not have to decide where to put things because there would be only one place.  This would be a system that operates like the six degrees of Kevin Bacon.  The user would only be six handshakes away from a Tuvin throat singer! 

Connections scale with network size.  Ultimately everything is connected.  It is the opposite of what is expected.  Large networks often require fewer handshakes!

AI could know everything except the future.  It is theoretically possible but not practical.  But our oracle collects information along its path.  The tyranny of the majority makes it biased.

Some psychologists and philosophers think that there is no such thing as morality even though there are more than a dozen theories having “truth value”.  We think that normative ethics, character, and virtue are proof of what is right or wrong, but they have exceptions to their rules.  Artificial intelligence derives its truth from humans who pick and choose.  If people lie, misrepresent, and deceive can we create a machine that will not do the same?  Do we want to create a mechanical human being, or solve the problem?

We think of history as a thread.  We could use our story to unpack it, but we might not get what was important to us.  It might be more or less than we wanted.  We would have to think about how we might make our search particular, and we would need to remember how we posed our question in case we wanted to find that again. 

 If you did not write it, will you understand it?  Have you internalized it?  We risk becoming accustomed to the computer that will read and write for us.  Already we have created an artificial intelligence that we cannot understand and it cannot explain itself to us.  Don’t we need machine intelligence instead of artificial humans?

Some people take the stairs instead of the elevator, thinking that exercise is good for them.  The electric scooter has replaced the bicycle used by 1950s youth.  Physical exercise is hard work.  You have to do it to become strong.  How about mental exercise and the drudgery of mental discipline?  When artificial intelligence thinks for us and when research is nothing more than entertainment, our brains will atrophy.   

Our bucket, full of threads, is dynamic.  Things change in the future as they have in the past.  There is a bit of insecurity in this.  It requires thinking about thought.  Remember, we are building a “paperless society” that does not appreciate filing cabinets and personal computer folders.

You don’t need a private bucket for your threads.  The public bucket in the clouds contains more than yours.  It is easier for everyone to use it.  Will we all think the same?  Will we become less human?  We won't need to visit the library anymore.  The digital oracle in the clouds will do it for us.

What is a thread?  Is it a connotative word, a cause and effect, a history, or a function of defined variables?  Is it factual?  Without facts we are adrift in the sea, blown by its winds and moved by its currents.

Some libraries simply file everything according to the purchase date.  This is not by copyright or subject.  It is not the Dewey Decimal System.  You don’t have to understand that.  But large libraries organize things by subject, purpose, and type of media, by ISDN and LCCN.  The library has another pointer; location.  The problem is:  They have to decide where to put something.  They have to classify it

To be a programmer requires something more than just knowing the language you are writing with.  When you put all your code into one bucket, it is spaghetti programming.  It lacks structure and is hard to understand, especially at a later date when you have forgotten why you did it that way.

Structured programs use a core which calls subroutines.  Arguments are passed to them so that an “orange becomes a fruit”.  The subroutine performs a task such as sorting or counting.  Thus, the programmer organizes his intelligence.

Consciousness, be it the programmer or computer, suffers from the fact that the truth is incomplete.  We fill in its gaps as we learn from culture and personal experience.  It takes time.

Our intelligent consciousness unites the thread and the folder.  We cannot take reality separately from ourselves.

Water dowsing is successful at identifying where to drill a well.  There is an interaction between a forked stick held in balanced equilibrium and its user.  When there is buried water below, the stick points at its location.  The stick knows, but Its user supplies the force to make it move.  Its user has personal experiences and learned subconscious reflexes which are partly the reasons for what and who he is.  Will computers be the same way?

In nature, all things are imprecise, inaccurate, and subject to unavoidable tolerances.  Measurements have precision, error, truth, and confidence.  You could be wrong!  Isn’t quantification better than just feelings?  How do you feel about trying to measure personal feelings?

Much of the world has been built using the method of “dead reckoning”.  Don't we have answers to questions that are hidden from our conscious minds?  There is no reason why an intelligent machine could not operate the dowsing stick.   It would need to replicate human intent and the meaning of the stick’s imbalance.  It would use measured statistical data about the divining rod's imbalance and movement while held in the machine’s grasp.  Would you drill where it said you should?  Why or why not?

Mr. Bean (Rowan Atkinson) is a famous actor and comedian who communicates through body language and facial expression.  The famous trumpet player Miles Davis communicated by his phrasing.  What he did not play was more important than what he did play.  My teacher was a mountain.  He gave me needed hand-holds. 

Do you suppose that AI will be able to communicate using body language?  Written music alone does not convey its feeling.  Will AI be able to identify and use metaphors? 

When we rely on artificial intelligence the distinction between it and us goes away.  Joseph Campbell explained that reason puts mankind in touch with God, not special revelation.  How about the thinking computer?  It will become more like us, but will we become more like it?

 
Miles Davis.


No comments:

Post a Comment