<!–

–>

June 23, 2022

In the past twenty years and, with accelerating speed in the past ten years, computer algorithms determine what information Americans receive and even dictate how they should respond to that information. There is an ancient method of analyzing information that allows people to see and understand all sides of an issue. It’s practiced in Yeshivas (traditional Jewish education schools that focus on Biblical and religious texts) but can be applied anywhere. American cultural and political discourse would improve greatly if it were added to American education.

‘); googletag.cmd.push(function () { googletag.display(‘div-gpt-ad-1609268089992-0’); }); }

I recommend a Netflix special production, The Social Dilemma, to everyone I meet—friends, relatives, shopkeepers, and even strangers. Although understated in tone, the movie is hard-hitting as it exposes the business model companies such as Facebook, Google, Twitter, and Instagram use to affect our thinking. The documentary’s conclusion is that computer algorithms have learned—on their own—how to reward humans to increase the amount of attention they pay to their devices. Then, the information technology companies sell the accumulated increased attention to advertisers in an auction-like process. Each person’s attention-increase is worth a few cents. These cents add up, making these giants the richest companies that have ever existed.

What the algorithms have learned is to feed back to people the same ideas with which they have already felt comfortable. Anything contradicting their present belief systems is blocked because it would reduce or end their attention and, thus, lower their monetary worth both to those who control the search engines and the advertisers who buy that “attention.” Truth has no bearing on the information that the systems provide to each person. Each person gets exactly what he feels at home with. Ideas are only of value if they increase attention; never for their contents.

The person being affected by these algorithms is ensconced in a bubble. From inside, he can justifiably ask himself, “how can anyone disagree with this ‘truth’?” that he knows so well and has been taught so systematically. People who hold other views, he reasonably believes, are probably liars or delusional and are manipulating others or are being manipulated themselves.

‘); googletag.cmd.push(function () { googletag.display(‘div-gpt-ad-1609270365559-0’); }); }

In short, each person is given his own truth that differs from everyone else’s. These technology companies, therefore, create a source of unresolvable conflict between every person on the planet. If slight variations in ‘truth’ suddenly appear, it is only because the algorithm is trialing new information to determine if it increases attention or diminishes it. The algorithms know that novelty draws attention and so novelty is used to increase the monetary value of the person by exposing him to information that he might find ‘interesting’.

If our best and brightest computer programmers permit the defining-away of meaning by using irresponsible algorithms, they have erased shared perceptions between people. Those subjected to the algorithms will believe the real world is defined by their limited online world.

Image: Yeshiva teachers in Jerusalem at the Etz Chaim Yeshiva, before 1910. Public domain.

However, constant confirmation of ones worst fears becomes an inevitable and obvious source of anger, depression, and ennui. The multiple forms of social pathology we are experiencing today are at least partially the result of the chronic application of these algorithms to our lives—and it’s worse, of course, for the vulnerable, such as young teenagers and social isolates.

There must be a way back from the edge of our self-created oblivion! Could there be a solution to the problem of almost everyone being considered by everyone else to be a useless eater? Is there a real-world response to this addiction?

I suggest here that if everyone applied the Yeshiva system, we could recoup our losses and counter the effects of artificial intelligence programs that tinker with our well-being. In such an environment, hits of dopamine could be used to uncover meaning instead of degrading it.

The Yeshiva system has been in development since Biblical times, when Jethro, a Moabite priest, recommended to Moses how to handle conflicts between people as the Jews became a nation. It thrives today everywhere Jews care about their heritage.