–>
June 23, 2022
In the past twenty years and, with accelerating speed in the past ten years, computer algorithms determine what information Americans receive and even dictate how they should respond to that information. There is an ancient method of analyzing information that allows people to see and understand all sides of an issue. It’s practiced in Yeshivas (traditional Jewish education schools that focus on Biblical and religious texts) but can be applied anywhere. American cultural and political discourse would improve greatly if it were added to American education.
‘); googletag.cmd.push(function () { googletag.display(‘div-gpt-ad-1609268089992-0’); }); }
I recommend a Netflix special production, The Social Dilemma, to everyone I meet—friends, relatives, shopkeepers, and even strangers. Although understated in tone, the movie is hard-hitting as it exposes the business model companies such as Facebook, Google, Twitter, and Instagram use to affect our thinking. The documentary’s conclusion is that computer algorithms have learned—on their own—how to reward humans to increase the amount of attention they pay to their devices. Then, the information technology companies sell the accumulated increased attention to advertisers in an auction-like process. Each person’s attention-increase is worth a few cents. These cents add up, making these giants the richest companies that have ever existed.
What the algorithms have learned is to feed back to people the same ideas with which they have already felt comfortable. Anything contradicting their present belief systems is blocked because it would reduce or end their attention and, thus, lower their monetary worth both to those who control the search engines and the advertisers who buy that “attention.” Truth has no bearing on the information that the systems provide to each person. Each person gets exactly what he feels at home with. Ideas are only of value if they increase attention; never for their contents.
The person being affected by these algorithms is ensconced in a bubble. From inside, he can justifiably ask himself, “how can anyone disagree with this ‘truth’?” that he knows so well and has been taught so systematically. People who hold other views, he reasonably believes, are probably liars or delusional and are manipulating others or are being manipulated themselves.
‘); googletag.cmd.push(function () { googletag.display(‘div-gpt-ad-1609270365559-0’); }); }
In short, each person is given his own truth that differs from everyone else’s. These technology companies, therefore, create a source of unresolvable conflict between every person on the planet. If slight variations in ‘truth’ suddenly appear, it is only because the algorithm is trialing new information to determine if it increases attention or diminishes it. The algorithms know that novelty draws attention and so novelty is used to increase the monetary value of the person by exposing him to information that he might find ‘interesting’.
If our best and brightest computer programmers permit the defining-away of meaning by using irresponsible algorithms, they have erased shared perceptions between people. Those subjected to the algorithms will believe the real world is defined by their limited online world.
Image: Yeshiva teachers in Jerusalem at the Etz Chaim Yeshiva, before 1910. Public domain.
However, constant confirmation of one’s worst fears becomes an inevitable and obvious source of anger, depression, and ennui. The multiple forms of social pathology we are experiencing today are at least partially the result of the chronic application of these algorithms to our lives—and it’s worse, of course, for the vulnerable, such as young teenagers and social isolates.
There must be a way back from the edge of our self-created oblivion! Could there be a solution to the problem of almost everyone being considered by everyone else to be a useless eater? Is there a real-world response to this addiction?
I suggest here that if everyone applied the Yeshiva system, we could recoup our losses and counter the effects of artificial intelligence programs that tinker with our well-being. In such an environment, hits of dopamine could be used to uncover meaning instead of degrading it.
The Yeshiva system has been in development since Biblical times, when Jethro, a Moabite priest, recommended to Moses how to handle conflicts between people as the Jews became a nation. It thrives today everywhere Jews care about their heritage.
‘); googletag.cmd.push(function () { googletag.display(‘div-gpt-ad-1609268078422-0’); }); } if (publir_show_ads) { document.write(“
While the contents of traditional Yeshiva learning may be of little or no interest to non-Jews, it is the structure of the Yeshiva that I am touting as a potential savior of humanity from its own limitations. By attending Yeshiva, students become responsible for the contents of their own brains instead of depending upon ‘experts’ who do not have their best interests at heart.
Here’s how the Yeshiva system could work in any social or academic situation: Walk into a crowded room (and you can bring your smartphone for quick consultations); choose a partner with whom to ‘learn’ for the evening; and then watch something together (e.g., The Social Dilemma) and then discuss it. This initial process is called learning b’khavrutah, learning with a friend or partner, and can go on as long as both members of the dyad wish to continue. However, as a practical matter, this part of the process probably would be limited to 30 minutes.
Next, a few pairs of learners gather to choose someone from their newly enlarged group and elevate that person to the position of “scholar.” Those not chosen take their positions as students in an academy, sitting around the scholar to see what he has to say. The dynamic at this stage is to engage the “scholar” by questioning him about the topic at hand.
Finally, two people are chosen from the group of all scholars. Separately, without the other being present, each presents his understanding of the issue to the entire body of students/scholars. This mechanism would be the equivalent of attending two different schools that have been addressing the same topic. After their presentation, the TED yeshiva is over. Everyone can now mingle or go to dinner.
This process can be repeated daily, weekly, or monthly with another topic using the same method. Absolute conclusions or courses of action are not expected or even desirable but, if the group is lucky, underlying principles for their opinions may emerge for everyone to hold up to the light to examine.
Talmudic discussions have proceeded over the course of more than 2,000 years. It is a common experience in a Talmud study session to have opinions on the same page of people divided by 400 years as though they were speaking directly to one another.
This process of learning and relearning the same issues with changing scholars and students uncovers the nature of the issues and teaches students how to speak about issues in a way that makes sense despite their complexity. For the learner, the process builds emotional resilience to conflict to prevent prematurely ending a discussion and allows inserting reason into the discussion, the exact opposite of the emotional decision-making so prized today.
Netflix’s The Social Dilemma ends on a pessimistic note without suggesting what to do about the purposeful warping of our brains. Whatever Yeshiva-style learning can do for us, I assert here with certainty that it is the best potential antidote currently available for what Facebook, et al, are subjecting us to so that they can make money.
<!– if(page_width_onload <= 479) { document.write("
“); googletag.cmd.push(function() { googletag.display(‘div-gpt-ad-1345489840937-4’); }); } –> If you experience technical problems, please write to [email protected]
FOLLOW US ON
<!–
–>
<!– _qoptions={ qacct:”p-9bKF-NgTuSFM6″ }; –> <!—-> <!– var addthis_share = { email_template: “new_template” } –>