<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=192888919167017&amp;ev=PageView&amp;noscript=1">
Friday, March 29, 2024
March 29, 2024

Linkedin Pinterest

Cepeda: Is technology unintentionally making us ignorant?

By Esther Cepeda
Published: November 6, 2014, 12:00am

A decade ago, when I was a graduate student in teacher training, a frightening thing came from an accomplished and generally excellent professor.

She said that in “the future,” teachers would no longer have to toil with the difficult and repetitive work of instilling boring facts into students’ heads. With the advent of the “everywhere-Internet,” “smart classrooms,” and the spread of smartphones, dry pieces of information such as state capitals, historical dates and mathematical theorems would be relics of the past — just as calculators and spellcheck had obviated memorizing the multiplication tables or learning how to use a dictionary.

The jaws that dropped in shock and outrage belonged to the very few students who understood why calculators and word-processing spellcheck were a pitiable substitute for knowledge.

This memory returned to me as I read Nicholas Carr’s “The Glass Cage: Automation and Us,” a thoughtful and terrifying new book on the benefits and costs of putting our faith in an increasingly automated everything.

Through a deep inquiry into some of the realms in which automation has most impacted our personal safety — our car’s GPS navigation device, airplane autopilot systems, doctors’ use of electronic medical records — Carr illustrates that “mounting evidence of an erosion of skills, a dulling of perceptions and a slowing of reactions should give us all pause.”

For years, I’ve been bemoaning the search engine as a “teacher’s little helper.” It, in fact, serves as students’ favorite and most effective work-avoidance system. Why read a book, go to the library or even speak to an expert when you can just Google it? Yet there is no surer way to get yourself labeled a Luddite than to suggest that search-engine shortcuts may not be the best way to prepare whole generations of students for the knowledge economy.

At a journalism convention in 2011, I had the pleasure of dining with Amit Singhal, a Google Fellow serving as the team leader of search algorithms. He addressed a group of editors on the topic of the then-new Google Authors program, but I was more interested in the recently launched search-suggestion functionality.

I asked Singhal: What is it going to do to our young people who don’t already have a good basis in spelling, math and writing when Google serves them up answers and data even if they don’t input the correct search terms, but the algorithm corrects for their lack of knowledge? Singhal replied that there was no evidence of any related negative effects.

Questions become lazy

This is why my favorite part of “The Glass Cage” is when Carr cites Google now admitting that their search algorithms are making us stupid. “Google acknowledges that it has even seen a dumbing-down effect among the general public as it has made its search engine more responsive and solicitous, better able to predict what people are looking for,” Carr writes. He then cites an interview in the London Observer, in which Singhal said Google has not enabled people to become better searchers. “Actually, it works the other way,” said Singhal. “The more accurate the machine gets, the lazier the questions become.'”

“The Glass Cage” is not an anti-technology book by any means, but it does pose big, difficult questions. For me, the biggest is: How do we keep technological tools that are meant to save the time and effort of recalling facts and knowledge from instilling ignorance in young people who have yet to truly attain a basis of knowledge?

If our time- and effort-saving tools are, to paraphrase philosopher Alfred North Whitehead, best used to put our mental power to higher-level reasoning, analysis and contemplation, how do we harness such tools in a way that will reinforce foundational knowledge, rather than supplant it?

Carr asks, “As we transform ourselves into creatures of the screen,” the existential question is: “Does our essence still lie in what we know, or are we content to be defined by what we want?”

Just look around — our society plainly looks more motivated by wanting than by knowing.

Loading...