A View With A View

Goodnight Alexa

One morning in late November, I was standing in an empty 11/12s classroom waiting for the students to return from a Buddhist monastery. In the corner of the meeting area were two pieces of bright construction paper covered in sticky notes. One group of notes was labeled “Our Googleable questions about religion,” and the other, “Our juicy questions about religion.” The students’ Googleable questions—such as “What are the biggest religious groups?”—had definite answers, while the juicy questions—including “Why do people believe in God?”—prompted more questioning and were far from easy to resolve, despite the over 865,000,000 results that come up instantly when the query is inserted in the search bar.

Seeing the students’ questions organized by the role that Google could have in their resolution brought me back to the many parent workshops I’ve hosted to help families take the lead in defining their children’s relationship with technology. In these discussions, most questions fall into three categories: 1) those with searchable, research-based answers, such as “How much screen time is appropriate for three-year-olds?” 2) those that require both research and discussion, such as “At what age should I buy my child a smartphone?” and 3) questions that can at best only be probed with more questions, and that I usually find the most interesting.

An example in the last category that continues to resonate across my personal and professional life came up at a recent meeting for parents of young children. A mother, who had hitherto remained quiet, asked, “ My daughter likes to ask Alexa questions; is that okay?” This is a particularly juicy question and, because I don’t know if it’s okay for her four-year-old to query a virtual home assistant, I began my reply with the following anecdote that I’ve often shared professionally among colleagues, but had not yet with parents.

A few years ago I was facilitating a lively discussion with eight-and nine-year- olds about how they use technology at home. One student raised her hand and said, “I’m an only child and my parents are busy, so I like to talk to Alexa.” This matter-of-fact response from a girl who was building a trusting relationship with a computer’s voice so resounded in my head that I forgot my follow-up questions. With all of its implications, it was an admission that deserved to be pondered.

Back to the parent meeting, and the mother’s query, I quickly provided more context for the anecdote, asking participants to think about the kinds of questions they thought were appropriate for a child to ask a computer. “What’s the weather like today?”—fine; “sing me a song”— maybe; “tell me a bedtime story”—no. Bedtime stories, whether read by the child, a caregiver, or in audiobook form are selected for the listener. Alexa’s selections are not necessarily appropriate and are read in a vacant, quasi-human voice. If a child has a question about the story, there is no one who can contextualize the messages.

We continued probing what relationships with Alexa-like systems would look like in the future, how parents teach their children what questions are okay to ask a computer, and which answers must come from a trusted adult. While we’ve been typing even our most sensitive questions into Google for almost two decades, there is a real difference between a computer that requires a user to physically engage with it, and an omnipresent, voice-activated assistant that passively waits, listening for its name.

Even my four-year-old son, who lives in a TV-free home and has minimal screen time, has learned in the last few weeks that he can take control of my phone by shouting, “Okay, Google,” but when that failed on my wife’s phone, he found that asking Siri was the key. He’s continued to test this new knowledge, giving verbal commands to laptops, the fridge and even our 2004 Toyota hatchback. And while most of the devices didn’t respond, pretty soon, they all will.

So, the juicy question is, what responsibility do we as parents and educators have to children who will live most of their lives fluidly exchanging thoughts with computer systems whose mechanics and their makers’ motivations remain obscure? Unfortunately, this is not a rhetorical question, but we do have a model for inquiry that can help students critically evaluate the technology in their lives.

In the Bank Street curriculum, balance is maintained among concrete skills, experimentation, and reflections on social justice. Equipped with this model, we know that students must to be allowed to explore technology with teachers who can provide context and technical guidance. Additionally, the time and space to work with these tools is important for students to develop their own understanding, to challenge peers, and to create physical (or digital) artifacts that give proof of their learning. But to cultivate real understanding, we need to help students step away from the machines and develop strategies for prioritizing their own values. Like a family discussing when and why their child should have a smartphone, or a teacher prompting a class about the relative merits of Google searching, students will eventually have to gain the capacity to keep determining for themselves which answers from an algorithm they can trust and accept.


Please feel free to reach out directly to Charlie to talk about technology at Bank Street or at home. He can be found in 410B and at cvergara@bankstreet.edu.

Charles Vergara is the Technology Coordinator at the Bank Street School for Children. While Charlie started in this role in 2014, this was not his first tenure at the school. He graduated from the School for Children in 1995 and says, “As an SFC alumnus, it’s exciting to give back to a school that did so much for me a child.”

Bronx High School of Science
University of Wisconsin, BA, English
Teachers College, MA, Math Science Technology and Communication
Klingenstein Center at Teachers College, Columbia University, EdM, Independent School Leadership