The Pleasure—and the Shock—of Recognition in the Era of AI

© Howard Gardner 1.15.2025

In early December 2024, when I first spied the name “Megan Kelly” in my inbox, I mistakenly assumed that it was a note or ad from the conservative political commentator. I was ready to delete, but something about the image on my screen stimulated me to read the note, and then to look at the attachment.

I am glad that I did—as it has given me lots to think about, both positive and disturbing. In fact, Megan is a high school teacher in Roxbury, New Jersey. She teaches a course on theories of child development. As an assignment, she asked her students to create a lesson using the video and AI tools that they had available. Megan generously sent along one of the videos for my possible interest.

Anyway: I clicked the link and, with increasing curiosity, pleasure, and a tad of alarm, I watched a seven minute YouTube video (link here).

As I’m an early-to-bed type, I don’t know the current hosts of late night television. But having watched Jack Paar, Johnny Carson, David Letterman, and Joan Rivers, over the years, I instantly recognized the format: the wild clap for the host (in this case Jimmy Fallon), the staring directly into the audience, and the bellowing introduction of the first guest—in this case, one “Howard Gardner.”

What came on next shocked me! I was wholly unprepared to watch the video Howard Gardener give a seven minute lesson about major thinkers in the very area of knowledge that I had long ploughed—some very famous (Sigmund Freud, Jean Piaget, John Dewey), others moderately so (Erik Erikson, Maria Montessori, Lev Vygotsky), and some known only to those within the field—Albert Bandura, Lawrence Kohlberg,… and me.

The simulacrum looked just like me—indeed it must have been taken from a photo or video of me as I was giving a talk; and the script was completely lip synched. For anyone who had seen a recent photograph of me, but had not heard my voice, or monitored my customary posture and gesticulation, that WAS me!

John Dewey

The frightening thought that gradually dawned on me: Even after I had died, anyone who had not seen me in person or know me via various “genuine” media presentation, COULD have, MIGHT have, indeed WOULD have—assumed that it WAS me.

What to write to the teacher and the kids? I sought to walk a fine line between appreciation—indeed awe—and accuracy. So in a first message, I thanked them for the efforts, and suggested only two edits:

  1. Except for John Dewey, who was both a philosopher and a psychologist, the ten individuals introduced were better described as psychologists, and not philosophers.

  2. As someone who has spent over forty years critiquing the idea of a single intelligence, it did not look good—and it did not reflect my self-image—to hear me boldly and baldly declare “I am very intelligent.”

End of my official communication with Megan Kelly and her students.


On further reflection, I have three quite different reactions to this creation by a group of young adolescents:

  1. AWE
    As someone in my ninth decade, I was born before television was available. Once TV became available and affordable, I spent much of my childhood watching TV, including late night television. It never occurred to me that I could have been a guest on late night television. Nor did it ever occur to me that I would be part of the high school curriculum. And it certainly never occurred to me that a group of students could assemble a video segment that looked, sounded, and seemed like me. Spooky!

  2. ANNOYANCE

    As a scholar, I could not hide my reaction to certain points made in the video. In addition to the pedantic points shared with Megan (and perhaps her students), I felt that some of the brief descriptions did not capture what was most salient about the thinker—I wish that the segment on Sigmund Freud had talked about the power of unconscious thinking— or that the description of B. F. Skinner had indicated the ways in which he sought to educate pigeons, rats, and his own children. But perhaps that’s my inevitable scholarly tick.

    Further, each portrait began with the date and location of birth, and ended with the date of death. While this information is suitable for an encyclopedia or textbook, it seems unnecessary and perhaps misleading to broadcast it to an audience. One could quibble about this point—perhaps it seems important to adolescents—and of course, I knew that one of these days, as the sole personality on the clip who is still alive, an update would presumably include the date and site of my death.

  3. ANXIETY

    As long as I have reflected on such matters, I have valued truth-telling. Not that I have never lied—or come close to lying—but that I have tried to model truth-telling; to encourage it in my family and in the settings where I have taught, featured it in my writing, and—for over thirty years—worked on the achievement of Good Work and Good Citizenship—the first of whose components is TRUTH.
    But now in the age of the Influencer, and with ready access to media which purport to present verisimilitude, it is becoming increasingly difficult—and may soon not even be possible—to delineate what is true, what is false, what is not known now but could in principle be known, and what is by its very nature not knowable.

In early 2025, we can still delineate that the video made—in good faith and with considerable skill—was by Megan Kelly’ students. But I fear that before long, it will not be possible to make such a determination—or it will be so difficult to make that no one will bother.

The line between “Not knowable in principle” and “Knowable but simply not worth the trouble” looms ever larger. I thank Megan Kelly and her students for raising these issues for me—and I hope that, in the period ahead, they and their peers will devote energy to calming my anxiety and re-establishing the desirability of truth.

ACKNOWLEDGEMENT: For comments on an earlier version of this blog, I thank Shinri Furuzawa, Annie Stachura, and Ellen Winner.

Previous
Previous

Discovering a Missing Link: A Family Tie Reactivates a Long-Forgotten Research Finding

Next
Next

Reflections on Writing: The First 75 Years