When I discovered a rather nondescript blurb on Craigslist about needing an immediate replacement for a “technology specialist,” I didn’t know exactly what I’d find. Much to my joy, however, I soon found myself working once a week at a private elementary school, tasked with various tech-related responsibilities, including teaching second-fifth grade “tech classes.” The tech classes would be 30-45 minutes each, once a week per grade. And that was it; the entirety of my assignment explained to me in terms of minutes. No context. No examples from previous tech specialists. No curriculum and no grading. Nothing.

In short: I was as free as a teacher could be.

Sean Michael Morris and Jesse Stommel attest that, “So often in our discussions of online education and teaching with technology, we jump to a discussion of how or when to use technology without pausing to think about whether or why,” and in a very real way I was asked to do precisely that. The implicit expectation in my newfound task was that I would teach students how to use technology, not why, or even if they should.

At first, I found myself relying partially on the Common Sense Media open curriculum, and partially on my own expertise with technological skills (photo editing, for example). I was selective, and I modified the lessons; however, the core message remained the same: be safe, respectful, and responsible online. Parents, board members, and fellow teachers appreciate this message. No stakeholder wants to be cornered between young, impressionable minds and the scarier side of the web, as Adam Savage once found himself.

From a critical perspective, however, I found that this curriculum only begs the really important questions: Who defines online safety? Why is this the normative model of respect? And isn’t responsibility shared in a common space? — the types of questions Cynthia Selfe has been asking for over two decades. These are not, society says, questions for the elementary classroom. Still, I am uncomfortable mandating concrete rules of engagement. Safety, respectfulness, and responsibility are things to be discovered, not taught.

And so it was time, I decided, to start throwing wrenches into the machine.

Once, with 3rd graders, we watched McGruff “The Crime Dog”-sanctioned Internet safety videos (“Faux Paw’s Adventures in the Internet,” among others) to observe kid-friendly unsafe behavior online. At the same time, however, each student had an iPad logged into Backchannel Chat with anonymous screen names. I asked them to informally chat about the video, make observations about Faux Paw’s behavior and just generally comment on the experience.

Afterward, I had the students guess as to which screen name matched which classmate and why they thought so. Soon enough, 3rd graders were critically analyzing digital rhetoric. One student was identified by the majority of the class, due to the use of a choice phrase spoken regularly by that individual (“I like pie”). The others, however, remained largely contested, and it became a lesson in how difficult it is to really know who you’re engaging with online. The students, for the most part, couldn’t even recognize one another. I provided them a safe space (Backchannel Chat), offered some general guidelines (“talk about the Internet safety video”), but for the most part, I let them discover the nuances of safe digital communications on their own.

I recognize that teaching digital safety, respectfulness, and responsibility is important foundational work for students of a digital world. But offering them as rules is not enough. I found, in complicating the subject, that even the youngest students could quickly connect these ideas. Students, as young as second graders, explore the vast expanses of Minecraft, a creative, digital space that is frequently beyond the scope of their parent’s closest scrutiny (“it’s just a game,” the parent might say). Sure, it’s a game, but one where over 15 million people come together, as they say, “to create wonderful, imaginative things.” But also to destroy other people’s things and steal their treasures. It is a training ground for digital respect and responsibility.

So, while the Common Sense Media curriculum lacked the critical perspective and consciousness-raising I continuously seek in my teaching practices, I found ways of hacking it, remixing it to better fit my classroom. Premade curricula, as I have said before is impersonal and ages quickly. It doesn’t account for the local audience. And this bothered me: I didn’t want a generation of students growing up to comply to the status quo of the Internet, whatever that is. Instead, today’s students should be making technology play by their rules.

My intention while teaching elementary students about technology was to introduce them to digital culture. Particular tech skills, such as typing, were already covered under the school’s standard curriculum and I had no desire or reason to focus my energies there. Instead, I wanted these students to practice thinking about and interacting with a computerized world. Most importantly, I wanted to emphasize creativity and critical engagement with the programs we encountered in class so that when they discovered new technologies, they could examine those as well. I wanted them to experiment, question, and — yes — even grow a little frustrated with the technology.

Rheingold reminds us that “one of the things that makes technology dangerous is the way people forget where tools come from, and what they were designed to do.” I wanted the students to know what was behind the digital curtain. I wasn’t content with teaching on the surface of today’s technology. I wanted the students to dig in deep. So I switched the curriculum. I began teaching code. I stopped making “plans,” in any traditional sense of the word. I prepared, of course, but what I prepared for was different than leading a formal lesson. Besides, now that the students had a grasp of safe, responsible, and respectful digital ethics, why not let them play a little? I soon found myself in critical dialogue around the origins of specific digital tools, and I had this conversation with elementary students.

Each day, particularly with 2nd grade, I presented my classes with what I called a “challenge,” (as opposed to an assignment). I call this a pedagogy of discovery. I tell them what I want (a digital pet that is programmed to do three different tricks, let’s say), but not how to complete the task. I introduce them to a coding program (Hopscotch, for the youngest ones; Tynker and Scratch for the older ones). Then, I have them play. Most of the time, they fail, at least at first. And here’s where I’ve observed something really interesting: in a moment of failure, a student doesn’t want to know what happened; they want to know why. Why did my alien disappear? Why did it draw this design? Or my personal favorite: Why did my program crash? Thus, the critical question Morris and Stommel ask becomes realized in the moment of failure. The discovery emerges from uncertainty, and learning occurs in taking the risk to try again, this time equipped with a deeper understanding of the program and the problem.

This pedagogy of discovery is based in the reaffirmed notion of “interest” as brought forward by Dewey in Democracy and Education. As a proponent of progressive, democratic education, Dewey advocated for a pedagogy that would connect students to their disciplines in a more authentic way. Accordingly, he argues that while some educators may stigmatize designing for student interest is a “soft pedagogy” (today, sometimes condescendingly called “edutainment”), we can have, in fact, a more critical understanding of the term. To involve “interest,” Dewey argues, “is to discover objects and modes of action, which are connected with present powers”; and further it is a force that “connects two things, otherwise distant” (127).

For example, the abstract notion of “the maker” or software engineer who designed the program is of no inherent concern to an elementary student. Neither is the distant code hidden deep below the program’s interface. Yet to leave these details out is to disengage the student’s actions from the very source which gives the machine its power. Why is it, for example, that when you click an icon the interface reacts in such a way? What forces are at play which cause the machine to react to a particular student action in that way? It was my aim to establish these connections between the user and the machine. Because it isn’t until you recognize the connection between two things (yourself and the machine, in this case) that you can become truly interested in it.

So, by giving students an empty stage, literally codeless and dysfunctional, the students became aware of their present power (in discovering the powerlessness of technology, on its own). To code is to create, and without code, no digital object functions. The students are therefore the makers of their own meaningful digital object, deeply interested in the outcome of their actions. To code poorly is to grow repeatedly frustrated. To code well is to control the output of the machine, which is a powerful feeling. In being so explicitly connected to the outcomes of the computer program, I found, the students were authentically interested in discovering the modes of action — the logic and structure of code — to build functional programs.

Block coding programs like Hopscotch and Scratch are well designed to show the connections between otherwise distant objects. For example, to begin a student’s technology education with typing within already expertly coded programs such as Microsoft Word is to make the connections invisible. Why, when you tap the keyboard text appears on the screen, is never addressed. The class’s concern, then, is a matter of what and how. Here, the computer and the program become black boxes, which spit out magically generated text on the screen. I do not wish to dismiss the importance of teaching basic digital literacy skills, which are crucial in today’s digital world. Instead, I am advocating for a technology classroom that also makes space and time to consider the question why.

Further, I encourage more transparent teaching methods: let the students discover the limits of a program, press buttons left unmentioned by the teacher, and make mistakes. Most importantly, when the students make these mistakes (more appropriately called discoveries), we should avoid saying, “you did this wrong,” or “that is not how it is done.” Let them wonder why. Let them know that digital tools are — indeed — tools; they have been made and can be broken. Remind your students that new ones will take their place, and that they could be the ones making them.

Morris and Stommel remind us that “Luddism has roots in a powerful kind of human agency,” and this is the what I designed for in my lessons. I am aware that there are many ways of teaching technology. Some, like the rote, hands-covered-by-cardboard typing practice I experienced in secondary school, teach us only to respond to the machine. Others can teach instead that machines respond to us as well; that we are connected. Together, as the Neo-Luddites were encouraged to do, the students and I discovered the “increasingly bizarre and frightening technologies of the Computer Age.” Yet we didn’t take this statement as a forewarning; we found it as an invitation. It’s precisely because technology can do so much good and bad, that we must be willing to critique the programs we encounter; as human agents, we can consider alternatives.

Near the end of the school year, a 2nd grade student of mine uploaded an emoji wearing a dress that was supposed to represent me. The feat came shortly after an introduction to open source culture, wherein I described how crowdsourced databases like that available in Hopscotch come to be. Without any prompting, that same student began a classwide campaign to promote and ultimately upvote the program. In this moment, I knew that she understood the system: to be visible online, especially in a crowdsourced environment, you need a critical mass of dedicated support. She knew that the code didn’t have to be difficult to be effective but rather it must be appropriate for her audience — which it was. In short, this student developed, debugged, uploaded, and promoted a unique, creative idea. The student not only navigated the interface but understood the driving factors behind the program and the ethos of the digital community. Sure, this is a silly example, beginning as it does with an emoji in a dress. At the same time, however, it suggests a rather sophisticated understanding of digital culture; an understanding that emerged out of discovery.

My students approached technology with the reflective curiosity advocated for in Morris and Stommel’s article. It’s true that technology reflects the ideology of the makers, and there are, indeed, makers who exploit the system and its users. Still, there are others working very hard to promote more liberatory aims. We live in a society where 6th graders can form companies and program educational games. So don’t be afraid to throw the occasional wrench, or teach others how to throw the same. Just be sure, when you’re passing along the information, that you include the why.