Eagle stare-down

Breaking up with Facebook: Untethering from the Ideological Freight of Online Surveillance

 Published on June 24, 2014 /  Written by /  Reviewed by Chris Friend and Valerie Robin /  “«Don't you think you took too many pictures today?»” by Tambako The Jaguar; CC BY-NC-ND 2.0 /  12

During the summer of 2013, I scratched my seven-year itch. I broke up with Facebook.

I first met the social media space in 2006, when the platform was in many ways an energetic youth bent on embracing all of life’s pleasures. Our relationship, like so many others, was full of vibrancy in the beginning. We shared our hopes and dreams, pictures of silly and significant events, and even shared music to chronicle our strengths and shadows. As our partnership matured, and Facebook began trying on new looks like the newsfeed, different privacy options, and even chat and mail, the once dazzling space dulled. We used to so clearly communicate, uninhibited by outside influence. And that quickly changed. As if almost overnight, Facebook began acting out, hiding important posts on the newsfeed and only showing what Facebook thought was most relevant or important. I began distancing myself in 2010. I could feel the shift in the site’s behaviors and habits, and I wasn’t sure then what that change meant for me or our relationship. In the final few years, I still occasionally flirted with Facebook, but I kept a distance; and in 2013, I finally left Facebook for good.

The shift from Facebook occurred for a couple of reasons; these motives helped me through the transition. The revelations forwarded by Edward Snowden on the National Security Agency’s deep ties to data collection of citizens both at home and abroad revealed through journalist Glenn Greenwald, piqued my interest in public debates on privacy and surveillance with the Internet and mobile app technology. The NSA collected Internet and telephone communications data, including data from Facebook, under the PRISM program, a directive birthed from a post-9/11 culture of assessment and prevention of threats to the American public.

Mostly though, the addition of Facebook’s GraphSearch — a super search engine based on phrases — affirmed how much data the social media company had about users, and established Facebook’s willingness to share such data with just about whomever wanted access. The addition of this feature became my tipping point to leave Facebook, and began my journey of talking with my colleagues and students about online surveillance.

On Facebook, when we engage in any activity, our clicks are recorded, our newsfeed posts archived, and our digital lives monitored. We may sense this happening because we see personalized advertisements alerting us to products we viewed on other sites. However, underneath the interface lie computer algorithms that filter the content we end up seeing in front of us on our newsfeed. Eli Pariser, who has an excellent Ted talk and book on what he calls the “filter bubble”, chronicles what we experience in some online spaces. Tracking technologies mine our browsers for data; computer algorithms take the data, run calculations, and output personalized information for the user.

While tracking technologies may seem trivial in light of the benefits (including free access) of the sites that use such technologies, the consequences of filtering content means information is refracted online and in mobile spaces, which leads to a myopia of information. With Facebook, for example, users no longer see all of the content from their friends — the algorithms filter content based upon proximity between users. This means the more a user comments or views another person’s posts or page, the more that person’s posts will appear in the user’s newsfeed. As Pariser notes, we each have our own unique filter bubbles, which means our preferences, habits, and our identities are categorized by algorithms for targeted advertising and search results. Facebook is not the only company that uses personalization either. Tech giants like Amazon, Google, and Netflix — among many others — use computer algorithmic surveillance to personalize products and goods, movie recommendations, and even search results.

Ultimately, the discussion of a refracted online experience addresses a larger conversation about the persuasive abilities of computer algorithms and code. Certainly, it is possible to consider computer technology as tools performing functions. At a deeper level, however, when we [re]act to the personalized content on our screens, we have been influenced by computer algorithms to change our behaviors, beliefs, intentions, or motivations. If algorithms have the ability to filter our content and modify our thinking because of the content provided by the algorithms, then we might consider computer algorithms and code as persuasive agents with abilities.

If we consider algorithms persuasive agents, then the ways we teach our students (and ourselves), may shift as a result. For example, in the classroom, some of us train our students to work with the tools and content of online, networked spaces. We may compel students to learn how to use tools like DropBox for document archival and sharing, Google Drive for collaborative exchanges, Facebook for class group discussion, even Tapestry — a mobile app of user-generated click-through stories… and the list goes on. While we find comfort in the affordances of the collaborative web 2.0 to share content, what if we integrated discussions about the persuasive agents that are embedded in these tools? How might we develop a renewed critical digital pedagogy addressing these concerns?

After all, we share these digital tools at the bequest of preparing students, in the words of researcher Stuart Selber, for functional (how to use the tool), critical (how to analyze the tool), and rhetorical (how to create with the tool) literate practices necessary for professional and personal communication in the age of the digital. As the digital seduces us to continue teaching students how to be critical and reflective prosumers (cf. George Ritzer & Nathan Jurgerson, who outline the portmanteau of producer and consumer linked to unpaid labor practices companies collect from online users) of technology, we also have to remind ourselves — and our students — that many of these platforms and applications track our movements online and create rich data portraits of who we are from our digital debris. As a result, if we continue to inhabit these online spaces and encourage our students to do so as well, we must also attend to why we should care about online surveillance and persuasive algorithms.

Writing teachers and researchers, for example, have long assumed the need for critical engagement and inquiry in the classroom. For instance, this year marks the 20th anniversary of Cynthia Selfe and Richard Selfe’s “The Politics of the Interface,” an intellectually stimulating examination of the ideological and material power relations embedded in computing technology, which was also the theme of this year’s Computers & Writing conference. Selfe & Selfe’s work helps frame the call educators should heed when addressing online surveillance.

If we, as teachers, ask our students to go online to sites that have tracking technologies, then as educators we bear responsibility in teaching students critically literate practices of analysis, evaluation, explanation, interpretation, and observation in connection with digital surveillance. Why? Quite simply, we are simultaneously shaping and being shaped by the digital technologies and personalized results we see on our screens. Cookies and beacons monitor our clicks and keystrokes for web personalization and tailored recommendations, but we also only see a fraction of content on the web because of the filter bubble. With personalized advertisements and recommendations, we might turn to ask, what are we not seeing? How are we being categorized? Might this be a form of discriminatory practice?

Each time we engage online, we have to attend to what Cynthia Selfe calls the “ideological freight” embedded in the interface. As teachers, we are poised to engage students with the critical skills necessary to untangle surveillance and tracking technologies as part of instruction. As educators, we’re prepared for these discussions; we already teach students how to evaluate sources online, for example. Why not teach students, as Heidi McKee eloquently notes in her article about net neutrality, how to evaluate privacy policies — and I add surveillance — online as well? We ask students to assess credentials, bias, publication type, ethos of the author and publisher in many classes. We can integrate instruction that asks students to assess privacy policies (Stephanie Vie has a forthcoming chapter in Technical Communication and Video Games which explores privacy policies and user agency), cookie use, and personalized advertising. Just by having this information, we can make better choices and decisions about what sites we feel comfortable sharing our data with, or when we want to opt-out — even if the decision to do so feels uncomfortable at first.

Because of the tracking mechanisms embedded in the architecture of the Internet and app technologies, educators need a frame for critical awareness with online surveillance that aligns with critical thinking skills. As McKee notes, teachers do not have to reinvent curricula in order to have students engage with the underlying norms of the interface. Instead, educators can integrate talks into existing learning outcomes and goals. When showing students a new online technological tool, educators must highlight the privacy policies, the terms of service, and even discuss third-party cookie collection. These critical micro-gestures, incorporated and scaffolded throughout a semester, may also help students learn that evaluation, assessment, and observation occur with each interaction in an online space, and reinforce a habit to engage critically both in and out of the classroom.

There are starting points to educating students about online tracking and surveillance that connect with a criticality of analysis, explanation, interpretation, and observation, but only represent one starting path into the discussion.

Analysis: Have students analyze the different types of personalized content they see on Google, Facebook or even Amazon to learn more about the filter bubble and the refracted content they experience online.

Evaluation: Ask students to install and use privacy tracking tools to see what companies track them online and learn ways to block or monitor the trackers if need be. Students may also conduct evaluative research into how companies use data.

Interpretation: Incorporate policy papers and educational materials authored by non-profit organizations to help students understand the larger debates about online surveillance and privacy online.

Explanation: Encourage students to cast arguments about the benefits or constraints of surveillance by having them explain the effects surveillance has upon users.

Observation: Go to marketing and behavioral advertising sites to run scripts that check for browser cookies to learn what companies observe web habits.

With the surveillance of web habits by so many companies, this may lead people to opt-out of participating in sites like Facebook just as I did last summer. I realize that leaving sites that use an immense amount of tracking technology does not address the core concerns of companies collecting vasts amount of prosumer data and possibly selling that data to other organizations. But, something has to change. Do we want to live our digital lives being constantly tracked? Do we want our legally tracked digital data sold and possibly used in ways that harm instead of support us? Make no mistake, we are on a precipice, tracking technologies will only increase, especially with Google’s work in the “Internet of Things” and digital surveillance may become more pervasive and invasive then it is already. I think educators can play an important role in critical engagement with surveillance technologies, and also by helping others understand the freight certain websites bear.

By developing critical engagement with students, educators may help them (and ourselves) make informed decisions about what sites and trackers to opt in and out of online. We may decide to fully opt-out or block certain tracking technologies. We may even initiate classroom projects where students investigate surveillance and privacy online and/or write to legislators, or even leaders at tech giants like Google and Facebook about increasing protections for consumers in spaces that use tracking technologies. It will take a concentrated effort to have our voices heard, and perhaps we may do so in ways that allow us to sustain our relationships with web and mobile companies instead of leaving. We can no longer afford to sit passively by while surveillance technologies harvest our data because there is too much at stake. We can educate our students, who may become future policy leaders, computer programmers, and educators on the real effects of surveillance online and hope both us and they can lead the way to positive change online. With such a concentrated effort, we are in a position to change our relationships — for the better — with companies that use surveillance, in ways that allow us to mend broken connections by focusing on positive and reciprocal relations for all parties involved. Perhaps one day, Facebook and I will find our common ground and reconnect, sharing our lives again once more without the ever watchful eye of surveillance tracking every movement.

Add to the Conversation

12 Responses
  1. Randy B

    The article listed one downside of surveillance and that is you do not see all your friend’s posts cause of filtering. Correct me if I am wrong on that and it listed other downsides? Who cares what information is taken about us? If my information of what I like and care about sparks a company to create a product that they know I will like why is that a bad thing?

    1. Estee Beck

      Hi Randy B, I read your comment with interest, and I want to thank you for taking the time to post. In many ways this article leaves out much scholarship and media coverage about surveillance in digital spaces because of space limitations. To address the second question about who cares, I believe that ultimately sits with each individual about what she or he wants to circulate in digital networks. However, there are several downsides to digital surveillance, and if you haven’t seen the documentary, These Terms and Conditions May Apply, then I encourage you watch. Many of the debates about digital surveillance both from the media and from the academy are included in this film. As for the third question, sure, if a company collects information to create a better product that you’d enjoy is it bad–or good? But, I also ask you to consider data brokers like Acxiom that touts it has over 1,500 unique data points on individuals, which reveal rich portraits about habits and activities. The concern that scholars in Surveillance Studies have with the collection of this data rests with when its paired with financial, medical, educational, and legal information. What will be the effects? Will the pairing be beneficial or harmful?

  2. Tom Readings

    Have you seen Diaspora? A colleague introduced me to it recently and it looks like a really interesting concept; distributed, private social networking.


    I also have reservations about Facebook, but these are partly based on poor usability, frequent interface changes and the seemingly unavoidable drift towards vacuous status updates.

    I wonder if a distributed system like Diaspora, once it made it into the mainstream, might allow people to connecting in a greater variety of ways, building up different methods of engagement.

  3. Thanks Estelle for this very interesting article. I was interested in similarities and differences in our concerns and approaches. I joined FB primarily as an educator (but also have had a bit of fun there) and became increasingly concerned about Facebook’s ever-changing policies and defaults on privacy and its data-sharing. Also startling to me was how little the 18+ students I was teaching thought or cared about privacy or data-sharing issues (this was around 2008-9). With colleagues, I had the opportunity to include some ‘digital literacies’ aspects in a Y1 undergraduate module ‘Emerging Technologies’ (in a UK university). Because the students were on various programmes in a Business School, we looked at topics and issues from business and personal perspectives that seemed to help students gain a more rounded view. We never required students to meet us on Facebook, or even to join Facebook. we found that if we set up small group activities there would always be at least one student on Facebook in the group. For example, asking the group to Google one person’s name with the term Facebook added usually threw up some pictures. Often students who knew the pictures were ‘on Facebook’ were shocked that they were also ‘on Google’. Over the 4 years we ran the module, we were gratified to see the general awareness of privacy issues increase. However, I am not sure how well we got across that technology morphs continually and awareness needs frequent updates;)
    I just wanted to put in a word for embedding practical literacies and ethics in a module that students see as relevant to their degree subject. I have found (I am now retired) that students tend not to value literacies and ethics when taught in a disconnected fashion.

  4. Kathleen

    As I read it, the other downsides are inferred. I am uncomfortable with a stranger taking my picture on the street, but I walked out of my house, right? My discomfort is based in a lack of knowledge or control of how that picture will be used, archived, or shared. The concerns being raised here are familiar concerns in conversations about privacy v. public safety (if I walk into a bank with a black beanie, an overcoat, and my hands deep in my pockets, should the security guard stop me?), but sped up because of the speed of development of technologies for gathering, analyzing, and distributing that data. Joining these social networks/downloading the programs/visiting the websites is voluntary, as is opting out of them. Incorporating a discussion about the background activities of the networks/programs/websites allows students to make a more fully informed choice of what to participate in.

  5. Did seem a bit odd to focus mainly on the issue of filtering content. According to the hype, the tech is supposed to be radical. Perhaps a closer observation of what happens would reveal that it actually functions to foster a conservative attitude, partly because of the surveillance and the awareness that everything in the new digital public arena is being logged and may be used in evidence against you in the future.

    Perhaps classroom activities taking this as their focus could include a look at Bentham’s Panopticon, and discuss how the internet might function as the digital Panopticon, helping to keep everyone inline.

  6. I teach people, primarily people 50+ how to use social media through introductory level 2 hour workshops.The number 1 concern of the majority of my students is privacy.Many of my students use a computer infrequently, have little to no online presence, and only want a Facebook account to see photos of their grandchildren. In short, they are the complete opposite of an 18 or 19 yr old who has lived their life online. These are ordinary people, many of whom still believe that if you do a status update on Facebook regarding your pending vacation, your house is sure to be robbed while you are away. Is this ‘cautious’ generation the ideal? If the worm-turns, and everyone decides to reduce their use of the internet, or takes precautions to guard their privacy, will the problem be solved…or is the barn already empty? (The horse has galloped down the hill.) I don’t want to sugar-coat the issue of online privacy, but for reasons listed above, I don’t believe that terrifying them into never using Facebook is reasonable either.

Leave a Reply

Explore Related Articles from Hybrid Pedagogy

journal logo (two nested mathematical Unity symbols in light and medium blue) above the following text: “Hybrid Pedagogy: An open-access journal of learning, teaching, and technology”

Open to Chance?

Latest Comments on Hybrid Pedagogy

Hybrid Pedagogy on Twitter

Support Our Work