Critical Digital Practices is a self-paced course designed to introduce you to some fundamental skills and concepts in computing. If “you” aren’t someone in pursuit of computing skills and concepts but rather someone looking for materials you can use to teach those skills and concepts to others, you can think of Critical Digital Practices as an open access teaching resource. In that case, you should feel free to copy, adapt, and remix the content you find here according to your needs—for example, by incorporating one or more modules or parts of modules into a course you’re offering through your institution’s learning management system—in accordance with the site’s Creative Commons License. You can fork or clone the course on GitHub or just copy, paste, and edit. Just be sure, please, to provide appropriate attribution.
All that said, from here on out, “you” refers to learners, not teachers.
“But lo! men have become the tools of their tools.” —Henry David Thoreau, Walden, 1847
You’re likely familiar with the controversy raging at the present moment over something called “critical race theory.” Although certain politicians have incorrectly slapped the “CRT” label on just about any mention of historical or current racism in American law, politics, and society—whether or not the mentioner has the slightest familiarity with the body of thought to which the label actually refers—these politicians are at least right to regard critical race theory itself as an effort to understand how racism works at the level of systems and institutions: to understand it, that is, not merely as a matter of individual beliefs and attitudes but as something woven deeply into the fabric of American life—more “feature,” you might say, than “bug.”
This orientation towards systems is part of what the “critical” in “critical race theory” is meant to signal. The theory is critical not only in the ordinary sense of passing judgment on the reality it studies, but also in several more important ways not obvious to anyone unfamiliar with the broad history of “critical theory” in modern social thought. Explanatory models, methodologies, and courses—like this one—that name themselves “critical” often do so with an eye towards that history. As a result, they’re likely to exhibit (at least) some combination of the following characteristics:
- An interest (as already noted) in how the subject under investigation—race, gender, sexuality, law, technology, etc.—functions internally as a system and in relation to other systems containing it, contained by it, or intersecting with it.
- Skepticism towards the possibility of conducting this investigation from some location outside the systems themselves, of adopting towards the subject a value-neutral “view from nowhere.”
- An intention that the investigation shall advance particular values: specifically, human freedom and autonomy.
This course on “Critical Digital Practices” is neither a theoretical nor an empirical investigation of technology. It doesn’t delve (much) into the history of digital computing, the economics of Big Tech, or the various ways we might understand a concept like “media.” Its main purpose is just to help you better understand how computers work and give you the skills you need to work better with them.
But just as there’s no “view from nowhere” available when we want to study some aspect of society, so it’s impossible to engage in a “practice from nowhere” when using human-made tools that have real effects on actual people, starting with you, the user of the tool. “Computing” is a deeply social activity; even when you’re alone with your screen, locked in private battle with data you’re trying to wrangle or the HTML code for your website, the “what,” “how,” and “why” of your activity all connect you to other people: to the designers of your computer’s hardware and operating system, to the developers who created the software you’re using, to the owners and admins of servers or platforms you may be relying on, to all those whose collective theoretical insights, technical innovations, and programming efforts, stretching back through generations, perhaps, have laid the conditions for you to accomplish your personal task.
Also, though: to the indivduals or companies that may be collecting data about your activity on your computer itself or on the web, to the developers whose code may have intentionally locked you out from leveraging certain of your machine’s capabilities (such as extracting an e-book’s text content from the proprietary wrapper containing it), to the legal system that might make it a crime for you to leverage those capabilities even if you’re able to overcome the technical barriers, to the politicians and organizations who, through laws, regulations, or standards, may have either restricted or enabled your freedom to access knowledge, share your creativity, or associate with the communities you care about.
As some of these examples suggest, whether software code enhances or limits your freedom or autonomy—perhaps enhancing your autonomy by limiting others’ freedom to violate your privacy or discriminate against you because, say, of your race, gender, gender identity, or sexual identity—isn’t determined only by the legal regime governing uses of the code; it may be a function of what’s in the code itself. This is an important part of what Lawrence Lessig meant by his influential formulation “code is law”. It’s also one message of the recent film Coded Bias, which focuses on algorithmic racial discrimination. It’s an insight driving much of the most important discussion right now about artificial intelligence (AI), large language models, the tools built with them (such as ChatGPT), and the potential of these tools, on the one hand, to propel discovery and unlock human creativity and, on the other, to distort reality, disrupt democracy, and confine our thinking to boxes of an elite technocracy’s making. (Full disclosure: Nothing on this site was written by ChatGPT or any other generative AI tool. Another full disclosure: everything here is nevertheless, in some sense, a remix of what others have thought and written previously. Perhaps, after all, everything is a remix?)
This course seeks to earn the adjective “critical” in its title, then, in part by encouraging you to think constantly about the way your digital practices impinge on your and others’ freedom and autonomy. To that end, it leans towards free and open source software solutions, recommends practices and platforms that reduce your and others’ exposure to invasive data tracking, highlights the importance of accessibility, suggests readings that are similarly “critical” in perspective (even if not self-labeled as such), and encourages you to share the course content freely (with attribution) under a Creative Commons license.
It also seeks to earn the adjective by advancing, in its own small way, the cause of liberation. Its premise is that the more you understand about the digital tools you use, the more likely it is that the tools will serve you, rather than the other way around. As Douglass Rushkoff writes in Program or Be Programmed: Ten Commands for a Digital Age (OR Books, 2010):
Digital technology is programmed. This makes it biased toward those with the capacity to write the code. In a digital age, we must learn how to make the software, or risk becoming the software. It is not too difficult or too late to learn the code behind the things we use—or at least to understand that there is code behind their interfaces. Otherwise, we are at the mercy of those who do the programming, the people paying them, or even the technology itself. (128)
This isn’t a programming course—although it will touch on a few programming basics. But as Rushkoff is suggesting here, simply to know that there is code behind an interface is to loosen somewhat its dominating grip. To understand, even in a very basic way, how that code operates is presumably to loosen it still more. Coding itself aside, this course assumes that even the simple step of interacting with your computer’s operating system through the command line rather than its graphical user interface (GUI) of windows and icons—one of the first steps we’ll take together—will give you a significantly greater sense of control as a digital citizen.
Well, sure—to the extent that it’s political to advance values such as freedom, autonomy, and equity. Again, the impossibility of identifying significant digital practices that don’t work to advance some set of values is part of what’s asserted in the course title. But keep in mind that there’s no universally accepted definition of freedom, autonomy, equity, or any other value people care about deeply, leaving plenty of room for multiple views. And even where there may be broad agreement on definitions, that agreement doesn’t translate into a unified “left,” “right,” or “centrist” take on any specific law, policy, or regulation designed to protect the values in question. Consider how Section 230 of the Communications Decency Act of 1996, or the applicability of copyright law to products of generative AI, has recently divided people who are united on other political questions.
Rest assured that this course won’t require you to take a particular position on these or any other issues, even as the course takes the position that understanding what’s at stake in such issues will increase your freedom and autonomy.
Module 1: Meet Your Computer
Module 2: The Command Line
Module 3: What is Text?
Module 4: Internet and Web
Module 5: Working with Data
Module 6: Content Management: WordPress and Omeka
Module 7: A Very Brief Introduction to the Python Programming Language
Spot an error or have an idea for making the course better? Don’t hesitate to open a pull request.
Critical Digital Practices was created by the Center for Digital Learning at SUNY Geneseo. Paul Schacht is the principal author. Some content has been remixed, with gratitude, from repositories openly and generously shared by the CUNY Graduate Center’s Digital Humanities Research Institute. Thanks to Amanda Schmidt for the many wonderful hours of conversation and brainstorming that are reflected in the underlying concept and overall design of the course.