Tag Archives: race

Postcolonial DH/DH Related Panels at #MLA14

Excited to be part of #MLA14, and by all the great topics I’m seeing. I started a collective google doc to publicize all panels related to the digital humanities, postcolonial, race, gender, ethnic, queer and disability studies at MLA14. Please add to the document here..

Talk on Trading Races at Duke 2013

Slides for Opening Remarks: Representing Race: Silence in the Digital Humanities MLA 13

Here are the slides for my opening remarks to our MLA 13 panel, “Representing Race: Silence in the Digital Humanities,” Scheduled for Friday, 10.15am, Gardner, Sheraton. #MLA13 #s239

MLA 2013: Navigating Archival Silence: Creating a Nineteenth Century Postcolonial Archive

Below are the slides to my 10 minute presentation, “Navigating Archival Silence: Creating a Nineteenth Century Postcolonial Archive” presented at the 2013 Modern Language Association annual meeting. The presentation is part of the panel “Representing Race: Silence in the Digital Humanities.” 

Race and the Digital Humanities: An Introduction (NITLE Seminar)

On November 16 I gave a webinar on Race and the Digital Humanities for NITLE. You can find my slides and links to our shared google doc and public Zotero library below.

Link to talk recording 

Link to #TransformDH Google Doc: Add yourself and your project/project idea here!

Join our public Zotero library on Race and the Digital Humanities here

Storify of Live Tweets of Event click here

Race and the Digital Humanities: An Introduction (NITLE Workshop)

I’m looking forward to giving an OPEN webinar for NITLE (National Institute for Technology in Education) that will introduce issues regarding race, ethnicity and the digital humanities on November 16. The hashtag for the talk is #racedh. The talk is open to anyone! Please do think about joining us and disseminating this!

Race and the Digital Humanities: An Introduction

Talk Description: What is the role of race in the digital humanities? While prominent scholars such as Alondra Nelson and Lisa Nakamura have problematized the role of race in technology from the late 1990s, the relevance of race studies is only recently starting to be broached within the digital humanities: for example, by Alan Liu, Tara McPherson, Amy Earhart, Natalia Cecire, and the #TransformDH collective. This seminar will give a brief survey of the emerging field of race and the digital humanities, introduce the audience to a variety of digital projects informed by race, and provide links to resources for people interested in working in this field. Topics covered will include: the genealogy of these debates, the theoretical assumptions that inform them, and issues to consider while constructing a race and digital humanities project.

Image Credit: Shea Walsh

Developing a Common Language across Race Studies and the Digital Humanities

(This is my proposal for THATCamp Theory 2012.)

I would like to propose a collaborative workshop to develop a common language or vocabulary between scholars of race studies (critical race studies, postcolonial studies), computer scientists and the digital humanists. What are some common terms that we use that we think in different ways? (Modularity comes up as one.) What are some of the assumptions that we share/do not share about how cultural constructs are replicated in code, and what are its implications?

During the workshop, participants can draw up lists of common terms, explain how we all understand them, and suggest how we can use these terms to inform our digital humanities projects. How does the digital humanities change or become inflected by race studies? Issues of representation—recovery of works by people of color—are important, but what else would be relevant here? What are some theories and methodologies that a race scholar can use in projects such as topic modeling and other types of text mining; geospatial mapping projects; and issues of gamification in the classroom? What are some examples of DH projects that can be nuanced with race theory, and how can this be specifically done?

Image Credit

More Hack, Less Yack?: Modularity, Theory and Habitus in the Digital Humanities

One of the most prevalent debates within the Digital Humanities (DH) is the idea that practitioners should just go about doing rather than talking, or to practice “more hack, less yack.” In other words, instead of pontificating and problematizing, DH scholars should be more concerned with making stuff, and making stuff happen. The “more hack, less yack” mantra has been going on for a while now, and has brushed up against some challenges; notably Natalia Cecire’s (@ncecire) argument for the need for a ThatCamp Theory to uncover the theoretical leanings of the digital humanities, in Alan Liu’s call for the need to integrate cultural studies into dh approaches, and in the recent TransformDH collective, set up by Anne Cong-Huyen (@anitaconchita), Moya Bailey (@moyazb) and M. Rivera Monclova (@phdeviate) to bring race/gender/class/disability criticism to the digital humanities.** In many of these debates, it seems as though the “theoretification” of DH is viewed with suspicion as it disturbs the implicit good nature of much of the DH community. Roger Whitson (@rogerwhitson), for example, mused on whether the digital humanities “really needs to be transformed,” arguing that: “It seems to me that the word “guerilla” reappropriates the collaborative good will of the digital humanities, making it safe for traditional academic consumption and inserting it into the scheme Stanley Fish and William Pannapacker (@pannapacker) highlight.”

I’ve been musing on the “more hack, less yack” issue recently, and it seems that Tara McPherson’s (@tmcphers) essay “U.S. Operating Systems at Mid-Century: The Intertwining of Race and UNIX” in Lisa Nakamura (@lnakamur) and Peter Chow-White’s (@pachowwhite) recent collection Race After the Internet may offer some important insights into this ideological impasse. In her essay, McPherson argues that in the mid-twentieth century, a common structural logic developed due to computerization, one which argued for the importance of “modular thinking”, “common-sense” and disciplinary hyperspecialization. By focusing on processes which work via the modular form—simple blocks by which a complex system is broken down and analyzed by individual groups—the rationale of this system appears “common-sensical”, thereby obscuring the actual political and social moment from which it emerges.

McPherson sees this modular logic manifest in both the development of UNIX as well as racial formations in the United States, and expands this to argue that this might be a hallmark of the Fordist moment of capitalist production in the United States, and finds its manifestation in the hyperspecialization of late capitalism, extending to the specialization of disciplines in higher education such as Area Studies departments. This mode of modular thinking, she argues, is a type of “lenticular logic” which undergirds both the structures of UNIX as well as the covert racism of color blindness:

“A lenticular logic is a covert racial logic, a logic for the post-Civil Rights era. We might contrast the lenticular postcard to that wildly popular artifact of the industrial era, the stereoscope card. The stereoscope melds two different images into an imagined whole, privileging the whole; the lenticular image partitions and divides, privileging fragmentation. A lenticular logic is a logic of the fragment or the chunk, a way of seeing the world as discrete modules or nodes, a mode that suppresses relation and context. As such, the lenticular also manages and controls complexity.” (25)

Reading McPherson makes me think: to what degree does lenticular logic underlie the DH imperative for “more hack, less yack?” How much does digital humanities work, through the way it is processed and organized through computational models, actually follow the Fordist logic of modularity? In the same way that UNIX engineers extolled programmers to “common sense and notions about simplicity to justify their innovations in code,” (28), neglecting how his common-sense is similarly constituted by their historical specificity as a class of workers in the 1970s, how has this sentiment actually provided the language behind “more hack, less yack?”

In other words, common sense is never simply “common sense.” What is “common sense” comes out of a particular socio-historical moment, just as “hacking” has derived from a very specific social context. And, just as UNIX programmers relied, in McPherson’s argument, on a common-sense modular “lenticular logic” to avoid speaking about the socio-political origins and conditions that allowed for their “common sense” to come into being, perhaps the same logic has underwritten our resistance to theory within the digital humanities. Where does our “common sense” in the digital humanities come from? How is it implicated in structures of privilege which remain invisible to us? Why are we so resistant to speaking about it, and how does the language of modularity aid us in this silence?

It appears to me that much of the “more hack, less yack” issue circles around the problem of modularity and common-sensical “form” that McPherson outlines in this essay. I see this in Bethany Nowviskie’s (@nowviskie) recent post, Don’t Circle the Wagons:

“Software development functions in relative silence within the larger conversation of the digital humanities, in a sub-culture suffused — in my experience — not with locker-room towel-snaps and construction-worker catcalls, but with lovely stuff that’s un-voiced: what Bill Turkel and Devon Elliott have called tacit understanding, and with journeyman learning experiences. And that’s no surprise. To my mind, coding itself has more in common with traditional arts-and-crafts practice than with academic discourse.Too often, the things developers know — know and value, because they enact them every day — go entirely unspoken. These include the philosophy and ethos of software craftsmanship and, by extension, the intellectual underpinnings of digital humanities practice. (If you use their tools, that goes for your DH practice, too.)”

Nowviskie’s elaboration of a “tacit understanding” that derives from “journeyman learning experiences” makes me wonder how much of these learning experiences dovetail with McPherson’s notion of modular, lenticular logic that structures UNIX and other mid-century structuralist Fordist systems. This “tacit understanding” creates a common-sense notion of simplicity, but one whose structure and “common-sensical” nature similarly allows for a significant amnesia towards its own socio-political origins and context.  In the same way that UNIX engineers extolled programmers to “common sense and notions about simplicity to justify their innovations in code,” (McPherson 28), neglecting how his common-sense is similarly constituted by their historical specificity as a class of workers in the 1970s, how has this mode of thought provided the language for the “more hack, less yack” sentiment?

McPherson’s argument recalls Paul De Man’s  Blindness and Insight, where De Man asserted that all critical readings are ultimately predicated upon a “negative movement that animates the critic’s thought, an unstated principle that leads his language away from its asserted stand… as if the very possibility of assertion had been put into question.” De Man argued that we needed to return to engaging how a certain type of form made certain readings possible. At the same time, he asserted that the blindness to that very form was critical to structuring our insights. While De Man’s metaphor is problematically ableist***, it still makes a critical point: that we need to interrogate how the logic of form tends to erase the perspective of its own creation. As literary theory given critics insights that hide their own foundations, the logics of computation have given us a certain type of structure, a type of tacit understanding, a sort of visible logic and knowing that have simultaneously obscured their own foundational assumptions.

I do not mean to suggest that tacit understanding equates to a certain type of blindness. That said, I do mean to recognize that all forms of shared, cultural understandings, whether they come under the umbrella terms “common sense,” “tradition” or “ritual,” are founded upon an important obscuring of their own particular socio-political specificity, and that to ignore this specificity is troubling. As Pierre Bourdieu observed, all cultural practices exist as habitus, a set of learned dispositions, skills and ways of acting that appear simply natural, but which are rooted in specific social-cultural contexts. My call, then, is for us to interrogate the habitus that makes up the Digital Humanities community.

Let me be clear. I get annoyed by jargon and obfuscation as much as the next person, which is why I am so attracted to the digital humanities community. But I do think that we need to invest in the creation of a metalanguage that will allow us to see the ideological foundations that undergird our “common sense.” And sometimes that comes hand in hand with theory. Also, theory doesn’t always need to be annoyingly grating, especially if it allows us to understand how our implicit systems invisibly privilege and disenfranchise certain groups of people. We need to question the forms that make us see “common-sense”, and to see value in the converse “less hack, more yack” proposition.

If computation is, as Cathy N. Davidson (@cathyndavidson) and Dan Rowinski have been arguing, the fourth “R” of 21st century literacy, we very much need to treat it the way we already do existing human languages: as modes of knowledge which unwittingly create cultural valences and systems of privilege and oppression. Frantz Fanon wrote in Black Skin, White Masks: “To speak a language is to take on a world, a civilization.”  As Digital Humanists, we have the responsibility to interrogate and to understand what kind of world, and what kind of civilization, our computational languages and forms create for us. Critical Code Studies is an important step in this direction. But it’s important not to stop there, but to continue to try to expand upon how computation and the digital humanities are underwritten by theoretical suppositions which still remain invisible.

** Alexis Lothian’s article, “Marked Bodies, Transformative Scholarship and the Question of Theory in the Digital Humanities.” Journal of Digital Humanities 1:1, November 4, 2011, gives an excellent history of the #TransformDH group, and the call towards Theory within the Digital Humanities. Thanks Alexis (@alothian) for pointing me to this!

***Thanks to Natalia Cecire (@ncecire) for reminding me of this.

Image Credit

Edited to Add: Some interesting responses to this post 

An MLA13 Proposal: Representing Race: Silence in the Digital Humanities

Update Dec 30: Unfortunately Moya Bailey will not be attending the MLA; however, she has posted a brief abstract of her remarks, “Digital Alchemy: The Transformative Magic of Women of Color Online,” here

Update Oct 18: This will be session #s239, Friday, 4 January, 10:15–11:30 a.m., Gardner, Sheraton. 

Update May 15: This roundtable has been accepted for presentation at the 2013 Modern Language Association Meeting in Boston.

Respondent: Alondra Nelson (Columbia U) Organized by: Adeline Koh (Duke University & Richard Stockton College)

Papers:

This roundtable presents new work by younger scholars on the issues of race, ethnicity and silence within within the digital humanities. Despite being widely acknowledged as important structural norms, race and ethnicity continue to be neglected analytical concepts within this growing field. This silence extends in various forms: in the calibration of digital humanities tools, projects and datasets, which fail to provide mechanisms to examine race as an important category of analysis; in how race structures forms of online identity in computer-mediated forms of communication; and in racialized silences within digital archives. In all of these forms, race and ethnicity persist as undertheorized, haunting signifiers within the digital humanities.

While established scholars in sociology and media and communications have published extensively on this subject such as Alondra Nelson (Technicolor: Race, Technology and Everyday Life, Afrofuturism); Lisa Nakamura (Visualizing Race; Race after the Internet); Wendy Chun (Programmed Visions) and Tara McPherson (Race and Cyberspace) this question is only slowly starting to be voiced within the larger umbrella of literary scholarship through the work of Alan Liu and Amy Earhart (Debates in the Digital Humanities). This is the right moment to raise this debate at the MLA, as the question is starting to be raised in both conference and print literary venues such as within the “Transformative Digital Humanities” collective (#transformdh on Twitter), and fields that encourage the broadening of the definition of literature, such as Critical Code Studies, a field which examines how computer code represents a variation of a politically charged discursive practice.

This session is timely as it directly addresses some recent digital humanities debates such as the debate on archival silences featured in Digital Humanities Now in March 2012. It will address questions such as: Why has the rapidly growing field of the digital humanities been largely silent on the issues of race and ethnicity? How does this silence reinforce unspoken assumptions and doxa within this field? How would a scholar nuance the representation of race in digital humanities projects? What is the role of the scholar of color within this new field? Representing Race: Silence in the Digital Humanities will address these questions by focusing on the theoretical implications of silence as an important structuring and limiting presence within the digital humanities.

To promote discussion, each presenter will be limited to a ten-minute electronic demonstration of their project. Professor Alondra Nelson will propose questions to both the panelists and the audience on questions of race and digital representation within both the social sciences and the humanities.

We will begin with short papers from two members of the Transformative Digital Humanities (#TransformDH) collective that focus on how the digital humanities community has been reluctant to address the issue of race and representation. Anne Cong-Huyen’s “Thinking of Race (Gender, Class, Nation) in DH,” discusses some of the hesitant resistance to the #TransformDH group at both the ASA Annual Meeting in 2011 and the MLA12 meeting, and some of the problems that emerge through the omission of race in the academy. Moya Bailey explores another dimension of this in her paper, “All the Digital Humanists Are White, All the Nerds are Men, But Some of Us Are Brave” by analyzing some new sets of theoretical questions that emerge from an examination of the politics of whiteness, masculinity and able-bodiedness within the digital humanities. Bailey’s paper examines how issues of access inform project design, and how underrepresented groups are imagined as end users to digital humanities projects.

Hussein Keshani goes further in his paper, “Race and State Patronage of Digital Islamic Studies in the UK” to explore the implications of the recent increase in state funding of digital infrastructure initiatives within UK Islamic Studies. By examining the long history of British imperialism and racialized representations of the Middle East, and South and Central Asia, Keshani argues that the UK state patronage of digital Islamic Studies represents more than a silencing of Islam, but a new form of racial governance and control.

Adeline Koh explores how a combination of postcolonial theory and new digital interfaces can address these forms of archival control in “Navigating Archival Silence: Creating a Nineteenth Century Postcolonial Archive.”  She begins by describing how many nineteenth century archives have occluded race and empire in navigational structure, and then discusses how her digital project Digitizing Chinese Englishmen attempts to create a “postcolonial” digital archive by establishing a self-reflexive structure with crowdsourced annotations and other types of public mediated interaction.

The formal part of the session will end with Maria Velazquez’s “Blog Like You Love: Anti-Racist Projects, Black Feminism and the Virtual,” which uses the ideas of embodiment and the ‘posthuman’ to trace a genealogical connection between black feminist creative projects and the digital humanities. Velazquez argues that the 1990s gave rise to a key moment in which black women’s creative practices and neoliberal understandings of community came together, but that these projects have been largely silenced.

All of these papers begin and end with a discussion of race, representation and silence within digital humanities discourse, debates and projects. Panelists will discuss how this contributes to the reproduction norms of social inequity in the digital space, and explore how theory can be incorporated in this discussion to further this debate. The exponential growth of the digital humanities, given the rapidly increasing number of digital-humanities centered panels at the MLA in the last three years, indicates the urgency of investigating the role of race in this field.

Panelist Biographies

Alondra Nelson (respondent). Alondra Nelson is associate professor of sociology and gender studies at Columbia University. Nelson is one of the foremost figures in the field of race and the digital humanities. Her publications include Afrofuturism—A Special Issue of Social Text (2002), a now classic text on the cultural effects of technology on the African diaspora; Technicolor: Race, Technology and Everyday Life (2001), and most recently Body and Soul: The Black Panther Party and the Fight against Medical Discrimination (2011), a seminal new study on the effects of race, health care, genetics and technology. Nelson has also published an essay titled “Roots and Revelation: Genetic Ancestry Testing and the Youtube Generation,” in the new collection Race After the Internet by Lis Nakamura and Peter Chow-White.

Anne Cong-Huyen is a Doctoral Candidate of English at UC Santa Barbara. She is currently finishing her dissertation on temporariness in the literature and media of the post 1980’s global cities of Los Angeles, Dubai, and Ho Chi Minh City. She deals heavily with issues of temporary migration and labor, often unequally divided along lines of gender, ethnicity, and nationality. In addition, she has been involved with questions of race, nationality, and materiality, as a result of digital technologies and within the digital humanities as part of the #transformDH collective, which seeks to insert critical race, gender, queer, disability, and other theories to DH scholarship. She first blogged about Asian American studies and DH in January of 2011, and will be writing an expanded version of that initial blog entry for the forthcoming collection Humanities and the Digital, edited by David Theo Goldberg and Patrik Svensson. She has served as Graduate Research Fellow of the American Cultures & Global Contexts Center, a HASTAC Scholar (where she co-hosted the first HASTAC forum on Race Diaspora in the Digital), and a Research Assistant of the Research-Oriented Social Environment (or RoSE) of the Transliteracies Project, led by Alan Liu.

Moya Bailey is a scholar of critical race, feminist, and disability studies at Emory University.  Her current work focuses on constructs of health and normativity within a US context. She is interested in how race, gender, and sexuality are represented in media and medicine. She is a blogger and digital alchemist for the Crunk Feminist Collective. In a co-authored piece for Ms. Magazine, Bailey proclaims “Black Feminism Lives (online)!” and chronicles the digital discourses of race, gender, and politics as articulated by young black women in cyber space. An earlier version of this paper, “All the Digital Humanists Are White, All the Nerds Are Men, but Some of Us Are Brave,” is under review for The Journal of Digital Humanities.

Hussein Keshani is an assistant professor in Art History and Visual Culture with the Department of Critical Studies at the University of British Columbia, Okanagan campus. His research focuses of the visual cultures of the Islamic world, with particular emphasis on South Asia between the 12th and 15th centuries and between the 18th and 19th centuries. His current research interests include gender and Islamic visual cultures in North India and digital art history. He has recently published “Towards Digital Islamic Art History,” in the Journal of Architectural History (2012), “Reading Visually: Can Art Historical Reading Approaches go Digital?” in Scholarly and Research Communication (2012) and is working on a manuscript on gender, art and space in 18th and 19th C North India.

Adeline Koh is assistant professor of postcolonial literature at Richard Stockton College and a visiting faculty fellow at the Duke University Humanities Writ Large Program in academic year 2012-2013. Her work focuses on the intersections of postcolonial studies, new media and the digital humanities. She recently published a co-edited volume titled Rethinking Third Cinema (2009), and heads two digital humanities projects: The Stockton Postcolonial Studies Project and Digitizing Chinese Englishmen. She regularly contributes to the Profhacker column at the Chronicle of Higher Education on the topic of digital publishing. She is currently working on two major projects: a monograph called Cosmopolitan Whiteness, which examines whiteness as a symbolic form of property in postcolonial literature, and Trading Races, an Alternate Reality/Role Playing Game designed to teach race consciousness in undergraduate courses at the Duke Greater than Games laboratory.

Maria Velazquez is a doctoral student at the University of Maryland, College Park. Her research interests include constructions of race, class, gender, and sexuality in contemporary media, as well as community-building and technology. She has served on the board of Lifting Voices, a District of Columbia-based nonprofit that helped young people in DC discover the power of creative writing, and is on the editorial board of Femspec, an academic journal exploring feminist speculative fiction. She recently received the Winnemore Dissertation Fellowship from the Maryland Institute for Technology in the Humanities. She has also received a fellowship from the Consortium on Race, Gender, and Ethnicity’s Interdisciplinary Scholars Program.  Her dissertation project examines the use of the body as a component in community building online, paying particular attention to the Bellydancers of Color Association, the anti-racist blogosphere, and Red Light Center, an adults’ only virtual world.

Image Credit

Race and Technologies of the Self: Reading Lisa Nakamura and Peter Chow-White’s Introduction to Race After the Internet

This post is part of the HASTAC crowd-sourced book review Race After the Internet project. In this project, each reviewer was assigned a chapter of the book to comment on. This cross-posted entry reviews the introduction to Race after the Internet, written by editors Lisa Nakamura and Peter Chow-White organized by HASTAC. 

A popular Internet meme goes: “On the Internet, no one knows you’re a dog.”  According to this line of logic, the Internet is a liberating space where external forms of identity such as gender, race, age—even species—can be exchanged, played with and performed. No one really knows whether a man, woman or transgendered person lies behind the provocatively dressed female avatar in Second Life, or the age and race of the person behind an onscreen username.

Race After the Internet disabuses us of this commonplace belief. The varied essays in the book demonstrate that, far from being a space where social categories such as race are “transcended,” that the Internet has been instrumentalized to categorize, divide and maintain social boundaries. In her essay “Race and/as technology,” Wendy Chun argues that if race has really decreased as an important social category since the end of the Second World War, that we would see a reduction of “racism and raced images.” Yet, as Chun points out, “we have witnessed their proliferation.” (5) Similarly, Alex Galloway argues in “Does the Whatever Speak?” that the rise of digital racial imagery in video games on the Internet should be a read as a form of “racial coding”, and that “racial coding has not gone away within recent years, it has only migrated into the realm of the dress rehearsal, the realm of pure simulation, and as simulation it remains absolutely necessary.” (11)

This resurgence of race in the Internet Age is masterfully introduced by the volume’s two editors, Lisa Nakamura (@lnakamur) and Peter Chow-White. They begin their forceful introduction with an examination of the careers of two of the most seminal theorists on race theory—Henry Louis Gates and Paul Gilroy—and argue that both have made a shift from the “deconstructive” to the “digital” in their work. While Gates’ most influential academic work stems from the 1980s and the 1990s, such as the watershed volume Race, Writing and Difference and The Signifying Monkey, his newer work is considerably more popular and technological: he now blogs for pbs.org, and produces PBS documentaries such as African American Lives (2006 and 2008) and the ongoing Faces of America, whereby genetic testing is used to definitively to show participants the “truth” of their genetic makeup. Both editors see a similar shift in Paul Gilroy’s work from his Black Atlantic to Against Race, where Gilroy turns towards “genomic thinking” in his formulation of race theory (4). The editors indicate that this exemplifies the emergence of a “new form of racial technology” (3) where “digital technology is here pressed into service as an identity construction aid” (3).

Nakamura and Chow-White’s introduction really shines in its useful overview of the history of the field of race and digital media studies. They begin their genealogy of this field in the “first generation” of studies of the Internet, or the text-based Internet cultures of the pre-Web 2.0 period. They locate this in the work of foundation collections such as Alondra Nelson (@alondra) and Thuy Linh Tu’s Technicolor: Race, Technology and Everyday Life (2001), Nelson’s special issue of Social Text entitled Afrofuturism: A Special Issue of Social Text (2002) and Kolko, Nakamura and Rodman’s Race in Cyberspace (2001) (6).

The next phase of Internet studies is located in the “transmedia shift” of the mid-2000s, where Internet use escalates, and media formats and devices increasingly begin to converge. The editors document this “transmedia” shift  in Anna Everett’s Learning, Race and Ethnicity: Youth and Digital Media (2008), Rishab Aiyer Ghosh’s Code: Collaborative Ownership and the Digital Economy (2005) and Pramod Nayer’s (2010) New Media and Cybercultures (7). Finally, Nakamura and Chow identify key monographs that have been instrumental to establishing the field of race and the Internet, including Nakamura’s own Cybertypes: Race, Ethnicity and Identity on the Internet (2002), Anna Everett’s Digital Diasporas (2009), Wendy Chun’s Control and Freedom: Power and Paranoia in the Age of Fiber Optics and Jessie Daniel’s Cyber Racism: White Supremacy Online and the New Attack on Civil Rights (2009).

While this is a very well-written introduction, the genealogy of race and new media studies it constructs is more North American than international. It would have been helpful to see how the authors would consider how forms of racialization on the global Internet would disrupt or support similar forms back in the United States. While Nakamura and Chow-White argue that works that they cited such as Nayer, Ghosh and Kyra Landzelius’s Native on the Net “internationalize Internet and race studies in much needed ways” (7), the genealogy could have included specific examples of how this internationalization had impacted Internet cultures and race within the United States. Adding works which discussed race and the Internet in other countries would have also been instructive, for example Christopher L. McGahan’s Racing Cyberculture: Minoritarian Art and Politics on the Internet (2007), which focuses on race in UK-based Internet cultures would have been useful, as well as Mark McLelland’s work on ‘Race’ on the Japanese Internet.

Ultimately, Nakamura and Chow-White’s introduction provides a critical new foray into thinking about race and technology. In many ways, their introduction recalls what Foucault termed the “technologies of the self,” or the practices by which individuals represent to themselves the ways in which they order, divide and govern themselves. If, as the Internet meme goes, that “no one on the Internet knows that you’re a dog,” Nakamura and Chow-White’s introduction has shown a different side to the meme: that even if no one knows the species of the entity controlling an online avatar, that the concept of a species still has meaning and resonance on the Internet. In other words, the Internet is not a liberating space, but one which relies on the technologies of the self that come from our “real” social worlds. The Internet has not freed us from race; on the contrary, race has literally become a technology of the self on the Internet.

Image Credit