Engineering Queerness in the Game Development Pipeline
by Eric FreedmanAbstract
This essay provides an overview of the relevant discursive tensions that emerge as game developers negotiate the demands of game engine technologies, and encourages broader critical and creative consideration of the role of game engines in an industry otherwise understood mainly in terms of its published games. For developers, game engines present a necessary mechanical order, but they also facilitate rapid development and cross-platform deployment. This presents a trade-off between order and possibility that organizes the field of play and establishes the player’s relative freedom. This essay examines how the space for possibility, for radical queer sensibility, shrinks through the process of development, and considers the subtle compromises that are made in the selection and use of a game engine. Many indie developers, in the pursuit of efficiency, have no choice but to accept this limit, to tie their intellectual properties to the systematized writing associated with engines, and to see their works operate less queerly, less out of bounds. This analysis considers a range of different engines--more dominant, costly, high-powered tools and proprietary engines built by AAA studios, and alternative, entry level tools such as Twine and GameMaker. The latter have been celebrated for their accessibility, but, more significantly for queer game artists, they also seem to facilitate qualities such as empathy, community and communicative openness. This analysis considers the value of seeing engine choice in such binary terms, and the possibility for more generally queering code space by considering the game development pipeline as an open text. This essay proposes that a queer analysis of the labors and technologies undergirding the work in progress might strengthen more generalized discussions of the representational politics of video games, their audiences and their production communities.
Keywords: Code, Engine, Programming, Queer, Software, Technobiography, Video game
Content note: The content of this article includes several representational references in video games, and covers such subjects as sexism, racism, violence, bodily harm and death.
The video game industry is not simply a sum total of its software enterprises and its serialized intellectual properties: it is also an arena of hardware and software development and licensing. Brand names are not simply attached to game franchises, but also to engines that govern the physics-based properties of characters and, by extension, those players who read and engage them. With most engines, success is measured by the naturalness of the relations across engine, asset, controller, character and player--relations that are circumscribed by the historical, technical and material conventions of the game industry. While a game engine may be known in developer and player communities by its distinct mechanics, abilities and features, in most video games the dynamics of play and the constellation of transactional engagements with the game world work to conceal the engine’s core mathematical logic.
This study is designed to encourage broader intellectual and creative consideration of the role of game engines and queer game mechanics in an industry commonly read as a series of fixed assets (playable intellectual properties), and my attention to labor and design process is an attempt to call into question the stability of game engine architectures, and by extension digital platforms. Despite corporate pronouncements of orderly technical progress, most game makers experience development environments and platforms as fundamentally messy affairs--as points of friction that push against individual agency. Through a review of Capcom’s engine development pipeline (from the MT Framework to Panta Rhei to the RE Engine), Konami’s acquisition of parallel engine technologies (the Fox Engine) and those engines that are more commonly used by independent game developers (Unity, Unreal, Twine, GameMaker and RPG Maker), this essay proposes how we might engage with the study of a relatively immaterial or “difficult” object and assesses the value of such engagement for queer game studies. Furthering earlier and ongoing queer interventions with software and hardware, this essay opens up otherwise hermetic field-specific studies of computation.
Game engines are a critical component of the video game development pipeline, a broader industrial arrangement that may help us understand the production of meaning in games by introducing design process as a rich textual field in which to situate traditional textual analysis. Using a pipeline model, we can set the groundwork for intervening in the ideological imperatives of code. Emphasizing the importance of the pipeline as design process to game studies analysis can reveal the relevant discursive tensions that emerge as game developers negotiate the publication cycle, the hardware cycle, and the cultural demands of audiences. More pointedly, the introduction of the pipeline as design process can also reveal the critical value of procedural literacy for queer game designers, players and critics, in a field where queer analysis can inform industrial analysis. Illustrating how game engines interact with assets is easier than demonstrating the interaction between game engines and the broader field of play. The former is revealed in the editor--in the visual scripting tools used by game artists. The latter, however, requires a more nuanced understanding of the relationship between code, programmer, tool, artist, intellectual property and audience (even in the case of open source engines that alter this transactional pipeline, allowing the engine to operate more freely as a content creation platform) to signal what Henry Lowood (2014) refers to as “the messy interplay of intentions, users and the marketplace” (p.196).
This essay expands on the existing literature on game engine technologies to open up a space for independent game engine developers and game designers to consider their relationship to corporate engines (such as Unity and Unreal) that are used in indie game communities by small and large development teams, and to further the interconnections between game theory and game design. This essay proposes that a queer analysis of the work in progress and the technologies that undergird that work in progress might strengthen discussions of the representational politics of video games, their audiences and their production communities. My focus is on game development--a site of cultural production that developers must negotiate, and where subjectivity and language are fundamentally entangled (O’Donnell, 2014). My use of the term “queer” is not limited to considerations of a counter-hegemonic impulse; more significantly, as this essay considers game production at both the center and the periphery, I am privileging an understanding of queerness as one of semantic openness and expressive possibility that can challenge (or productively deconstruct) normative realism. This is not simply a call for more queer game designers or queer programmers, but a call for the continued investigation and expansion of code, and those obfuscations that hinder queering this terrain, some of which are standard industrial practices designed to prevent tampering, to protect intellectual property, and to increase security and stability by blocking modification. Game engines are computational structures, and their underlying algorithms, like all algorithms, have a social dimension.
Queerness Under the Hood: Code, Platform, Engine
Code operates along a continuum, a site of negotiations involving commodity production, organizational life, technoscientific knowledge and industrially-aligned geopolitical territorialization. Code carries power relations, distributes agency, shapes communication, contours personhood and organizes everyday life; yet it readily disappears behind functional surfaces and their more immediate gratifications. Wendy Chun (2004) notes that computers have created a paradoxical commitment to visuality and transparency. We evidence the former quality in the sheer volume of digital images, assets and code objects that are shared across networks. The latter quality, transparency, is the net effect of the combined work of database systems, artificial intelligence, and interface design to produce unobstructed transactional engagements, effortlessly linking inputs and outputs without calling attention to the language of software. The task at hand is to situate a certain form of textual analysis--one that traverses both visual and nonvisual artifacts--within this uneven terrain.
My work here differs from existing explorations in the field of critical code studies that attempt to queer the foundational logic of programming (see, for example, Zach Blas, 2008) because this essay focuses on the adoption of pre-existing tools (game engines) rather than artisanal (code) writing practices; as an engine is built and versioned, the otherwise latent potential of code, found in its modularity, is readily sealed over. Counter to Mark Marino’s (2006) foundational claims for critical code studies, I do not believe that we can seamlessly situate code in relationship to literature, but my argument does not contradict this earlier work that suggests that the language of the programming community and the various ontological properties of code should be used as a basis for its interpretation. Programming languages are situated practices, and their authors are situated practitioners--parts of larger software industries. The trouble with engines is that as secondary processes, as concretizations of code rather than programming environments, they promote a language gap. For most media scholars, engines have led to a certain illiteracy. Even when they are open source, game engines require a more precise understanding of specific programming languages. They take time to decode, to lay bare the particular lines of code that, taken together, produce a specific aesthetic, mechanic or functional interaction.
The field of video game studies has expanded to deeper studies of the coded mechanics of play; but even as it has expanded to consider platform and software (as distinct points of production and analysis), it has not fully attended to the discursive tensions of game development. Software studies consistently links the technical considerations of software to economic, political, cultural, and ideological concerns, and connects software as object, material and medium to software practice. Platform studies shifts our attention to the materiality of media technologies, though it works with many of the same theories and methodologies as software studies and considers the underlying “possibilities” of code. In their inaugural contribution to the MIT Press “Platform Studies” book series, Nick Montfort and Ian Bogost (2009) read across hardware and software, situating the Atari Video Computer System in its cultural contexts. The texts in the series follow this focus on hardware systems in order to reveal the technical, cultural and artistic affordances of each successive software environment. Yet, as Aubrey Anable (2018) notes, by creating a concise history of technology, these successive studies offer little insight into the complex relationships between bodies and technologies; perhaps as an unseen consequence of technical rigor, the formalism of platform studies almost always erases matters of difference. Anable suggests, “Imagining media platforms as sealed ‘black boxes’ reinforces a similar sealing off of subjectivity, agency, race, and sexuality” (p.137). Similarly, in her critical analysis of the history of UNIX, Tara McPherson (2012) draws attention to the difficulty of reconciling (and synthesizing) the history of culture, gender and race, and the history of programming and computing--a rather unique yet consequential inflection of the digital divide. McPherson urges media scholars to excavate screen space, to consider the relative complicitness of code; to consider code in relation to culture is to avoid performing an overly reductive, formalist read of operating systems, and in the case of this analysis, game engines. The difficulty is tracing the specific influences of code (and unraveling the complex array of distinct algorithms) at the level of visual representation, play and cultural performance, and the specific influences of the culture industry on code (though the broad strokes of engine development, adoption and dominance are fairly obvious).
The birth of engines in the early nineties fostered a split between core functionalities (components such as the rendering engine, the physics engine, scripting, and artificial intelligence) and game-specific content, building on the already extant separation of source code from assets and resources. This split also fostered a division of labor between programmers (creating systems) and artists (filling in the parameters), with the most segmented organizational subdivisions in AAA development. This industrial division also shaped the field of game studies, placing more focused attention on visual analysis, ignoring certain material relations to study narrative, genre, seriality and other literary and cinematic conceits.
Engine development by its very nature suggests a well-defined separation between core software components (for example, rendering, collision detection and audio systems) and the art assets and rules of play, although the line between the game and the engine may shift within any one development studio. The engine is a data-driven architecture, while a game contains hard-coded logic or special-case code to render specific types of assets. The game engine can be best understood as a foundation, though the engine itself can be subdivided between runtime components (the subsystems required when a game is executed) and a tool suite (for digital content creation) that suggest even more deeply bifurcated fields of production and analysis. By knowing the engine (and the unceremonious and decidedly non-visual machinations and industrial labors of software), we may untangle the multiple paradigms of game studies and find the roots of those affordances that are commonly (mis)understood as characteristics of play--when in fact everything matters in the game development pipeline. The proprietary game engine is illusive; beyond the editor environment it is pure code (although some engine editors feature integrated visual scripting tools to facilitate collaboration between artists, designers and programmers). The core technologies of the engine drive embodiment: they are the locus where static and dynamic elements are defined and world chunks are delimited. By knowing the codebase of the engine, we come closer to understanding the relative mutability and stasis associated with software.
A gameplay system--a system of rule-based interactions, entities, objectives and signs of progression--is both more and less than the sum of an engine, hardware, drivers, a machine-specific operating system and peripheral device controls. While a gameplay system is a function of the sum total of these components, it is experienced as a series of states, managed at the level of code, inherited from an abstract base class and handled by a game engine. States are data-driven operations that drive the animation pipeline and allow rapid iteration (Gregory, 2009), call out entity properties and functionalities (transform actions), and more generally situate gameplay, binding together characters and events. The more one examines these data-driven imperatives, the more one notices the common grammatical constructs that govern the game development landscape, a common syntax of finite options that serve to organize labor and provide the necessary communication protocols across any given gameplay system. The structured determinacy of objects is matched by the structured determinacy of their transformation, of possibility, of interaction and outcomes. This is the structuring logic of the game engine, the foundation for a series of intellectual properties that can be redeployed without major modification. The normativizing imperatives of the game development pipeline foster a certain interpretive inertia, an absence of true agency, that operates across every game state and is written at the level of code. The ideological implications of these standards are most evident at the level of representation; for example, in game animations. In their critique of the (artificial) limits of technological necessity, Alison Reed and Amanda Phillips (2013) turn their attention to game studio BioWare’s “tight economy of animations” (p.135) to suggest that the pursuit of computational efficiency is highly racialized; the procedural and economic imperatives of syntactic exactness, of defining and recycling common states, is not simply a matter of software engineering but of radical containment, of conforming otherwise diverse digital bodies to a common vernacular.
While it may be an uneven comparison to consider proprietary AAA engines in relation to the technologies used by indie game developers, in many cases large team and small team developers work with the same toolsets, and even large game studios should be understood as communities of practice (collaborative enterprises that include teams of producers, artists, engineers and designers) that must negotiate the versioning of their respective engines. BioWare’s Mass Effect was produced with the Unreal Engine, a tool used by a broad range of studios, taught in many higher education game programs, and utilized by many independent developers versed in its programming language. This circuit of producers and users of a common software package “constitute a network of parties that share a common interest in its destiny” (Damsgaard & Karlsbjerg, 2010, p.66). They are mutually committed to the program’s refinement and success, and work to co-create its norms. The Unreal Engine Marketplace formalizes (and commodifies) this act of co-creation, as it facilitates the sharing of game-ready content and code, while official Unreal forums provide a space for developers to engage in collaborative problem solving.
Despite this apparent unidirectional push to establish best practices, software represents merely one set of fixed relations for what is otherwise a context-free set of abstractions. Software allows us to consider code as a material practice, as one possible design of organized computation (Mackenzie, 2006). In this analysis, our attention is drawn more precisely to object-oriented programming; governed by specific organizing principles, basic object classes can be used to generate other classes. Through encapsulation or information hiding, code can conceal the internal representation, or state, of an object from the outside. Information hiding is one limit in a game system, aimed at securing the integrity of an object (preventing invalid states) as well as reducing system inter-dependency and complexity. At this limit, there are no open object classes nor are there non-conforming states. Jacob Gaboury (2015) notes that theories of the digital image, of three-dimensional simulation, are historically grounded in a particular understanding of the construction and interaction of objects. This particular ontology requires that as we approach computer-generated graphical images (and digital visual representations more broadly), we “must account for their status materially as both image and object” (p.44). On the subject of materiality, we must consider that computer generated images are linked to particular, evolving program structures, each with its own agenda. To locate meaning and pinpoint ideology requires reading across images and systems of representation. In the case of game engines, developers are seeking distinct solutions to parallel problems (of image production, transformation and interaction), some of which are driven by common languages, though each may have a unique syntax and algorithm. While the final image produced by the broad array of available and emerging engines may fit within the same visual regime, “each is materially distinct from the other, the product of a unique set of interests and concerns” (p.56) governed by publishers and developers.
The developer’s selection of a game engine serves as the most significant transactional limit. When engines appear in public demos, they stand as conflicted texts, as self-contained and readily analyzed objects. They also serve as a symbol of the development pipeline for unfinished games that lack specific release dates, narratives and forms of interaction, and in many cases engines serve as a forecast for the build of as-yet-undetermined intellectual properties. As such, they stand in a unique critical position, speaking to the distinct binaries of developer and audience, hardware and software, engine and interface--as uniquely mutating, queerly open objects. Indeed, a more nuanced approached to game studies should consider both the relative fluidity of the work in progress as an unrealized body as well as the formative conditions of core game technologies. As distinct points of analysis, consider the following examples from three distinct large-scale developers: Capcom, Techland and Kojima Productions.
Capcom: A Case Study in Proprietary Game Engine Development
Capcom’s proprietary game engines, like those developed by other corporations, provide a common work environment for the company’s global network of developers, programmers, directors and sound designers while simplifying the development process across multiple platforms. The engine provides the syntax to allow the insertion of an infinite variety of assets that may perform the same way. The engine controls the soft and hard physics, determines the relative utility of exploration, and emits the signals of agency. The engine determines the relative impact (and realness) of our (avatar’s) presence in the world we inhabit. The engine sets transactional limits on storytelling, fixing and controlling assets. Software mechanics delimit the field of play.
Capcom’s MT Framework (multi-thread, meta tools and multi-target) is a proprietary multiplatform engine (graphics engine, programming language) that allows the company to distribute its content across the major next generation consoles, influencing production and reception across the board. The first version of the MT Framework (1.0) was used for the Dead Rising zombie game, released in 2006, with new versions of the engine released from 2007 through 2010. Engine building marks a shift in approach for Capcom, displacing the relative independence of the development team with the formularized game studio unit.
Changes to engine source code yield changes in design architectures (for example, in the materials editor), fostering the need for ongoing conversations between designers, artists and programmers. The goal for Capcom (and other studios) is to continue to realize greater efficiencies in the game development pipeline (marrying code and design) and to mitigate the dependency on the middleware of other software developers (such as Havok, known for its Physics engine). Engine development is part of a comprehensive strategy for economic growth that includes maximizing the utilization of proprietary content within new operational models.
Resident Evil: Revelations 2 (2015) runs on the last version of Capcom’s MT Framework (2.0). The game returns to the roots of the survival horror genre and has significant stealth elements, although it also implements cooperative gameplay across two characters. While one character is equipped with traditional firearms and melee weapons, the other is more vulnerable, limited to using environmental weapons such as crowbars and bricks. These secondary characters can also find hidden items: one is equipped with a flashlight, while the other has special perceptive abilities. Enhanced vision binds these two female protagonists, whose narrative and mechanical functions are primarily altered forms of looking. There are two such pairings in the game (Barry Burton and Natalia Korda; Claire Redfield and Moira Burton), introduced across its chapter-like installments. In both cases, this basic gameplay mechanic produces a purposeful tension between narrative and environmental agency, and while gender politics are writ large across the character pairings (the young girl Natalia and Barry’s daughter Moira embody two fundamentally passive game mechanics--stealth and distraction), any concomitant shifting across character genders does not inherently produce a queer subject. From the perspective of play (of interacting rather than looking), the game holds steadfastly onto its third-person perspective and its acute attention to the physical environment: for the player, body swapping foregrounds the fluidity of dynamic controller mechanics and the generic rewards of mastery.
This is an object lesson in procedural literacy and the liminality of code, of operationalizing the queerness of code by purposefully remapping it. Code is fluid; it lacks any inherent mechanical or anatomical fixity, although these possibilities become increasingly constricted throughout an ever-narrowing production pipeline. Code is perverse in that it submits all assets to the same inscrutable logic; and conversely, code is normative in that it limits what the body can do as part of a fixed circuit of material relations that is the governing logic of successful game production and gameplay. There is an unseen gap between the algorithm of the game and its appearances. The game mechanic must make sense, and it must work without failure. The control schemes linked to the controller and the control schemes linked to the software and hardware-based manipulation of the character’s body are tethered by the engine, a hub that also determines the potential kinesthetic energies ascribed to the player’s body in the production pipeline and released in post-production praxis. Algorithmic manipulation composes and moves the body, and structures its interactions with environments and other objects. Performance and appearance may seem distinct, but in the digital domain they are subjected to the same controls.
Resident Evil: Revelations 2 privileges physical function(ality) over form but does not altogether erase the symbolic function of narrative. The seriality of the franchise accustoms players to engage in cross-gender identification in order to play through in a completist fashion. Many of the RE games challenge players to play through as both male and female characters, either in a story mode that repeats the same narrative premise but equips the player with different skills or weapons, or in a challenge mode that foregrounds environment over narrative. As part of an industrial imperative to create replay value that is often aligned with both game-specific and platform-specific rewards frameworks, the RE games that allow multiple character choices situate those choices as a form of functional mastery commonly structured as sequentially unlocked gameplay. In RE 5, players who complete the game as the lead character Chris Redfield can elect to play through again as Sheva Alomar, who is otherwise controlled by AI in single-player mode. Shifting gender, race and ethnicity from Chris to Sheva (who hails from an unnamed West African country), the latter is sequenced as a secondary subject position, once again locating character choice as a sign of functional mastery. A completist approach to gameplay can also unlock Sheva’s alternate “tribal” outfit (a leopard print bikini, white body paint and heeled sandals)--a costume that marks and exaggerates her status as an exotic other.
Setting aside for the moment a game’s visual and aural matter, the in-game mechanic is perhaps the most readily accessible sign of an underlying engine mechanic. However, the game mechanic is one step removed from the engine’s underlying processes, providing more immediate insight into game design than game programming. Artificial intelligence is another accessible sign of an underlying engine mechanic. For example, evidenced-based decisions that impact game outcomes are made possible through an engine’s underlying artificial intelligence frameworks. While one can situate decision-making as a form of ethical ideological engagement, to some extent most gameplay can be understood as a cycle of interaction that leads from information gathering and analysis to decision-making and interaction. While game AI is associated with mechanics (attached to characters, it is fundamental to producing embodiment), it is a property of the engine. As an algorithm developed by the programmer that can be associated with data management, it nonetheless creates clear signposts of intentionality. Indeed, if we consider AI as a component of a character or a game object more generally, we should do so in relation to the physics, render and transform (position, rotation and scale) components of the same object. To develop a fuller understanding of a game object, we need to take note of how these components are governed by software systems that may couple or uncouple data and behavior. These relations are specified by the engine.
We might focus, as Eva Hudlicka (2009) suggests, on the relative generation of affective behavior in game engine design, expanding game AI to include “symbolic representational methods and classification algorithms for emotion recognition” (p.300) that could in fact redefine the parameters of the game object, converging the design of game engine systems with the data-driven processes of virtual assistants to understand the known states of the user through context-dependent game feedback. In the case of Resident Evil: Revelations 2 and many of the titles in the RE franchise, in single-player mode (distinct from cooperative gameplay) character-linked AI is a narrative imperative; each time the player switches bodies, for example, from Barry to Natalia, AI follows, occupying the body that is freed from device control. Yet the player must manage both bodies: in a form of psychological projection, the player must attend to the well-being of each character pair, even when one half of the unit lingers just off screen. At the same time, AI also controls the game’s enemies. While a form of emotional projection binds together the game’s central protagonists, despite the mediating influence of machinic embodiment (the game’s protagonists are all delimited by a codebase, and all are subject to varying degrees of AI control or determinacy throughout the game narrative), the same type of projection does not extend to those bodies that are defined by their appearances and actions as enemies. The contextual operations of AI across these distinct bodies is driven by a number of underlying conditions, embedded in the codebase, that, to varying degrees, prompt player identification, call forth distinct actions and, by nature of their animations and vectors, allow for a distinct set of transformations.
Consistently throughout the RE franchise, we are asked to consider the health of our bodies, and to know just enough about each game’s underlying algorithms (how far we can stress the body, the environment, the interaction) to avoid triggering a death animation. Each character model is rendered with a set of materials, each of which contains a shader and the values for that object’s properties. Gameplay code, in turn, feeds several values into the object shader, which are then processed to create an additive offset to the object’s vertex coordinates. In simpler terms, gameplay code supplies an object’s shader with values that signal the changing state of the object; in the Resident Evil universe, as in most games, the physical imprint of code is not simply found in character builds, but also in the impact of the game environment on the character model. The changes to an object’s vertices reveal that body’s state of duress--the location of a weapon’s impact, the vector of the impact, and the radius of the impact. In the case of Resident Evil: Revelations 2, health status is not registered on the protagonist’s body but translated into a disembodied visual signature; as players take more damage, the screen itself begins to redden. Countering the seemingly singular pursuit of graphical realism fostered by repeated versioning of the MT Framework, this title translates the physical register into a game-specific code, as is common with heads-up displays and the broad array of health meters associated with survival horror games. These signs of damage can also yield to a finite set of more traditional death animations--of pre-rendered vector displacements. Once the playable body meets its damage threshold, the controller is temporarily disabled.
Algorithmic knowledge (a facet of procedural literacy) is a fundamental skillset across the RE series; as successive titles have stretched the fundamental mechanics of survival horror to emphasize both melee combat and conventional weaponry, players have been tasked with making critical choices across firearm classifications to consider firepower, reload speed, capacity and “critical hits.” These nuances impact game progression and shift our attention from bodies to weapons as the origin points of ritualized pleasure, making enemy combat more or less challenging, and requiring constant consideration of resource allocation. In every case, the enemy body becomes knowable as a data set: vulnerability is quantified. Perhaps to remind us of the importance of the physical realm, to guide us back to the surface of things, vulnerability is often called out by a visual sign--an obvious weak point--on an enemy body. Visual and narrative pleasure in gameplay can be found in consistently executed rule-based cause and effect--a form of realism aligned with the predictable functioning of algorithms.
Beyond the seamless operation of AI across multiple bodies, the push in engine design is driven by the pursuit of the real in visual representation, while the push in narrative design has led to the proliferation of reductive branching narratives, where choice has an impact on character relationships and story. Certain gameplay choices in Resident Evil: Revelations 2 affect the campaigns of other characters, and as with most of the titles in the franchise, different choices produce different endings. Yet these decision trees are limited displays of affect and agency, all of which are structured by underlying conditional statements. As such, they do not signal a unique turn in engine development. In any game property, narrative design, interaction design and object design are linked to a common data set, and driven by a shared engine.
With the development of Resident Evil 7 (released January 2017), and after abandoning its Panta Rhei engine, Capcom began the build of the successor to the MT Framework: the RE Engine. The RE Engine premiered at E3 2015 in Kitchen, as part of a tech demo for Sony’s virtual reality headset, Project Morpheus. RE 7 was already well under development at the time of the Kitchen demo. Director Koshi Nakanishi has indicated that Kitchen was developed in parallel with RE 7 as a proof of concept of the RE Engine and VR specifically. In the company’s 2016 year-end report, Capcom’s Lead Programmer of the Technology Section Tomofumi Ishida suggests the development of the RE Engine was fueled by the company’s investment in speeding up the development process while realizing new VR functionalities and building photo-realistic assets to match the specifications of next-generation game consoles. For Capcom, photo-realistic effect is aligned with a broadened approach to localization, extending the act of translation beyond language to include non-textual elements. As RE 7 is set in the United States, Capcom’s promotional literature suggests a sensitivity to “design intentions”--a responsiveness to the culture and climate of location. Significantly, Ishida (2016) acknowledges the common negotiations at work between artists and programmers: “Creating an interesting game should not be hindered by development engine constraints. If an artist expresses a desire to do something, the engine must evolve to make it happen. For this reason, all of us on the engine development team work in constant close contact with the game development team to promote improvements” (p.5).
While Capcom developed a new engine over the course of the RE franchise, the larger plotline--the focus of which is a secret organization experimenting with biological weapons on an unsuspecting populace--remains fairly constant. By foregrounding these narrative tropes, which form the primary pleasure (the central conflict) of gameplay, Capcom naturalizes the operational imperatives of its engine. Though the mechanics and the environments evolve with the potential of each engine, a common dramatic impulse binds the franchise together.
The negotiations at those game studios that are invested in dual proprietary properties (game titles and engines) are about the different workflows of artists and programmers; the imprint of the engine on play is consistently found in the interstices between teams and technologies (between tool suites and runtime components, between assets and processes, and between storytelling and level design), as design tools such as shaders can literally be written into source code. Each iteration of the engine marks a point in an ongoing negotiation between programmers and artists over their relative freedoms, while the build and stabilization of a common engine binds together all of a company’s developers. The consequences of the evolving development environment, in addition to introducing mechanical difference, extend beyond the hidden labors of the development team. The MT Framework, for example, has custom toolsets--resources that have been used across those titles built with the engine. While the engine was primarily used by Capcom from 2006 to 2015, it was also the foundation for more recent remasters of earlier Capcom titles, including Resident Evil Zero (2016) and Resident Evil: Revelations (2017), and for several new titles including Monster Hunter: World (2018). Capcom has highlighted several technical improvements with each successive version of the MT Framework; MT Framework 2.0, developed in 2008, introduced more dynamic environmental effects, and other enhanced graphics properties including deferred lighting, optimizing the handling of in-game data in the service of immersion. Reflecting on enhancements to the engine in version 2.0, programmer Taro Yahagi (2012) notes: “This time we’ve focused even more effort on shaders and implemented even more functions. It’s now possible for us to display various kinds of interactivity in our graphics--characters can be shown totally drenched in water or with their clothes scorched.” Capcom’s engine development suggests an ongoing investment in the visual spectacle of bodies and environments (algorithmic exactness), though not in expanded forms of identification and embodiment (an expressive limit that would require algorithmic uncertainty).
Game Engine Reveals: Software as Visual Spectacle
When developers reveal engines in public previews, such as Techland’s Chrome Engine 4 demo (2009), they present isolated on-rail displays designed to illustrate rendering power (through visual spectacle), rather than the coded depths of each software layer or the mechanics of storytelling. Most render demos posit a relation to a particular game title, even as they have an uncertain status as objects belonging to, but not of a game: for example, the Chrome Engine 4 demo calls out its use in Call of Juarez: Bound in Blood, but the pre-rendered clip displays only living environmental elements.
Engine reveals are notably different from their associated game reveals. Techland’s Dead Island February 2011 announcement trailer is a promotional film created by UK (Glasgow) animation studio Axis Productions. The trailer highlights the demise of a typical nuclear family vacationing on the island and, in dramatic reverse time, slowly reveals their undoing by a zombie contagion. The trailer illustrates neither engine nor assets, and it contains neither cinematics nor gameplay. It is a distinct short form animation project. As such, it is a phantom limb of critical inquiry, detached from the body of the game and disassociated from the production pipeline. Unlike the game, it stands as a closed text to be read, laden with cultural value simply in terms of its representational politics (of viral heteronormativity disrupted from without and within--as the young daughter literally tears her father apart) and not in terms of more complex forms of embodiment. As such, it lies outside the primary vectors of critical game studies and does nothing to assist any procedural analysis of either the game mechanics or the game itself. The narrative direction of the trailer runs counter to the temporal unfolding of the open world game, even though the game’s setting, a tourist resort on a tropical island, evokes a similar tension between the modern and the primitive. The trailer’s cinematic language is far more seductive than the game’s limited first-person perspective.
Dead Island was developed with the fifth iteration of Techland’s proprietary Chrome Engine, which was first used in 2003. The most forceful critiques of the Chrome Engine codebase are narrowly defined forensic investigations that align most readily with singular game objects. Nonetheless, these are significant interventions that demonstrate the importance of reading script against narrative agency, and of positioning code in relation to game world tropes. Such findings are close approximations of the overall power of script. In the case of Dead Island, the rogue “FeministWhorePurna” placeholder text for a subsequently renamed “Gender Wars” skill points to an obviously racialized gender bias in the game narrative and in the game mechanic (Yang, 2017), but also points to a bias in the game development pipeline (where files are named by programmers).
A similarly conflicted object, PT, a playable teaser and promotional vehicle for an unreleased game property developed by Kojima Productions and published by Konami, showcases the mechanics of an engine, but has an uncertain relationship to its parent property. Released for the PS4 in August 2014 as a free download on the PlayStation Network, PT served primarily as an interactive teaser for the game Silent Hills, a cancelled installation in the Silent Hill franchise to be directed and designed by Hideo Kojima in collaboration with Guillermo Del Toro. The teaser plays as a series of ever-changing hallway loops that the player navigates from a first-person perspective, with each pass marked by environmental changes and cues that when sequenced correctly unlock a final cut scene revealing the nature of the game and its role in the franchise. With the cancellation of the game,PT exists as a self-contained intellectual property, a game in itself, with an uncertain status as a videogame, demo, or teaser, as the mechanics of the full property were never defined (although the playable loop, with its puzzle-solving mechanic and its narrative and environmental emphasis on the haunted and the supernatural is squarely tied to other Silent Hill installments). As a playable environmental loop, PT is a fairly close visual approximation of the spatio-temporal gradients and the power of its engine, although its closing cut scene re-attaches it to an already value-laden serialized intellectual property, fixing its status as something other than a mechanics-driven environmental problem (a puzzle) to be solved.
PT was developed with the Fox Engine, a proprietary cross-platform, cross-generational game engine built by Kojima Productions for use in Konami games, including Metal Gear Solid V. The engine was presented in demo in 2011 at Konami’s pre-E3 conference. Like most engines, the Fox engine contains a level editor, animation editor, cut scene editor, FX editor, UI editor, sound editor, motion editor and other toolsets. The engine features physically-based rendering, with support for reference models from real-world cameras to test the effects of linear space lighting and to maximize its photorealistic effect, and both 3D and photo-based scanning to create photorealistic character models. The goal of the Fox engine is realistic, physically accurate output, realized by studying and measuring the details of the world outside the game while still situating photorealism in context--defined by the material rules of the game. One promotional piece for the engine positions two sets of images of a Kojima Productions conference room side by side--one set of digital photographs and a second set of screenshots created in the engine and rendered in real time--and frames the photo series with the prompt, “Is it REAL, or is it ‘FOX?”
Figure 1: Promotional materials for the Fox Engine foreground photorealism.
Engine, code and 3D design, bound together in the game production pipeline, introduce operational limits that bind together those correspondences in the service of realism. In their analysis of the Fox engine, Stephanie Boluk and Patrick Lemieux (2017) suggest that the ongoing push toward (hyper)realism has led to a visual economy in which “all assets are equal and even the most banal subjects require graphical overkill” (p.129). In their analysis of Fox engine artifacts and game assets more broadly, Boluk and Lemieux productively extend the concept of realism beyond the formal pursuit of mimesis to the ideological conceits of mastery--over the tools of production, the engine-driven traces of the natural world (for example, modeling and texturing, lighting and shading), and over the player’s cognition and perception. To extend the concept even further, we need to call out the role of scripting as an essential aspect of all games--defining and classifying object types, connecting object types (for example, game objects and components) and messaging between object types to activate, deactivate and create new instances of an object.
Ideology operates across the coding and decoding of assets, the core mechanics associated with the field of those assets, and the broader discourses of industrial production (the pursuit of more forceful correspondences between games and everyday life made possible by advancing software, hardware and devices). But our attention to this vast network of informatic code (Galloway, Gaming, 2006) is often fleeting--undone by the more seductive and unruly politics of representation and the more overt imperatives of engagement and interaction. Subsequent demo images of the Fox engine feature rather unfortunate racialized choices, as pre-built characters are dropped into the Kojima conference room; devoid of context they reveal much about the underlying nature of play as a value-laden (and colonizing) process of game development. They also signal the forceful pursuit of a singular form of (photo)realism, and articulate an ideology (and a corporate philosophy) aligned with technological teleology.
Figure 2: The pursuit of photorealism is not without consequence, as illustrated in a public demo of the Fox Engine.
Visual pleasure and informatic code are woven together in the game production pipeline, for identity is a data type--a mathematical variable--that is literally bound to bodily articulation and played potential. The informatic mode of cybernetic typing is made manifest in everything from motion capture, to transcoding, to statistical analysis, to keying attributes and actions to specific numeric variables, and the degree of its prominence provides an index for the very dominance of informatic organization, which has recolonized the function of identity. We must not look beyond the representation, we must look underneath it to find its coordinates--seeing the mappable body as a physiognomic system and a mechanical system. There is value in critical game praxis to mobilize (rather than resolve) this tension between surface and substrate--to interrogate realism by calling out both the formal and structural properties of a game text (Galloway, 2004). This is not a call for a particular form of realism (narrative realism, representational realism or social realism), but rather a call for attending to how realism in all of its (interactive game-based) forms remains connected to game architectures. And while certain game components may be repurposed to more openly expressive ends, engines are driven by rules with limited variables. At the same time, the push toward realism in any of its forms is not linear or unidirectional. Multiple engines are running at any one moment: MT Framework properties are running simultaneously with RE Engine properties, and multiple engines may still be at play within any game development company or community. These overlapping architectures are both a matter of game development timelines but also of the inherent technological needs or aesthetic choices being made by developers.
The Uncertain Queerness of Game Engine Logic
There is a broader lesson here in shifting our attention to the deeper architectures of game design while not abandoning the image altogether--a lesson that may be aligned with one of the possible futures of queer media studies. By studying software, we are unlocking our ability to be mindful of the manner in which closed design and deployment systems push human agency toward technocentric certitude, and we are more mindfully situating the study of representation in a richer contextual field. There are broader lessons about connecting algorithms to material outcomes that are more important than the theorization and build of a queer game mechanic. To realize a queer game mechanic necessitates building from the ground up with an acute awareness of our cultural context--using cultural theory to inform engine design, for example. Yet the industrial frameworks of the game industry, where labor is situated within a pipeline model, and the institutional lineages of programming languages make this an inauthentic value proposition. Constructing non-normative space, creating narrative incongruity, and evoking liminality are a few collective queer tactics that have been used to reorient play, but these are punctuations in storytelling that do not pierce the systems layer. Taken together, these gestures and objects have value in that they privilege queer history and experience; but code persists and the singular pursuit of realism is well underway.
Program code, like language in general, naturally oscillates between process and expression. The call for a “QueerOS” formulated by Fiona Barnett et al. (2016) looks to the unrealized promise of self-modifying interfaces; to cut through the “skin” of the interface is to cut through the heteronormative materialities of information, to restore the transformative possibilities of code. The possibility of a queer operating system starts with the assumption that most modern interfaces are, by design, sites at which code has already coalesced--they are prescriptive in form and function, “hygienic” distillations of the material world into discrete fields of information (Barnett et al., 2016). While code may seem fluid and open, as it flows across multiple agents, it also follows a call to order. The need to execute is a non-waiving protocol that cuts across industrial sectors and functions. The call to order emanates from and satisfies multiple bodies, and conjoins machine-like humans (programmers) and human-like machines (engines) in closed-loop information systems. Programmers, as industrial agents, express themselves through their manipulation of layers of representation, including symbols, words, language and notation. A significant complication in this study of bodies and languages has been explored by those opening up the overlooked queer history of computing. Jacob Gaboury’s (2013) exploration of this queer genealogy, and the technical achievements of notable figures including Alan Turing reaches beyond biography into code mechanics. Gaboury’s goal is to demonstrate that the structuring logic of computational systems does not fully erase queer subjectivity; any requisite semantic order cannot account for all forms of knowledge. For Gaboury and other queer scholars, queering the field of computation has less to do with the subtle mechanics of code (or radical disruptions in its operational imperatives), and more to do with the ease of communication and contact, and the production of community (Gaboury, 2013). We should understand “communication and contact” in relation to the productive sharing of limits (of the body, identity and code work), and remain open to questioning whether community can actually be produced through work. In The Inoperative Community, Jean-Luc Nancy (1991) suggests that the work of capital is antithetical to community, for it can only privilege the general characteristics of its products.
New media theorist Friedrich Kittler characterizes the cultural problem and importance of code by suggesting, “Programming languages have eroded the monopoly of ordinary language and grown into a new hierarchy of their own” (1995). Code evades perception; as it reaches from simple operations into complex languages, it also moves through several distinct assemblies, passing from the fluid production framework of programmers to the relatively fixed mechanisms of hardware controls. The programmer invariably steps out of the equation, leaving the program to run on its own; as code evades cognition by any one observer, it also appears to write itself.
Despite my attention to code, I echo the sentiment expressed by Jennifer Malkowski and TreaAndrea Russworm (2017) in their study of game representation that code analysis if taken on its own has the tendency to silence otherness--to reconstitute margins. My goal is not to dismiss representational analysis, but to understand the complexity of its determinacy: a signpost often inscribed in the codebase, at an often-overlooked point in the development pipeline. To push representational politics forward, we must understand its many origin points. This is not an either/or proposition, of studying code or image. Malkowski and Russworm note that representation is tethered to software and hardware, but this dependency does not negate the politics of the image (which in the public arena is often of greater immediacy and consequence); rather, they suggest we must situate computational and representational code side by side, and understand their specific discursive (and functional) histories. This analytical model pushes further than the “unit analysis” proposed by scholars such as Ian Bogost (2006), although what binds these approaches together is the common search for limits and the concomitant search for affordances. The critical importance of situating image and code side by side is meant to undo traditional hierarchies of labor, knowledge and value; to excavate and examine code while sacrificing any careful attention to the image is to improperly imply that representation is merely a “surface effect” (Anable, 2018, p.137) of more substantive technological underpinnings.
The space for possibility, for radical queer sensibility, cuts across game layers (image, interface, the architectures of software and hardware) and shrinks through the process of development. While this is true as well in the more obvious signs of game trailers that fix meaning in parallel ways (making even the most open game a knowable, marketable property), the compromises that are made in the selection and use of the game engine are much subtler. The choices made in marketing material that set a series of limits are just as important as the default character rigs and the binary data sets that delimit “Hero_Gender_Male” or “Hero_Gender_Female.”
Many independent game developers, in the pursuit of efficiency, have no choice but to tie their intellectual properties to the systematized writing associated with one of several proprietary game engines, have to see their works operate less queerly, less out of bounds. There are a broad range of engines and development environments--more dominant, costly, high-powered tools such as Unity and Unreal, and alternative, entry level tools such as Twine and GameMaker. The latter have been celebrated for their accessibility. Twine is a free, open source browser-based tool that requires only basic HTML. GameMaker is primarily a 2D engine that features a drag and drop system, a proprietary scripting language (GML), and a fairly deductive build process of rooms, objects and pre-built script events; the simplicity of GML writing does not neatly align with the stricter requirements of more complex programming languages. Users of RPG Maker, a development engine for role-playing games that includes a number of pre-built assets, have formed communities of their own to overcome many of the engine’s inherent physiognomic defaults or absences--such as skin tone, hair color and texture--and to support each other writing customized scripts in JavaScript or Ruby (Game Scripting System).
For queer game artists, the simpler mechanics (and alternative game languages) of Twine and GameMaker seem to facilitate qualities such as empathy, community and communicative openness. But we need to avoid understanding even the value of engine choice in such binary terms, as freedom, expressivity, multivocality and queerness are in every case delimited by software mechanics. Twine communities do not commonly intersect with Unity communities, which creates a set of linguistic (and perhaps class-based) barriers between those who have been taught to code and those who have not. Engine choice is commonly driven by language choice and proficiency; most Unity developers write code in C# and most Unreal developers write in C++ (or marry C++ with Blueprint, the engine’s visual scripting system). The affordances (and relative complexities) of programming languages parallel the affordances of their engines.
Alison Harvey (2014) has noted that accessible open-source tools such as Twine have queered the landscape of gameplay and creation by promoting game-making on the periphery--expanding and diversifying communities of practice and stimulating alternative economies of production and distribution. Yet all development tools work by trafficking in signs in pre-determined languages. As a whole, these practices reveal the evolving contours of technobiography: they speak to the degree to which the body is a network, experience can be quantified, and technology can play a role in both the expression and construction of self.
The adoption of expanded forms of AI by the game development community suggests one of several possible futures for a more expressive game mechanic. The contextual expertise afforded by AI in localized 3D environments has led to a number of limited experiments with real-time player-built 3D assets that trade on the autobiographical trace. Watson and Waffles (2016), created during an IBM sponsored hackathon uses Watson (a suite of AI services, tools and applications that are part of a growing IBM machine-learning ecosystem) to introduce a VR drawing mechanic into a pre-built Unity 3D environment (Ginn, 2018). Linking AI to the functional parameters of engine-based mechanics, the game transforms hand-drawn 2D forms into 3D objects (for example, a key, a ladder and a light bulb) that shape the field of play and form a fundamental part of puzzle-solving and progression. The 2018 launch of the IBM Watson SDK (software development kit) on the Unity Asset Store has fueled such early experiments by formally opening up the Unity development pipeline to integrated Watson cloud services.
Semantic openness is always relative. Game engine design, and the chronology of use and version history, limits and conceals what are otherwise mutable relations--that is the very nature of an engine. The value of the engine lies in efficiency (of labor) and re-use. As we read the engine to determine the intention behind its coded frameworks, we discover first and foremost that the need for radical clarity is a function of efficiency, reproducibility and operability. Computational space is structured to serve the organizing principles of the labor force--the seamless ability for programmers and designers to occupy their assigned positions in the production pipeline. In their discussion of code aesthetics, Michael Mateas and Nick Montfort (2005) suggest that “good” code facilitates these lived industrial relations as it “simultaneously specifies a mechanical process and talks about this mechanical process to a human reader” (p.153).
The experience of movement, of environment, of subject-object relations, are all bound by the engine. In gameplay, there is no transformative enactment, for there is no fundamental unmasking of the environmental mesh, no fundamental reworking of the landscape, no undoing of static and dynamic elements. There is the experience of unfolding, which itself is governed by a memory allocation subsystem--yet another mechanical layer and functional imperative of the engine. In the broadest terms, there is no fundamental understanding of the conditions that make gameplay and the game world possible. And even the historical conditions of the engine are concealed by versioning, as swappable assets become the only knowable and comfortable signs of difference. Therein lies the bind: technological revealing circumscribes and cancels historicity and pulls us away from the socioeconomic conditions that drive the game industry pipeline. To be truly transformative, media studies must consider games as a process and a pipeline, or risk effacing the labor capital of a significant body of individuals in an industry studied primarily through its intellectual properties, its objects and its representations. The focus on labor is a logical extension of what Gerald Stephen Jackson (2017) refers to as the problem of complexity in software engineering that erases the collaborative creative activity of individuals within the game development pipeline--a process that substitutes mastery for agency and has no tolerance for disruption. If game development is grounded in engine choice, then computation is performed only in relation to an engine; this is relational rather than open “computational performativity” (Jackson, 2017).
Gesturing toward collaboration, co-creation and community building, larger game companies have opened up their more robust development tools to independent artists. Most software developers have kept their intellectual properties open through a proprietary SDK while restricting the ability to publish and sell without a version license, and most have created pathways for sharing independently-authored assets. Most of the major game technology companies participate in and openly advocate for some version of a sharing economy. Game engine development follows one of three business models: free open source (the engine is distributed with source code), freeware (the engine is distributed freely but without source code), and commercial (the engine is proprietary and distributed through a royalty model or commercial contract). The Unreal Engine Marketplace launched in March 2014 with the release of the Unreal Engine 4. The Marketplace is an e-commerce platform that connects content creators with developers; the site facilitates micro-transactions of game-ready content and code (including 2D assets, characters, animations, special effects, materials and textures, environments and blueprints). The Marketplace (launched by developer Epic Games) is indicative of an emerging ecosystem approach that produces communicative openness, sharing, and collaboration.
There is significant value for queer game studies to advance computational literacy and to consider the relative authenticity of the transactional calls to community found in a marketplace of object-oriented ideas and solutions; resource sharing is more often focused on collaborative problem solving (on pushing past a procedural limit) than it is on ideological decoding. This is not surprising, as code is a text most commonly understood in relation to the causal chains in machines (Tzankova and Filimowicz, 2017). Code has to work; but code is also a form of creative expression. Code is embedded in structuring transactions that foster deep cultural dependencies between organic and inorganic actors. While many queer game designers are indeed building and not simply occupying game space, and are writing code as a new form of technobiographic practice, engines are built with limits, and every development choice has its consequences.
The ongoing industrial migrations of game engines (across entertainment forms, 3D imaging, prototyping and manufacturing, scientific visualization, architectural rendering, cross-platform mixed reality, and within the military) suggest they have broad power for organizing the cultural field. In a multi-million-dollar deal with game developer Crytek, Amazon licensed the CryEngine in 2015 as a codebase for its own proprietary Lumberyard engine and with the goal of expanding the Amazon Web Services ecosystem by consolidating a suite of products and services for video game developers (tools for building, hosting and livestreaming). As game space and lived space continue to converge, bound together by new industrial arrangements in extended reality, the major 3D authoring tools and their respective parent companies will continue to have a significant impact on contemporary visual culture. Code is the new horizon for the culture at large. Algorithms may be statements of machinic discourse (Goffey, 2008), but their effects are broad and real.
If we understand code as the set of instructions that undergird software systems, to queer code is to understand it not simply as a functional constraint but also as a method to distribute norms. To queer code is to perceive those flashpoints where technology regulates social experience--where algorithms have decided material outcomes. We need to continue to queer code space, but we also need to acknowledge the subjective and varied labors within the game development pipeline--where developers and publishers, artists and programmers have to arrive at a common language. Working within a localized industrial context, this language is a codebase necessarily borne of a particular moment within a protracted build cycle. To queer code space, queer game studies must consider the engine as a foundational element that delimits an otherwise mutable process. If we understand the direct mechanisms for connecting code and control, we can properly assess the limits of control and lay a more appropriate groundwork for realizing free play in networked space.
References
Anable, A. (2018). “Platform Studies.” Feminist Media Histories 4, no. 2, 135-140.
Barnett, F., Blas, Z., Cárdenas, M., Gaboury, J., Johnson, J.M. & Rhee, M. (2016). “QueerOS: A User’s Manual.” In M.K. Gold & L.F. Klein (Eds.), Debates in the Digital Humanities 2016 (pp.50-59). Minneapolis: University of Minnesota Press.
Blas, Z. (2008). “transCoder: Software Development Kit.” Queer Technologies 2007-2012. Retrieved July 23, 2018. http://www.zachblas.info/works/queer-technologies/.
Bogost, I. (2006). Unit Operations: An Approach to Videogame Criticism. Cambridge, MA: MIT Press.
Boluk, S. & Lemieux, P. (2017). Metagaming: Playing, Competing, Spectating, Cheating, Trading, Making, and Breaking Videogames. Minneapolis: University of Minnesota Press.
Chun, W.H.K. (2004). “On Software, or the Persistence of Visual Knowledge.” Grey Room 18 (Winter 2004), 26-51.
Damsgaard, J. & Karlsbjerg, J. (2010). “Seven Principles for Selecting Software Packages.” Communications of the ACM 53, no. 8, 63-71.
Engel, M. (2017). “Perverting Play: Theorizing a Queer Game Mechanic.” Television & New Media 18, no. 4, 351-60.
Gaboury, J. (2015). “Hidden Surface Problems: On the Digital Image as Material Object.” Journal of Visual Culture 14, no. 1, 40-60.
Gaboury, J. (2013). “A Queer History of Computing.” Rhizome, February 19, 2013. Retrieved June 10, 2018. http://rhizome.org/editorial/2013/feb/19/queer-computing-1/.
Galloway, A.R. (2006). Gaming: Essays on Algorithmic Culture. Minneapolis: University of Minnesota Press.
Galloway, A.R. (2006). “Language Wants to be Overlooked: On Software and Ideology.” Journal of Visual Culture 5, no. 3, 315-331.
Galloway, A.R. (2004). “Social Realism in Gaming.” Game Studies 4, no. 1 (November 2004). Retrieved June 10, 2018. http://www.gamestudies.org/0401/galloway/.
Ginn, C. (2018). “IBM & Unity Partner to Bring the Power of AI to Developers.” IBM DeveloperWorks Blog, May 3, 2018. Retrieved August 31, 2018. https://developer.ibm.com/dwblog/2018/ibm-watson-unity-sdk-developers-ai-ar-vr-speech-gaming/.
Goffey, A. (2008). “Algorithm.” In M. Fuller (Ed.), Software Studies: A Lexicon (pp.15-20). Cambridge, MA: MIT Press.
Gregory, J. (2009). Game Engine Architecture. Natick, MA: A.K. Peters, Ltd.
Grizioti, M. & Kynigos, C. (2018). “Game Modding for Computational Thinking: An Integrated Design Approach.” IDC ’18 -- Proceedings of the 17th ACM Conference on Interaction Design and Children (June 2018), 687-692.
Harvey, A. (2014). “Twine’s Revolution: Democratization, Depoliticization, and the Queering of Game Design.” GAME: The Italian Journal of Game Studies 3, 95-107.
Hudlicka, E. (2009). “Affective Game Engines: Motivation and Requirements.” Proceedings of the 4th International Conference on Foundations of Digital Games (April 2009), 299-306.
Ishida, T. (2016). “RE Engine: An Engine Enabling the Artist to Fulfill Their Every Wish.” Capcom Integrated Report 2016 (The Latest Development Report), 5.
Jackson, G.S. (2017). “Transcoding Sexuality: Computational Performativity and Queer Code Practices.” QED: A Journal in GLBTQ Worldmaking 4, no. 2, 1-25.
Kittler, F. (1995). “There is No Software.” CTHEORY, no. 32 (October 18, 1995). Retrieved December 30, 2017. http://www.ctheory.net/articles.aspx?id=74.
Lessig, L. (1999). Code and Other Laws of Cyberspace. New York: Basic Books.
Lowood, H. (2014). “Game Engines and Game History.” Kinephanos, History of Games International Conference Proceedings (January 2014), 179-198.
Mackenzie, A. (2006). Cutting Code: Software and Sociality. New York: Peter Lang.
Malkowski, J. & Russworm T.M. (2017). “Introduction: Identity, Representation, and Video Game Studies Beyond the Politics of the Image.” In J. Malkowski & T.M. Russworm (Eds.), Gaming Representation: Race, Gender, and Sexuality in Video Games (pp. 1-16). Bloomington: Indiana University Press.
Manovich, L. (2013). Software Takes Command. New York: Bloomsbury.
Marino, M. (2006). “Critical Code Studies.” Electronic Book Review, December 4, 2006. Retrieved June 10, 2018. http://www.electronicbookreview.com/thread/electropoetics/codology.
Mateas, M. & Montfort, N. (2005). “A Box, Darkly: Obfuscation, Weird Languages, and Code Aesthetics.” Proceedings of the 6th Digital Arts and Culture Conference (December 2005), 144-153.
McPherson, T. (2012). “U.S. Operating Systems at Mid-Century: The Intertwining of Race and UNIX.” In L. Nakamura & P.A. Chow-White (Eds.), Race After the Internet (pp. 21-35). New York: Routledge.
Montfort, N. & Bogost, I. (2009). Racing the Beam: The Atari Video Computer System. Cambridge, MA: MIT Press.
Nancy, J-L. (1991). The Inoperative Community (P. Connor, Ed., P. Connor et al., Trans.). Minneapolis: University of Minnesota Press.
O’Donnell, C. (2014). Developer’s Dilemma: The Secret World of Videogame Creators. Cambridge, MA: MIT Press.
Reed, A. & Phillips, A. (2013). “Additive Race: Colorblind Discourses of Realism in Performance Capture Technologies.” Digital Creativity 24, no. 2, 130-144.
Tzankova, V. & Filimowicz, M. (2017). “Introduction: Pedagogies at the Intersection of Disciplines.” In V. Tzankova & M. Filimowicz (Eds.), Teaching Computational Creativity (pp.1-17). New York: Cambridge University Press.
Yahagi, T. (2012). “How Dragon’s Dogma Changed the MT Framework.” GregaMan Blog, April 5, 2012. Retrieved August 31, 2018. http://www.capcom-unity.com/gregaman/blog/2012/04/05/how_dragons_dogma_changed_the_mt_framework.
Yang, R. (2017). “On ‘FeministWhorePurna’ and the Ludo-material Politics of Gendered Damage Power-ups in Open-World RPG Video Games.” In B. Ruberg & A. Shaw (Eds.), Queer Game Studies (pp.97-108). Minneapolis: University of Minnesota Press.
Ziarek, K. (2001). The Historicity of Experience: Modernity, the Avant-Garde and the Event. Evanston: Northwestern University Press.
Zimmerman, E. (2009). “Gaming Literacy: Game Design as a Model for Literacy in the Twenty-First Century.” In B. Perron & M.J.P. Wolf (Eds.), The Video Game Theory Reader 2 (pp.23-31). New York: Routledge.
Ludography
Call of Juarez: Bound in Blood (Techland, 2009)
Dead Island (Techland, 2011)
Dead Rising (Capcom, 2006)
Kitchen (Capcom, 2015)
Mass Effect (BioWare, 2007)
Metal Gear Solid V (Kojima Productions, 2015)
Monster Hunter: World (Capcom, 2018)
PT (Kojima Productions, 2014)
Resident Evil 5 (Capcom, 2009)
Resident Evil 7 (Capcom, 2017)
Resident Evil Zero (Capcom, 2002/2016)
Resident Evil: Revelations (Capcom, 2012/2017)
Resident Evil: Revelations 2 (Capcom, 2015)
Silent Hills (Kojima Productions, not released)