*NB: This article originally appeared in Model View Culture on January 20, 2015. I am reproducing it on my personal blog for posterity *
What does it mean to design computers for everybody?
Hardware is too often ignored when it comes to improving inclusivity.
Advances in computer hardware are marked by revolution – the personal computing revolution, the multiprocessor revolution, the mobile computing revolution. A new class of hardware design revolutions are approaching rapidly: the widely-predicted end of Moore’s Law calls for drastic innovation to continue making faster processors, and the reduced cost of commodity hardware from companies such as Arduino, Raspberry Pi, and Adafruit reduces barriers for hobbyists to experiment with full-featured microprocessors.
Despite these radical changes, most existing efforts to improve access to computer science education leave computer hardware out. Coding camps and online resources proudly proclaim that “anybody can learn to code,” but do little to explain how code actually runs on a computer. Even programmers seeking to learn for themselves how code executes are told to ‘RTFM’ and questioned on their intent. Meanwhile, existing efforts to create a more inclusive hardware community fall short. The Open Source Hardware Association, despite a board of directors that is 50% female, only has 6% non-male membership. Make Magazine, a popular resource for DIY hardware development, has only featured six women in 9 years of publication, and no people of color. While both the conversation on promoting diversity in technology and the accessibility of commodity hardware are becoming more mainstream, the two ideas remain lamentably disassociated.
A rigid focus on programming in promoting diversity in tech ignores the growing need for hardware-centric software design. Writing software for different platforms, from the iPhone to the Internet, requires fundamental knowledge of the platform you’re developing for. Mobile developers need to balance fast code with ensuring efficient battery life, and scaling up Internet companies demands understanding of server technologies and data center infrastructures. As computing focuses in more on new technologies and hardware platforms, an emphasis on only ‘fun, approachable’ computer science becomes problematic: we are still trying to hide the negatives stereotypes of computing. Rather than ushering minorities into computing while hiding its negative stereotypes, we should reimagine how we introduce the most scary and ‘computer’-y parts of computing, like assembly code and processor design, and change the common perception of hardware. As hardware becomes more of a dominant force in driving technology, leaving women and minorities out of hardware design leaves them out of shaping the future of computing.
Video games as a window into hardware design
In my first computer hardware design course, our professor inspired us to love hardware design through vintage video games. As we built up our knowledge from 1’s and 0’s to bigger logic circuits, he would show us those same circuits in the original logic boards of arcade games like Pacman or Space Invaders. We learned that the engrossing, visual experience of playing arcade games had deep roots in fundamental hardware design concepts. This contextualization of hardware design within a fun, familiar environment cemented our interest in digging around the inner workings of a computer.
The notion of using exciting applications to motivate hardware design isn’t restricted to the classroom. In the late 90s, a small company named Nvidia designed the graphics processing unit (GPU) as a way to render computation-heavy video game graphics more quickly. Today, the company beats processor-design heavyweights Intel and AMD at market share for graphics chips, a success derived from tying their hardware advances to the demand of 3D graphics for video games. Video games are a $100 billion industry, and computer architects assert that the rise of the GPU is a direct result of gaming’s economic power.
The GPU’s design for graphics parallel processing was so much more powerful than standard computer processors that scientists and researchers began to shape other large computations to fit the GPU’s model. Programs ranging from genome sequencing to statistical analysis capitalized on the efficiency of GPUs to complete data-heavy tasks quickly, and thus the idea of general-purpose GPUs (GPGPUs) took form. The use of GPUs has become so standard in non-graphics applications that NVIDIA and other companies have begun producing more flexible, programmable GPUs for GPGPU purposes in high performance and supercomputing tasks. What was once a niche accelerator for video games is now tasked to run chemistry simulations and particle accelerator experiments, and is prevalent in all smartphones, laptops, and desktop computers as well.
The rise of the GPU enables snappy animations on our iPhones and smooth playback of movies on our laptops, and has also impacted scientific communities beyond computer science. But video games should not be the sole motivational example driving all novel computer architectures, nor should it be the only methodology for introducing computer hardware to new learners. Although women make up almost half of gamers, they comprise only 12% of video game developers. Sexism and misogyny are rampant in video games, and this fosters an environment of harassment and abuse for female game developers and journalists. When GPU companies work so closely with video game companies, and promote video games first on their company website, they only exacerbate the association of hardware with communities that are unfriendly to women.
On a broader scale, the evolution of computers designed for the needs of a few into the computers for all introduce biases against minorities. Psychology researchers found that simply the presence of video games in a computer science classroom impedes minorities from expressing interest in computing. In recent months, LittleBits CEO Ayah Bdeir argued that the design of conventional electronics calls back a history that is alienating, and MIT Media Lab professor Leah Buechley lamented that the maker communities only cater to rich, white men. As we move forward, we must consider what kind of applications and communities we include and exclude when designing commodity hardware.
what we talk about when we talk about hardware revolution
In an environment when a successful startup has lasting impact on the design of computers 15 years later, we cannot afford to ignore any chance of opening access to the future of hardware design. At best, ignoring change means continuing a history of computing tied to oppressive spaces. At worst, disregarding the role of inclusion in the internet of things will lead to a society interconnected, but only for a privileged few. A future of computers for everyone requires profitable application spaces to drive production, new flagship features to get people organically interested in hardware, and hardware designs with lower barriers to entry.
A computer with a text editor and a browser with an Internet connection is all it takes to write code and explore new software, but every addition to a circuit design starter pack implies the financial means to acquire more gates and LED lights, access to programmable hardware devices, and a pre-existing ability to navigate lamentably under-supported software toolchains. Computer software has lower overall costs, and educational resources are more freely available. We need to reduce costs to more flexible and powerful hardware to increase access for those who want it.
We also need the hardware industry to acknowledge a diversity of interests and influences across the full stack of computing, to produce chips that inspire new classes of devices. Most importantly, we need to recognize that the applications that drive innovation also drive interest in the area of hardware design, and ensure that we aren’t only promoting interest within the same homogenous groups that have always been there.
Luckily, much of the foundation for making change has already been created. $20 Raspberry Pi’s and Arduino boards help bridge the gap between programmable chips and standard laptops, beginning the work of bringing affordable, programmable computers to artists, musicians, and students. Crowdsourcing allows Bitcoin enthusiasts to fund their own bespoke hardware design, tailored specifically to their Bitcoin mining needs. Risk-averse venture capitalists are becoming more amenable to funding hardware companies to create novel designs, with more and more hardware startups and accelerators taking form. Computer processors are getting smaller and smaller, giving credibility to a dream of “internet of things”, where objects and items can all independently communicate and interact. While this vision of continuous surveillance is not without problems, the internet of things is quickly approaching and projected to dominate all other computing markets.
“internet of things” as an opportunity for change
While some are skeptical of the the sweeping claims Internet-of-Things enthusiasts make (all of our objects will have embedded computers that talk to each other! our roads will be connected to our houses and our phones!), smaller-scale efforts like fitness tracking and home automation have had sizable success. What’s compelling about a movement towards internet-of-things is that it requires design and computing innovation in every space, for all things.
At its most basic, the “Internet of Things” is a call to drastically re-imagine how technology is designed and deployed across our world, from chip-level hardware design to user interfaces. Many of our assumptions about how computers are designed, what they are deployed to do, and how humans interact with them, will go out the window, and with those assumptions should go the deep-ingrained currents of sexism, racism, and homophobia littering computer history and design. Without careful, calculated efforts to improve the status quo, though, tech may be trapped in its biased ways. Voice recognition systems in cars still literally ignore women’s voices, even though calls to improve have been sounding since the early 90s. The Chief Technologist of the FTC continues to get served Google ads suggesting she has been arrested because of a black-sounding name. “Women must be part of the design teams who are reshaping the world, if the reshaped world is to fit women as well as men,” write CMU researchers Jane Margolis and Allan Fisher in Unlocking the Clubhouse. As we reshape our interactions with computers once again, it would be naive to repeat the tech industry’s past mistakes.
Opening up IoT to women and minorities provides a number of financial and social incentives. Economically, serving a more diverse audience can ease the overhead of producing new computing hardware. Chip fabrication is expensive, costing up to $3-4 billion to construct new fabrication plants. Conveniently, women control $20 trillion in annual consumer spending, and are woefully underserved in most electronic markets. Additionally, improving education initiatives is easier when hardware is targeted for specific applications. Rather than striving to include many children with the same dumbed-down hardware kits, new applications in the embedded space provide real-world inspiration for children to learn computing not only through video games or toy puzzles, but by any of their favorite objects or environments with embedded computers, from cars to cookware to sports. Including a variety of perspectives in the hardware design process opens up opportunity to successfully reach broader communities.
It’s a well-known aphorism that “hardware is hard.” Improving diversity in technology is even harder. By unifying these two goals under the common theme of pervasive computing technologies, we can create new paths for inspiring diverse participation and building better technologies. This is a clear opportunity to improve upon all the hardware revolutions that have come before, by taking stock of the rich and diverse world we live in and including more voices in the hardware design of an interconnected future.