Social Problems

The Internet is a Built Environment

In Social Problems, Joel Best provides a constructivist framework for understanding how a wide array of issues – distracted driving, Pluto’s planetary status, abortion – are made the focus of societal concern and action. His examples clearly demonstrate how technology, culture, and other facets of society may change which problems are constructed, but that tactics and strategies used in constructing them have remained very similar over time. However, embedded in Best’s framework is one particular assumption that prevents us from applying it directly to understand the internet in this context. Best’s framework repeatedly and sometimes explicitly assumes infrastructures are neutral in design and use, but the internet, originally called ARPANET, was constructed for the US Department of Defense during the Cold War “to meet the needs of military command and control […] and improve military tactical and management decision making.” If we are to understand how the internet can be and is used in the construction of social problems, we cannot assume that platforms built and institutions operating upon a foundation of surveillance and control treat information, or people, neutrally.

One of the most explicit examples of this occurs when, in a discussion about the media, Best asserts that

claims of all sorts can be posted and made accessible to anyone who knows how to use a search engine; in effect, the Internet has an unlimited carrying capacity. Because there is little in the way of filtering, the Internet offers a forum for even the most controversial claims […]. (p. 142)

While it’s true that the internet essentially removes the carrying capacity limits inherent to physical media (e.g. how many songs can fit on a cassette tape, how many words can legibly fit on a newspaper page), it’s demonstrably false that the internet, or companies and services that rely on it, are impartial about what information they provide, and to whom. Google has “personalized” their search results since 2005, and in recent years demonstrated a willingness to compromise on governmental censorship in order to re-establish a foothold in the Chinese marketplace. Dr. Safiya Umoja Noble’s forthcoming book, Algorithms of Oppression, explicitly links search algorithms to the replication and reinforcement of sexism and racism. Social media companies rely on filtering as a core part of their service, updating and tweaking their algorithms as their business needs change. Sometimes a company will provide details publicly, marketing the change as a feature and not a bug, as when Twitter changed from a chronological timeline to an algorithmic one in 2016. Other times, the fact and details of the filtering are kept quiet, leaving media companies and individual “influencers” alike to track changes for themselves.

New competitors are beginning to address these concerns; Mastodon is a decentralized and free open-source Twitter competitor with a chronological timeline and DuckDuckGo reports that its search engine “does not track personal information.” However, filtering is not just an issue at the level of an individual service or website. Just a few days ago, the FCC notified the US Senate of their intention to repeal net neutrality regulations. If this repeal is enacted, internet service providers (ISPs) like Comcast and Verizon will be allowed to slow down access to, paywall, or render inaccessible whichever websites they choose. In an environment with a robust and competitive marketplace, consumers could vote with their dollars and support companies who do not throttle, paywall, or block content. With dwindling competition among ISPs, especially at higher connection speeds, households in much of the United States have little or no true choice in who provides them with digital access to entertainment, information, retailers, civic services, etc.

Despite our description of materials as born-digital, people as digital natives, and hardware as next generation – a strange and uncomfortably reproductive vocabulary – the internet is part of the built environment. We craft both the physical (underground cables, servers, wifi access points, etc) and the virtual (software, social media platforms, algorithms, etc) structures and processes that comprise the internet. Sometimes our design decisions come from intention and sometimes from ignorance; sometimes we understand the outcome and sometimes we don’t. Most dangerous are the (please forgive me) unknown unknowns, the impacts we don’t anticipate and don’t even realize we need to anticipate.

Earlier this year, a fitness-tracking company released a data visualization that revealed the secret locations of American, French, and Italian military bases around the world and exposed data that could be used to track an individual user’s movements. This release wasn’t an accident; Strava has been collecting these data for years and spent a significant amount of time and energy re-engineering the map’s entire codebase to handle the scale of data this visualization required. They simply failed to recognize, investigate, or perhaps care what danger the isolated pockets of activity on the map represented. Instead of interrogating Strava’s actions, many media outlets placed the blame on individual soldiers.

Certainly, the irony of systems built for surreptitious military surveillance being used to surreptitiously surveil the military is delicious, and the military perhaps needs to have a conversation with individual servicemembers about protecting their data. Ultimately, however, Strava is solely responsible for having aggregated, visualized, and released these data publicly. Whether intentional or inadvertent, Strava was able to do this (and for what, marketing?) because surveillance and data analysis are precisely why the internet was built and the purpose for which we continue to refine it. This is the logical consequence of the choices we have made: building a worldwide network, placing endpoints on as many desks and in as many pockets as possible, and failing to put in place adequate physical, virtual, legal, or other societal barriers that would protect us.

What we need is a fundamental shift in how we think about the internet, and digital spaces more generally. They are part of the built environment, and so any natural framework will have shortcomings. We need to be even more critical regarding the digital built environment, however, because our unknown unknowns are still so vast. Many government and public agencies use software that relies on formulas (referred to as black box algorithms) even their programmers don’t understand. We cannot start from a position that these spaces and structures are neutral, agnostic, or equitable. We cannot assume this happens accidentally or because of a mistake, or that good intentions can mitigate consequences. We must recognize the system is not broken; the system is working exactly as designed.

Tagged ,