With Prospect of Muslim Registry, Programmers Consider a Code of Ethics

Can an innovative, rapidly expanding profession abide by a single set of rules?

Illustration by mastaka
Illustration by mastaka

Bill Sourour was twenty-one years old when he coded a site that marketed a drug to teenage girls—a project he still regrets.

In 2000, he landed his first desk job at an interactive marketing firm in Toronto that developed websites for pharmaceutical companies. One site, a project for an unnamed client, featured a quiz using a list of symptoms to suggest therapies meant for women. The results almost always led to the client’s drug (which Sourour declined to name, since the company that distributed it is currently involved in a lawsuit). The site that Sourour built wasn’t illegal—it complied with Health Canada regulations on pharmaceutical advertising—but at the time, he felt it misled users into thinking it was an unbiased source of information.

Yet the young developer followed his instructions: he made the questions multiple choice and crafted an aesthetic for the site that was clearly aimed at girls. The client was so satisfied with the final product that he took Sourour out for a steak dinner at the Keg Mansion. “Everyone understood that this is how these pharmaceutical companies can essentially advertise to consumers,” Sourour says. “We were all sort of going along.” Later, he read about the drug’s major side effects and risks—depression and suicide—and about a girl who took the drug and killed herself.

Coders will inevitably be asked to complete questionable assignments throughout their careers. In public, responsibility lies with the executives who green-light projects and who might choose to ignore problems or deceive users. Blame rarely falls on the developer, but coders are not just cogs in the product development process—they’re also crucial to its success. Increasingly, we turn to our screens for information and rely on programmable technology for daily life—security systems, medical devices, even our car brakes. Without a developer willing to put in the time, most of these products wouldn’t exist; and if coders simply follow instructions with a blind eye to the end result, the ethical consequences can be severe. In 2014, Toyota admitted it knew about fatal manufacturing faults that caused cars to suddenly accelerate. A year later, when Volkswagen was caught cheating on diesel emissions tests, the company’s US president blamed “a couple of software engineers” for equipping cars with software that would defeat emissions controls.

In one of the most jarring examples of ethical breaches in coding, engineering errors and quality-assurance oversights in a computerized radiation-therapy machine named Therac-25 led to six patients, including one in Hamilton, dying from cancer or suffering other serious complications in the 1980s. “Technical problems are intertwined with ethical nuances, and ignoring either can lead to disaster,” researchers Keith Miller and Don Gotterbarn wrote in a 2009 article for Computer. The intent to do wrong doesn’t matter, according to the two professors, since engineers have an obligation to act in good faith.

Now that American president Donald Trump has started to act on his election promises—such as signing an executive order (later struck down in federal court) barring citizens from Muslim-majority countries—the need for a professional code of ethics has returned to the spotlight. Weeks after the election last November, the first-ever Bay Area Tech Solidarity meeting brought designers, programmers, and other tech workers together to discuss the implications of data and programming under a Trump administration. Together, about fifty of them drafted a pledge for workers who refuse to build a database that would “target individuals based on race, religion, or national origin”—such as Trump’s promised Muslim registry. (The letter cites the use of Hollerith punch cards, which were produced by IBM’s German subsidiary under the Nazis. The resulting data catalogue was used to classify German society, and it eventually included information that helped carry out the Holocaust.) About 2,800 tech workers have since signed on, and over 135 tech company leaders have signed a similar pledge.

Other companies are publicly distancing themselves from a prospective registry: Twitter has promised to take action against developers mining tweets for surveillance purposes; Facebook, Apple, Google and Microsoft have said they would not cooperate. These companies are already collecting information about their users’ ethnicities, religious beliefs and races, which can be sold by data brokers for commercial purposes, but whether or not that information is used to build a Muslim registry depends on each company’s ethics—and any regulations that might force them to comply.

Having a government registry is not a far-fetched concept. A system for tracking citizens of Muslim-majority countries was put in place in the US a year after 9/11, requiring visitors from Iran, Iraq, Syria, and other countries to check in with authorities before entering and leaving the country. More than 13,000 people were placed in deportation proceedings, and others were barred from applying for permanent residence or citizenship. Developers, including the software engineers who built it, had national security in mind, but the system was ineffective: it resulted in zero terrorism-related charges, was defunct by 2011, and was dismantled by the end of 2016. Without coders who are willing to establish such a data-driven program again, it would be hard to enforce any kind of “Muslim registry” in 2017.

The answer to the problem of coding ethics may be a framework that would govern professional developers and keep them accountable—similar to the regulatory boards that rule over doctors or lawyers in Canada. That framework doesn’t exist yet, but in 1997, the Association for Computing Machinery, which counts close to 100,000 international members, published a programmer’s code of ethics. It instructs developers to accept responsibility for the material they help produce and to act in the interest of the public good.

In theory, these principles apply to all developers, no matter their discipline. In reality, it’s impossible to ensure that ethics are universally understood and maintained; by its very nature, coding is largely unregulated. Developers work in different languages for a variety of organizations, whether public or private, legal or illegal. The novelty of software and web development—its openness and ubiquity, its impartiality and anonymity—is exactly what makes it difficult to supervise. With the number of coders growing exponentially every year, many of whom are self-taught or learning through community boot camps, world-wide ethical standards are becoming even harder to enforce.

The risk is that if coders don’t regulate themselves, governments will intervene. Top-down regulations could limit who is considered a coder, or what tools can be used to build software. This could kill innovation in a field that thrives on developing new techniques. It’s also important to keep a separation between government and limitless data; already, federal departments have access to personal-information mines that can be used against citizens. Think, for example, of the records gathered by Canada’s national police force, such as mental-health issues and past charges without convictions. Without developers to build a database that is functioning and easy to use, that information is harmless. When compiled and shared by authorities in ways that violate people’s trust, it can be used to further marginalize sections of society.

After he published an article about the code he regrets, Sourour contacted a coding boot camp in New York and emphasized the important of ethics training. He regularly writes about ethics in his newsletter for developers and, since leaving the marketing company in his early twenties, he’s gone on to work for government organizations to develop medical practice tools. Now based in Ottawa, Sourour is selective with the code he writes and treats his experience with pharmaceutical companies as a lesson for newcomers. “[A coder] is always going to be asked to do shady stuff,” he says. That doesn’t mean they should take the job.

Michelle Pucci
Michelle Pucci is a producer at CBC North in Iqaluit and a former Chawkers editorial fellow at The Walrus.