Esprii Chapman – Notes on IMPOVERISHED ALGORITHMS: MISGUIDED GOVERNMENTS, FLAWED TECHNOLOGIES, AND SOCIAL CONTROL. – 11/26/20 (Article 8 of 10)

This “collect it all” mindset,  which endorses the gathering and retention of as much data as possible, serves two functions. First, it reifies data analytics as a tool of state control. Second, it creates the massive data sets necessary for predictive analytics. Additionally, governments’ propensity to collect, store, and search ever-increasing amounts of information fosters a public-private symbiotic relationship which reinforces the market for algorithmic social control systems. The problem is that all large data sets are “dirty,” filled with errors and mistakes. (388)

error rates in government systems are often exacerbated when governments combine data from data brokers with already problematic government data. Large data sets are also vulnerable to generating their own errors in the form of false or spurious statistical relationships. This is because the risk of an algorithm surfacing a statistically significant but contextually meaningless connection between variables increases as the size of data sets  increases. Data error leads to faulty predictions and potentially dangerous wrong decisions. In 2009, Justice Ginsburg warned that “inaccuracies in expansive, interconnected collections of electronic information raise grave concerns for individual liberty.” That warning was not heeded, and as more governmental decisions become automated, the situation has only worsened.(389)

Data errors contained in the large data sets used by governments are almost impossible to challenge or correct, not to mention that the existence of the databases themselves is often kept secret. (391)

Because problematic algorithms rely on flawed and biased data sets, governmental decision-making is bound to repeat and reinforce the already existing discrimination and bias created by past conduct used to control marginalized populations. (393)

Using private vendors to build algorithmic social control technologies is problematic for a variety of reasons not the least of which is that it increases opacity and lack of accountability. (406)

  • Examples:
    • Facial recognition software was donated to an Arizona school system with the hope of developing a market for the technology in schools. Amazon designed a facial recognition system that it is currently marketing to various law enforcement agencies. (406)

A recent study directed Freedom of Information Act (FOI A) requests to a variety of state and local agencies as part of a project to test how responsive government agencies would be to requests for information about predictive analytics. While not the only obstacle to transparency, the study found that aggressive trade secret claims and NDAs were a major hurdle in obtaining information about the predictive systems. This lack of transparency about the algorithms creates a lack of accountability that implicates all the issues of bias and unfairness inherent in big data analytics. (407)

Valentine, Sarah. “IMPOVERISHED ALGORITHMS: MISGUIDED GOVERNMENTS, FLAWED TECHNOLOGIES, AND SOCIAL CONTROL.” Fordham Urban Law Journal, Vol. 46, No. 2, January 2019, pp. 364-427.

1 Comment

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s