Individualism and Chess

My thoughts recently have been rather scattered, but I think the underlying theme relates to the concept of individualism, and how technology increasingly leads us to consider people as part of a collective. I’ve been bouncing between thoughts on government surveillance, obscurity, the culture wars, and chess, and more generally upon the de-emphasis of the individual that is necessary to analyze the aggregate. While I’m completely in favor of large scale analytics, we also need to recognize that individuals wish to be treated on an individualized basis, and I think that de-individualization is at the heart of many modern problems. 

A Queen’s Gambit?

I’ve mentioned before that I think the solution most people intuitively want for the warrantless government surveillance question is to require the particularization of searches that don’t qualify as 4th Amendment “searches”. This would require the government to have some level of suspicion before collecting your records, rather than be allowed to vacuum up all of the data that is currently unprotected. This solution is ideologically simple, but goes against a long history of Supreme Court jurisprudence, and arguably could have negative consequences. The rationale behind the current system is that we don’t want courts to perform individualized analyses into the motivations of police officers at a minute level. The word “search” could be applied to an enormous range of traditional police activities, including extremely trivial ones, and allowing each to serve as a potential grounds for a 4th Amendment challenge would make prosecution even more litigious and pedantic than it already is. Instead, we created a bright-line, constitutional “search” standard, and police have free reign to anything below that standard, (which I will refer to as “non-searches”).

The point that this bright-line rule potentially misses, however, is that police officers have an inherent filter that is performing this individualizing analysis regardless. A police officer has limited time and resources, so even non-searches are done on some degree of individualization. They see something suspicious, or hear a noise, or even simply recognize a pattern of crimes in a particular area. Regardless of exactly why, the point is that police officers are acting on individualizing criteria that the Supreme Court has decided we don’t want courts to reexamine. This inherent filter, coupled with the practical impossibility of searching everything, created an expectation of privacy among the common citizen that, although they could be subjected to these non-searches, it would only realistically happen when a police officer had some reason to suspect illegal activity by that individual. We expect to be treated as individuals.

The difference now is that police computers don’t have the limited time and resources of police officers, removing the need for this inherent filter. Programming computers to disregard the obviously irrelevant is surprisingly difficult, and current processor speeds make it much easier to look at everything: a so-called “brute force” approach. Brute force is common in computing, and while inefficient, it is extremely easy to program and tends to be quite effective.

Check

Which brings me to chess. Computer chess programs have been around since the 1970s, (excepting “the Turk”), after which it became an ongoing challenge to write a program that could reliably defeat the world’s chess Grandmasters. Despite booming processor speeds, it took a surprisingly long time, and this difficulty has been attributed to our human ability to subconsciously filter out the obviously bad moves. Truly skilled chess players’ brains don’t even consider bad moves, which preserves the brain’s focus for the potentially good ones. Computers, on the other hand, will analyze every possible move, which can be an extremely time consuming process. (It analyzes all of the current options, then analyzes each countermove to each option, then each countermove to their countermoves, and so on, ad infinitum.) While this can be done fairly easily today thanks to modern processor speeds, in the past computer programmers had to emulate this human filter in their algorithms. Scanning everything proved to be too processor intensive, so they needed to cut corners, and they did so by removing individual moves that weren’t worth considering. Nowadays, technology allows us to be lazy, and this laziness makes it easier to analyze the aggregate than to impose criteria that individualize.

Privacy En Prise?

While I’ve expressed previously that I don’t think this brute force approach to police non-searches violates the 4th Amendment, it may erode a right I suspect lies in the penumbra of the Constitution: the right to obscurity. Now the right to obscurity isn’t a real Constitutional right: I made it up for this post. But it reflects an unspoken assumption that ordinary citizens have a right to go unnoticed and unscrutinized unless there is some reason for the scrutiny. It is akin to the right to privacy, (also in “the penumbra”), which is the foundation for the right to have an abortion and to private intimacy. They both rely on a fundamental right against excessive government involvement in our private lives. The right to obscurity would say that we have a right to not be subjected to governmental scrutiny without a reason. That reason might be as weak as a police officer’s hunch, but at least it’s a reason. Even police non-searches would require the police officer’s human filter to perform some level of individualization before subjecting an individual to scrutiny.

As I articulate it here, this right wouldn’t protect very much. It wouldn’t give you a right to challenge an individual police officer’s decision to knock on your door or approach you on the street, because those acts passed the police officer’s inherent filter: they individualized you. They may have done so for discriminatory, biased, irrational, or otherwise unsound reasons, but they did it for reasons specific to you. That would be enough to overcome the right to obscurity. It would only really protect against ubiquitous computerized scrutiny and the massive data collection simply for the purpose of collection. It wouldn’t stop targeted computerized scrutiny, but it would require that the target was individualized. The 4th Amendment may still require higher standards, but this would set a minimum. The goal would be to recreate the practical protections that computers have been rapidly eradicating, and it would do so by requiring human decision-making to be the impetus for government involvement in our lives.

Checkmate?

Although a right to obscurity would ostensibly provide a solution for the warrantless wiretapping concerns many have, I am unconvinced it is really the best solution. To begin, the majority of the data collection at issue technically isn’t direct government surveillance, at least not in the traditional manner. In the case of the telephone metadata collection, the FISA court issues orders to telecommunication providers to disclose metadata which they were already collecting. Typically, the information at issue is necessarily collected to manage their businesses. Whom you call, when you called, for how long, etc., these are mandatory elements to facilitate communication, which must necessarily be provided to the telecom company. Any right to obscurity would be rendered somewhat meaningless, both because the collection wasn’t done by the government, and because we tacitly agreed to disclose this information. The alternative would be like asking a courier to deliver a letter, but refusing to tell him/her where to take it for privacy reasons.

We may instead construe the right to obscurity more broadly, such that it prevents the government from requesting our information from third parties without individualization, but that raises more problems. Do we have an absolute right to demand individualization? Or does it weaken as we share information with more and more companies? Can we share information with everyone except the government and still demand that government pretend not to see it? And what if the information is acquired by the government through otherwise legitimate means? Are they required to have the data, but not think about it?

The current solution proposed by the government is something of a middle ground, and it actually recognizes this desire for individualization. The data will still be collected for government use, but will be held by a third party, and government agencies must request searches only upon some minimum criteria. While this wouldn’t allow for individualization at the collection stage, it would add safeguards to the common citizen that prevents excessive government involvement in their lives. There would still be issues of oversight, and the method of searching would need to be appropriately minimized, but in general this solution would provide a workable framework for limiting direct surveillance on individuals and thereby help preserve the right to obscurity.

Endgame

But perhaps more fundamentally, I actually think placing a greater reliance on computers will help protect civil rights. For one thing, our traditional notions of privacy do not really apply to computers. We don’t think of privacy as keeping information from computers; privacy is in relation to other people. If we craft systems that operate through entirely automated means this actually enhances traditional privacy, as it removes the need for other humans to view and assess our private information. While computers don’t have our filter, an entirely computerized system is its own form of filter. And while I’m sure some will proclaim such a development as signaling a Terminator-esque dystopic future, I, for one, will welcome our new robot overlords.

Furthermore, computers do not have inherent biases. I briefly alluded to this problem earlier, but with humans come a host of preconceived biases, prejudices, and discriminatory tendencies that are difficult to explicitly identify. Whether a police officer stop was for legitimate reasons or for prejudicial ones is nearly impossible to determine, and this allows for such practices to continue. Computers, by contrast, do not have this problem. If a computer algorithm is unjustly discriminatory, this will be apparent simply by reading it, and can be easily remedied. Transparency in police algorithms would open them up to scrutiny by independent advocacy groups that could identify and remove problematic coding to help ensure that the actions of the computer conform to whatever societal standards we choose to impose.

This is not to say that computers don’t have their downsides. Since computers follow specified algorithms, they are inherently predictable, and that predictability can be exploited. During the mid-2000s, when chess programs were nearly unbeatable, the top players adopted computer-specific strategies designed to work against their algorithms and exploit their shortcomings. While these ultimately still fell to superior processing power, it articulates the potential for criminals who understand government surveillance systems to circumvent entirely computerized systems.

This relates more generally to problems with transparency. Too much transparency can sometimes undermine the purpose of the system, as is the case with law enforcement. But inadequate transparency could lead to arbitrary or secret determinations being made about our lives, without any direct human oversight. Much like the debate with drones, I think many people feel that entirely automated processes lack a “human element” that provides moral oversight. People like Edward Snowden and Chelsea Manning, regardless of whether you agree with them, are only possible because they were a part of that system. If such systems were entirely automated, these secret initiatives would lack any form of whistleblower oversight. I’m not convinced this is a substantial problem, as there will always be some human involvement, but it certainly should be on our radar.

And as a final point, automated searches raise interesting questions about what criteria we can or should utilize in our automated systems. Once we acknowledge that we should individualize, what characteristics of the individual should we look at? Suppose we create an airport terrorist surveillance system: are we allowed to consider the race, nationality, religious beliefs, or First Amendment protected speech of the individual? Does our answer depend upon how this impacts the system’s efficacy? Should we focus purely on results, so that whatever gives the highest success rate is favored? Or are the criteria used more important than the ends they achieve? Entirely automated systems can certainly be in tension with the individualism I started this post with, but they also offer one of the strongest ways to ensure it.

I originally wanted to go on to discuss how I think our society increasingly fails to treat individuals as individuals, especially in online spheres, but my posts already run too long, so I’ll leave that discussion for another time. For now I’ll just point out that our interactions online are often with minimalist avatars, which allow us to remove the individual and project the ideology that conforms to our biases. To borrow a phrase from John Green, our unwillingness to imagine each other complexly leads to discourse where opposing sides argue against straw men they construct atop their opponent’s de-individualized avatar. We don’t like to be lumped into the aggregate, we want to be treated as individuals.

-Scott

2 thoughts on “Individualism and Chess

  1. Pingback: Aggregation Episode 3: Revenge of Mosaic Theory | The CACR Supplement

  2. Pingback: Unsafe Harbor? | The CACR Supplement

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s