A constitutional black hole

[dropcap]I[/dropcap]n the four or so decades  since the term ‘Internet’ broke into the lexicon, technology has become irrevocably embedded into our everyday lives. Language scrambles to describe the rapidly developing tech industry.

In some cases, language fails to express concepts emerging from the technological world. There is perhaps no greater example of this than ‘the cloud.’

The cloud refers to remote servers. Rather than storing your data locally on your hard drive, you can store data remotely on servers owned by someone else. When you use Google Drive, Dropbox, or iCloud, you are using cloud servers to store data elsewhere, saving space on your computer so it runs efficiently.

It seems like a good deal, especially since, given the amount of space offered on free accounts, many people will never need to pay for cloud space.

Yet the concept of storing our data in a ‘cloud’ is problematic; it makes it seems like our data is somehow floating in space. It is rare that we are asked to picture the physical structure of the Internet — a material system designed to move binary information, or bits, from one place to another. In reality, ‘the cloud’ is a tangible network of computers. The computers are connected by bundles of fibre optic cables that run across land and sea, and by radio towers receiving and sending wireless signals. Its near seamless operation leaves our data as out of sight and out of mind.

Similarly, conceptualizing the scope and physicality of mass surveillance is challenging, as is acknowledging its consequences. But in our post-Snowden world, questioning the rhetorical power of the term ‘the cloud’ is important. Outsourcing our data is convenient, but it all too often exposes our information to security risks, and, more jarringly, to spying government agencies like the National Security Agency (NSA).

2-envelope-Kawmadie Karunanayake

As Canadians who often choose to store our data in the US, we should be concerned by the no holds barred approach of American intelligence and security agencies, and the near unrestricted access they are granted through cooperation with private corporations.

Increasingly, Canadian universities are outsourcing their email services to these companies in the name of convenience and affordability —  indeed, the services offered by Google and Microsoft far exceed those offered in-house by public institutions in quality, storage space, and features. Plus, they’re often free, while costly homegrown e-communications operations put a strain on IT departments and budgets. Nevertheless, there is a truth to be faced: outsourcing online communications comes with a distinct set of risks that put the privacy of thousands of students, staff, and faculty across the country at risk.

U of T outsources student email

In 2011, when U of T opted to outsource its student email services, both Microsoft and Google were courting Canadian universities with offers of limited-term free e-communications services.

Universities across the country were signing contracts with the two tech giants, and it was quickly becoming the norm to work with one of the two companies to get free student email and all the perks that come with it — including plenty of cloud-based storage.

The university took some measures to get students’ input on the change, including striking a consultation committee comprised of students, staff, and faculty. The university’s chief information officer, Robert Cook, issued a response to the findings in which he stated that if money were no object, the most “desirable route to follow” would be a U of T managed system.

“We have not pursued a detailed quotation, but it would obviously cost millions of dollars for hardware, software, system porting and ongoing development staff,” he wrote.

In a follow-up report issued by Cook in May of 2010, he specified that the estimated annual operating cost of maintaining in house UTORmail services for the 159 distributed email systems offered by different divisions at the university was roughly $1.44 million.

According to the students surveyed, the most highly desired email feature was a large storage quota, followed by an integrated calendar service, and online file storage. For faculty, who continue to use UTORmail, storage remains a major concern. Accounts typically have between 50 and 150 MB, opposed to the 50 GB to 1 TB typically available on outsourced systems. With so little space, users regularly have to empty their accounts in order to send and receive new messages, particularly if they contain large files.

Citing the ever-growing maintenance costs for UTORmail, the will of students, and preliminary research into outsourcing options, Cook recommended outsourcing email services to Live@EDU with Microsoft Canada. The switch took place in September of 2011, with new students given Microsoft email accounts, and current students given the option to opt in or to keep their existing accounts. Soon, faculty will face the same decision.

“People weren’t raising hard questions”

Professor Andrew Clement from the Faculty of Information first heard that the university was considering moving staff and faculty emails to Microsoft services as well in the fall of 2013.

“It was mainly treated as an administrative change,” he recalls.

Several years prior, Clement had begun a research project on the movement of data between the United States and Canada, and the subsequent privacy implications and the risk of interception by the NSA. Knowing that Microsoft servers were housed in both the US and Canada, and that Microsoft was the first company to join the NSA’s PRISM program — which allows the security organization direct access to its data — Clement attended a town hall meeting on the potential switch to express his concerns.

Heidi Bohaker, a professor in the Department of History, also attended the meeting. Upon hearing that the new email service would be cloud-based, Bohaker recalls, “that raised a lot of interesting questions immediately to my mind, in terms of where the data is going to be stored.”

Bohaker then contacted Lisa Austin, a professor in the Faculty of Law, who also had concerns with the potential changes and connected Bohaker to Clement, who was organizing a ‘teach-in’ event in November to discuss the implications of outsourcing e-communications for faculty and staff.

The university had gone ahead with producing a comprehensive Information Risk and Risk Management Report on the proposal. “UTORmail, the University’s legacy institutional email service, is near end-of-life and requires significant investment to bring [it] up to current industry standards,” the report reads. It goes on to describe the success of the migration of student accounts to Microsoft services in 2011, with the stated objective of transitioning faculty and staff e-communications.

Clement contends that the university was “averse” to addressing risks from surveillance associated with migrating to cloud-based services. “People weren’t raising hard questions,” he recalls, “I think they… were very weak, and we shone a light on them.”

Clement also drew attention to the role of Microsoft employees in putting together the Privacy Impact Assessment cited in the report.

“[The university] got substantial help from Microsoft which is, in my view… quite inappropriate, for them to play such a strong role… so [Microsoft] actually wouldn’t want to draw attention to their participation in the PRISM program, or generally the surveillance risks,” he says.

Clement’s efforts to shed light on the privacy risks paid off — the university’s plans to migrate faculty and staff emails petered out, and UTORmail continues to provide centralized services for @utoronto.ca email addresses.

3-envelope-Kawmadie Karunanayake

Clement, Bohaker, and Austin united to produce the Seeing Through the Cloud report, released in 2015, which detailed outsourcing efforts across Canada, questioned the logic of Privacy Impact Assessments employed by universities in contracting Microsoft and Google, and highlighted the risks of outsourcing email services.

Of particular concern to Clement was the direct access that the NSA would have to Microsoft servers.

“If Microsoft was going to host the U of T email service in the US, then that’s all the communication — they [the NSA] don’t have to intercept it on the fly,” he describes, “Getting direct access to the server of Microsoft and others means they can go in at will and look back at previous emails… [T]his is a great deal more of a risk… and something that I thought needed to be debated on campus.”

[pullquote]“Getting direct access to the server of Microsoft and others means they can go in at will and look back at previous emails… [T]his is a great deal more of a risk… and something that I thought needed to be debated on campus”[/pullquote]

The university struck a Faculty and Staff eCommunications Advisory Committee to submit a recommendation on email services. The committee recommended that the university negotiate a contract with Microsoft to extend UTmail+ to staff and faculty beginning with email and calendar services, although a small number of committee members did not endorse the recommendation and submitted a dissenting report. The committee also recommended that the university call for legislation to combat mass surveillance, develop an in-house encryption service, continue to offer UTORMail for divisions that opt out of UTmail+, and offer locally hosted file sharing.

Marden Paul, director of Planning, Governance, Assessment & Communications in the office of the Chief Information Officer, says that the university is still considering a number of factors in its decision making process. “We’ve heard from many faculty and staff members that the current communications and collaboration technologies don’t meet their needs, and for some time, the university has been working to assess the requirements for better tools,” he said in a statement to The Varsity. “The process has included in-depth assessments of privacy and security, analysis of risks and how to mitigate them, consultations with faculty and staff, review of reports from the IPC and other interested parties, and the overall benefits to be attained by university community from better tools.”

A national trend towards outsourcing

The university’s decision to outsource student email services came as part of a national trend towards outsourcing online communications in Canadian post-secondary institutions, beginning in 2006 with Lakehead University’s transition to Google Apps for Education.

The reasons cited by Canadian universities for the decision to outsource tended to centre around three main points: cut costs; students’ expectations of a faster, better service; and the depleting quality of older, in-house systems.

U of T was uniquely transparent during the process of outsourcing student email. The Information Risk/Risk Management Document remains publicly available for download on the Information + Services website. No other university can claim a publicly accessible Privacy Impact Assessment (PIA).

Despite this commendable transparency, the consultation process, like those at other universities, was insufficient. At several post-secondary institutions, students were asked to compare an “aging” in-house system with a state-of-the-art outsourced one; the option of an improved in-house system was not explored in depth and was consistently presumed to be prohibitively expensive, despite the absence of meaningful research.

Additionally, the university’s PIAs did not reflect the potential privacy risk of storing data in the United States (claiming the ‘similar risk’ argument, in which data stored in Canada and the US is presumed to be equally secure) and claimed that the level of privacy that should be expected was that of a postcard — that those using the service should accept the possibility their correspondence could be read, and behave accordingly. This sentiment stands in opposition to the Supreme Court of Canada’s statements that Canadians have the right to an expectation of privacy when it comes to their email.

John P. Dirks, a history professor and research assistant on the Seeing Through the Cloud report, recalls, “both in the PIAs and even in the contracts, but especially in the PIAs… there was a readiness to dismiss broader privacy concerns based on an assumption that while data is always shared between [Canada and the US]… Canada had a long-term intelligence sharing and information sharing protocol with the Americans… [The universities] were really dismissing it as a non-issue.”

“A constitutional black hole”

“From the legal policy standpoint, you can really see courts and legislatures struggling to keep up with the changes as they occur,” says Daniel Carens-Nedelsky, graduate student of law who worked with Austen to examine the legal policy at work in both Canada and the United States’ privacy regulations surrounding extra-national outsourcing.

The university’s decision to outsource e-communications services was based in part on the notion of the similar risk of state surveillance, and state access to, e-communications, regardless of whether data was stored in Canada or the US. The Seeing Through the Cloud report claims that this argument is based on “faulty assumptions, factual errors, and a surprisingly limited expectation for privacy in eCommunications.”

Carens-Nedelsky explains that the similar risk argument emerged from a 2005 Freedom of Information and Protection of Privacy Act (FIPPA) review, when a group of IBC customers issued a complaint over concerns about the USA PATRIOT Act, because the company was contracting in the US.

According to Carens-Nedelsky, the privacy commissioner found that the complainants concerns were legitimate, but that the Canadian government had a similar ability to access their information; and hence made the similar risk argument. The commissioner ruled that the focus should be on each individual contract, not on the country in which the data is being stored.

4-envelope-Kawmadie Karunanayake

The 2005 FIPPA ruling has been invoked widely and in a variety of cases. The argument is simple: it doesn’t matter where you store your information, the risk is the same. Yet, according to Austin and Carens-Nedelsky, this assumption is patently false.

“[Similar risk] is thoroughly incorrect but makes some sense coming from the FIPPA context,” explains Carens-Nedelsky. “What they’re used to dealing with is looking at company’s internal policies and contracts, [and ensuring that those are] properly protective of privacy. Are they abusing customers’ private information? Are they selling it to third parties? That’s what it’s designed to catch. It’s not designed to catch this comparative constitutional law question.”

The Canadian Charter of Rights and Freedoms limits the government’s ability to access individual’s private information. Yet the question remains, how best to deal with privacy protection across jurisdictions?

More troubling is that neither country’s constitutions protect the communications data of a non-American citizen that is stored in the US, or vice-versa. A Canadian resident whose data is stored in the US will not necessarily be granted protection from third party access. According to the Seeing Through the Cloud report, their data falls into “a constitutional black hole, where the constitutional protections of neither country apply.”

“We have this 2007 ruling that says when you’re not in Canada, you don’t have the constitution applied to you,” says Carens-Nedelsky. “You’re bound by whatever legal standards there are where you are. In the US, there was a corresponding ruling that said, US constitutional privacy protection [doesn’t] apply to non-American citizens, or people without substantial connection to the US”

According to Carens-Nedelsky, this black hole presents a major problem. “[The data] has no constitutional protection… when something has no constitutional protection, it doesn’t immediately mean the government can access it, but it means the government can write whatever legislation it wants and there will be no ability for judges to say that’s unconstitutional legislation,” he explains.

[pullquote]“[The data] has no constitutional protection… when something has no constitutional protection, it doesn’t immediately mean the government can access it, but it means the government can write whatever legislation it wants”[/pullquote]

So what does this all mean for student and faculty privacy? “For academics in particular, there are some specific concerns around academic freedom,” says Carens-Nedelsky. “Both of ability to criticize the US government and… [there is concern for] exchange students from  countries who are writing very private confidential reports that may be critical of their own regime. [They] are writing in Canada on the presumption that this will be well protected… the risk that this could get out is material and worrying,” he says. “Academics routinely deal with highly confidential information [and] confidential sources… much [of] research works on the assumption that this will be held confidential. If you’re storing this information on US servers, that is not actually a statement you can make with any confidence.”

The physical internet

At the bottom of the legal uncertainty surrounding data movement between the US and Canada, is the concern that the US government is able to collect a vast amount of information from citizens of other countries. Email communications are easily intercepted by American institutions because of the flow of Internet data; a lot of international data passes through routers housed in the United States.

“We live in a post-Snowden world,” says Dawn Walker, a student at the Faculty of Information and a research assistant on the Seeing Through the Cloud report. “For most people, that means absolutely nothing but for other people, it’s totally changed how… they’re navigating the world and their relationship with their government and their relationship with the American government.”

All communication on the Internet is based on packet switching. Every piece of communication transmitted over the Internet — for example, an email — is broken down into a series of small packets upon sending. These packets contain both the content of the message, as well as several pieces of metadata, including a header with the message’s source and destination IP addresses.

The packet then moves through several routers which read the header to see where the packets are going and pass it to the next router on the way. Once the packets arrive at their destination, they are reassembled into the original message.

Boomerang routing is when packets begin and end their route in the same country, but pass through another one on the way. This often convoluted path is rarely the most efficient.

“[Boomerang routing] isn’t the fastest way to route data,” explains Clement, “It’s kind of a myth that the Internet routes to optimize the speed of transmission. More important than that is the arrangement that the various carriers make between each other as to who they hand off traffic to.”

Large carriers have vast numbers of routers inside large, unmarked buildings in major urban centres, which are linked to one another through fibre optic cables that can transmit tens of billions of bits per second. The decisions these companies make with respect to whether, and how they allow other carriers to connect to their networks, influence the paths that information takes as it is sent across the Internet.

Physically, the infrastructure of the Internet is both stunningly massive and virtually invisible, contained in bundles of wires and unremarkable buildings packed to the brim with computers, and packaged to consumers simply as ‘the cloud.’

Through the PRISM program, the NSA has access to stored Microsoft data, as well as stored data from Google, Facebook, Twitter, and Apple — this was the crux of the argument against outsourcing e-communications at U of T to Microsoft. But the NSA can also access data in motion through its programs that intercept communications while in transit.

5-envelope-Kawmadie Karunanayake

In order for the NSA to access data in transit, they have to engage with carriers who can create splitter sites. A copy of all data in transit is sent to these sites through the creation of new fibre infrastructure that supports surveillance. Clement’s IXmaps project serves to both illustrate Internet traffic and boomerang routing by showing the routes of various messages, and to identify NSA “listening posts” to show where users’ data is most likely being intercepted.

In the pursuit of countering terrorist activity, former NSA director General Keith B. Alexander adopted a “collect it all, tag it, store it” strategy according to a former US intelligence official quoted in The Washington Post in 2013, who added, “Rather than look for a single needle in the haystack, his approach was, ‘Let’s collect the whole haystack.’”

Walker points out that upstream data capture by the NSA adds another layer to concerns over email outsourcing. “If your thing is on the Internet, it’s captured,” she says, “So is that the point of where you want to focus your attention here? Or do you want to focus on more technically secure email systems?”

Policy solutions

So what options are available to organizations? According to the Seeing Through the Cloud report, at least, U of T should keep its faculty email services in-house for the time being, while a conversation about the way forward takes place on campus and alternatives are seriously considered.

“What I would really like to see happen here at U of T is a really vigorous discussion about that,” says Bohaker. “With everything on the table, including… the value of our metadata.”

Aside from holding off on transitioning faculty e-communications, there’s also the possibility of letting students opt out of the outsourced services.

In order to address privacy concerns related to outsourcing, the report suggests institutions like U of T could regularly update PIAs, take measures to keep data local, and make risk assessment documentation public to ensure transparency.

In contrast to university reports related to outsourcing, Clement contends that in-house email services may be a viable option. British Columbia, for example, has developed cloud services for public institutions to store data in-province, where the outsourcing of data is provincially prohibited.

“It depends on the priorities of the university,” Clement explains. “If [U of T] decided that this actually was a priority, they could do it — they could bring back email services… They could develop as has happened in British Columbia… It’s not perfect, but there’s certainly options. It could be wound back if there was a will.”

Another option is to continue to use vendor services, revisit contracts, build policies that define stricter privacy requirements or create local infrastructure, and require vendors to keep data local. At the provincial level, as in British Columbia, politicians can craft policies to keep data local to better protect data security and privacy.

“There’s multiple challenges to privacy, to our civil liberties, to our public institutions when our data is not protected,” says Clement. “Universities do have a responsibility. If your university is not fulfilling that responsibility, then that I think becomes a concern.”

Looking forward

Like the term ‘the cloud,’ the term ‘email’ is misleading when you consider the way the Internet operates. We think of the postal service as relatively secure — we know it is illegal to open another person’s physical mail, and while it may be intercepted on the way to its destination, the risk is higher that it will simply get lost.

Often, we view our email with a similar expectation of privacy; but while the idea that email is ‘like a postcard’ fails to hold water from a policy standpoint, it may be a functional reality when it comes to the security of our data, and especially our metadata.

Bohaker notes that a lot of people seem to not care about their data being exposed to intelligence agencies.

“[O]ne of the responses I’ve had [to the Seeing Through the Cloud report] is people say, ‘well, I don’t do anything… that is problematic behaviour… [If] people want to look at my cat videos and my knitting blog, what’s the big deal? … I find [it] quite disconcerting because I don’t think people quite understand how privacy laws are necessary to protect us all.”

You may not care about being surveilled by the US government or other countries’ intelligence agencies because you have nothing to hide. But from a security perspective, when your data is stored on cloud servers there is a risk that your data could be exposed to others, which could include your peers or colleagues.

“Some [students] definitely are… at greater risk than others,” Clement says.

Privilege plays a major role in who can be less concerned about mass surveillance. Students from countries governed by authoritarian regimes, for example, have more cause to be concerned about Microsoft servers abroad. Students who are involved in activism have more cause to be concerned about their organizing being surveilled. And of course, faculty conducting particularly controversial research in the eyes of governing bodies would have cause to be concerned about freedom of thought if their e-communications data were stored remotely.

6-envelope-Kawmadie Karunanayake

“[E]ven if at the present time you don’t… particularly care about where your data goes and who’s looking at it, there might be a time when you do, and also, we’re not just individuals who have concerns about ourselves personally, but we’re concerned our friends, our family, our neighbours, the people we work with, and they may be at risk,” says Clement, adding, “If we don’t stand up for some basic rights, then we’re harming people we care about.”

[pullquote]“Even if at the present time you don’t… particularly care about where your data goes and who’s looking at it, there might be a time when you do”[/pullquote]

Outsourcing email services may seem like a step forward for U of T — but is it really the future we want? The university’s agreement with Microsoft allows the company to move your data around the globe, excepting a few embargoed nations, and to have it on hand for the NSA. The physical infrastructure of the Internet puts our data in constant flux, with our metadata readily on hand for the peering eyes of government agencies and more.

While the problem, like the notion of the cloud, may seem intangible, it’s an infrastructural issue that relates to the way the technology that runs the Internet is built and the struggle of law and policy to catch up to its rapid development. The idea of the cloud obscures the reality that there is no cloud at all — there are only other computers, that together can store a lot of information in the same physical space. These are maintained by large corporations that can access your data with relative ease — and they can help other people access it, too.


Posted

in

by

Tags: