What Can Digital Out of Home Advertising Learn from SideWalk Labs Controversy? Part 1

0*jR3mQURv7QehR1oX.png

This is a two-part article that explores the role of data privacy and citizen consultation in regard to the use of technology in public space. Part 1 explores the controversies of SideWalk Labs and Part 2 delves deep into the learning for Digital Out of Home Advertising.

Sidewalk Labs recently announced the creation of icons as an initial prototype of a visual language for signage in the public realm that alerts the public to the presence of digital technology (that may be collecting public data). The black hexagons express the purpose of the technology; the blue and yellow hexagons show how identifiable information is used, and the white hexagons display the entity responsible for the technology. Another white hexagon with a QR code and URL enables people to learn more.

Sidewalk Labs have been hard at work on Quayside, a futuristic test bed that is the world’s first neighbourhood built from the Internet up.  Located just southeast of Downtown Toronto. Sidewalk Labs is owned by Google parent company Alphabet.

SideWalk Labs offers technological solutions to neighborhood problems like sustainability

In a blog post, Jacqueline Lu, Associate Director, Public Realm, Sidewalk Labs detailed the problems raised data privacy issue raised in public spaces:

“There’s little transparency about what data these technologies are collecting, by whom, and for what purposes. Signage that does appear in the public realm often contains either small snippets, which give no indication of how to follow up or ask more questions, or multiple paragraphs of dense text.”

She further went on to explain the choice of the hexagons, 100 participantsfrom several cities:

 “From our user research, we knew that there were some core concepts that people wanted to know while they were in the public realm: specifically, the purpose of digital technology as well as its accountable entity. People also wanted to have an easy way to follow-up and learn more and know if the technology could ‘see’ or identify them.

The biggest problem is when seemingly disparate pools of technically anonymized data can be combined to identify people with a high degree of accuracy.  A recent study by MIT researchers on the growing practice of compiling massive, anonymized datasets about people’s movement patterns revealed how this can happen. It was the first-ever analysis of so-called user “matchability” in two large-scale datasets from Singapore, one from a mobile network operator and one from a local transportation system. A statistical model tracked location stamps of users in both datasets and provides a probability that data points in both sets come from the same person.

In experiments, the researchers found the model could match around 17 percent of individuals in one week’s worth of data and more than 55 percent of individuals after one month of collected data.

The researchers told MIT News, “In publishing the results — and, in particular, the consequences of de-anonymizing data — we felt a bit like ‘white hat’ or ‘ethical’ hackers,” adds co-author Carlo Ratti, a professor of the practice in MIT’s Department of Urban Studies and Planning and director of MIT’s Senseable City Lab. “We felt that it was important to warn people about these new possibilities [of data merging] and [to consider] how we might regulate it.”

Just in Time or Too Little Too Late?

The same month Canadian Civil Liberties Association (CCLA) announced that they aresuing three levels of governmentand Waterfront Toronto, a publicly-funded organization, over the planned smart neighbourhoodin Quayside.

At a press conference, Michael Bryant, CCLA executive director and general counsel, said that Waterfront Toronto and the three levels of government had “sold out” citizens’ constitutional rights to freedom from surveillance “to the global surveillance mammoth of behavioural data collection, Google”. He said:

“The Google-Waterfront Toronto deal is invalid and needs to be reset. “These agreements are contrary to administrative and constitutional law, and set a terrible precedent for the rest of this country. Unlawful surveillance is wrong whether done by data profiteers or the state. We all deserve better from our federal, provincial and municipal governments,”

Data Privacy has Plagued SideWalk Labs Since Its Inception

Issue of citizen data, surveillance and privacy have plagued SideWalks Labs It’s been a constant issue for Sidewalk Labs. In October last year, Dr Ann Cavoukian, a privacy expert and consultant for Sidewalk Labs, resigned from her role upon discovery that after initially being told that the data collected would be wiped and unidentifiable, she learned during a meeting that third parties could access identifiable information gathered in the district. Saadia Muzaffar, founder of TechGirls Canada, also stepped down from the digital strategy advisory panel of SideWalk Labs, saying that the company was not adequately addressing privacy issues she and others had raised.

Knowledge is Not Always Power

While SideWalk Labs hexagons provide the potential to raise awareness of data collected in public spaces, they do not enable citizens to opt in or out of data collection in any real detail, nor do they offer an alternative to those opposed to having their data collected – are such people effectively barred from entering the neighbourhood? They further offer no insight into whether data can be sold to third parties and who financially benefits – will funds raised in any sales be funnelled back into the community or used to benefit citizens? Only time will tell, but everything suggests the process is not going to be easy.

Read Part 2 of this article to see how the Digital Out of Home advertising sector can learn from the challenges of SideWalk Labs.

Leave a Reply