Do You Control Your Data?

noun_35171_ccHave you noticed how you see an advertisement pop up for shoes on a website that you visit, right after looking at the same pair of shoes on an e-commerce platform, or get advertisements for coaching classes when you’re researching colleges for higher studies? These are some examples of the way in which our online behavior is tracked and then used to target us with advertising information we could potentially find useful. The web as we know it today, is reliant on the commodification of personal information. This ad-based model generates revenue from users’ online behavior. Most websites use cookies to ‘personalize’ the online experience, and deliver more targeted advertising to individuals based on their online activity. The line on Facebook’s home page, “It’s free and always will be” needs to be examined more carefully in this context.

Social networks rely on user-generated content, by users who are both consumers and producers of content. The digital labor of these ‘prosumers’ generate data, which is collated and analysed for patterns through the use of algorithms.  For example,  Facebook’s algorithm determines what you see on your news feed by carefully combing   through your interactions with your friends, the kind of information you look for, your indicated interests, and the pages you follow.  Google has also professed ignorance on the minute working of their search algorithms despite the fact it is in effect the largest index of human knowledge.

Data mining of social networks provides information on millions of users and their online activities, on what appeals to them and can garner the most ‘likes’. This data, collected in real time can build user profiles, to market products to users in more targeted and detailed ways. Thus, sponsored pages and tweets also find their way into users’ social media feeds, depending on their interests.

With masses of data being produced every day, and newer, more sophisticated ways of analyzing and aggregating the same, privacy of data has become a key concern. While more effective data protection is necessary, there also must be more transparency about how the data is being collected and used. For example, the Sydney Health and Society Group’s analysis of big data and healthcare looks at online platforms like Smart Patients, which is a community of patients. These companies encourage patients to write about their experiences of illness. This information is not just useful for other patients, but has commercial value as the data (which is controlled by the platforms) can be harvested and sold to clients.

TMI: How Over Sharing Compromises Your Privacy

noun_23944Where social media has allowed people to assert more visibility and exercise more expression, an increasing culture of over-sharing has also led to privacy becoming a blindspot for most online users. Data is often unthinkingly bartered for free services on the Internet. An Intel Security Survey found that 63 per cent of millenials would willingly sell their personal data for money. Did you also know that Facebook’s online app can record audio and video from your phone at any time without your consent? It’s important to know what we’re signing up for on the Internet, and that we remain fully informed of how our data is being used. This goes to show how willing people are to trade their information for services online. Without informed consent or knowledge of terms of use, we are effectively giving big platforms the power to access, monitor and monetize our online behavior. To be better informed, you can visit  Terms of Service; Didn’t Read, a user rights initiative that breaks down the terms of service of various online platforms, letting people know just what these companies do with their data.

 


The Cost of Cookies


A performance artist in New York, Risa Puno, conducted a social experiment, where she distributed cookies to people in exchange for personal information. The experiment titled ‘Please Enable Cookies’ was meant to illustrate how easy it was to get people to divulge personal data in exchange for something ‘free’. Pino refused to tell them what she would do with the information and referred them to her ‘terms of service’ instead, which was in fine print and gave her the right to display and share the data. Almost 400 people willingly shared personal information including social security digits, address and zip code.


Opting out of cookies


While regulation around data ownership and rights is still emerging, some initiatives, flowing out of a larger privacy policy movement have been made. The European Union has passed a directive that allows individuals to refuse the use the cookies, due to concerns over digital tracking. Now, many websites notify users of the use of cookies, which users may accept or deny. While cookies may be used to enhance the web experience, as they remember users’ personal preferences, and login information, they can also be used to construct digital profiles of users.

The Right to be Forgotten: Erasing the Digital Footprint

noun_1816_ccThe permanent nature of the Internet allows information to live on in infamy, thus leaving a digital trail that can follow an individual for the remainder of their lives. Often times, individuals have data shared about them without consent or knowledge (as in the cases of children who have digital footprints created by parents or family members). The right to be forgotten has emerged as a response to the increasing trend of hyper online visibility. It attempts to negotiate how individuals can exercise control and have a voice in how and to what extent information about their past lives (in cases where such information might have an impact upon their current lives) should be easily available on the web.

The debate around “Right to be Forgotten” has its origins in the EU data protection drive and was crystallized out of discussions emerging from a court case in Spain which was later appealed by Google at the European Court of Justice(ECJ). Existing provisions in the EU Data Protection Directive, and Article 8 of the European Convention on Human Rights which enshrines the right to respect for private and family life have been extended to make the case for a “right to be forgotten.”

The right to be forgotten as defined by the EU, asserts that, “Individuals have the right – under certain conditions – to ask search engines to remove links with personal information about them.” This allows individuals to appeal to have their data deleted when data is no longer required, necessary, incorrect or irrelevant. Currently this is practiced on a case by case basis. An example of this might be where an individual has withdrawn their participation from social networking sites and would like their data to be removed from the systems. The right to be forgotten does not actually delete information from the web. It merely allows for links to existing information (such as on media and content aggregating websites) to be made unavailable on search engines such as Google, Bing, Yahoo etc so that they are not easily searchable and accessible. The right to be forgotten is also extended to children and minors, awarding them provisions to exercise control over data they may have shared without fully realizing the consequences of the same.

The right to be forgotten as it is currently practiced in the European Union has selective application in cases where the information is inaccurate, insufficient or irrelevant. It is not absolute when taking into account other fundamental civic rights such as the freedom of expression, or information and the ways in which they may find themselves in conflict with the former. Privacy safeguards with regard to personal data are constantly being negotiated and threatened by the overreach of corporate and state surveillance. With this in mind, the right to be forgotten is an important move towards allowing users to exercise greater voice and control over their personal data and the manner in which it used by giant tech corporations and states. However, opponents of the right have raised pertinent and valid questions on its application, potential misuse and the possible threats it poses to freedom of expression, and the free and open nature of the Internet.

Some critics, like cyberlaw professor Jonathan Zittrain argue that the current data laws of Europe and their application towards a right to be forgotten, may have serious implications for censorship and free speech on the web. The directive can be used to carve out holes in search engines thus rendering important or relevant information inaccessible or curb dissent. It could also then set a dangerous precedent for making censorship easier and to be done on spurious grounds. By allowing users to reach in and take out information they wish to conceal, the veracity of the Internet risks becoming determined in purely individualistic terms.

Currently there are no right to be forgotten provisions in India. The ‘right to be forgotten’ can be made using the broader Information Technology Act and rules enacted by non-legislative bodies pursuant to various sections of the Act such as Rule 3(2) of the Intermediary Guideline Rules, 2011, under Section 79 of the IT Act. However, an individual used the Google ruling as a basis to approach the news website, Medianama to take down links pertaining to a story. Unwilling to set a dangerous precedent for enabling future specious requests without legal backing, the website declined the request while choosing not to disclose the identity of the person making the request.