Dark Data: Retaining identity in the smart world

To what extent can privacy laws, codes or terms of use be developed to protect our online persona from misuse by social media platforms?

By Olivia Higgins

Privacy is one of the defining issues of our times. Both Facebook’s data collection and China’s social credit system utilise information about us to define our online persona. Traditional notions of privacy as a personal right allowing us to control what others know or say about us have not been legally enforceable in Australia, whereas the use of data held by governments and large companies in Australia is, to some extent, regulated under the Privacy Act.

 In Jathan Sadowski’s book, Too Smart (2019), he labels our data as being “too smart” for our own good and “taking over the world.” He states, “Nothing is safe from smartification… Who benefits from smart technology?” (Sadowski, 2019). In relation to this quote, the prominent issue at hand is how privacy and personal identity can be protected once accessibility and connectivity is granted. Privacy laws, codes and terms may be as effective as fast as it can be revised and implemented. Due the fast-changing, unpredictable nature of tech conglomerates such as Facebook and Google, frameworks might only be highly beneficial once there is a strategised, more stringent way to manage, regulate and possibly deter, potential threats to data privacy.

This article will examine the extent to which Australian privacy laws should be amended, improved and used in conjunction with other laws and changes in technology, particularly with the rise of the smart city structures and changing attitudes towards personal privacy. Overall, the paper looks into the potential threats of rising technology on identities and how laws can be developed to further protect it.   

Overview of Current Law Climate

There are key issues that surround the privacy act alone and its pending updates. Key triggers such as the 2018 Cambridge Analytica scandal may have been an impetus for what is arguably the start of many other problems that also crossover to other fields such as political and economical spheres. The need to revise these existing solutions is arguably as effective as the time change is acted upon, with immediate effect.

 The current privacy laws need to be constructed based on the existing trends and malfunctions that occur especially within the technological conglomerates such Google’s tracking of personal locations although the location was switched off (ACCC, 2019). Contractual obligations and business demands may also affect the way the Australian government body and policymakers approach upcoming issues. Future macro trends, such as the rise of smart cities or the smart citizen cyborg as seen in the case of the Chinese social credit system and facial recognition technologies being implemented locally, are best tackled with a mix of laws and regulations that target the core power points that control the data.

Overall, as these business models change through time, regulators then need to allow space for adaptability and the monitoring of the industry. Ultimately, the focus needs to “shift to the media ecology, the new points of control, and assessment of whether the user experience is inconsistent with the public interest,” in terms of protection of data (Mansell, 2014). The online persona, overall, can be quintessentially classified as non-existent or fading as it has now been reduced to data points of tastes, preferences to aid the algorithms in a business organism. Therefore, the purest form of the online persona can then be defined as the facts, private information that require protection from the eco-system of data monetisation.

Privacy Protection Laws

Defining Privacy

The definition of privacy itself may be difficult to comprehend for the public spheres as what constitutes privacy, particularly personal privacy, is not mutually exclusive to the dependency most individuals have to such technologies or online networks. In Section III of Facebook’s Privacy Policy, the protection laws do not extend yet to include, “third-party service providers that assist in providing the Service or part of the Service” (Facebook, 2019). As there are updated ways to define the term, inclusive of types of privacy, there may be some blind spots in its solutions to actively target privacy protection laws – possibly limiting its power. For the purpose of this essay, we will refer to the Australian Inquiry on privacy, Section 1.31, under the ALRC Report 108, for defining information privacy, territorial privacy and privacy of communications. Referring to the VLRC’s Workplace Privacy: Issues Paper, it states that “privacy can be expressed as a right, and that this right to privacy can then form the basis for determining what are legitimate interests in privacy.”

Overall, as highlighted repeatedly in the report, “establishment of rules governing the collection and handling of personal data the issues to be covered in this Inquiry do not fall neatly into one concept” making it more complicated as cities and citizens as aforementioned are transitioning into a global shift of Industry 4.0 where privacy needs to be consistently redefined and rectified.

Australian Privacy Act and Rights

In section 1.58 of the ALRC Report 108, a primary concern highlighted was the “practical importance of the recognition of competing interests” according to The Community Services Ministers’ Advisory Council. More importantly, the Australian Privacy Act and Rights recognises that although privacy is an important individual right, it arguably does not stand alone. The other rights are sub-groups that co-exist with it, extending to health, safety and freedom. For example, face detection technologies used by governments may breach this recognised right yet with competing interests, it is argued to be used for the collective security even when the technology is placed in an area that does not alert the individual. At times, the report states that the “exercise of rights on behalf of one person can have negative consequences for another person,” creating competing interests that not only highlights the extent to which the privacy issue is complex, but it is also systemic.

 A possible further complication to the successful, sustained implementation of this act is how Research and Technology (R&D) from the tech conglomerates are hardly exposed and known. The status and updates of emerging technologies such as face detection will not make it harder to implement this act alone to effectively uphold data protection laws.

Existing Legal Framework

In the ALRC Report 108, the Australian Privacy Law and Practice evidently reviewed the “culmination of a 28-month inquiry into the extent to which the Privacy Act 1988 (Cth)” with the primary aim of providing a more effective framework for the protection of privacy in Australia. However, it is worthy to examine Australia’s changes in relation to its relationship with other key policymakers in the global sphere such as the European Union (EU) to “create a common framework for citizens of all countries to use to manage their data” (Ismail, 2018). A recurring issue is the extent to which privacy laws are only going to be harder to the regulate once passed due to the current climate. There may be stringent laws in place yet there still remains information unknown and no tangible consensus as to who has the right to obtain, hold or monetise information about others (Transparency Reporting Index, 2018).

According to Kate Crawford at Microsoft, the existing legal frameworks might be limited in power due to the “fascist” qualities of AI, surveillance and predictive polling systems. She notes the use facial features, like in the China case study, is an attempt to “justify the unjustifiable.” The existing Australian framework on data protection is a steppingstone to more constructive by-laws that may hold more tangible power in pre-maturely deterring, rather than simply defeating, future blow-ups like the Cambridge Analytica scandal.

Australia and the European Union (EU)

To further assess how online persona should be protected, it is essential that we look at both the Australian and EU case study to draw cross-comparison analyses. After the General Data Protection Regulation released their own reaction to tighten privacy laws, it initiated a global ripple effect of changes. This example is a key indicator of the rising action being taken across the globe to retain power over the tech conglomerates. The adoption of the GDPR values into the Australian data privacy regulation framework evidently shows how different countries are doing improvements for individual data subjects to control how their personal information is processed (AccessNow, 2018).

Since mid-2018, there has been drastic changes to the data privacy sphere, particularly in the U.S. and in Australia itself that may adjust these frameworks to best suit contractual obligations or satisfy business requirements. Overall, as seen in the GDPR-Australia case study, as the regulations become increasingly similar, that is arguably more beneficial as standardisation brings us closer to a common, global framework for citizens of all countries to use to manage their data.

From an economics perspective, driving in investments into Australia with a more rigid framework means that companies are more made to be more aware of major regional data privacy legislation – producing more long-term benefits in terms of deterring future privacy problems. Like with the GDPR in which European companies started to take these data restrictions more seriously, the companies or individuals who aim to protect online persona should increasingly be more aware of the Australian data environments and ensure that if they do business or employ someone in Australia, it will apply to them (Ismail, 2018).

Urgent Issues

A primary issue lies in the rate of which laws get passed and updated to suit the fast-paced climate. Arguably, once an assessment is completed, there should be a more immediate effect once to be proven more effective as seen in the Australian data privacy regulations (Notifiable Data Breach Scheme) data breach notifications (Ismail, 2018). As the Office of the Australian Information Commissioner (OAIC) states, they “expect that once an entity becomes aware of an eligible data breach, it will provide a statement to the Commissioner promptly, unless there are circumstances that reasonably hinder the entity’s ability to do so” (OAIC, 2019). Whilst keeping with the more restrictive GDPR guidelines to notify the relevant data protection authority may be regarded as a way to assert authority, there are huge administrative costs or business downsides. A more sustainable solution would be to involve “middlemen” such as think tanks or advocacies to monitor and bridge the gap between government and business.

The aforementioned issue is one that is recurring. Although it is a start that the Australian frameworks are constantly being adapted to suit a more global data protection framework, it is only the beginning. More actors like research bodies need to be involved to tackle key power points, as Mansell (2014) highlighted, since the global climate is changing rapidly from the rise of these phenomena – the rate of action needs to start matching the rate of change.

The Rise of the Smart City

Algorithm governance is a rising trend across smart city projects such as in China, Toronto or Singapore. With this trend comes its own set of unique problems that require specific, strategic solutions that target it at the core. These key case studies coupled with existing solutions can be considered a double-edged sword. Efforts of datafication and stratification in both cityscapes and suburban areas by government bodies themselves is a temporary solution that may more strategically regulate and control potential problems that may arise like the misuse of individual data – which does not solve the problem of maintaining the safe online persona. That said, it would be unfair to discredit potential for misuse of data by governments themselves. To assess the role of governments in dealing with data protection strategies, it is worthy to look at ideological structures, collaborators and the relationships they have with the surrounding technological conglomerates.

Privacy and the Cyborg Citizen

The technological proliferation is occurring at an incredible rate.  The term “cyborg citizen” was coined to represent the function of human interest in hybrid systems that combine human-computer, robot interaction in a comprehensive system. The eco-system in which this term is predicated on is one that is consistently challenged – the online and offline self-combined. However, such an eco-system is only sustained by the data it collects. As individuals are seamlessly connected to social online platforms or driving, ride-hailing apps as already a part of their own lives, the cyborg citizen analogy alludes to how over-reliance on these data conglomerates to live day-to-day may mean that the users of social media render themselves powerless (Sadowski, 2019). By this “powerlessness”, the only point of control is to control the power itself by limiting it and to do so it to tackle it with combined global country powers or laws.

Implications and Analysis

Retaining Identity in the New Climate

To further formally assess the potential of these existing strategies is to ensure that there is a need to come to a more resounding solution to allow users to navigate safely through this space. Clearly, there is a lack of a precise formats that are regurgitated in different ways but only make the solutions even more complex. There are currently no clear solutions that are sustainable at present. The closest solution could be to implement a more collaborative approach that benefits all stakeholders, several publics and the surrounding tech companies.

There is arguably still a lack of laws to protect and govern the situational climate. The co-shaping of urban identities to fit strategies for behaviour is one way to maintain adaptability of laws. Yet, the rate in which laws are being implemented and changed according to the standards or existing problems in society may not always be the right fit. There are no current rules that govern the societal standards, making it seemingly backward in terms of situational factors. Where there are rules, there are restrictions. Yet the tangible restrictions may not be sustainable if not maintained or regulated by a middleman.

Reshaping Existing Laws

As the rate of change or implementation might not necessarily occur at the rate of effect. Drawing parallels from the European case study, with the case of the General Data Protection Regulation (GDPR) and several associated advocacies such as AccessNow, there may be a ripple effect of tangible action implemented in Australia such as the Notifiable Data Breach Scheme. However, there is neither a clear-cut solution nor a sustainable one, making it a more unknowable policy that fails to conclude or provide a peace effort.

Although there are remarkable efforts to help mitigate the situation, there still exists an undeniable truth– how are these new law frameworks designed to help ease these complex problems? How are these laws constructed, and with which stakeholder in mind? There is a primary argument that targets the root of the problem, that these laws are more to appease tech conglomerates rather than deter future privacy issues that might happened again. More importantly, there needs to be more rigid assessment of the construction and implementation of these laws that target power points rather than just the overarching issue such as in one-time fines. 

Trends and Predictions

On October 29 of 2019, there is a case study that evidently shows there is a growing trend of reporting tech conglomerates for breach of conduct. The Australian Competition and Consumer Commission (ACCC) took action against Google for using the location of users without consent and even when it was switched off. Google collected, kept and used “highly sensitive and valuable personal information about consumers’ location without them making an informed choice” (ACCC, 2019). The calling out of such acts by commissions and think tanks is a good way to determine that laws placed are being maintained and taken firmly.

Picture 1.jpg

Diagram 1. AccessNow Transparency Reporting Index 2019

Another trend seems to be opposing the aforementioned according to the Transparency Reporting Index 2019, when there was a stagnation in transparency reporting as the rate at which the number of companies publishing reports grows has been decreasing persistently after 2013 (Diagram 1). The transparency reports are a trend that is being pushed forth by advocacies like AccessNow, especially after Google released its first report in 2010. The full disclosures are aimed at maintaining trust between all actors involved in data protection.

Discussion

Other than the global ripple effect, the most sustainable and effect way to combat abuse of individual online personae is to eventually reach a point of a global, sustained framework that operates and targets the social network at its core. Crawford (2017) argues that there is a need to first make current AI systems more transparent and accountable as “the ocean of data is so big; we have to map their complex subterranean and unintended effects.” Upcoming foundations like AI Now, a research community focused on the social impacts of artificial intelligence to monitor abuse is an example of tackling current algorithmic dysfunctions with tangible action. More importantly, current discussions are about how the legal framework needs to change to ensure that primary algorithmic behaviour that drives formation of the online persona in the first place is ethical as possible and free from any unseen biases.

Conclusion

Overall, tackling the Australian problem makes sense from a general viewpoint and could possibly be a good starting point. There are no rapid changes that need to be involved to ensure a more effective framework. What matters is that active efforts are underway to constantly adapt, adopt and redo laws of protection of the online persona.

References

Sadowski, Jathan. (2019). Too Smart: How Digital Capitalism is Extracting Data, Controlling Our Lives, and Taking Over the World. MIT Press. ISBN: 9780262538589.

Grothaus, Michael. (2018). How our data got hacked, scandalized, and abused in 2018. Published in Fast Company. Retrieved from: https://www.fastcompany.com/90272858/how-our-data-got-hacked-scandalized-and-abused-in-2018.

Australian Privacy Law and Practice Report. (2019). Retrieved from: https://www.alrc.gov.au/wp-content/uploads/2019/08/108_vol1.pdf

Australian Competition & Consumer Commission. (2019). Google allegedly misled consumers on collection and use of location data. Retrieved from: https://www.accc.gov.au/media-release/google-allegedly-misled-consumers-on-collection-and-use-of-location-data

Ismail, Nick. (2018). GDPR vs Australian data privacy regulations: 5 key differences. Published in Information Age. Retrieved from: https://www.information-age.com/gdpr-aus-data-privacy-regulations-123471003/

Workplace Privacy Issues Paper. (2003). Victorian Law Reform Commission. Retrieved from: https://www.lawreform.vic.gov.au/sites/default/files/IssuesPaperfinal.pdf

Poell, Thomas. Van Dijck, Jose. (2016). Constructing Public Space: Global Perspectives on Social Media and Popular Contestation. University of Amsterdam, The Netherlands. International Journal of Communication. https://ijoc.org/index.php/ijoc/article/viewFile/4984/1535

Kruse, Lisa M. Norris, Dawn R. Flinchum, Jonathan R. (2018). Social Media as a Public Sphere? Politics on Social Media. Vol. 59, Pp. 62-84. The Sociological Quarterly. https://doi.org/10.1080/00380253.2017.1383143

Rossi, Ben. (2017). GDPR compliance – the real implications for businesses. Published in Information Age. Retrieved from: https://www.information-age.com/gdpr-compliance-real-implications-businesses-123466772/

Transparency Reporting Index. (2019). AccessNow. Retrieved from:  https://www.accessnow.org/transparency-reporting-index/