The Veillances Part Two: Dataveillance

T

Introduction

Welcome to part two in our series on ‘the veillances’ you should be aware of, how they can impact your privacy, and what you can do to reduce your exposure to them.  The four parts to the series are:

This article is all about Dataveillance which may be a word you’ve never heard of before, but are no doubt experiencing.  When people say “Google / Facebook is listening / watching” they are usually referring to situations that have been enabled by Dataveillance.  A popular example used is being on a website and being served ads for that particular product or service on social media later. 

Dataveillance

What is it?

Roger Clark first defined the term in the mid-1980s as “the systematic monitoring of people’s actions or communications through the application of Information Technology.” (1) With the explosion of data generating technologies since then it has been refined to “the practise of monitoring and collecting online data as well as metadata” (2).

A great accompanying read to helping define how much data you can create with innocuous online searches and payments read our article Challenging the way you imagine Data & Privacy.  This article is intentionally following the first part of this series on Surveillance as it is the next step forward in historical veillances.  Surveillance, by definition, doesn’t consider the data, the codes, the 1s and 0s being transmitted and stored by your online activities and use of digital tools. 

How it works

Dataveillance refers to the collection and use of data resulting from ones use of social networks, emails, credit card transactions, GPS coordinates and anything else that that leaves a digital footprint of its user’s activity (2).  Dataveillance collection is automated, efficient, fast, and focused on aggregated data and information. 

The benefits to society from dataveillance tend to be focused on economic efficiencies, for example:

  • It can be used to collect and verify data such as your bank being able to flag potential fraudulent transactions being made on your credit card
  • Assess security threats and predict potential terrorist or criminal acts through Predictive Policing
  • Allow companies to better understand their customers by tracking their online activity, test which advertisements work, reward loyal customers quicker, and so forth.
  • The overall ability for computers to collect, store, and analyse our personal data to allow for streamlined processes, decisions and calculations much faster (virtually instantaneously) compared to human level data administration. 

The examples above show a picture of data speaking or representing us on our behalf within public and private institutions, which is at the heart of Dataveillance.  When you apply for a loan with your bank to purchase a car or a house you rarely have to go into a branch and meet with a lending specialist for an interview style appointment like you used to.  Your data, collected through dataveillance, is effectively being “interviewed” on your behalf.  On the other side of the table, the bank’s lending specialist is not making any calculations or assertions based on a meeting with you.  The decision to approve or reject your loan application is done by a computer, or more specifically a programmed algorithm based on banks appetite for risk using a pool of aggregate consumer data.

Impacts to privacy

The data collected and analysed can be a disservice to you, for example, if you order excessive food delivery or deposit some money into an online betting account for the occasional punt, this can become a red flag on that loan application example earlier. 

Again, the data is speaking on your behalf. 

This is because the data is aggregated by these various institutions and you are ‘socially sorted’ against all others on the database.  This sorting is done by a computer and thus does not account for the various nuances and legitimate reasons certain data being used was created in the first place.  Returning to our example- the excessive spend on food delivery can be attributed to you breaking your leg and not being mobile or your car being stolen (thus the loan application for a new one) and the deposit on the online betting account may be for just the Super Bowl as a one-off.  Computers are ambivalent to the reasons why certain data is created and therefore do a great job profiling your data but often a poor job profiling you, the person behind the data. 

If the data attached to us is not an accurate representation of us it can be a disservice sure, but what about our future selves?  Data collected today for an innocent reason could be repurposed to use against us in the future.  The best example is the dataveillance systems used in China and the ruling party’s Social Credit Score program.  It would be unwise to assume institutions that collect our personal data now could use or sell it in the future to advance agendas which fundamentally impeach the freedoms and opportunities we enjoy today. 

The impact to privacy comes from the sheer volume of collection points now available to service dataveillance.  This is due to our adoption of newly created digital services which create data (often replacing ones that didn’t) and the expansion / ongoing development of existing digital services in creating more data.  As more elements of our lives become digitised the more data is created which logically flows to the more privacy we lose.  The example used above regarding a bank loan would have been impossible only a few decades ago before online purchasing was available, back when cash was king.  Cash spending doesn’t created data as the purchase cannot be traced to you (unless you marry the purchase up to a rewards program, warranty, online receipt, or similar digital association), this is one reason cash is seemingly on the way out.  Many governments are putting limits on cash purchases and didn’t hesitate to label it ‘dirty’ during the COVID-19 pandemic.   Stakeholders in the dataveillance economy, the business community, followed suit with some rejecting cash payments, though still legal tender, in favour of card payments only.  Other impacts to you regarding dataveillance include:

  • Lack of transparency regarding how companies may use or sell your data, many do not disclose what data is collected and who it is shared with (eg intelligence agencies or marketing agencies and contractors)
  • As alluded to earlier, businesses can use data to track your online activity strongly impacting the concept of a private internet using experience.  Though legislation is catching up (such as GDPR) many websites are still unable to define the cookies they use or unwilling to share why they use them. 
  • Companies can sell aggregated data and those transactions are not made known to the users who have generated that data.  This means you don’t know to who, how much or how far your data has gone. 

Before we get to how you can reduce your exposure to dataveillance systems we thought this part of the article would be a great place to briefly highlight some emerging trends which enable, drive or otherwise aide dataveillance systems. 

Surveillance Capitalism

Surveillance Capitalism, as described by the work of Shoshana Zuboff, can be defined as “an economic system which commodifies personal data with the goal of making a profit from the ability to target consumers more precisely” (3).  Based on the definitions we are introducing, defining, and using in this four-part series we’ve distinguished the difference between Surveillance (in part one) and Dataveillance (earlier in this piece) so perhaps a more apt term would be ‘Dataveillance Capitalism’ however for the purposes of this section we will stick to seminal title ‘Surveillance Capitalism’. 

As we discussed to in our article Challenging the way you imagine about Data & Privacy there is value in your data, a real value, a dollar value.  This makes it of interest to small start-ups looking to fill niche market gaps, to large multinational organisations, right through to bad actors.  This dollar value of your data is the reason so many platforms, apps, games and online services are free to join, play, and otherwise use.  You don’t pay for social media services or the latest trending game such as Pokémon Go which took the world by storm only a few years ago.  The development, ongoing maintenance, and profit of these is typically paid for by the data your electronic device shares with the developer when you download and accept the terms of use.  That is why many ask for access to bizarre parts of your devices such as contacts, image gallery, storage and so forth (similar to Pokémon Go).  This data is then analysed and commodified by companies to improve the effectiveness of targeted advertisements which they sell on to interested businesses. 

A beautiful way Shoshana Zuboff has put this new economic system into perspective is by using a well-known business success story: “surveillance capitalism was pioneered by Google and later Facebook, in much the same way that mass-production and management capitalism were pioneered at Ford and General Motors a century earlier” (4).  Most people are aware this has been but less aware of the magnitude, influence and money this industry has.  The Cambridge Analytica scandal is probably most prominent recent example of not just Surveillance Capitalism at play but of it showing its strength and influence.

We wanted to briefly discuss Surveillance Capitalism to highlight that there is a powerful and growing business model driving Dataveillance.  There is a growing hunger by tech behemoths for more data, not just in volume or metrics but in quality, accuracy, and recency (ideally real time).  Where will all this lead to when they have ‘better’ data in five years than they do today?  There are fears regarding behaviour intervention and conditioning through rewards and punishments to push users to profitable outcomes.  The ability to change people’s behaviour at scale arguably erode concepts of freedom, free-thinking, autonomy, and agency.  Having higher powers know everything about you, predict your behaviour, and potentially alter or condition it is a scary thought indeed – especially when you don’t have reciprocating knowledge. 

That’s it for the brief introduction on Surveillance Capitalism, for more information about theory of Surveillance Capitalism, the business models it creates, power dynamics, or to just dive deeper into it we’ve included some great links in the Further Reading section at the bottom of this article. 

Quantified Self

The brief introduction into Surveillance Capitalism above highlighted the growing business models which enable Dataveillance.  This brief introduction into the concept of the Quantified Self is designed to highlight our growing acceptance to create personal, sometimes very personal, data to share with tech giants, feed the Surveillance Capitalism economy, and eventually enabling more Dataveillance.

Quantified Self refers to a cultural trend in utilising self-tracking, often wearable products or software-based apps for the feedback they provide regarding metrics of interest or concern.  The key concepts of Quantified Self are self-knowledge through self-tracking and if you can measure it, you can change it.  Popular examples of wearables include smart watches such as Fitbits and the Apple watch and apps such as My Fitness Pal and Strava. 

People involved in the Quantified Self movement are wanting to take more control of their health and modify their behaviours and habits to optimise their quality and length of life.  They want to know how many steps they take in a day, what the quality of their sleep is, or how many calories they’re consuming and burning.  All of this data is tracked and monitored by the products and apps described above with users in most cases needing to accept some data sharing / selling conditions. 

A detailed analysis regarding the privacy risks of participating in the Quantified Self movement is outside of the scope of this piece and also difficult to do for two core reasons:

  • Differences in legal jurisdictions: Some countries have a legal framework protecting this type of data from being disclosed (read: sold) while some countries don’t.  These are the same legal frameworks that form patient-doctor confidentiality for example.  If the laws do exist there are also variances of what is permitted to do with the data based on if the data is aggregated (or not), voluntary given up by the user (or not), or depending on if unambiguous consent was granted (or not).
  • Each manufacturer or software developer has a different approach to how they treat the data through their Privacy Policy.  If they are legally able to some will sell aggregated data, some will only share it with trusted partners, and some promise not to but that promise is void if the company is bought out by or merges with another. 

We covered a lot of the more obvious risks in our article about Interactive Insurance Policies which themselves draw upon consumers being onboard with the Quantified Self Movement.  Have a read of that for some more commentary about the privacy concerns in this space. 

If you need to know how many times you rolled over last night or how much of your sleep was ‘active’ and how much was ‘passive for legitimate health related reasons – of course you should participate.  However, if you’re not using the data for any tangible benefit be aware that you are creating more data about yourself.  The more data you create the more data that can be used to create a picture about you, erode your privacy, and service Dataveillance systems part of our Surveillance Capitalism economy. 

It’s worth pointing out that our focus is privacy of course and so our concerns here are formulated from that space alone.  We understand that self-tracking has a lot of merit when it comes to improving physical health, mental health, sleep, exercise performance and other such goals.  Our focus on privacy is not to take away from those advantages or to imply privacy concerns are more important as some people can get lifechanging benefits from self-tracking.  At the end of the day, it’s for each individual user to decide upon their priorities when it comes to self-tracking and where privacy is on that list, not us. 

How to reduce the amount of data you create

This is a more suitable heading for this section as reducing your exposure to dataveillance is difficult, bordering on impossible once its already created.  Yes, there are opportunities where you can contact companies to delete the data they have stored on you and this is also tends to be an option when closing and deleting accounts.  However, most people don’t do this and simply stop using accounts or delete an app rather than going through the process of formally closing it down first.  The trouble here is you don’t know how far that data has already gone, how many times it’s been used or sold on.  At request a company will delete the data it has about you at the time you make that request but are under no obligation to chase up a third party or data broker they’ve previously sold it to and compel them to do the same. 

Shoshana Zuboff puts it like this:

“demanding privacy from surveillance capitalists or lobbying for an end to commercial surveillance on the internet is like asking Henry Ford to make each Model T by hand.  It’s like asking a giraffe to shorten its neck or a cow to give up chewing.  Such demands are existential threats that violate the basic mechanisms of the entity’s survival.” (4)

Therefore, it’s best to take a prevention is better than cure approach and thus the best way to limit your exposure to dataveillance is of course to generate the least amount data as possible.  The two best ways to do this are:

  1. Reduce the number of apps and online accounts you have – anything that has a username and password that links back to your identity or your mobile device basically.  If you’re part of a rewards or loyalty program that you don’t use enough to get any rewards from – close it down.  The same goes for social media and gaming apps – audit your phone and see what you don’t use anymore because it could still be collecting data about you.  Ensure you go through the process of closing accounts down too, don’t just delete the app.  Many mobile phones and laptops come preloaded with unused apps which may collect data about you even if you’ve never opened them.  Keep only what you need or want.
  2. Reconsider electronic payments – we have been conditioned to move away from cash and pay for everything electronically under the guise of convenience.  Yes, it may seem like a pain in the rear end but it’s the way things always were, we’ve just forgotten.  At least make the change to pay for sensitive purchases in cash such as medication or adult store purchases.  Remember, credit card companies will lever stand up to the government and the government can subpoena those records anyway. Click here to see our How To Guide about shopping privately.

Other considerations are listed below but they focus on data security and privacy / reducing the links data can make back to you.  They will help regarding your use of Quantified Self products and apps if you want to take precautions there.  However, these considerations wouldn’t service certain dataveillance systems such as the banks profile on you when you apply for a loan as per our example earlier.

  • Utilise antimalware, adblocking, and other systems designed to block data collection or secure your device against malicious viruses, bugs, keystroke loggers etc, from collecting data.  See our article Malware 101 for more on this. 
  • Use a reputable, no logs VPN
  • Ensure your browser/s privacy settings are set to the desired settings.
  • Regularly delete your cookies, browsing history and associated search data from your browser/s
  • Where available use apps in ‘privacy’, ‘incognito’, ‘dark’ or however termed setting and only use apps with a Privacy Policy
  • When signing up to services only include data that is marked as necessary or give data that is not accurate, such as a different date of birth.  Avoid giving excessive data unnecessarily. 
  • Avoid sharing location data on social media including social share functions from within a self-tracking app by direct connection (signing into your social media through the self-tracking app)
  • The usual tips: ensure you update your device and apps regularly, turn Bluetooth off when not required, use strong passwords, utilise device encryption.

Pre-Conclusion: Pros & Cons / Risks

To lead in to our conclusion, we’ve created this table highlighting the good, the bad, and potentially bad elements of Dataveillance systems. 

Conclusion

A message we want to write explicitly and the reason we did a pros/cons list was that dataveillance systems are not inherently bad in every context.  They are a necessary by-product of the interconnected digital world the public so badly craves (especially the Quantified Selfers).  Indeed, one of the latest trends or buzz phrases pushed by every reputable business journal instructs business leaders to ‘create personalised, collaborative, bespoke customer experiences / customer journeys‘ or similar notions.  The general public wants personalisation and that can only happen if public and private institutions have data with which to create that personalised interaction.  

Rather than making a judgement on dataveillance the neutrality of this article highlights our focus on you the reader making a decision based on your perceived risks

Our brief word on Surveillance Capitalism highlights that there are bigger, more powerful forces at play driving Dataveillance capability.  The Quantified Self movement highlights how we’re becoming more open and accepting to tracking and generating increasingly sensitive and private data to share.

‘Google / Facebook is listening‘ is the popular saying we’ve all heard when an ad pops up on our social media feed after browsing the very same family of products hours earlier.  However, this continually developing thanks to what we’ve written about today.  The ads used to be for just sports apparel stores, now the ads know I’m a runner so serve running shoes, soon they will serve trail shoes for the terrain I tend to run on, gear based on the climate I live in and re-serve those ads based on my weekly mileage subtracted from the average time trail running shoes last.  The advertising will get more detailed and have increasing capability to influence behaviour. 

We hope this article has highlighted how dataveillance can be used for you and against you leading to certain, and likely, modest changes in your behaviour.  Perhaps the next time you visit an adult shop, for example, you would have researched the location anonymously, left your mobile phone at home or in the car and paid for your private product in cash.  In doing so you know you’ve done a lot to ensure that data revealing your sexual interests hasn’t been created

Governments need to ensure regulation regarding accountability for data breaches and misuse of consumer data upon the business community is contemporary and based on modern data collection methods.  However, governments have one thing on their side the average bad actor doesn’t – the law.  If you are to imagine a situation where all data in existence is 100% safe and secure from any type of breach (bad actors, employees, competitors, etc) it will still rarely be safe from access by government institutions.  As alluded to in this article we don’t know what the future holds, what political system your country will find itself in, and how your data could be used against you in the future.  It’s not absurd to imagine the Social Credit Score used in China doesn’t breed a less offensive but equally effective version in western societies sometime in the future.  It is therefore important to take the prevention is better than cure approach and simply minimise how much data about you can be created and hedge your bets that way.  Dataveillance isn’t going away or can easily be fought, our biggest means to action is our own participation in how much data we enable to be created by and about us. 

Stay tuned for part three in this series on Sousveillance.

References

(1) http://www.rogerclarke.com/DV/#SurvD

(2) https://en.wikipedia.org/wiki/Dataveillance

(3) https://journals.sagepub.com/doi/10.1177/1095796018819461

(4) https://www.faz.net/aktuell/feuilleton/debatten/the-digital-debate/shoshana-zuboff-secrets-of-surveillance-capitalism-14103616.html?printPagedArticle=true

Further Reading

The Age of Surveillance Capital is the book by Shoshana Zuboff for more on Surveillance Capitalism

Wikipedia page on Surveillance Capitalism: https://en.wikipedia.org/wiki/Surveillance_capitalism

Online article by Shoshana Zuboff titled Surveillance Capitalism and the Challenge of Collective Action: https://journals.sagepub.com/doi/10.1177/1095796018819461

This article is written in line with our Terms & Conditions and Disclaimer. As such all content is of a general nature only and is not intended as legal, financial, social or professional advice of any sort. Actions, decisions, investments or changes to device settings or personal behaviour as a result of this content is at the users own risk. Privacy Rightfully makes no guarantees of the accuracy, results or outcomes of the content and does not represent the content to be a full and complete solution to any issue discussed. Privacy Rightfully will not be held liable for any actions taken by a user/s as a result of this content. Please consider your own circumstances, conduct further research, assess all risks and engage professional advice where possible.

Recent Posts

Contact us

SUBSCRIBE TO OUR NEWSLETTER

* = required field
I am over 18 years of age