Privacy and Personal Data
Every time you go online you leave a trail. Companies collect it, analyse it, and make decisions about you from it. This lesson covers what that data is, how it is gathered, and what the law says about who can use it and why.
You search online for a new pair of trainers. You don't buy them. Over the next two weeks, the same trainers appear in adverts on Instagram, YouTube, a news website and a weather app. You never signed up to any of those services. How do they all know?
Privacy and data questions often use scenario contexts. You need to identify what type of data is involved, how it was collected (active vs passive footprint), and which GDPR principle may have been violated. These are the three layers examiners test most frequently.
Digital footprints
A digital footprint is the trail of data you leave whenever you use digital technology. Every search, every click, every post and every purchase adds to it. There are two types:
Cookies and tracking technologies
A cookie is a small text file stored on your device by a website. Cookies were invented to solve a genuine problem: HTTP is stateless, meaning each page request has no memory of previous ones. Cookies give websites a way to "remember" you.
But cookies can also be used for tracking, and this is where ethical concerns arise.
GDPR - key principles and rights
The General Data Protection Regulation (GDPR), implemented into UK law by the Data Protection Act 2018, sets out how organisations must handle personal data. There are six core principles:
Under GDPR, individuals have the right to: access their data (Subject Access Request), the right to erasure ("right to be forgotten"), the right to portability, and the right to object to processing.
Case studies - real and fictional
Cambridge Analytica harvested the personal data of approximately 87 million Facebook users without their explicit consent. A researcher created a personality quiz app that was permitted to collect data about users who installed it. Crucially, it also collected data about all of those users' friends - people who had never used the app and had not given consent.
This data was then used to build detailed psychological profiles of voters, which were sold to political campaigns. Cambridge Analytica reportedly used these profiles to target voters with personalised political advertising in the 2016 US Presidential election and the Brexit referendum.
Facebook was fined $5 billion by the US Federal Trade Commission. Cambridge Analytica itself went into administration in 2018. The case triggered widespread debate about data ownership, consent and the power of social media companies.
In 2012, the US retailer Target used purchasing data to build a predictive model that could identify whether a customer was pregnant - and predict their due date - from their shopping habits alone. The model looked for patterns like buying unscented lotion, vitamin supplements, cotton wool and certain foods in unusual combinations.
Target then sent personalised maternity advertising to customers the algorithm identified as pregnant. In one widely reported case, a father received baby product coupons addressed to his teenage daughter and complained to the store - only to later discover she was indeed pregnant. Target had known before he did.
No personal data was shared externally, and no law was broken. But the case raised profound questions about the ethical limits of data analytics, informed consent and whether companies should be able to make sensitive inferences from seemingly innocuous data.
In May 2023, Ireland's Data Protection Commission fined Meta (the parent company of Facebook, Instagram and WhatsApp) €1.2 billion - the largest GDPR fine ever issued. The case centred on Meta's practice of transferring the personal data of European Union users to servers in the United States, where EU data protection standards do not apply by default.
The European Court of Justice had previously struck down the Privacy Shield agreement, which had allowed transatlantic data transfers. Meta continued transferring data using an alternative mechanism called Standard Contractual Clauses, but the DPC found this was insufficient to protect EU users' rights given the extent of US government surveillance powers. Meta was ordered to stop these transfers within five months and to delete or return the data it had already sent.
Meta appealed against parts of the ruling and argued that a new data transfer agreement - the EU-US Data Privacy Framework - rendered the order unnecessary. The case highlighted the ongoing tension between global platform architectures and regional data protection regimes, and raised fundamental questions about whether a single company's infrastructure can simultaneously comply with the laws of every country it operates in.
ShopEasy is a fictional UK supermarket chain that offers loyalty cards to customers. When customers sign up, they provide their name, age, email address and home postcode. Every purchase is linked to their card and stored in ShopEasy's database.
ShopEasy sells anonymised purchasing data to a health insurance company, which uses it to identify customers who buy large amounts of alcohol, cigarettes and processed food. The insurance company uses this data when calculating premiums. ShopEasy's terms and conditions stated that data "may be shared with trusted partners for commercial purposes" - but customers did not realise this included insurance companies.
Cookie type classifier
Drag each item to the correct cookie category.
Drag each item below into the correct category, then click Check.
GDPR requires "freely given, specific, informed and unambiguous" consent. Most cookie consent banners are designed to make accepting all cookies the easiest option. Does this constitute genuine consent under GDPR?
Lesson 1 Worksheets
Three differentiated worksheets - recall, application and exam technique.