Cambridge Analytica gained access to personal data from millions of Facebook users through a seemingly harmless personality quiz. Although only 250,000 users gave consent, the app harvested data from their entire friend networks, affecting over 75 million users (Kanakia, Shenoy and Shah, 2019). This data was later used to build detailed psychographic profiles of users, analysing their personality traits, interests, and behavioural patterns. These profiles were then deployed in targeted political advertising during both the 2016 U.S. presidential election and the Brexit campaign, raising widespread concerns about digital privacy and the manipulation of democratic processes (Méndez Egea, 2021).
A similar incident occurred with the Flo Health app, a period and fertility tracker used by over 100 million people. Flo promised to keep sensitive health data private. Still, a 2021 investigation by the U.S. Federal Trade Commission (FTC) found that the app had been sharing user data with third-party companies, such as Facebook and Google (FTC, 2021). This included menstrual cycles, pregnancy intentions, and wellness data collected through daily survey-style inputs (Tovino, 2019). These disclosures contradicted Flo’s privacy policy, and users were not adequately informed about them.
Both cases involve the deceptive use of surveys to gather personal information under the guise of offering value or insights. In doing so, they violated ethical standards around informed consent, autonomy, and transparency. The Cambridge Analytica case raised questions about political manipulation, while the Flo Health case provoked concern about the commodification of women’s health data.
Legally, both incidents exposed weaknesses in digital regulation. Cambridge Analytica helped accelerate GDPR enforcement, while Flo’s FTC settlement mandated stronger privacy practices and third-party auditing. Socially, both cases contributed to growing mistrust in digital platforms, particularly those that handle sensitive personal data.
Professionally, these actions conflict with the ACM (2018) Code of Ethics (sections 1.6 and 1.7) and the BCS (2022) Code of Conduct, which emphasise the importance of respecting user privacy, maintaining confidentiality, and acting with integrity. Computing professionals involved had a responsibility to protect users and ensure that data was not misused, regardless of business or political pressure.
References
Association for Computing Machinery (2018) ACM Code of Ethics and Professional Conduct. Available at: https://www.acm.org/code-of-ethics.
British Computer Society (2022) BCS Code of Conduct. Available at: https://www.bcs.org/membership-and-registrations/become-a-member/bcs-code-of-conduct.
Federal Trade Commission (FTC) (2021) Developer of Popular Women’s Fertility-Tracking App Settles FTC Allegations that It Misled Consumers About the Disclosure of their Health Data, Federal Trade Commission. Available at: https://www.ftc.gov/news-events/news/press-releases/2021/01/developer-popular-womens-fertility-tracking-app-settles-ftc-allegations-it-misled-consumers-about.
Kanakia, H., Shenoy, G. and Shah, J. (2019) ‘Cambridge Analytica–a case study’, Indian Journal of Science and Technology, 12(29), pp. 1–5. Available at: https://sciresol.s3.us-east-2.amazonaws.com/IJST/Articles/2019/Issue-29/Article8.pdf.
Méndez Egea, B. (2021) ‘Data’s use and abuse in political communication. Cambridge analytica and the brexit campaign, a case study’. Available at: https://repositorio.comillas.edu/xmlui/handle/11531/46838.
Tovino, S.A. (2019) ‘Going rogue: Mobile research applications and the right to privacy’, Notre Dame Law Review, 95, p. 155. Available at: https://scholars.law.unlv.edu/cgi/viewcontent.cgi?article=2307\&context=facpub.