TopMyGrade

GCSE/Computer Science/OCR

1.6.1Ethical issues: AI bias, censorship, surveillance, the digital divide; weighing benefits and harms

Notes

Ethical, legal, cultural and environmental impacts of computing

OCR J277 Paper 1 (Section 1.6) tests students on the wider impacts of technology. Questions often ask students to "discuss", "evaluate" or "give arguments for and against" — so you need balanced answers that acknowledge both benefits and drawbacks.

Ethical issues

Ethics concerns what is morally right or wrong. Computing raises many ethical questions:

AI bias

  • AI systems learn from training data. If the data contains human biases (racial, gender, socioeconomic), the AI may reproduce and amplify those biases.
  • Examples: facial recognition systems less accurate on darker skin tones; loan approval algorithms that disadvantage certain demographics; hiring algorithms filtering out certain groups.
  • Benefit: AI can be faster and more consistent than humans. Harm: encoded bias can cause discrimination at scale without accountability.

Censorship vs freedom of information

  • Governments and platforms can restrict access to information online (e.g. blocking websites, filtering content).
  • Argument for censorship: protects children from harmful content; prevents radicalisation; stops misinformation.
  • Argument against: restricts freedom of expression; can be used to silence political opposition; undermines democracy.

Surveillance

  • Technology enables mass surveillance: CCTV with facial recognition, tracking phones, monitoring internet traffic.
  • Benefit: catches criminals; reduces terrorism; improves public safety.
  • Harm: invasion of privacy; data can be misused by governments or companies; chilling effect on free speech.

The digital divide

  • Not everyone has equal access to technology (internet, devices, digital skills).
  • Causes: income inequality, geographic location, age, disability, lack of digital literacy.
  • Impact: those without access miss out on education, job opportunities, government services and healthcare information.
  • Closing the divide: affordable devices, subsidised broadband, digital literacy training.

Legal issues

Key UK legislation

LawYearWhat it covers
Data Protection Act (DPA 2018) / GDPR2018How organisations collect, store and process personal data. 8 principles including: data minimisation, purpose limitation, accuracy, security.
Computer Misuse Act (CMA 1990)1990Criminalises unauthorised access to computer systems (hacking); creating or distributing malware; modifying data without permission.
Copyright, Designs and Patents Act (CDPA 1988)1988Protects creators' rights over software, music, images, text. Unauthorised copying is illegal.

Software licences

  • Proprietary (closed source): code is private; users pay for a licence; cannot modify or redistribute.
  • Open source: source code publicly available; can be used, modified and redistributed freely (subject to licence terms, e.g. GPL, MIT).

Cultural and environmental impacts

Cultural impacts

  • Globalisation: technology connects people worldwide; cultures share ideas but local cultures may be eroded.
  • Social media: changes how people communicate, consume news and form communities.
  • Changing work patterns: remote working; gig economy (Uber, Deliveroo); automation displacing jobs.

Environmental impacts

  • Data centres: use enormous amounts of electricity for servers and cooling (~1–2% of global electricity).
  • E-waste: discarded electronics contain toxic materials (lead, mercury, cadmium). Improper disposal pollutes soil and water.
  • Manufacturing: producing chips and devices uses rare earth metals and large amounts of water and energy.
  • Circular economy: refurbishing, reusing and recycling devices; manufacturers designing for repairability (Right to Repair legislation).

Common OCR exam mistakes

  1. Giving one-sided answers to "discuss" questions — OCR mark schemes award marks for balanced arguments.
  2. Confusing the Data Protection Act with the Computer Misuse Act — DPA is about personal data; CMA is about unauthorised access/hacking.
  3. Forgetting the Digital Divide is an ethical issue — it is about inequality of access, not just a technological fact.
  4. Saying AI bias "can be fixed by using more data" — more data may help but can also introduce more bias if data is still unrepresentative.

AI-generated · claude-opus-4-7 · v3-ocr-computer-science

Practice questions

Try each before peeking at the worked solution.

  1. Question 12 marks

    Computer Misuse Act

    Give two actions that are made illegal by the Computer Misuse Act 1990. [2 marks]

    Ask AI about this

    AI-generated · claude-opus-4-7 · v3-ocr-computer-science

  2. Question 26 marks

    AI bias — balanced discussion

    Discuss the ethical implications of using AI systems to make decisions about loan applications. [6 marks]

    Ask AI about this

    AI-generated · claude-opus-4-7 · v3-ocr-computer-science

  3. Question 33 marks

    Digital divide

    Explain what is meant by the digital divide and describe one step that could be taken to reduce it. [3 marks]

    Ask AI about this

    AI-generated · claude-opus-4-7 · v3-ocr-computer-science

Flashcards

1.6.1 — Ethical issues: AI bias, censorship, surveillance, the digital divide; weighing benefits and harms

8-card SR deck for OCR Computer Science (J277) topic 1.6.1

8 cards · spaced repetition (SM-2)