Ethical, legal, cultural and environmental impacts of computing
OCR J277 Paper 1 (Section 1.6) tests students on the wider impacts of technology. Questions often ask students to "discuss", "evaluate" or "give arguments for and against" — so you need balanced answers that acknowledge both benefits and drawbacks.
Ethical issues
Ethics concerns what is morally right or wrong. Computing raises many ethical questions:
AI bias
- AI systems learn from training data. If the data contains human biases (racial, gender, socioeconomic), the AI may reproduce and amplify those biases.
- Examples: facial recognition systems less accurate on darker skin tones; loan approval algorithms that disadvantage certain demographics; hiring algorithms filtering out certain groups.
- Benefit: AI can be faster and more consistent than humans. Harm: encoded bias can cause discrimination at scale without accountability.
Censorship vs freedom of information
- Governments and platforms can restrict access to information online (e.g. blocking websites, filtering content).
- Argument for censorship: protects children from harmful content; prevents radicalisation; stops misinformation.
- Argument against: restricts freedom of expression; can be used to silence political opposition; undermines democracy.
Surveillance
- Technology enables mass surveillance: CCTV with facial recognition, tracking phones, monitoring internet traffic.
- Benefit: catches criminals; reduces terrorism; improves public safety.
- Harm: invasion of privacy; data can be misused by governments or companies; chilling effect on free speech.
The digital divide
- Not everyone has equal access to technology (internet, devices, digital skills).
- Causes: income inequality, geographic location, age, disability, lack of digital literacy.
- Impact: those without access miss out on education, job opportunities, government services and healthcare information.
- Closing the divide: affordable devices, subsidised broadband, digital literacy training.
Legal issues
Key UK legislation
| Law | Year | What it covers |
|---|---|---|
| Data Protection Act (DPA 2018) / GDPR | 2018 | How organisations collect, store and process personal data. 8 principles including: data minimisation, purpose limitation, accuracy, security. |
| Computer Misuse Act (CMA 1990) | 1990 | Criminalises unauthorised access to computer systems (hacking); creating or distributing malware; modifying data without permission. |
| Copyright, Designs and Patents Act (CDPA 1988) | 1988 | Protects creators' rights over software, music, images, text. Unauthorised copying is illegal. |
Software licences
- Proprietary (closed source): code is private; users pay for a licence; cannot modify or redistribute.
- Open source: source code publicly available; can be used, modified and redistributed freely (subject to licence terms, e.g. GPL, MIT).
Cultural and environmental impacts
Cultural impacts
- Globalisation: technology connects people worldwide; cultures share ideas but local cultures may be eroded.
- Social media: changes how people communicate, consume news and form communities.
- Changing work patterns: remote working; gig economy (Uber, Deliveroo); automation displacing jobs.
Environmental impacts
- Data centres: use enormous amounts of electricity for servers and cooling (~1–2% of global electricity).
- E-waste: discarded electronics contain toxic materials (lead, mercury, cadmium). Improper disposal pollutes soil and water.
- Manufacturing: producing chips and devices uses rare earth metals and large amounts of water and energy.
- Circular economy: refurbishing, reusing and recycling devices; manufacturers designing for repairability (Right to Repair legislation).
Common OCR exam mistakes
- Giving one-sided answers to "discuss" questions — OCR mark schemes award marks for balanced arguments.
- Confusing the Data Protection Act with the Computer Misuse Act — DPA is about personal data; CMA is about unauthorised access/hacking.
- Forgetting the Digital Divide is an ethical issue — it is about inequality of access, not just a technological fact.
- Saying AI bias "can be fixed by using more data" — more data may help but can also introduce more bias if data is still unrepresentative.
AI-generated · claude-opus-4-7 · v3-ocr-computer-science