Units of information
OCR J277 Paper 1 uses the base-1000 (decimal) convention for units in the spec — kilobyte = 1,000 bytes, megabyte = 1,000 KB, etc. (Some textbooks still use base-1024 — OCR mark schemes accept either if the working is clear, but the spec answer is base-1000.)
The hierarchy
| Unit | Symbol | Size |
|---|---|---|
| Bit | b | 1 binary digit (0 or 1) |
| Nibble | 4 bits | |
| Byte | B | 8 bits |
| Kilobyte | KB | 1,000 bytes |
| Megabyte | MB | 1,000 KB = 1,000,000 bytes |
| Gigabyte | GB | 1,000 MB |
| Terabyte | TB | 1,000 GB |
| Petabyte | PB | 1,000 TB |
Converting between units
Each step up multiplies by 1,000 (or 8 for bit↔byte). Each step down divides.
- 16 bits ÷ 8 = 2 bytes.
- 5,000 KB ÷ 1,000 = 5 MB.
- 2 GB × 1,000 = 2,000 MB = 2,000,000 KB = 2,000,000,000 bytes.
A common pattern in OCR is "calculate the file size", e.g. an image 600 × 400 pixels at 24 bits per pixel:
- pixels = 600 × 400 = 240,000
- bits = 240,000 × 24 = 5,760,000 bits
- bytes = 5,760,000 ÷ 8 = 720,000 bytes = 720 KB.
Why bits and bytes?
A computer stores everything as binary. The smallest unit is the bit — high or low voltage representing 1 or 0. Bytes are the smallest addressable unit in most architectures — RAM addresses one byte at a time.
Common OCR exam mistakes
- Confusing b (bit) and B (byte). Network speeds quote bits/sec; file sizes quote bytes.
- Using ÷ 1,024 when the question expects ÷ 1,000.
- Forgetting to divide by 8 when converting bits → bytes.
- Quoting "8 bits = 1 nibble" — a nibble is 4 bits.
AI-generated · claude-opus-4-7 · v3-ocr-computer-science-leaves