Conversions

Binary vs Octal vs Hex

When to use which number system — a practical comparison for developers and students.

Published: May 4, 2026 · 7 min read

The Three Systems

Binary (Base 2) uses only 0 and 1. It is the native language of computers — every transistor is either on or off. Use binary when working with individual bits, boolean logic, or understanding how hardware processes data.

Octal (Base 8) uses digits 0–7. Each octal digit represents exactly 3 binary bits. Octal was historically popular in early computing (PDP-8, Unix file permissions). Today it's mainly used for Unix chmod values like 755 or 644.

Hexadecimal (Base 16) uses 0–9 and A–F. Each hex digit represents exactly 4 binary bits (a nibble). Hex is the most widely used non-decimal system in modern computing — memory addresses, color codes, MAC addresses, and byte values.

Quick Comparison

The number 255 in each system:

• Decimal: 255 (3 digits)
• Binary: 11111111 (8 digits)
• Octal: 377 (3 digits)
• Hex: FF (2 digits)

When to Use Binary

• Understanding CPU operations and ALU logic
• Bitwise operations (AND, OR, XOR, NOT)
• Network subnet masks
• Learning computer science fundamentals

When to Use Octal

• Unix/Linux file permissions (chmod 755)
• Legacy system compatibility
• Some assembly language contexts

When to Use Hex

• Memory addresses and pointers
• Web color codes (#FF5733)
• Byte-level data inspection
• MAC addresses, IPv6, UUIDs
• Assembly language and machine code
• Cryptographic hashes (SHA-256, MD5)

The Relationship

Since 8 = 2³ and 16 = 2⁴, both octal and hex are shortcuts for binary. One hex digit = 4 bits. One octal digit = 3 bits. This is why conversion between these bases is trivial — just group the binary digits.

Try converting between all bases with our conversion tools.