Code

/kōd/

noun — "a system of symbols or rules used to represent information."

Code is a structured system for representing, communicating, or storing information using a defined set of symbols, rules, or conventions. In computing, cryptography, and digital communication, code refers to any method by which data or instructions are expressed in a form that can be transmitted, processed, or interpreted according to a predefined scheme. It emphasizes the *representation* of meaning rather than the meaning itself.

Technically, a code maps a source of information, such as letters, numbers, commands, or logical operations, into a symbolic representation. This mapping can serve multiple purposes:

  • Compression — reducing the size of information for efficient storage or transmission (e.g., Huffman coding).
  • Error detection and correction — enabling detection or recovery from errors during transmission (e.g., parity bits, Reed-Solomon codes).
  • Encryption or obfuscation — hiding information from unauthorized readers (overlapping with ciphers).
  • Machine instructions — representing commands that a processor executes in digital electronics and computing.

 

In software development, code often refers to human-readable instructions written in programming languages such as Python, C++, or Java. These instructions are ultimately compiled or interpreted into machine-readable formats so that a processor can perform the desired operations. In cryptography, a code translates entire words or phrases into alternative symbols according to a predefined dictionary, distinguishing it from a cipher, which typically operates on individual letters or bits.


# conceptual example: simple binary code representation
# mapping letters to 8-bit binary codes
A = 01000001
B = 01000010
C = 01000011
# encoding the word "CAB"
word = [C, A, B]
binary_encoded = [01000011, 01000001, 01000010]

Code also encompasses standards such as ASCII, Unicode, and Morse code, which provide a systematic mapping between symbols and data representations. These codes allow devices, humans, and software to interpret information consistently across different systems and platforms. In this sense, code is both the language and the grammar of digital and symbolic communication.

Conceptually, a code is a bridge between abstract meaning and practical implementation. It defines how ideas, messages, or instructions are represented so they can be transmitted, stored, or executed. In digital systems, proper coding ensures accuracy, interoperability, efficiency, and security, making it a cornerstone of modern computing, networking, and cryptography.

See Cipher, Encryption, Levenshtein Distance, Caesar Cipher, Ottendorf Cipher, Affine Cipher, ASCII, Unicode.

Universal Asynchronous Receiver/Transmitter

/ˈjuːɑːrt/

noun — "asynchronous serial link for device communication."

UART, short for Universal Asynchronous Receiver/Transmitter, is a hardware communication module used to send and receive serial data asynchronously between a processor and peripheral devices. It converts parallel data from a CPU or microcontroller into a sequential stream of bits for transmission, and conversely reconstructs incoming serial data into parallel form for the processor. UARTs are fundamental in embedded systems, serial consoles, and point-to-point communication over short distances.

Technically, a UART implements the physical and data link layers of a serial communication protocol. It handles framing, start and stop bits, parity checking, and buffering. Each transmitted byte is encapsulated with:

  • 1 start bit signaling the beginning of transmission
  • 5–8 data bits carrying the payload
  • Optional parity bit for error detection
  • 1–2 stop bits indicating the end of the byte

The transmitting and receiving devices must agree on the **baud rate**—the number of bits transmitted per second—to correctly interpret the timing of each bit.

 


# conceptual UART transmit
TX_byte = 0xA5
# frame sent: start | 8 data bits | parity | 1 stop bit
UART.send(TX_byte)
# receiver reconstructs byte from serial stream
RX_byte = UART.receive()

In embedded workflows, UART provides a simple, low-overhead channel for debugging, logging, device configuration, and peripheral control. It is widely supported across microcontrollers, CPUs, and FPGA boards. While UART is limited to short-distance, point-to-point links, it is highly reliable, does not require a shared clock, and allows flexible framing and error detection.

Conceptually, UART is like a mail courier who packages letters (bytes) with a clear start and end envelope and ensures both sender and receiver understand the delivery speed and format. Each byte is sent sequentially, and any timing mismatch or framing error can be detected and corrected if parity is used.

See SPI, I²C, GPIO, Microcontroller, Embedded Systems.

Message Queuing Telemetry Transport

/ˌɛm.kjuːˌtiːˈtiː/

noun — "lightweight messaging protocol for IoT devices."

MQTT , short for Message Queuing Telemetry Transport, is a lightweight, publish-subscribe messaging protocol optimized for constrained devices and low-bandwidth, high-latency, or unreliable networks. It enables efficient, asynchronous communication between clients and brokers, making it widely used in Internet of Things (IoT) applications.

Technically, MQTT operates over TCP/IP and defines three types of Quality of Service (QoS) levels: at-most-once, at-least-once, and exactly-once. Messages are published to topics, and subscribers receive messages from topics they are interested in. The protocol uses a small header (just 2 bytes for most messages), minimizing overhead and allowing reliable messaging even on limited hardware.

An MQTT system consists of clients (publishers and subscribers) and a broker. Publishers send messages to topics on the broker, which manages delivery to subscribers. The broker can retain messages for new subscribers, support persistent sessions, and handle thousands of concurrent connections efficiently.

In workflow terms, a temperature sensor in a smart home might publish readings to a topic named home/temperature. Multiple subscribers, such as a monitoring dashboard, an alerting system, or a logging service, receive the readings independently and in near real-time. Publishers and subscribers are decoupled, allowing each component to scale or fail without impacting the others.

Conceptually, MQTT is a minimalist switchboard for device-to-device messaging, designed to keep communication reliable, low-overhead, and asynchronous across large networks of sensors and actuators.

See Pub/Sub, Streaming, Kafka.

Pub/Sub

/pʌb ˈsʌb/

noun — "asynchronous messaging model for decoupled communication."

Pub/Sub (short for Publish/Subscribe) is a messaging pattern in which senders (publishers) do not send messages directly to specific receivers (subscribers), but instead categorize messages into channels or topics. Subscribers express interest in one or more topics and receive only messages that match those topics. This decouples the sender and receiver, enabling scalable, asynchronous communication across distributed systems.

Technically, a Pub/Sub system consists of three core components: the publisher, the subscriber, and the message broker. Publishers push messages to the broker, which handles routing based on topic subscriptions. Subscribers register interest in topics, and the broker ensures delivery, either immediately or via persistent queues. This mechanism supports both transient and durable subscriptions depending on system requirements.

Common implementations include cloud messaging services like Google Cloud Pub/Sub, Kafka (when used in pub/sub mode), and MQTT brokers. Messages may carry structured payloads in formats such as JSON, Protocol Buffers, or Avro. Delivery semantics vary by system: at-most-once, at-least-once, or exactly-once guarantees, affecting reliability and idempotency considerations.

In workflow terms, a sensor network might publish telemetry data to a topic like temperature-readings. Multiple processing services subscribe to this topic: one logs the readings, another triggers alerts if thresholds are exceeded, and another aggregates metrics. None of these services need direct knowledge of each other, and the publisher does not need to know the number of subscribers.

Pub/Sub is often paired with streaming systems for real-time event handling. Publishers continuously generate events, brokers buffer and route them, and subscribers process events asynchronously. This allows scaling out consumers independently, supporting high-throughput, low-latency architectures common in microservices and IoT ecosystems.

Conceptually, Pub/Sub acts as a message switchboard: publishers deposit messages at labeled slots (topics), and subscribers pull from slots they care about. This abstraction simplifies complex, distributed communication patterns, promotes loose coupling, and allows systems to evolve independently without breaking contracts between producers and consumers.

See Streaming, MQTT, Kafka.

Communication

/kəˌmjuːnɪˈkeɪʃən/

noun … “Exchange of information between entities.”

Communication in computing refers to the transfer of data or signals between systems, devices, or components to achieve coordinated operation or information sharing. It encompasses both hardware and software mechanisms, protocols, and interfaces that enable reliable, timely, and accurate data exchange. Effective communication is essential for networking, distributed systems, and embedded control applications.

Key characteristics of Communication include:

  • Medium: can be wired (e.g., Ethernet, USB) or wireless (e.g., Wi-Fi, radio, Bluetooth).
  • Protocol: defines rules for data formatting, synchronization, error detection, and recovery.
  • Directionality: simplex, half-duplex, or full-duplex communication.
  • Reliability: mechanisms like ECC or acknowledgments ensure data integrity.
  • Speed and latency: bandwidth and propagation delay affect performance of communication channels.

Workflow example: Simple message exchange over TCP/IP:

client_socket = socket.connect("server_address", port)
client_socket.send("Hello, Server!")
response = client_socket.receive()
print(response)
client_socket.close()

Here, the client and server exchange data over a network using a communication protocol that guarantees delivery and order.

Conceptually, Communication is like passing a note in class: the sender encodes a message, the medium carries it, and the receiver decodes and interprets it, ideally without errors or delays.

See Radio, Error-Correcting Code, Protocol, Network, Data Transmission.

Radio

/ˈreɪdioʊ/

noun … “Information carried on invisible waves.”

Radio is the technology and physical phenomenon by which information is transmitted through space using electromagnetic waves in the radio-frequency portion of the Electromagnetic Spectrum. It enables communication without physical conductors by encoding information onto oscillating electric and magnetic fields that propagate at the speed of light. These waves can travel through air, vacuum, and some solid materials, making radio foundational to wireless communication.

At its core, radio works by generating a carrier wave at a specific frequency and modifying that wave to represent information. This modification process is called Modulation. The modulated signal is converted into electromagnetic radiation by an Antenna, which couples electrical energy into free space. On the receiving side, another antenna captures a small portion of that energy, converting it back into an electrical signal that can be amplified, demodulated, and interpreted.

Radio systems are defined by several technical characteristics. Frequency determines how fast the electromagnetic field oscillates and influences range, bandwidth, and penetration through obstacles. Bandwidth determines how much information can be carried per unit time. Power affects range but is constrained by regulation and interference concerns. Noise, both natural and man-made, introduces uncertainty that limits reliability. These constraints are not arbitrary; they are governed by the mathematics of Information Theory, which formalizes how much information can be transmitted over a noisy channel.

A critical theoretical boundary in radio communication is the Shannon Limit. It defines the maximum achievable data rate for a given bandwidth and signal-to-noise ratio, assuming optimal encoding and decoding. No matter how advanced the hardware becomes, no radio system can exceed this limit without changing the physical parameters of the channel. Modern digital radio techniques are designed to approach this boundary as closely as possible.

In practical workflows, radio underlies a vast range of systems. In broadcast radio, audio signals are modulated onto carrier waves and transmitted from high-power towers to many passive receivers. In mobile communications, devices dynamically adjust frequency, power, and modulation to maintain reliable links while moving through changing environments. In satellite systems, radio waves traverse long distances through space, requiring precise timing, encoding, and error correction to compensate for delay and noise.

Radio communication can be analog or digital. Analog radio varies the carrier continuously, directly reflecting the source signal. Digital radio encodes information as discrete symbols, enabling robust error detection and correction. Digital techniques allow multiple users to share spectrum efficiently and make better use of limited bandwidth, which is why modern wireless systems overwhelmingly rely on digital radio.

The behavior of radio waves is shaped by physics. Lower frequencies tend to travel farther and diffract around obstacles, while higher frequencies support greater data rates but are more easily blocked or absorbed. Reflection, diffraction, and scattering cause multipath effects, where multiple delayed copies of a signal arrive at the receiver. Radio system design accounts for these effects using signal processing and adaptive techniques.

Conceptually, radio is like tossing structured ripples into a vast, invisible ocean. The ripples spread outward, weakened by distance and disturbed by interference, yet with the right encoding and listening strategy, meaning can still be recovered from the motion of the waves.

See Electromagnetic Spectrum, Modulation, Antenna, Information Theory, Shannon Limit.

Information Theory

/ˌɪnfərˈmeɪʃən ˈθiəri/

noun … “Mathematics of encoding, transmitting, and measuring information.”

Information Theory is the formal mathematical framework developed to quantify information, analyze communication systems, and determine limits of data transmission and compression. Introduced by Claude Shannon, it underpins modern digital communications, coding theory, cryptography, and data compression. At its core, Information Theory defines how much uncertainty exists in a message, how efficiently information can be transmitted over a noisy channel, and how error-correcting codes can approach the theoretical limits.

Key characteristics of Information Theory include:

  • Entropy: a measure of the average information content or uncertainty of a random variable.
  • Mutual information: quantifies the amount of information shared between two variables.
  • Channel capacity: the maximum rate at which data can be reliably transmitted over a communication channel, as formalized by the Shannon Limit.
  • Error correction: forms the theoretical basis for LDPC, Turbo Codes, and other forward error correction (FEC) schemes.
  • Data compression: defines limits for lossless and lossy compression, guiding algorithms such as Huffman coding or arithmetic coding.

Workflow example: In a digital communication system, Information Theory is applied to calculate the entropy of a source signal, design an efficient code to transmit the data, and select error-correcting schemes that maximize throughput while maintaining reliability. Engineers analyze the signal-to-noise ratio (SNR) and bandwidth to approach the Shannon Limit while minimizing errors.

-- Pseudocode: calculate entropy of a discrete source
import math
probabilities = [0.5, 0.25, 0.25]
entropy = -sum(p * math.log2(p) for p in probabilities)
print("Entropy: " + str(entropy) + " bits")
-- Output: Entropy: 1.5 bits

Conceptually, Information Theory is like designing a postal system: it determines how many distinct messages can be reliably sent over a limited channel, how to package them efficiently, and how to ensure they arrive intact even in the presence of noise or interference.

See Shannon Limit, LDPC, Turbo Codes, FEC.

Shannon Limit

/ˈʃænən ˈlɪmɪt/

noun … “Maximum reliable information rate of a channel.”

Shannon Limit, named after Claude Shannon, is the theoretical maximum rate at which information can be transmitted over a communication channel with a specified bandwidth and noise level, while achieving error-free transmission. Formally defined in information theory, it sets the upper bound for channel capacity (C) given the signal-to-noise ratio (SNR) and bandwidth (B) using the Shannon-Hartley theorem: C = B * log2(1 + SNR).

Key characteristics of the Shannon Limit include:

  • Channel capacity: represents the absolute maximum data rate achievable under ideal encoding without error.
  • Dependence on noise: higher noise reduces the capacity, requiring more sophisticated error-correcting codes to approach the limit.
  • Fundamental bound: no coding or modulation scheme can exceed the Shannon Limit, making it a benchmark for communication system design.
  • Practical significance: real-world systems aim to approach the Shannon Limit using advanced techniques like LDPC or Turbo Codes to maximize efficiency.

Workflow example: In modern fiber-optic networks, engineers measure the channel’s SNR and bandwidth, then select modulation formats and forward error correction schemes to operate as close as possible to the Shannon Limit. This ensures maximum throughput without exceeding physical constraints.

-- Example: Shannon-Hartley calculation in pseudocode
bandwidth = 1e6         -- 1 MHz
snr = 10                -- Linear ratio
capacity = bandwidth * log2(1 + snr)
print("Max channel capacity: " + capacity + " bits per second")

Conceptually, the Shannon Limit is like a pipe carrying water: no matter how clever the plumbing, the flow cannot exceed the pipe’s physical capacity. Engineers design systems to maximize flow safely, approaching the limit without causing overflow (errors).

See LDPC, Turbo Codes, Information Theory, Signal-to-Noise Ratio.

Very-high-bit-rate Digital Subscriber Line 2

/ˈviː.diː.ɛs.ɛl.tuː/

noun — "squeezing fiber-class speed out of copper."

VDSL2, short for Very-high-bit-rate Digital Subscriber Line 2, is an enhanced broadband access technology that delivers high-speed data over existing copper telephone lines. It improves upon VDSL by supporting higher data rates, wider frequency bands, and better performance over short loop lengths, making it a key technology for last-mile broadband.

Technically, VDSL2 uses DMT modulation across multiple frequency profiles, allowing downstream speeds that can exceed 100 Mbps under ideal conditions. It is typically deployed from a DSLAM or a street-level DPU, where the copper run to the customer is short enough to preserve signal quality. Features such as vectoring further reduce crosstalk between lines, increasing stability and throughput.

Key characteristics of VDSL2 include:

  • High data rates: significantly faster than ADSL and early VDSL.
  • Short-loop optimization: best performance when fiber is close to the user.
  • Advanced modulation: relies on DMT and multiple profiles.
  • Vectoring support: minimizes interference between copper pairs.
  • Upgrade path: bridges legacy copper and newer technologies like G.fast.

In practical deployments, VDSL2 is commonly used in fiber-to-the-cabinet (FTTC) architectures, where fiber reaches a neighborhood cabinet and copper completes the final connection to homes or offices. This approach delivers high speeds without the cost of full fiber installation.

Conceptually, VDSL2 is like putting a high-performance engine into an old road: the road stays the same, but the ride gets much faster.

Intuition anchor: VDSL2 extracts maximum broadband performance from existing copper lines.

See ADSL, SDSL, Bandwidth.

Virtual Local Area Network

/viː.læn/

noun — "the invisible walls that organize a network."

VLAN, short for Virtual Local Area Network, is a network configuration that segments a physical LAN into multiple logical networks, allowing devices to be grouped together based on function, department, or security requirements rather than physical location. VLANs improve traffic management, enhance security, and reduce broadcast domains within enterprise networks.

Technically, VLANs use tagging protocols like IEEE 802.1Q to mark Ethernet frames, enabling switches to identify and segregate traffic. Switches and routers enforce VLAN boundaries, apply QoS (QoS), and support inter-VLAN routing to allow controlled communication between segments.

Key characteristics of VLANs include:

  • Segmentation: separates network traffic into logical groups.
  • Traffic control: improves performance and reduces congestion.
  • Security: limits access to sensitive resources.
  • Scalability: easy to reconfigure without changing physical cabling.
  • Inter-VLAN communication: controlled via routers or Layer 3 switches.

In practical workflows, network engineers configure VLANs on switches to isolate departments, separate guest Wi-Fi traffic, or prioritize critical applications, ensuring efficient and secure network operation.

Conceptually, a VLAN is like having separate rooms in an open office: everyone shares the same building but works in isolated, well-defined spaces.

Intuition anchor: VLAN organizes networks logically, giving control and security without extra hardware.

See Switch, QoS, LAN, Router, IP.