/θrɛd/

noun — "smallest unit of execution within a process."

Thread is the basic unit of execution within a process, representing a single sequential flow of control that shares the process’s resources, such as memory, file descriptors, and global variables, while maintaining its own execution state, including program counter, registers, and stack. Threads allow a process to perform multiple operations concurrently within the same address space, enabling efficient utilization of CPU cores and responsiveness in multitasking applications.

Technically, a thread operates under the process context but maintains an independent call stack for local variables and function calls. Modern operating systems provide kernel-level threads, user-level threads, or a hybrid model, each with different trade-offs in scheduling, performance, and context-switching overhead. Kernel threads are managed directly by the OS scheduler, allowing true parallel execution on multi-core systems. User threads, managed by a runtime library, enable lightweight context switching but rely on the kernel for actual CPU scheduling.

Threads share the process’s heap and global data, which enables fast communication and data sharing. However, this shared access requires synchronization mechanisms, such as mutexes, semaphores, or condition variables, to prevent race conditions, deadlocks, or inconsistent data states. Proper synchronization ensures that multiple threads can cooperate safely without corrupting shared resources.

From an operational perspective, threads enhance performance and responsiveness. For example, a web server may create separate threads to handle individual client requests, allowing simultaneous processing without the overhead of creating separate processes. In GUI applications, threads can separate user interface updates from background computations to maintain responsiveness.

Example in Python using threading:


import threading

def worker():
    print("Thread is running")

# create a new thread
t = threading.Thread(target=worker)
t.start()
t.join()

Thread lifecycles typically include creation, ready state, running, waiting (blocked), and termination. Thread scheduling may be preemptive or cooperative, with priorities influencing execution order. In multi-core environments, multiple threads from the same process may execute simultaneously, maximizing throughput.

Conceptually, a thread is like a single worker within a larger team (the process). Each worker executes tasks independently while sharing common tools and resources, coordinated by the manager (the operating system) to prevent conflicts and optimize efficiency.

See Process, Scheduler, Mutex, Deadlock.