/pəˈræm.ɪ.tər/

noun — “the rules of engagement your functions follow when doing their job.”

Parameter is a variable used to pass information into functions, methods, or procedures in programming. It defines the input a function expects and allows for dynamic behavior based on the data provided. For example, in Python, you might define a function like def greet(name):, where name is a parameter that the function uses to customize its output.

Parameters can be required, optional, or have default values, enabling flexibility in how functions are called. They differ from arguments, which are the actual values passed to these parameters when a function is executed. For example, in greet("Alice"), `"Alice"` is the argument corresponding to the parameter name.

Parameters are crucial for modular, reusable code. They allow functions to operate on different inputs without rewriting the function logic. Many programming languages, including C, C++, Java, Python, and JavaScript, rely heavily on parameters to structure programs efficiently.

Parameter works closely with Args (Arg) in functions, scripts, and command-line operations. In CLI contexts, named options or flags often act as parameters for commands, while the actual inputs provided are the arguments. For example, tar -czvf archive.tar.gz folder/ uses flags and inputs that can be thought of as parameters shaping the behavior of the tar command.

Conceptually, a parameter is like a set of instructions for a vending machine — you select your drink (argument), and the machine delivers exactly that based on the rules (parameter) it expects.

Parameter is like giving your function a tiny note: “Do this this way, not that way, unless told otherwise.”

See Arg, Function, Command Line Interface, Shell Scripting, Automation.