n. A hole in the security of a system deliberately left in place by designers or maintainers. The motivation for this is not always sinister; some operating systems, for example, come out of the box with privileged accounts intended for use by field service technicians or the vendor's maintenance programmers.
Historically, back doors have often lurked in systems longer than anyone expected or planned, and a few have become widely known.
Ken Thompson's 1983 Turing Award lecture to the ACM revealed the existence of a back door in early UNIX versions that may have qualified as the most fiendishly clever security hack of all time. The C compiler contained code that would recognize when the 'login' command was being recompiled and insert some code recognizing a password chosen by Thompson, giving him entry to the system whether or not an account had been created for him.
Normally such a back door could be removed by removing it from the source code for the compiler and recompiling the compiler. But to recompile the compiler, you have to *use* the compiler -- so Thompson also arranged that the compiler would *recognize when it was compiling a version of itself*, and insert into the recompiled compiler the code to insert into the recompiled 'login' the code to allow Thompson entry -- and, of course, the code to recognize itself and do the whole thing again the next time around! And having done this once, he was then able to recompile the compiler from the original sources, leaving his back door in place and active but with no trace in the sources.
The talk that revealed this truly moby hack was published as "Reflections on Trusting Trust", 'Communications of the ACM 27', 8 (August 1984), pg. 761-763.
Syn. trap door; may also be called a 'wormhole'.