[orig. from MIT MacLISP]
n. 1. [techspeak] A multiple-precision computer representation for very large integers. More generally, any very large number. "Have you ever looked at the United States Budget? There's bignums for you!"
2. [Stanford] In backgammon, large numbers on the dice are called 'bignums', especially a roll of double fives or double sixes (compare moby, sense 4).
See also El Camino Bignum.
Sense 1 may require some explanation. Most computer languages provide a kind of data called 'integer', but such computer integers are usually very limited in size; usually they must be smaller than than 231 (2,147,483,648) or (on a losing bitty box) 215 (32,768). If you want to work with numbers larger than that, you have to use floating-point numbers, which are usually accurate to only six or seven decimal places. Computer languages that provide bignums can perform exact calculations on very large numbers, such as 1000! (the factorial of 1000, which is 1000 times 999 times 998 times ... times 2 times 1). For example, this value for 1000! was computed by the MacLISP system using bignums: