<computer programming> /big'nuhm/ (Originally from MIT MacLISP) A multiple-precision computer representation for very large integers.

Most computer languages provide a type of data called "integer", but such computer integers are usually limited in size; usually they must be smaller than 2^31 (2,147,483,648) or (on a bitty box) 2^15 (32,768). If you want to work with numbers larger than that, you have to use floating-point numbers, which are usually accurate to only six or seven decimal places. Computer languages that provide bignums can perform exact calculations on very large numbers, such as 1000! (the factorial of 1000, which is 1000 times 999 times 998 times ... times 2 times 1).

(01 Feb 1996)

biglycan, Bignami, Amico, bignonia, bignoniaceous < Prev | Next > bigot, Big Red Switch, Big Room

Bookmark with: icon icon icon icon iconword visualiser Go and visit our forums Community Forums