Hello, In preparation for a talk I'm going to give, I wanted to estimate how good 128 bits MD5 hashes were: how many hashes must be taken before the probability for a collision become non negligible? (I'm assuming equi-probability of every hash.) The brute force approach is simply to enumerate the non-collision probability for k different hashes, and compute until it becomes lower than 1. This probability is (writing N for 2 ^ 128): N * (N-1) * (N - 2) * ... * (N - k) --------------------------------------- N^k I tried computing this using the bignum library that comes with OCaml, and it slows down to a crawl very fast (for k ~ 1000). So I tried to be more subtle and approximate the result (using Stirling's approximation of factorials), but OCaml's floats are not precise enough to yield anything significant. (I'm trying to compute the log of the approximation of N! / (N^k * (N-k)!), which is N (ln N) - N - (k (ln N) + (N - k)(ln (N - k)) - (N - k)).) Is there a library with better floating point precision than the OCaml one? Thanks, Alan