Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> key sizes we'd have thought totally reasonable in 1995 are totally inadequate today.

Not really. The last big jump in complexity was in 1991, when the first 512-bit modulus, 2^512+1, was factored by NFS [1]. What has progressed the most, by far, has been the available computational power.

[1] http://www.ams.org/journals/mcom/1993-61-203/S0025-5718-1993...



The last big theoretical jump in complexity was NFS, but is your argument that implementations of NFS haven't much improved, apart from raw computational power of the same sort available to general purpose computers (ie: non-vectorized instruction sets)?

You know much more than I do about this stuff, but I think I'm prepared to challenge you on this. :)


NFS implementations have certainly improved, and certain sub-algorithms as well, e.g. lattice sieving and polynomial selection. But these things don't significantly tilt the exponent you expect in a factorization. As in, you expected around 2^80 "operations" for RSA-1024 back in 1995, and that's still roughly what you expect today.

The practical NFS improvements since then might bring that down to 2^78 or even 2^72---and that's a lot of money saved---but when you're thinking of which key size to use it doesn't matter nearly as much as how 2^80 computation is something that is reasonably practical today, but was unthinkable in 1995.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: