Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Is that really the case though when the differences in computation would be measured in microseconds, but the network noise would be in the order of milliseconds?



I don’t know about that... in the paper the client and server are on the same network. It would be very interesting to repeat this study using faster processors (which will make this signal smaller) and over the public internet (making the noise bigger).


This is why constant time functions are used in cryptographic implementations, even over the network.

These are called timing attacks and they're less common now because professional cryptographers know how to deal with it. But this is very much a perfect example of it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: