Hi Folks,
I've just been reading a shocking story on R-devel (the developers' mailing list for the R statistical programming language). Quote:
"Over on R-help, the old problem of floating point precision has come up again (see my example below, where calling RSiteSearch can change the results of the var() function).
The problem here is that on Windows many DLLs set the precision of the fpu to 53 bit mantissas, whereas R normally uses 64 bit mantissas. (Some Microsoft docs refer to these as 64 bit and 80 bit precision respectively, because they count the sign and exponent bits too).
When R calls out to the system, if one of these DLLs gets control, it may change the precision and not change it back. This can happen for example in calls to display a file dialog or anything else where a DLL can set a hook; it's very hard to predict."
In other words (if I understand aright), certain programs or processes can change the way the CPU functions when doing floating point arithmetic with the result that other programs will subsequently operate in a different FP environment and get different answers.
I'd like to think that Linux was immune to this kind of thing, but can't see an a priori refutation of the possibility.
What are the informed views on such a question?
Best wishes to all, Ted.
-------------------------------------------------------------------- E-Mail: (Ted Harding) Ted.Harding@nessie.mcc.ac.uk Fax-to-email: +44 (0)870 094 0861 Date: 18-Feb-06 Time: 16:12:56 ------------------------------ XFMail ------------------------------