Problem with using a hash or md5sum or similar is that the storage format has to be the same on all implementations...
How is that a problem?
Your program outputs a million decimal digits.
You direct then to a file.
You take the hash of the file.
Or you just pipe the program's output into the hash program
Code: Select all
$ ./fibo | md5sum
Please don't tell me you are using a machine that does not use ASCII and that would cause the hash to be wrong.
You are doing a good job at making me dislike Python, JS, etc even more. How is a huge number like this represented in a machine native calculable way? I am not aware of any 3.5 million bit CPU register that can do direct calculation.
True. The machine does not have the hardware to do arithmetic directly on billion digit numbers.
That does not mean to say it cannot be done in software and that software can't be built into programming languages.
A fail to see how that would cause you to not like Python or JS or anything else. Don't forget that programming languages often include many things that the machine has no concept of:
Would you dislike a language that provided 16 and 32 bit operations on an 8 bit machine?
Would you dislike a language that provided floating point maths on an an integer machine?
Would you hate a language with functions, subroutines, procedures on a machine that had no JMP, RET instructions or a stack?
Heck, BASIC is terrible. It has FOR loops. No machine has such loops built in.
And strings. WTF?