I'm aware of _FILE_OFFSET_BITS. However, the default integer type is 32 bits on 32-bit platform. Care can be used to get around this to create code that works for both 32-bit and 64-bit architectures. As the majority of software development is done on 64-bit Intel these days, it is possible to find code that used to work on 32-bit and no longer does. Same with little versus big endian and ASCII versus EBCDIC versus UTF8 coding. Updates to code that was carefully designed to work on both no longer does.jahboater wrote:I believe if you set _FILE_OFFSET_BITS to 64, all file handling will be 64 bits wide even on 32 bit platforms; it is the default on 64 bits.
http://www.gnu.org/software/libc/manual ... acros.html
In a quick and unscientific comparison - a program on aarch64 was around 10% larger executable size and 10% faster compared to 32 bit ARM.
Some Linux configurations by default assume the terminal used for the command line understands UTF8. The result is that gcc compiler emits UTF8 encoded diagnostics which become unreadable on older terminals. There is an advantage in keeping up with fashion. Right now the fashion is 64-bit, little endian and UTF8.