jahboater wrote: ↑
Fri Oct 04, 2019 5:18 pm
Honeywell mainframe by any chance? I used B on one of those (yes 36 bit "words" only).
C promotes smaller types to "int" before arithmetic. Int is whatever size is naturally best for the machine. I presume the original development PDP had type int as 16 bits, nowadays its normally 32-bits.
No, it was a DEC PDP10 (I understand it was popular in education). We used the usual Algol 60, Fortran, Cobol and Pascal (replacing Algol). I also used MACRO10, its assembler, to create my first compiler.
No sign of C however (not even on PDP11 which I also used). I didn't meet it many years later, after at least a decade of using my own systems language 'in-house'. (And I can't have been impressed as I carried on developing my own languages. Actually I have pretty much always done so, and still am.)
What I did hear later about C on PDP10 was that some used 9-bit bytes (4x9 bits per word). Wouldn't that be great now? (PDP10s were not byte-addressable, but some had byte packing and unpacking instructions, allowing 'byte' sizes of 1 to 36 bits. For packed text, we normally used 6x6 bits (ASCII subset) or 5x7 bits (ASCII) in each word.
And, yes, a 16-bit machine will likely have had 16-bit 'int' types, as would an 8-bit one actually. But while my 'int' has progressed from 16 to 32 to 64, most C's seem to have capped it at 32-bits.