sal55
Posts: 63
Joined: Sat Sep 21, 2019 7:15 pm

Re: The Rust debate.

Thu Oct 03, 2019 5:02 pm

jahboater wrote:
Thu Oct 03, 2019 3:27 pm
plugwash wrote:
Thu Oct 03, 2019 3:06 pm
jahboater wrote:
Thu Oct 03, 2019 2:52 pm
I thought all unsigned arithmetic was defined.
The problem is if int is larger than 16 bits then the code snippet isn't unsigned arithmetic, it's conversion to int, followed by signed arithmetic, followed by conversion back to uint16_t .
OK, if the original operand fits in a signed int then that's what it is promoted to.
Even though it would also fit in an unsigned int which would be more sensible!
Rust has its own odd behaviours:

Code: Select all

    let a:u8=255;
    let b:u8=1;
Here, a+b and a+1 both give a result of 0, even if assigning the result to a wider variable. In C, everything is promoted to 'int' (typically 32 bits), the result is 256 if using it at u16 or wider, or 0 if stored back into u8. But then, you get the same problem adding 0xFFFFFFFF and 1, if wanting to use a 64-bit result of 0x100000000.

(This in Rust release mode; in debug mode; it will fail with an overflow error.)

It seems to me that this makes Rust lower level than C, as well as much stricter, as every expression will be of exactly 8, 16, 32 or 64 bits, and both operands of a binary op will always be the same width.

The type of a literal constant like 1 seems to be adjusted to that of the other operand, so can be u8, u32 or u16, unless both are constants, when the wider type is used. However, expressions such as 2000000000+2000000000 or 1<<62 overflow; you have to use 1u64<<62, as the width of a literal seems to be capped at u32.

So still a little messy. (My own languages are 64-bits, since I thought all hardware was now, and those cans are kicked down the road far enough they will give the expected results on all these examples without needing to do anything special.)

Getting back to C: if you were to draw up a chart of the 64 combinations of i8/i16/i32/i64 and u8/u16/u32/u64, as to whether the binary operation is done as signed or unsigned, then the results will not have the regular pattern that you might expect, partly due to the discontuity between 32-bit and 64-bit types. I think Rust solves this at least, by not allowing such mixed arithmetic!
It wont be undefined behavior for very long. Two's complement is now deemed to be universal.
Too many high-end C compilers rely on undefined behaviour for them to be able to do their optimisations.

User avatar
John_Spikowski
Posts: 1614
Joined: Wed Apr 03, 2019 5:53 pm
Location: Anacortes, WA USA
Contact: Website Twitter

Re: The Rust debate.

Thu Oct 03, 2019 5:13 pm

For me the two biggest reasons I use C is portability and the extensive code / library base that comes standard with Linux essential development tools.

User avatar
paddyg
Posts: 2588
Joined: Sat Jan 28, 2012 11:57 am
Location: UK

Re: The Rust debate.

Thu Oct 03, 2019 5:17 pm

@sal55, Rust might have a slight inconstancy there but

Code: Select all

let a:u64 = 1 << 62;
works OK on my laptop
also https://groups.google.com/forum/?hl=en-GB&fromgroups=#!forum/pi3d

User avatar
jahboater
Posts: 6286
Joined: Wed Feb 04, 2015 6:38 pm
Location: Wonderful West Dorset

Re: The Rust debate.

Thu Oct 03, 2019 5:19 pm

sal55 wrote:
Thu Oct 03, 2019 5:02 pm
Too many high-end C compilers rely on undefined behaviour for them to be able to do their optimisations.
What happens is this:
Undefined behavior cannot happen in a correct C program.
Therefore if a compiler can detect the UB, its free to delete the code.
This is commonly seen with badly designed a-priori checks for signed overflow.

if( a + b < a ) printf("overflow!");

The compiler will simply delete the whole thing, since a + b can never be less than a in correct program.

User avatar
John_Spikowski
Posts: 1614
Joined: Wed Apr 03, 2019 5:53 pm
Location: Anacortes, WA USA
Contact: Website Twitter

Re: The Rust debate.

Thu Oct 03, 2019 5:33 pm

Sure it can. A and B being negative numbers then added then compared.

Heater
Posts: 16844
Joined: Tue Jul 17, 2012 3:02 pm

Re: The Rust debate.

Thu Oct 03, 2019 5:45 pm

sal55,
Rust has its own odd behaviours:
May be. But not the one you are incorrectly pointing out below.
Here, a+b and a+1 both give a result of 0, even if assigning the result to a wider variable.
No, they don't. And you can't assign the result to a wider variable:

Code: Select all

    let a: u8 = 255;
    let b: u8 = 1;
    let c: u32 = a + b;
    print!("{}", c);
Results in:

Code: Select all

$ cargo run --release
   Compiling test v0.1.0 (/home/pi/test)
error[E0308]: mismatched types
   --> src/main.rs:102:18
    |
102 |     let c: u32 = a + b;
    |                  ^^^^^
    |                  |
    |                  expected u32, found u8
    |                  help: you can convert an `u8` to `u32`: `(a + b).into()`

error: aborting due to previous error
But you can do this:

Code: Select all

    let a: u8 = 255;
    let b: u8 = 1;
    let c: u32 = (a + b).into();        // Gives 0    
    println!("{}", c);
    let c: u32 = a as u32 + b as u32;   // Gives 256    
    println!("{}", c);
You just have to say what you want.
(This in Rust release mode; in debug mode; it will fail with an overflow error.)
You can have overflow checking in any build mode. Configure it in Cargo.toml or on the rustc command options:

Code: Select all

[profile.release]
overflow-checks = false
So still a little messy.
Not messy at all. As you see.
Too many high-end C compilers rely on undefined behaviour for them to be able to do their optimisations.
I don't believe so. One is expected not to use UB.

But if one is using UB the optimizer, is of course at liberty to do what it likes, like deleting all your code and returning any random numbers, crashing your program or whatever.

If you are lucky the compiler will warn you of UBs. If it can detect them.
Memory in C++ is a leaky abstraction .

sal55
Posts: 63
Joined: Sat Sep 21, 2019 7:15 pm

Re: The Rust debate.

Thu Oct 03, 2019 6:07 pm

paddyg wrote:
Thu Oct 03, 2019 5:17 pm
@sal55, Rust might have a slight inconstancy there but

Code: Select all

let a:u64 = 1 << 62;
works OK on my laptop
I used:

Code: Select all

println!("{}"), 1<<62);
for my examples, to avoid influencing the result.

sal55
Posts: 63
Joined: Sat Sep 21, 2019 7:15 pm

Re: The Rust debate.

Thu Oct 03, 2019 6:27 pm

Heater wrote:
Thu Oct 03, 2019 5:45 pm
Here, a+b and a+1 both give a result of 0, even if assigning the result to a wider variable.
No, they don't.

Code: Select all

fn main() {
    let a=255 as u8;
    let b=1 as u8;
    let c=255 as u16;
    let d=1 as u16;

    println!("{}",a+b);
    println!("{}",c+d);
}
This gives results of 0 and 256, even though both calculations are 255+1.
But you can do this:

Code: Select all

    let a: u8 = 255;
    let b: u8 = 1;
    let c: u32 = (a + b).into();        // Gives 0    
    println!("{}", c);
    let c: u32 = a as u32 + b as u32;   // Gives 256    
    println!("{}", c);
You just have to say what you want.

Not messy at all. As you see.
Well, it is a bit messy and fiddly. It used to be just in assembly that you had to specify one of ADD AL,BL, ADD AX,BL, ADD EAX,EBX, or ADD RAX,EBX; you don't expect that in HLL!

You don't want to worry about intermediate overflows, not until a result has to fit into a specific width (or you are overflowing the machine word size anyway).

ETA: just noticed a couple of anti-patterns in your example, first, the re-use of the 'c' identifier (apparently the same identifier can be re-used in the same scope for different purposes and even with different types).

Second, this one:

Code: Select all

   let c: T = a as T + b as T;
Notice the type T occurs 3 times. If the first T is changed here, the others must be changed too, with possible changes of behaviour. (I'm interested in language design; I notice such things. Not that I will ever use Rust as it seems like trying to code with both hands tied behind your back, even if compilation was blazing fast.)

User avatar
paddyg
Posts: 2588
Joined: Sat Jan 28, 2012 11:57 am
Location: UK

Re: The Rust debate.

Thu Oct 03, 2019 7:39 pm

sal55 there are things about rust that are annoying and hard to get used to, but the definition and casting of integers is not one of them. Basically you get exactly what you ask for explicitly without any compiler deciding what it thinks you want. You could also do

Code: Select all

let c = a as u32 + b as u32;
or 
let c = (u + b) as u32;
but you can't do things like
let c = a as u32 + b;
let c:u64 = a as u32 + b;
also https://groups.google.com/forum/?hl=en-GB&fromgroups=#!forum/pi3d

sal55
Posts: 63
Joined: Sat Sep 21, 2019 7:15 pm

Re: The Rust debate.

Thu Oct 03, 2019 8:36 pm

paddyg wrote:
Thu Oct 03, 2019 7:39 pm
sal55 there are things about rust that are annoying and hard to get used to, but the definition and casting of integers is not one of them. Basically you get exactly what you ask for explicitly without any compiler deciding what it thinks you want. You could also do

Code: Select all

let c = a as u32 + b as u32;
or 
let c = (u + b) as u32;
but you can't do things like
let c = a as u32 + b;
let c:u64 = a as u32 + b;
Well, it's a rather micro-managed approach. I think I would agree with some of it when it comes to mixing signed and unsigned values, as it can be difficult to know what is intended. But even 2-3 can give an illogical result when using purely unsigned; it will either overflow, or wrap, and wrap with different results depending on the widths involved.

IMO there is too much preoccupation with these small types. Normally individual variables, parameters etc won't be 8 or 16 bits; they would be a machine word: 32 or more commonly now 64 bits. So intermediate calculations will be done at that width (like C works; so ahead of its time in some ways except its int is stuck at 32 bits). You'd only use narrow types for storage purposes: inside structs and arrays to save memory and bandwidth, or to match the layout of some struct for interfacing.

So I'm not sure I agree with how Rust does things.
@sal55 I noticed nim on your comparison sheet - have you done anything serious with that?)
No, I was only looking at it for good ideas to steal. But there is a LOT in there; far too much I think. It looks like they wanted to cram in everything they could possibly think of.

User avatar
paddyg
Posts: 2588
Joined: Sat Jan 28, 2012 11:57 am
Location: UK

Re: The Rust debate.

Thu Oct 03, 2019 9:59 pm

On occasion it does seem a bit micro-managed, as you say. But often, it's a matter of choosing (or later revising) the original types so casting isn't needed everywhere. The u8 is essentially byte arrays so they crop up a lot (along with all the string and unicode char stuff the ugliness of which was commented on earlier in this thread) u16 was chosen as the element array buffer unit in OpenGLES2 Presumably when the spec was introduced phone GPUs couldn't cope with very many vertices and didn't have much memory. There are probably other things like that which, ten years down the line, sit in the pockets of billions of people so have to be catered for by programming languages.
also https://groups.google.com/forum/?hl=en-GB&fromgroups=#!forum/pi3d

Heater
Posts: 16844
Joined: Tue Jul 17, 2012 3:02 pm

Re: The Rust debate.

Thu Oct 03, 2019 10:23 pm

sal55,
This gives results of 0 and 256, even though both calculations are 255+1.
Seems correct and expected to me.

In the first case you are dealing with 8 bit numbers and 256 is not even a possible result. 0 is the only possible result provided you have overflow checks off. Or would you prefer just any random number for UB?

In the second case you are dealing with 16 bit numbers and 256 is presumably what you expect.
Well, it is a bit messy and fiddly. It used to be just in assembly that you had to specify one of ADD AL,BL, ADD AX,BL, ADD EAX,EBX, or ADD RAX,EBX; you don't expect that in HLL!
I have to disagree. If you are going to ignore the type specifications, as you seem to be suggesting, then there is no point in having those type specifications, do what Javascript does and just have numbers.

As it happens there are very good reasons for having such types, even in a high level languages.
ETA: just noticed a couple of anti-patterns in your example, first, the re-use of the 'c' identifier (apparently the same identifier can be re-used in the same scope for different purposes and even with different types).
Now there you have a point. When I noticed that "shadowing" of variables was possible in Rust I was very surprised. I even asked about it on the Rust language user forum.

The upshot is that such shadowing is not a problem in Rust, thanks to the strict type mechanisms, and in fact has advantages of it's own.

I don't much like the "pattern" and positively hate "anti-pattern". As some famous software guru once said "All software patterns are just rules to follow to make sure you don't get tripped up by a deficiency of your programming language." The opposite of "pattern" is "chaos". Let's just say that the variable name shadowing is a different pattern, that works on Rust because of it's other features.
Notice the type T occurs 3 times. If the first T is changed here, the others must be changed too, with possible changes of behaviour.
Which is what we would expect I hope.
I'm interested in language design; I notice such things.
I'm no language guru but also find language design and the reasoning behind language feature choice interesting.
Not that I will ever use Rust as it seems like trying to code with both hands tied behind your back.
That is exactly what it is. Always there to stop you writing bad code. Although it is giving you power rather than taking it away. The power to create memory and thread safe code. The power to go in and refactor/modify/extend large programs and be sure you have not accidentally broken anything in that respect.
IMO there is too much preoccupation with these small types. Normally individual variables, parameters etc won't be 8 or 16 bits; they would be a machine word: 32 or more commonly now 64 bits.
No.

In my world having to deal with bits, bytes and whatever is an everyday occurrence. One of my first Rust programs was a decoder of messages in a proprietary format. It involves pulling bit fields out of messages and assembling them into meaningful numbers and structures. Rust's small types and strict checking were very useful there.

These small types are also essential for anyone dealing with hardware.

Even without any of that many people want to minimize storage space and or bandwidth by using smaller types.

Rust is a systems programming language, hence those small number types are essential.

For those who don't need that there is always Python or Javascript.
Memory in C++ is a leaky abstraction .

sal55
Posts: 63
Joined: Sat Sep 21, 2019 7:15 pm

Re: The Rust debate.

Fri Oct 04, 2019 12:29 am

Heater wrote:
Thu Oct 03, 2019 10:23 pm
IMO there is too much preoccupation with these small types. Normally individual variables, parameters etc won't be 8 or 16 bits; they would be a machine word: 32 or more commonly now 64 bits.
No.

In my world having to deal with bits, bytes and whatever is an everyday occurrence. One of my first Rust programs was a decoder of messages in a proprietary format. It involves pulling bit fields out of messages and assembling them into meaningful numbers and structures. Rust's small types and strict checking were very useful there.

These small types are also essential for anyone dealing with hardware.

Even without any of that many people want to minimize storage space and or bandwidth by using smaller types.

Rust is a systems programming language, hence those small number types are essential.
I'm not sure you get what I'm saying. I'm not doing away with narrow types, just saying use them for memory storage. You don't also need that narrow processing inside the CPU registers or ALU.

For example, my own dynamic language, where execution normally deals with boxed variant values, and where integer calculations are done at at least 64 bits wide, also has packed arrays. Not only of 8, 16, 32 and 64 bits, but also of 1, 2 and 4 bits.

If I take a 2-bit array where A[1]=2 and A[2]=2 (values can only be 0, 1, 2, 3), then the following:

Code: Select all

println A[1] + A[2]
will print 4. But following the same logic as Rust uses, it should display 0 (4 modulo 4), which would be crazy. Who cares what narrow packed location those numbers came from, 2+2 is 4. If I was going to store the result back into memory at A[3], then I'd need to be aware the value would be truncated (and if this was anything like Rust, if might report an error).

Otherwise, for any other purpose: displaying, assigning to any variable, passing to a function, performing further calculations, then its provenance is irrelevant. This is for a somewhat higher level language, but even C uses this approach! (There it uses an 'int' as the default arithmetic width, usually 32 bits.)

Similarly, if you took the binary value 1011, and added up the 4 individual bits, you would get 3, not 1, because the result doesn't need to be 1 bit wide too!

Actually, I'm not even sure the ARM architecture directly supports arithmetic operations on values of arbitrary width (I think operands can be loaded or stored at any width).
Last edited by sal55 on Fri Oct 04, 2019 12:41 am, edited 2 times in total.

Heater
Posts: 16844
Joined: Tue Jul 17, 2012 3:02 pm

Re: The Rust debate.

Fri Oct 04, 2019 12:32 am

jcyr,
The nanny language... I'd rather take responsibility for my bad code
I'm not sure how much you are joking there but that is a position expressed by many programmers. As seen in this thread and elsewhere. I get the feeling it especially comes from "old hands" who have spend years mastering their craft and are firmly convinced of their superior skills.

Trust me, I have much the same feeling.

I find it a bit paradoxical though. One might expect that those with such experience would be acutely aware of their fallibility and propensity for error. That they would be very welcoming of all the help they can get ensuring things are correct.

But really, do you really take responsibility for your bad code? If you create a buffer overrun in code that eventually causes grief and expense for your customers or results in a security breach in sensitive systems does that come out of your salary? Is there an investigation, do you get fired? Contrast to doctors making mistakes say. I bet you don't.

A the end of the day of does not matter what bad code you create for yourself. It's end users who suffer. We should be ashamed of ourselves for shipping such shoddy goods.
Memory in C++ is a leaky abstraction .

sal55
Posts: 63
Joined: Sat Sep 21, 2019 7:15 pm

Re: The Rust debate.

Fri Oct 04, 2019 12:41 am

Heater wrote:
Fri Oct 04, 2019 12:32 am
jcyr,
The nanny language... I'd rather take responsibility for my bad code
I'm not sure how much you are joking there but that is a position expressed by many programmers. As seen in this thread and elsewhere. I get the feeling it especially comes from "old hands" who have spend years mastering their craft and are firmly convinced of their superior skills.

Trust me, I have much the same feeling.

I find it a bit paradoxical though. One might expect that those with such experience would be acutely aware of their fallibility and propensity for error. That they would be very welcoming of all the help they can get ensuring things are correct.

But really, do you really take responsibility for your bad code? If you create a buffer overrun in code that eventually causes grief and expense for your customers or results in a security breach in sensitive systems does that come out of your salary? Is there an investigation, do you get fired? Contrast to doctors making mistakes say. I bet you don't.

A the end of the day of does not matter what bad code you create for yourself. It's end users who suffer. We should be ashamed of ourselves for shipping such shoddy goods.
I think everyone would like important, mission-critical software to be written in languages like Ada and Rust. But they want them written by other people!

Lots of other software isn't so critical. There will be bugs and logic errors, but those aren't going to be fixed by wasting time baby-sitting every little arithmetic operation. Having the overflow traps in place ought to be enough, if something is not meant to overflow.

User avatar
jahboater
Posts: 6286
Joined: Wed Feb 04, 2015 6:38 pm
Location: Wonderful West Dorset

Re: The Rust debate.

Fri Oct 04, 2019 1:50 am

sal55 wrote:
Fri Oct 04, 2019 12:29 am
Actually, I'm not even sure the ARM architecture directly supports arithmetic operations on values of arbitrary width (I think operands can be loaded or stored at any width).
Yes.
ARM can only do arithmetic on full 32-bit registers or in 64-bit mode, 64-bit registers also.
You can load and store 8, 16, 32, or 64 bits.

I think its CPU's like ARM that caused the C designers to adopt this "promote all small types to int" before doing arithmetic. C arithmetic and ARM CPU's are very well matched.

Intel x86 can properly do arithmetic on 8, 16, 32, or 64 bit numbers, including long multiply and division, and correctly setting all the flags, even the overflow flag.

User avatar
jahboater
Posts: 6286
Joined: Wed Feb 04, 2015 6:38 pm
Location: Wonderful West Dorset

Re: The Rust debate.

Fri Oct 04, 2019 2:02 am

Heater wrote:
Fri Oct 04, 2019 12:32 am
I find it a bit paradoxical though. One might expect that those with such experience would be acutely aware of their fallibility and propensity for error. That they would be very welcoming of all the help they can get ensuring things are correct.
Exactly.
That's why we use tools like valgrind, or the sanitizers, or -ftrapv, or asserts(), all time to check our code at run time, and any kind of static checking available at compile time - even if its just maximum warning levels.

I suspect that valgrind does a better job than Rust at run-time error detection (yes I know its slow).

Having said that I cannot remember the last time I had a buffer overrun or signed integer overflow for that matter, probably a decade or more ago.

Of course testing with valgrind doesn't cover all possible cases and customers always do unexpected things with your program. For them, code in Rust and specify the minimum hardware requirements to be powerful enough to cope with the hideous bloated code than Rust seems to produce :)
Last edited by jahboater on Fri Oct 04, 2019 2:22 am, edited 1 time in total.

Heater
Posts: 16844
Joined: Tue Jul 17, 2012 3:02 pm

Re: The Rust debate.

Fri Oct 04, 2019 5:14 am

jcyr,
...but bugs will always creep in anyway....
No doubt true.
Firefox on Windows sometimes fails to shut down properly despite being written in Rust.
Firefox is not written in Rust.

Very recently small parts of Firefox have been written in Rust, mostly as a means to make use of the parallel processing we can do on our multi-core machines today, like the Stylo CSS engine.
No language will ever detect logic errors (as in, the chosen algorithm is just plain wrong).
Very true. But the major motivation for the use of languages like Rust is to tackle those 60% of security vulnerabilities that are attributable to errors handling memory that would not have been possible if a memory safe language had been used.

Note: Approximate percentage figure taken from Microsoft and others: See:

We need a safer systems programming language: https://msrc-blog.microsoft.com/2019/07 ... -language/
Writing Linux Kernel Modules in Safe Rust: https://www.youtube.com/watch?v=RyY01fRyGhM
Memory in C++ is a leaky abstraction .

User avatar
John_Spikowski
Posts: 1614
Joined: Wed Apr 03, 2019 5:53 pm
Location: Anacortes, WA USA
Contact: Website Twitter

Re: The Rust debate.

Fri Oct 04, 2019 6:05 am

Wow!

That has to be a huge boost for Rust when Microsoft recommends it as the best memory safe kernel level programming language.

User avatar
PeterO
Posts: 5968
Joined: Sun Jul 22, 2012 4:14 pm

Re: The Rust debate.

Fri Oct 04, 2019 7:25 am

Heater wrote:
Fri Oct 04, 2019 12:32 am
I find it a bit paradoxical though. One might expect that those with such experience would be acutely aware of their fallibility and propensity for error. That they would be very welcoming of all the help they can get ensuring things are correct.
I think it might be an issue with "presentation" and other non-language-detail issues.

The designers of such tools and languages often make brash claims like "No more bugs" which we all know is impossible because not all bugs can be detected by the compiler+runtime, sometimes the algorithm is wrong or incorrectly implemented in perfectly valid code.

They claim to solve classes of problems that "old hands" may have never encountered, or for which they have already existing solutions.

They are often young "johnny-come-latelies" who can't possibly have enough experience to understand the real problems ;)

The new languages can take a "big bang" and "not invented here" approach rather than making incremental improvements to an existing language or retaining well known elements of syntax from older languages.

As with all these sorts of things, the good ones will flourish and the others will wilt and die away.

Right, I'm off to write some Algol-60 :lol:

PetrerO
Discoverer of the PI2 XENON DEATH FLASH!
Interests: C,Python,PIC,Electronics,Ham Radio (G0DZB),1960s British Computers.
"The primary requirement (as we've always seen in your examples) is that the code is readable. " Dougie Lawson

Heater
Posts: 16844
Joined: Tue Jul 17, 2012 3:02 pm

Re: The Rust debate.

Fri Oct 04, 2019 8:11 am

PeterO,

Those are good points.

There has always been a lot of snake oil in the software industry. Overrun by marketing as it often is. Hardly surprising, there have been huge fortunes made in the industry. One should always be aware of this.

To be sure Rust does not come from snake oil salesmen. Like the Linux kernel it was a hobby project of Graydon Hoare for many years before Mozilla and others got interested.
The new languages can take a "big bang" and "not invented here" approach rather than making incremental improvements to an existing language or retaining well known elements of syntax from older languages.
That is an interesting point.

Rust of course makes use of well known elements of syntax and semantics from much older languages, notably C (1972) and Haskell (1985).

Notably the poster child of "making incremental improvements to an existing language or retaining well known elements of syntax from older languages" is C++ which has notoriously become a complete mess of a language and totally failed to address the memory and thread safety concerns that Rust has targeted.

Fear not, Rust is enough like C to keep C programmers happy and importantly I can see it has ALGOL in it's genes :)

Speaking of which, I read that there is a lot of interest in Rust coming from Javascript and Python programmers. That's a bit surprising, one would not normally expect that demographic to be keen on getting into a systems programming language. It seems the fact that they cannot accidentally and silently create memory unsafe programs is very attractive for them.
Memory in C++ is a leaky abstraction .

hippy
Posts: 8543
Joined: Fri Sep 09, 2011 10:34 pm
Location: UK

Re: The Rust debate.

Fri Oct 04, 2019 11:53 am

Heater wrote:
Fri Oct 04, 2019 8:11 am
Speaking of which, I read that there is a lot of interest in Rust coming from Javascript and Python programmers. That's a bit surprising, one would not normally expect that demographic to be keen on getting into a systems programming language. It seems the fact that they cannot accidentally and silently create memory unsafe programs is very attractive for them.
At least for Python the interest appears to be in creating native extension modules in Rust rather than C, for the same reasons I guess anyone would suggest using Rust instead of C.

PyO3 (https://github.com/PyO3/pyo3) allows Rust to be used to create native extension modules for Python as well as allowing Rust to call Python programs.

I have my own utility to generate C interfaces for native extensions for Python from simple one line definitions where one then just has to add the actual C code and that could be extended to generate Rust interfaces as well. Similarly for SWIG and other tools which do that.

My utility also creates native C extensions for Script Basic so extending it that way would also allow native Rust extensions for Script Basic to be created.

I have added both to my ever growing 'list of things to do''.

sal55
Posts: 63
Joined: Sat Sep 21, 2019 7:15 pm

Re: The Rust debate.

Fri Oct 04, 2019 12:13 pm

jahboater wrote:
Fri Oct 04, 2019 1:50 am
sal55 wrote:
Fri Oct 04, 2019 12:29 am
Actually, I'm not even sure the ARM architecture directly supports arithmetic operations on values of arbitrary width (I think operands can be loaded or stored at any width).
Yes.
ARM can only do arithmetic on full 32-bit registers or in 64-bit mode, 64-bit registers also.
You can load and store 8, 16, 32, or 64 bits.

I think its CPU's like ARM that caused the C designers to adopt this "promote all small types to int" before doing arithmetic. C arithmetic and ARM CPU's are very well matched.
Yes, half a century ago (when C was devised) there would have been a greater variety in architectures, byte-addressable ones not as dominant as now. But also C was a derivative of B which in turn came from BCPL, which as I understand it, had only one type: a 'word'. So, perhaps no one really knows why C works as it does...
Intel x86 can properly do arithmetic on 8, 16, 32, or 64 bit numbers, including long multiply and division, and correctly setting all the flags, even the overflow flag.
My first exposure to programming was at college in the mid-70s on a machine with a 36-bit word size (and using Algol 60 which has been mentioned; IMO it is nothing like Rust...). However, when my 'real' work started it was on 8-bit processors. There, using 8-bit adds for 8-bit types, and 16-bit adds for 16-bit, made sense, as widening everything to 16-bit was too inefficient. (It still is, so I understand some C compilers for such devices bend the rules.)

However, I also used the approach that I would use the dominant type in a mixed-type operation, so u8+u16 would be done as u16. Both methods: u8+u8 done as u8, or both widened to u32 or u64, have their advantages. But I decided recently to use the latter.

First, because it gave consistent results for a+b (both u8 and containing 255 and 1), and a+1, with both yielding 256. (1 is a 64-bit constant so using the dominant rule, means a is widened.)

Second, not all processors are like x86, so that trying to emulate that behaviour - u8+u8 yielding u8 - could be costly (eg. needing to truncate a result).

User avatar
PeterO
Posts: 5968
Joined: Sun Jul 22, 2012 4:14 pm

Re: The Rust debate.

Fri Oct 04, 2019 12:22 pm

jcyr wrote:
Fri Oct 04, 2019 11:53 am
PeterO wrote:
Fri Oct 04, 2019 7:25 am
Right, I'm off to write some Algol-60 :lol:
Perhaps you've worked with a Burroughs B6700, a stack and descriptor based machine optimized to run Algol?
Some light reading/watching for you :-)
https://www.youtube.com/watch?v=Wa7KVU_e8U8
https://www.tnmoc.org/notes-from-the-mu ... -the-diode
http://www.billp.org/ccs/ElliottAlgol/
PeterO
Discoverer of the PI2 XENON DEATH FLASH!
Interests: C,Python,PIC,Electronics,Ham Radio (G0DZB),1960s British Computers.
"The primary requirement (as we've always seen in your examples) is that the code is readable. " Dougie Lawson

Heater
Posts: 16844
Joined: Tue Jul 17, 2012 3:02 pm

Re: The Rust debate.

Fri Oct 04, 2019 2:43 pm

sal55,
My first exposure to programming was at college in the mid-70s on a machine with a 36-bit word size (and using Algol 60 which has been mentioned; IMO it is nothing like Rust...).
My first exposure to a high level language, after some BASIC, was ALGOL in the mid-70's. Didn't have much time to get into it as I was supposed to be studying something else at the time. But when I look at Rust I can't help seeing ALGOL. I see: nested functions, recursive functions, structures, unions, arrays, structured conditionals and loops, case statements, sum types, pattern matching, everything-is-an-expression, modules... and importantly an emphasis on memory safety. Maybe it's my rose tinted classes or X-ray specs but that was all in ALGOL.

Admittedly the surface syntax is very different, leaning towards C style, but the ALGOL genes are definitely there.
However, when my 'real' work started it was on 8-bit processors. There, using 8-bit adds for 8-bit types, and 16-bit adds for 16-bit, made sense, as widening everything to 16-bit was too inefficient. (It still is, so I understand some C compilers for such devices bend the rules.)
We have trodden a similar path...

We should not get confused between what the programming language defines and what the hardware can or cannot do it. Perhaps the machine can do arithmetic on 8 bits or 16, or perhaps it can only do everything as 32 bits, whatever. If the language has 8 an 16 bit types and defines how operations on them work then how the generated instructions get that done is neither here nor there. Modulo performance of differences course.
Memory in C++ is a leaky abstraction .

Return to “Other programming languages”