I think we are largely in agreement there. Perhaps with a differing view as to what to do about the problems, or even accepting that they are problems...
Not C++ !!!!!!! C only.
Amen brother. As I have often said, I don't believe there is even one human that knows and understands all the details of the C++ spec, let alone how all those parts work together. I have seen even Stroustrup falter at understanding why a ten line function did not work as one might expect.
No, its supposed to do what the standard says, if not its a bug
But there in lies the problem. The standard does not say. In fact if you take the standard at face value everything is undefined behavior!
For example a simple function signature like so:
Now tell me:
1) Is that parameter a pointer to a single character (Which is what it looks like to be honest) ?
2) Is that character a signed or unsigned value?
3) Or is it a pointer to a bunch of consecutive characters?
4) Is that bunch of consecutive characters C string with a null termination?
5) Or is it just an array of bytes?
6) Or is it a null pointer?
7) Or is it some random pointer, perhaps from somewhere uninitialized?
8) Is this function going to modify that bunch of characters?
9) Is that bunch of characters somewhere on the stack or on the heap?
10) If the latter, is this function supposed to free() that bunch of characters or leave it to the caller?
11) What happens if this function does a realloc on the memory pointed at by that pointer? How many other parts of the code have that old pointer value and will get broken?
12) If it's an array of characters how long is it?
13) Don't get me started on the return value.
In one line of code we have a ton of undefined stuff. Yes I know, using modern day types like uint8_t helps, and yes use of const helps, but basically you need to be checking the documentation at every step here to be sure what is going on.
The thing is, you talk of "myriads" of bugs in the Linux kernel, and yet I have been using Linux for decades and never see them.
I have also been using Linux since 1996 and I am constantly amazed that I don't see them either. That does not mean they are not there. There have been plenty of security issues raised on the kernel. There is a lot of people working to preemptively find all these undefined behaviors before the bad guys do:
Making C Less Dangerous in the Linux kernel: https://www.youtube.com/watch?v=FY9SbqTO5GQ&t=2361s
Does making the kernel harder make making the kernel harder?: https://www.youtube.com/watch?v=Gtjy7pWjW9M
It's much the same situation in a any large code base. Even the small ones I have worked on.
I'm not against progress, and I may take another look at rust one day. But coding is always hard to get right, without additionally having to worry about the growing pains of a new language.
I'm with you there. It's not a trivial matter to move from an old thing that you know all the problems with to a new thing that may have a bunch other problems you don't know about.
for( int i = 0; i < 100; i += 2 )" whats the point in checking the += 2 addition for overflow?
Quite so. Sometimes the possibility of overflow can be deduced from the source code at compile time.
Languages like Ada and Rust try to constrain the syntax and semantics to make it even more possible, such that actual run time checks are not needed.
Memory in C++ is a leaky abstraction .