I pretty much agree with all that. With a few niggles of course...
While it is important for a language to be expressive to the CPU, other aspects such as readability by the programmer and productivity are also important.
That is what I said here a page or so back.
...the path to digital liberation is unlikely a second curricular reform that replaces Scratch and Python with toggling in machine code using switches and blinking lights,
Actually, an eye opening part of my early programming experience was almost exactly that. We had the processor but we did not have any software, not even an assembler, just the chips. Motorola 6809 as it happened. The first problem was to build a computer out of that adding RAM, ROM, serial port etc on a wire wrap board. Then the programming... nothing for it but to write down the assembler instructions on paper then assemble them manually into hexadecimal then load that into the EPROM programmer via paper tape. Except we wrote the program we wanted in a pseudo code that looked like ALGOL 69 first, then compiled that manually on paper into assembler.
Before long we had a debug monitor up and running on that board, including a program loader from cassette tape, commands to display/set memory and registers, run the code, single step, etc.
I think everyone serious about programming should have that experience!
The programmer who uses them is unable to create new algorithms of any significant complexity and must instead rely on built-in features and subroutine libraries.
Kind of true except many algorithms invented to speed things up have been invented by people who don't even program. They are called mathematicians. For example the Fast Fourier Transform was first discovered by Gauss. Famously Tukey also had the FFT idea in the 1960's, but it was down to Cooley to turn that algorithm into a computer program. At that point it would not have mattered if Cooley wrote the FFT in assembler or slow interpreted BASIC, it would still have been faster than doing it the way people did.
No matter what you know about programming or how brilliant you are at it if you don't have the maths chops you ain't going to find it.
In my opinion, in order for the second age of personal computing to be successful, the effort spent learning to program must liberate a person from digital servitude. This means the language chosen must be expressive enough.
Lack of expressivity may be a reason to avoid Basic.
Dijkstra (and I) agree with you.