Nice essay ejolson.

With the creation of the interactive programming environment called BASIC by Kemény, Kurtz... For a moment the world was liberated from the servitude imposed by the owners of intellectual property. ... Then it ended.

Well said.

I too was introduced to computing as a young teenager with BASIC. Except it was via a teletype connected to a far away mainframe in 1972. Later I used ALGOL. By the time 8 bit computers arrived in the late 70's early 80's I was not using BASIC or buying home computers like the C64 etc. No, we programmed in a assembler. Often outlining our solution in a pseudo code that looked like ALGOL first.

In a large part though, you are right, BASIC and cheap 8 bitters kick started a generation of programmers and computer designers.

The next generation of PCs which entered the home were no longer personal nor programmable...Then came the Raspberry Pi.

This was very depressing to watch as it happened. With the IBM PC the diversity of machines we had dried up. New computer owners were "users", not programmers. The PC was no longer a computer but a commodity gadget that you use, like a TV or food mixer. I have referred to this as the dark ages of personal computing.

The Raspberry Pi was, as you say, exactly conceived to try and lighten up the dark times. To enable kids an others to experience the joys (and frustrations) of programming as we did back in the Golden Age.

Careful forensic analysis of the 8-bit computers and other relics from the golden age indicates the systems were programmed using a language called BASIC.

What you mean "forensic analysis" like: Turn them on and observe the BASIC prompt they immediately show:)

I think BASIC was a natural for early personal computers. It's immediate interactivity was a big help of course. But let's not forget that there were no high level language compilers that would run on the limited resources available. The only one I know and used is Intel's PL/M, but that required maxed out memory and disc drives. Besides Intel was not about to let anyone use it without paying a couple of thousand dollars. On such small machines there is not much you can do language wise. FORTH was clearly possible but that was not about to light any fires.

...Moreover, early prototypes could not run Linux but instead consisted of only a GPU running micro Python.

I presume you mean Raspberry Pi prototypes there. In which case I think not. I have read that the original Pi idea was built from AVRs, had no GPU. MicroPython did not exist then but I believe the idea was to use Python some how. Sorry I can't find a link to that.

The decision to promulgate the use of a slow interpreter intended for education rather than C led to the natural question, then why avoid BASIC? Why avoid BASIC is the focus of the current thread.

Ah, now we come to the crunch: Why avoid BASIC?

Hundred's or reasons have been given in this thread already, from lack of a standard to performance and so on. Recently we have been comparing language "expressiveness", whatever that means exactly.

If we look at the expression of the fast Fibonacci algorithm, which we have been using as our vehicle for comparison, in modern BASIC it looks like this:

Code: Select all

```
dim shared as bignum t1,t2,t3
sub fibowork(n as integer,a as bignum,b as bignum)
if n=0 then
a.n=0:b.n=1:b.d(1)=1
return
end if
fibowork(n\2,a,b)
if n mod 2=0 then
rem [a,b]=[a*(2*b-a),b*(2*b-a)-(-1)^k]
bigadd(t1,b,b):bigsub(t2,t1,a)
bigmul(t1,a,t2):bigmul(t3,b,t2)
if n mod 4=0 then bigdec(t3) else biginc(t3)
a=t1:b=t3
else
rem [a,b]=[a*(2*a+b)+(-1)^k,b*(2*a+b)]
bigadd(t1,a,a):bigadd(t2,t1,b)
bigmul(t1,b,t2):bigmul(t3,a,t2)
if n mod 4=1 then biginc(t3) else bigdec(t3)
a=t3:b=t1
end if
return
end sub
sub fibo(n as integer,b as bignum)
if n<2 then
b.n=1:b.d(1)=n
return
end if
static as bignum a
fibowork((n-1)\2,a,b)
if n mod 2=0 then
rem b=b*(2*a+b)
bigadd(t1,a,a):bigadd(t2,t1,b):bigmul(t1,b,t2)
b=t1
else
rem b=b*(2*b-a)-(-1)^k
bigadd(t1,b,b):bigsub(t2,t1,a):bigmul(t3,b,t2)
if n mod 4=1 then bigdec(t3) else biginc(t3)
b=t3
end if
return
```

We can compare with the expression of the same algorithm in popular Python:

Code: Select all

```
def fibo(n):
if n in fibs:
return fibs[n]
k = (n + 1) // 2
fk = fibo(k)
fk1 = fibo(k - 1)
if n & 1:
result = fk ** 2 + fk1 ** 2
else:
result = (2 * fk1 + fk) * fk
fibs[n] = result
return result
```

Similarly clear and succinct expressions of the algorithm are here in Javascript, Haskell and Scala.

Personally I think your wonderful rendition of big fibo in BASIC clearly demonstrates why BASIC is really poor in the expressiveness department. No contest.

It is an easily established fact that a programming language isn't much good unless it can be used to compute million-digit Fibonacci numbers. Therefore, a contest was devised by which to compare BASIC to other programming languages: Code the Karatsuba algorithm for big number arithmetic and use the doubling formula to compute the 4784969th Fibonacci number.

No. The "contest" as I originally stated the challenge was not about Karatsuba algorithm or any big number math operators. Karatsuba is not even mentioned. The challenge was simply to calculate the first Fibo number with a million digits.In the language of your choice.

DavidS then suggested we don't use any non-standard libraries as that is some kind of cheating. I agreed because that prevents people form simply calling GMP's fibo from C or whatever. That is were the need for DIY maths functions came from, hence Kartasuba. They are not part of the problem statement.

I have to be honest here, I was being a bit wicked when I posted that challenge. DavidS had for years been suggesting that BASIC and ARM assembler are very simple and efficient for the programmer. A position I could not agree with. I knew that this problem takes 10 minutes for a Python programmer would take a lot more work in BASIC or ARM assembler. Or ten minutes for those that use Scala, Haskell or Javascript. I was also convinced that the final solution would perform better in those languages.

So far my supposition has been confirmed true. Except, ejolson, you rather rocked my boat by producing not one but two versions in very different BASIC styles in what seemed like a very short time. I'm impressed.

After 38 pages of forum discussion, complete programs were written in various dialects of BASIC, C++, JavaScript and C among other languages and attempts. Having written some of these programs I wonder, why did the golden age of personal computing end? What role did BASIC play in its demise? What language is the right choice to begin a reformed age of personal computing? What needs to be done differently so the reformed age doesn't suddenly end?

Those are deep questions I will have to think on some more...

Memory in C++ is a leaky abstraction .