I'm a senior docent at the Computer History Museum in Silicon Valley and I have a special tour I give to computing (both hardware and software engineering) students and faculty. I walk them through the exhibits to show examples where fault lines were artificially introduced into the timeline that set computing progress back years due to uncoordinated and thoughtless actions on the parts of some. One type of fault line has been the relentless push for hardware performance without a concomitant extension of software capabilities to fully take advantage of the hardware advances. One result was the marketing-driven "Megahertz wars" of the early-through-mid personal computing era, where CPU clock speed was given far too much attention over other system attributes. The tech media just made things worse by essentially parroting whatever the press releases stated without much in the way of analysis of what was being claimed. Worse yet, manufacturers tweaked their products to tune them for simplistic hardware benchmarks, which didn't reflect typical real-world usage of systems.
A lot of noise has been made by the Nerdocracy about the deficiencies of the ARM CPU, RAM limitations, Ethernet speed, USB port number and speed, lack of SATA, etc., on the Pi. As I've stated elsewhere in other threads, the most interesting part of the Pi isn't the ARM CPU, it's the VIdeoCore IV GPU and the Quake 3, Pi3D, and Wayland demos show what is possible when the GPU is exercised to its full extent. There's a lot of capability in the GPU that should be exploited, especially by students and educators so that they can be much better understand where computing has been heading over the years.
The purpose of the Pi is not to satiate the Nerdocracy's incessant demands for more, more, more of pretty much everything. That's what the commercial market suppliers are there for, and if you don't like their prices, well, start your own company and do something about it. The Foundation has demonstrated just how low you can go when profit is removed from the equation, but it requires exorbitant amounts of hard work by many volunteers doing everything from printed circuit board layout, to optimizing important software, to describing everything sufficiently that students can learn what has happened over the past century, or so, of computing developments without repeating mistakes made by others. I also give a tour of the exhibits that highlights repeated mistakes - it's kind of the raison d'être for the museum.
Sometimes you have to pull over, stop, and walk among the flowers along the Information Superslab in order to gain a better appreciation for the technology surrounding us today. A lack of perspective results in a very expensive form of gluttony where literally tons of computing power sit idle, or are even discarded before their time. If you haven't already noticed, gluttony ironically is not a growth industry over the long haul. Too much of anything can be a bad thing, and all things should be taken in moderation, including demanding bigger/faster hardware before you've even scratched the surface of what's already available.
The best things in life aren't things ... but, a Pi comes pretty darned close!
"Education is not the filling of a pail, but the lighting of a fire." -- W.B. Yeats
In theory, theory & practice are the same - in practice, they aren't!!!