The term is "dogfooding". And why not? Use Pis to bring down the cost of building Pis?
Wasn't that the entire point of the article? That Raspberry Pi's make the whole monitoring process cheaper?
Or it's the right tool for the right job.Burngate wrote: Seems slightly incestuous - or maybe keeping the business within the family
I think a lot of Pi users are grateful I/O pins were included, even if it was a case of 'they're there; we may as well break them out for use".When we built the first Raspberry Pi, I didn’t want to put input-output pins on it, because I thought kids would be interested in using them to write programs. Of course, what children actually love doing with Raspberry Pi is interacting with the real world
hippy wrote: ↑Wed Nov 06, 2019 3:11 pmI think a lot of Pi users are grateful I/O pins were included, even if it was a case of 'they're there; we may as well break them out for use".When we built the first Raspberry Pi, I didn’t want to put input-output pins on it, because I thought kids would be interested in using them to write programs. Of course, what children actually love doing with Raspberry Pi is interacting with the real world
On which, with respect to the Pi's bolted onto kit in Sony's factory; they have power, HDMI and network cables but what are they monitoring or connected to - It looks like they may be using DSI ports or is it a ribbon cable out of the back ?
Or, given the HDMI connections, are they used as networked information displays ?
So my guess is the ethernet."The industrial equipment they use tends to have data ports on the back, usually Ethernet ports or serial ports, which spew out data about how the machines are performing – but almost always, historically, nobody’s been listening."
Not sure it would have been a disaster.
Good point; could be ethernet in, pre-process the data, Wi-Fi out.
I wouldn't say it's "because" of the Pi. Popular SBCs prior to the Pi had GPIO pins, as did microcontrollers. What has happened is that boards attempting to compete with the Pi claim to have a Pi-compatible GPIO pin block. Whether or not that is actually true, I couldn't say.dickon wrote: ↑Wed Nov 06, 2019 5:05 pmAs would the rest of the SBC space. It's remarkable just how many of them come with a 40-pin header with GPIO on it. All mutually incompatible, of course, and nothing else has anywhere near the userbase or support of the Pi, but still. GPIO is a thing that's out there, mostly because of the Pi.
It's been an immensely influential product.
It's not actually that difficult.
Same here, and probably the same for everyone with a Pi and not using GPIO.W. H. Heydt wrote: ↑Wed Nov 06, 2019 5:12 pmFor me, had the GPIO pins been left off, it would have changed very little.
I was a student at the time, playing around with Gentoo and Linux From Scratch. I was trying to justify spending money on a beagleboard, but they were all well outside my budget. Then the pi was announced, one thing led to another and here we are.W. H. Heydt wrote: ↑Wed Nov 06, 2019 9:58 pmFrom my reading, the "go to" SBC in the DIY/maker/amateur market at the time the Pi was launched was the Beaglebone. They sold for $90 each. The PI pretty much took over their market quickly.
Probably the latter, but yes, it was a very interesting interview. FWLIW, the commercial uses of the Pi that I've seen all involve the GPIO header, and I'm very, very glad it's there. I'm also using a Pi's GPIO header because it's there and I can: the Dallas 1-wire stuff lets me monitor the temperature in my livingroom, which in turn feeds into the central heating system. If it didn't have it, I'd have to fudge it.
3D designs such as 3DSoC will deliver vast performance increases (in the interim, 2.5D/3D stacking of DRAM near the CPU, or 3D TSV SRAM). They won't be "free returns", but we'll see computers with at least 10-100 times more single or multi-threaded performance. That means even more people will use smaller computers like SBCs or dockable smartphones without needing a beefy desktop. Or we can write even more bloated and inefficient code.I think the era of free returns in processor speeds is drawing to a close, because we’re running out of atoms. The smallest structures on silicon chips are now spaced around 7 nm apart, which is about 70 atoms, and at those distances both the physics and the economics of the system start to go awry. Our knowledge of the behaviour of semiconductors is based on a statistical model of each thousand silicon atoms having, on average, this many dopant atoms embedded within them. But of course, once you’re making silicon structures 70 atoms apart, it’s no longer a statistical process, so your assumptions start to break down on the physics side. At the same time, on the economic side, it’s becoming ruinously expensive to build faster chips.
[...] But the trends that enabled that consensus are coming to an end, and that means we’re beginning to see a new focus on efficiency in software engineering. I’m excited by this because I’m still a software engineer at heart, and until recently it’s been very hard to argue for writing more efficient code because the doubling in computer power meant it wasn’t necessary. You just waited two years, and your code ran twice as fast.