extra RAM


120 posts   Page 1 of 5   1, 2, 3, 4, 5
by Jared.G » Wed Aug 15, 2012 8:59 pm
i was wondering if it would be possible to double the amount of RAM on the next iteration (model C) of the raspberry pi. it would be very simple and not raise the cost much. all that it would take is to add a seperator and sauder a second ram chip directley above the current one. i dont mean a second board just a sligty higer stack of ram. this would be shorter than some of the components already on the board and would not require much modification to the board.
-Jared
Posts: 2
Joined: Wed Aug 15, 2012 8:52 pm
by mahjongg » Wed Aug 15, 2012 9:49 pm
What you are proposing cannot work, (POP RAM's are not stackable).
A single POP (Package on Package) RAM with the double capacity would be possible, but at the moment it costs several times the price of the PI. Also I do not predict a "model c" in the next several years.
User avatar
Forum Moderator
Forum Moderator
Posts: 4506
Joined: Sun Mar 11, 2012 12:19 am
by Jared.G » Thu Aug 16, 2012 2:08 am
mahjongg wrote:What you are proposing cannot work, (POP RAM's are not stackable).
A single POP (Package on Package) RAM with the double capacity would be possible, but at the moment it costs several times the price of the PI. Also I do not predict a "model c" in the next several years.

well you are right i never thought about that. is there anyway that i could hook some up some extra RAM through the gpio port?
Posts: 2
Joined: Wed Aug 15, 2012 8:52 pm
by jackokring » Thu Aug 16, 2012 2:17 am
Not 400MHz wide SDRAM at full speed no.
Pi=B256R0USB CL4SD8GB Raspbian Stock. https://sites.google.com/site/rubikcompression/strictly-long
User avatar
Posts: 754
Joined: Tue Jul 31, 2012 8:27 am
Location: London, UK
by jamesh » Thu Aug 16, 2012 2:43 am
RAM prices are coming down, and I can foresee a time where it will actually be cheaper to go for a 512 rather than 256MB PoP, at which point the sensible choice would be to go to 512. When/if that may happen I don't know.
Raspberry Pi Engineer
Raspberry Pi Engineer
Posts: 10601
Joined: Sat Jul 30, 2011 7:41 pm
by castleromeo » Fri Aug 31, 2012 12:49 am
DDR3 RAM uses very little space. You can probably squeeze 1 Gb of it on future versions.
Posts: 6
Joined: Wed Aug 29, 2012 12:58 am
by W. H. Heydt » Fri Aug 31, 2012 1:14 am
castleromeo wrote:DDR3 RAM uses very little space. You can probably squeeze 1 Gb of it on future versions.


The BCM2835 does not have, I am given to understand, enough address lines to handle 1GB of RAM, so a move to 1GB would require a new processor, which would require a new board design...and it took six years to go from conception to a working product with *this* board.

So the choices are...cap it out at 512MB (distinctly possible within 2 to 3 years), redesign the PCB and go through the whole design and development cycle *again* (NOT going to happen any time soon), Broadcom decides to produce a new processor that is "pin compatible" with the current one and has more address lines (not on the horizon).

If option 2 or option 3 *does* happen (and I think it'll be 5 or 6 years from now before either would...if ever), then one might as well go whole hog and look for dual- or quad-core, much faster clock (1.5-2GHz) and capable of handling *at* *least* 2GB of RAM (and perhaps as much as 4GB...just for future growth with minimal additional changes...not to be initially used).
Posts: 1372
Joined: Fri Mar 09, 2012 7:36 pm
Location: Vallejo, CA (US)
by jamesh » Fri Aug 31, 2012 7:57 am
W. H. Heydt wrote:
castleromeo wrote:DDR3 RAM uses very little space. You can probably squeeze 1 Gb of it on future versions.


The BCM2835 does not have, I am given to understand, enough address lines to handle 1GB of RAM, so a move to 1GB would require a new processor, which would require a new board design...and it took six years to go from conception to a working product with *this* board.

So the choices are...cap it out at 512MB (distinctly possible within 2 to 3 years), redesign the PCB and go through the whole design and development cycle *again* (NOT going to happen any time soon), Broadcom decides to produce a new processor that is "pin compatible" with the current one and has more address lines (not on the horizon).

If option 2 or option 3 *does* happen (and I think it'll be 5 or 6 years from now before either would...if ever), then one might as well go whole hog and look for dual- or quad-core, much faster clock (1.5-2GHz) and capable of handling *at* *least* 2GB of RAM (and perhaps as much as 4GB...just for future growth with minimal additional changes...not to be initially used).


Although it was 6 years from the idea to execution, most of that 6 years was waiting for an appropriate SoC to appear, not actually designing the board. So nay future devices would be much quicker to appears simply because appropriate devices are here and available. This will require a board redesign as there will never be a pin compatible device to the 2835. So 512 I believe is the limit for the current generation of board. Even that would be a very effective upgrade.
As to future devices with dual or quad cores 1.5GHZ and 1GB+, Broadcom does has appropriate devices, but there are no plans at present to use them. The Foundation has enough on its hands just with the current device.
Raspberry Pi Engineer
Raspberry Pi Engineer
Posts: 10601
Joined: Sat Jul 30, 2011 7:41 pm
by W. H. Heydt » Fri Aug 31, 2012 7:22 pm
jamesh wrote:Although it was 6 years from the idea to execution, most of that 6 years was waiting for an appropriate SoC to appear, not actually designing the board. So nay future devices would be much quicker to appears simply because appropriate devices are here and available. This will require a board redesign as there will never be a pin compatible device to the 2835. So 512 I believe is the limit for the current generation of board. Even that would be a very effective upgrade.
As to future devices with dual or quad cores 1.5GHZ and 1GB+, Broadcom does has appropriate devices, but there are no plans at present to use them. The Foundation has enough on its hands just with the current device.


Yes...I pretty much assume that a new board design wouldn't have that "front loading" issue. It would probably get a lot more attention with the factories for prototyping and scheduling production ramp ups as well, in light of the market response to the Pi.

As for no compatible device...never is a long time. If the Pis ramp up in the direction of 2 million units per year--as seems likely--that *might* be enough to get Broadcom's attention. Given the general industry trend in die shrinks and such (just what *is* the process size for the 2835, anyway?), I wouldn't completely rule out higher clocked and/or more capable SoCs with the same coonections...that is, an enhanced plug replacement device.

I agree that *significant* upgrades (multicore, 1.5+GHz clock, complete board redesign) are more than the Foundation can deal with NOW...but in, say, 3 years..? Who's to say?

While the Pi appears to be just about ideal (going to 512MB would--as you note--be a big improvement and feed into the present targets) for grade school to, possibly, high school, an "enhanced" variant that is similar, though not identical, could very well be targeted at junior college to university students.
Posts: 1372
Joined: Fri Mar 09, 2012 7:36 pm
Location: Vallejo, CA (US)
by Jim JKla » Fri Aug 31, 2012 7:34 pm
There is a latteral thinking way to make more memory availalble and thats build a RaspberryPi farm with every block of memory with it's own processor. :D
Noob is not derogatory the noob is just the lower end of the noob--geek spectrum being a noob is just your first step towards being an uber-geek ;)

If you find a solution please post it in the wiki the forum dies too quick
User avatar
Posts: 1935
Joined: Sun Jan 29, 2012 11:15 pm
Location: Newcastle upon Tyne UK
by bredman » Sat Sep 01, 2012 7:05 am
You need to remember the goal of the foundation. This goal is to teach children how to program. The Raspberry Pi is just a tool to help with the final goal.

The foundation does not have the goal to produce the best-performing piece of hardware possible. The Raspberry Pi is currently sufficient to meet its intended purpose and will probably not be improved until a valid (and relevant) requirement exists.

This may not help those who want to use their Raspberry Pi for non-educational purposes, but there is good news. The Raspberry Pi has spawned a new interest in low-cost bare-board computers and I expect the costs to drop dramatically. The Raspberry Pi proved that there is a large hacker community willing to pay for a sub-$100 computer and a lot of manufacturers will try to satisfy this market.
Posts: 1413
Joined: Tue Jan 17, 2012 2:38 pm
by Jim JKla » Sat Sep 01, 2012 7:25 am
Fixed reduced memory forces the move away from bloatware the programms produced to run on the lowend of that first wave of home computing. Proved that a lot could be done with a small package. I remember there were programs that ran on the Commodore64 and the Spectrum48 that were closed to the BBC bacause it used memory resources to controll pre installed periferal interfaces.

The fact that these home computers were all doing useful projects in the sub 1Meg memory region just shows that big memory is not the only answer. ;)
Noob is not derogatory the noob is just the lower end of the noob--geek spectrum being a noob is just your first step towards being an uber-geek ;)

If you find a solution please post it in the wiki the forum dies too quick
User avatar
Posts: 1935
Joined: Sun Jan 29, 2012 11:15 pm
Location: Newcastle upon Tyne UK
by prodata » Sat Sep 01, 2012 8:05 am
bredman wrote:You need to remember the goal of the foundation. This goal is to teach children how to program. ...The Raspberry Pi is currently sufficient to meet its intended purpose and will probably not be improved until a valid (and relevant) requirement exists.


I keep reading this but never entirely convinced about the logic. If the Foundation thought that there was a potential upgrade to the Pi which was both very straightforward to implement (ie no major redesign involved) and backwards compatible, and at some point development resource became available to do so, then I don't see why producing a Pi 'C' but perhaps at a somewhat higher price point would do any other than to further the goals of the Foundation.

Personally I'd like to see a Green Pi eg with SWM regulators used (and any other power-saving options fully implemented) and would pay eg $45-49 for a unit, but whatever the spec of the Pi C then if there was a market potential for 5-figure sales and the prospect of say $5 margin per unit rather than $1 (or whatever the current figure might be) then I don't see why this wouldn't further the aims of the Foundation. Is there something about the Foundation charter that says it has to use a one-trick pony and nothing more? (Strictly speaking a two-trick pony I guess if both models 'A' and 'B' are considered.)
Posts: 113
Joined: Tue Jan 24, 2012 5:53 pm
by itimpi » Sat Sep 01, 2012 11:42 am
The thing that will stop the Foundation from looking at changes in nthe near term is that they ar a very small outfit and are already stretched to deliver the initial version.
Posts: 1027
Joined: Sun Sep 25, 2011 11:44 am
Location: Potters Bar, United Kingdom
by prodata » Sat Sep 01, 2012 12:41 pm
itimpi wrote:The thing that will stop the Foundation from looking at changes in nthe near term is that they ar a very small outfit and are already stretched to deliver the initial version.


Of course. But perhaps next year the immediate launch campaign will have been seen off. And the Foundation will then be reviewing progress; deciding on the next move; and if the Pi project continues to build momentum - and hopefully funding too - then there may be at least a little more resource available than hitherto.
Posts: 113
Joined: Tue Jan 24, 2012 5:53 pm
by Joe Schmoe » Sat Sep 01, 2012 2:06 pm
I keep reading this but never entirely convinced about the logic. If the Foundation thought that there was a potential upgrade to the Pi which was both very straightforward to implement (ie no major redesign involved) and backwards compatible, and at some point development resource became available to do so, then I don't see why producing a Pi 'C' but perhaps at a somewhat higher price point would do any other than to further the goals of the Foundation.


I have 3 comments, none of which should be taken as an attack. Rather, they are in the "There are more things in the world than are dreamt of in your philosophy" category.

1) You are reflecting the standard "consumer/industrial/maximize-profit" view - which is not
the only possible one. The members of the Foundation and many of the posters on this board have made it clear that they do not subscribe to this viewpoint. As you can probably tell, neither do I (which explains why today it is Bill Gates who heads up the largest software company in the world [*] and is a multi-billionaire, and not me).

[*] Modulo quibbles about the fact that he is now semi-retired and not really running the show anymore (that a**hole Balmer is)

2) It is likely that, sometime in the next year or two, when prices come
down enough to make it a good do, the Pi will switch to using the 512M PoP
memory. That's the only hardware change that seems likely.

3) As others have noted, there are other boards. In particular, if you want
more stuff, by all means, check out the HackBerry board. At $65 ($83
delivered), I think it is a much better do than the Pi. That price includes
a known good power supply; that alone is worth the extra bucks (to not have
to dick around with all the power supply issues).
Never answer the question you are asked. Rather, answer the question you wish you had been asked.

- Robert S. McNamara - quoted in "Fog of War" -
Posts: 2269
Joined: Sun Jan 15, 2012 1:11 pm
by prodata » Sat Sep 01, 2012 6:07 pm
Joe Schmoe wrote:I have 3 comments, none of which should be taken as an attack.

And not taken as such. But there is a broader way that you could be thinking about this. Let's say that the Foundation have a goal, which is - in a nutshell - to encourage programming in schools. Then if I was in their shoes (which patently I'm not :) ), any idea that enabled me to make progress towards that goal might be welcome. Just because the project started off totally focused around the Pi A/B doesn't mean that it has to end there. If there are other potential means (of any sort - technical/financial/political etc but, as one example, a Pi derivative that enabled a higher margin to feed back into the project or had some other tangible benefits) then why not consider using them, if resources allow? I'm not sure what this has to do with philosophies - it's just about achieving your end goal as effectively as possible.
Posts: 113
Joined: Tue Jan 24, 2012 5:53 pm
by W. H. Heydt » Sat Sep 01, 2012 8:40 pm
Jim JKla wrote:Fixed reduced memory forces the move away from bloatware the programms produced to run on the lowend of that first wave of home computing. Proved that a lot could be done with a small package. I remember there were programs that ran on the Commodore64 and the Spectrum48 that were closed to the BBC bacause it used memory resources to controll pre installed periferal interfaces.

The fact that these home computers were all doing useful projects in the sub 1Meg memory region just shows that big memory is not the only answer. ;)


I would be in a lot more agreement with the arguments relating to memory sizes like 48K or 64K...had my first programming job not been using an IBM System/360 Model 30 (a small mainframe) with 32K of memory....and it was the sole machine for a multi-hundred-million dollar per year business.

When you find a way to squeeze Linux into 6K to 8K bytes (the size of system nucleus that 360/30 had), then we'll talk about small memory models.

In the mean time...don't look at the memory of the Pi as large because it's 256MB. Look at how much of that memory is left over after the graphics memory and operating system eat up what they need. Doubling physical memory to 512MB will far more than double the *usable* memory and allow more programs to be resident in memory concurrently, greatly facilitating programming and learning about computing in general.

I note that a Pi I just checked (my "alarm clock"), which is not running the GUI desktop, nor any user programs at the moment has 12MB of free space. Even if the graphics allocation were doubled with a move to 512MB (from 64MB to 128MB), that should give a free space closer to 200MB...a 16 fold increase. (This is also why I am not on the bandwagon for 1GB of memory until it is deemed appropriate to do a complete board redesign, and then only to *permit* the use of at least that much memory, whether it starts out with it or not.)
Posts: 1372
Joined: Fri Mar 09, 2012 7:36 pm
Location: Vallejo, CA (US)
by jojopi » Sun Sep 02, 2012 12:08 am
W. H. Heydt wrote:I note that a Pi I just checked (my "alarm clock"), which is not running the GUI desktop, nor any user programs at the moment has 12MB of free space. Even if the graphics allocation were doubled with a move to 512MB (from 64MB to 128MB), that should give a free space closer to 200MB...a 16 fold increase.
I suspect you are misreading your non-gui system here. Mine says:
Code: Select all
pi@tau ~ $ free
             total       used       free     shared    buffers     cached
Mem:        188112     177884      10228          0      37048     109496
-/+ buffers/cache:      31340     156772
Swap:       102396       2612      99784
There is only 10MB completely free, but there is only 31MB actually allocated to user processes. The bulk of the memory is used by the kernel to buffer and cache disk accesses. In fact, the kernel aims to keep free (that is, wasted) memory down close to 8MB. If there was 448MB total, instead of 192MB, the buffers and cache would be even bigger, and the totally free about the same. The "-/+ buffers/cache:" line is the one that is usually best to look at.

I sympathise with complaints about modern gui systems, however. On my main desktop system I have to restart firefox from time to time because it reaches 6GB and starts to become noticeably slow. That is just ridiculous.
User avatar
Posts: 1873
Joined: Tue Oct 11, 2011 8:38 pm
by Jim JKla » Sun Sep 02, 2012 7:36 am
jojopi wrote:On my main desktop system I have to restart firefox from time to time because it reaches 6GB and starts to become noticeably slow. That is just ridiculous.


But isn't that a problem generated by Firefox not low memory? ;)
Noob is not derogatory the noob is just the lower end of the noob--geek spectrum being a noob is just your first step towards being an uber-geek ;)

If you find a solution please post it in the wiki the forum dies too quick
User avatar
Posts: 1935
Joined: Sun Jan 29, 2012 11:15 pm
Location: Newcastle upon Tyne UK
by mikerr » Sun Sep 02, 2012 9:22 am
Having plenty of RAM makes you use different programming methods too.

I remember one programming assignment at my CompSci course had a largish dataset

Everyone - bar me - used the "read entire file into memory array, then process it" method
- size of dataset they could handle was limited by memory

I wrote a rather more complex one that worked on it as a stream - "read a bit, process,dump - read another bit...",
and so could handle an infinite dataset.

I didn't get any extra points for it though - the lecturer made the comment that it was just a teaching exercise,
and that code simplicity was more important than capability :(

On Pi some the of RAM is grabbed by the GPU - when using much GPU stuff you really have to goto a 128/128 split,
which can be cramped - 512 would allow a nicer 384/128 split. No doubt that will magically appear at some point
when 512MB chips are as cheap/cheaper than 256MB
Got a Pi Camera? View it in my android app - Raspicam Remote ! No software required on the pi
User avatar
Posts: 964
Joined: Thu Jan 12, 2012 12:46 pm
Location: NorthWest, UK
by Jim JKla » Sun Sep 02, 2012 9:54 am
mikerr wrote:the lecturer made the comment that it was just a teaching exercise,
and that code simplicity was more important than capability


Philistine ;)
Noob is not derogatory the noob is just the lower end of the noob--geek spectrum being a noob is just your first step towards being an uber-geek ;)

If you find a solution please post it in the wiki the forum dies too quick
User avatar
Posts: 1935
Joined: Sun Jan 29, 2012 11:15 pm
Location: Newcastle upon Tyne UK
by itimpi » Mon Sep 03, 2012 6:09 am
I guess it might be a possibility. I remember the Foundation saying that a pin-compatible 512M RAM package existed, but that it added far to much to the price to allow the target price the foundation wanted to be achieved. The difference here seems to be $15, and that is probably less than it would have been 6 months ago with the way memory prices sem to be continually dropping.
Posts: 1027
Joined: Sun Sep 25, 2011 11:44 am
Location: Potters Bar, United Kingdom
by Jim JKla » Mon Sep 03, 2012 6:36 am
It felt odd to me that the difference between the "A"&"B" included lower memory as well as dropping the Ethernet and twin USB we may even get the "C" or is it the "B+" or the "B512" before the "A" ;)

I would probably get one of the memory expanded ones but like the "A" I can see it being a wait. :)
Noob is not derogatory the noob is just the lower end of the noob--geek spectrum being a noob is just your first step towards being an uber-geek ;)

If you find a solution please post it in the wiki the forum dies too quick
User avatar
Posts: 1935
Joined: Sun Jan 29, 2012 11:15 pm
Location: Newcastle upon Tyne UK
by Jim JKla » Mon Sep 03, 2012 6:54 am
Wouldn't it just be a single component change on an "A" once the said "A" was available?

Just a curiosity question not an inquisition ;)
Noob is not derogatory the noob is just the lower end of the noob--geek spectrum being a noob is just your first step towards being an uber-geek ;)

If you find a solution please post it in the wiki the forum dies too quick
User avatar
Posts: 1935
Joined: Sun Jan 29, 2012 11:15 pm
Location: Newcastle upon Tyne UK