Page 1 of 1

Saving a file in RAM

Posted: Thu Nov 21, 2019 5:00 pm
by sonettguy
I am using a 4Gb pi 4 running Raspbian 10 Buster. I have two python3 programs running simultaneously with numpy imported as np. One periodically saves to the SD card a text file of data collected from a sensor and a second reads it periodically and displays the data it contains on the surface of a three-dimensional sphere.

Code: Select all

Program A:
...
for k in range(sensors):
   ...
   SN=S1[0,k,:,:]
   filename="S1_ex4_b_src_"+str(k)+".csv"
   np.savetext(filename,SN,fmt='%.7f',delimiter=',')

Program B:
...
for i in range(sensors):
   filename="S1_ex4_b_src_"+str(i)+".csv"
   curve=np.genfromtxt(filename,delimiter=",")
   ...
I now need to speed up the process 100x or so. I need to move the file from the SD card to RAM. It is OK to lose it when I power down. It seems like a common problem, but when I search the forum using 'saving a file to RAM' or 'sharing a file without SD card' or several similar phrases, I come up with discussions of moving a file off the pi or having a slow upgrade.

How does one save a file to RAM instead of an SD card?

Re: Saving a file in RAM

Posted: Thu Nov 21, 2019 5:06 pm
by gordon77
Put it in /run/shm/

Re: Saving a file in RAM

Posted: Thu Nov 21, 2019 5:13 pm
by jahboater
You could create a "ram disk". Tmpfs is virtual memory but it will be in ram most of the time (except when memory is over-committed then it gets paged out like any other memory).

I add these two lines to /etc/fstab.

Code: Select all

tmpfs /tmp tmpfs defaults,noatime 0 0
tmpfs /var/log tmpfs defaults,noatime,size=16m 0 0
So you could save your files to /tmp. Here /tmp will be half the size of memory, 2GB.

However, I must say that you may not see much of a speed-up as the disk cache on Linux is remarkably effective.
When you write a file, it is always written to memory. (The system will write it to the disk later which has all sorts of benefits).
A file frequently read will also be in memory as the system retains a copy of the file after you have read it.

Python is known to be very slow. If you want a performance increase, consider another language such as C or C++.

Re: Saving a file in RAM

Posted: Thu Nov 21, 2019 5:16 pm
by jahboater
gordon77 wrote:
Thu Nov 21, 2019 5:06 pm
Put it in /run/shm/
Yes indeed. You meant /dev/shm I think??. /dev/shm is tmpfs.

Code: Select all

 $ df -h
Filesystem      Size  Used Avail Use% Mounted on
/dev/root        59G   10G   46G  18% /
devtmpfs        1.9G     0  1.9G   0% /dev
tmpfs           1.9G     0  1.9G   0% /dev/shm
tmpfs           1.9G   25M  1.9G   2% /run
tmpfs           5.0M     0  5.0M   0% /run/lock
tmpfs           1.9G     0  1.9G   0% /sys/fs/cgroup
tmpfs            16M  156K   16M   1% /var/log
tmpfs           1.9G     0  1.9G   0% /tmp
/dev/mmcblk0p1  253M   53M  200M  21% /boot
tmpfs           386M     0  386M   0% /run/user/1000
pi@raspberrypi:~ $ 

Re: Saving a file in RAM

Posted: Thu Nov 21, 2019 5:28 pm
by procount
Why not create a named pipe between the 2 programs, using mkfifo?
https://www.howtoforge.com/linux-mkfifo-command/
Open the pipe for writing in one program and reading in the other.
https://www.geeksforgeeks.org/python-os ... %20deleted.

Re: Saving a file in RAM

Posted: Thu Nov 21, 2019 5:38 pm
by rpdom
jahboater wrote:
Thu Nov 21, 2019 5:16 pm
gordon77 wrote:
Thu Nov 21, 2019 5:06 pm
Put it in /run/shm/
Yes indeed. You meant /dev/shm I think??. /dev/shm is tmpfs.
/run/shm is a link to /dev/shm. They are the same thing.

Re: Saving a file in RAM

Posted: Fri Nov 22, 2019 3:34 pm
by sonettguy
Thank you, Jahboater. I didn't know that about the OS, that it stores to RAM and in the background sends it to disk. I wonder now if my slowness comes because one program is hogging the file (that is, it keeps it open so the other program cannot gain access). I'll have to study this for awhile and run some timing tests. Good information.

And, thank you, procount for the pipe suggestion. Although I was planning on asynchronous operation, it might work as well to speed things up. I am particularly thankful for the links.