Heater wrote: ↑Sat Nov 09, 2019 11:28 am
I'm curious, with all that detective work on your history and such, did you ever discover what it was that you did that messed it up?
Unfortunately not! I have been down this route twice now in the last few weeks and both times I had intended to check something I thought was safe (setting up WiFi as access point and testing the I2C connection to an RTC module). In the process somethings were installed via apt and probably changed basic settings I have no idea about. Result: broken system.
0) It's slow and tedious.
1) Generally there is a lot of empty space on the SD card file system. One is spending a lot of time and disk space backing up nothing!
2) Those image files are big. Wasting a lot of space on my PC drive.
3) When it comes time to restore you run into problems if the SD you want to put the image on is not big enough. Or if it is bigger you end up not using it all.
Now when I had to do the resurrection for the 2nd time I followed my notes from the first time plus used the history command output to see what apt installs I had done. So I got to a new working system with my home/pi data all restored.
But now I used the GUI program to copy the image to an SD card, in this case of half the size of what I normally use (16 vs 32 GB).
This will serve as my rescue in the future if I am hit again.
And you are right it IS very slow and tedious. And the final img file is very big and contains a lot of zero space.
So I used 7zip to compress the image file from 15 down to 4.5 GB as a zip archive.
Procedure:
1) SD Card Copier used in the GUI onto a smaller SD than the original
2) Win32DiskImager on Windows7 to create an image file from the SDcard
3) 7zip to reduce the size of the img file from 15 to 4.5 GB
And this all took a loooong time.
But should I ever have to re-image to an SD card Etcher can use a zip file as input so no need to expand the compressed file anyway.
Now I don't do that. The main bulk of what you want to write back to a new SD card can be had from a fresh download and install of Raspbian pretty quickly. The stuff that you need backed up is often reproduced from a bunch of apt-get install commands. Or it it's my own code a quick pull from the github repository it all lives on. Configuration changes are a matter of keeping records of what you have done/and or backing up copies of the changed config files. Often saved in github again for quick recovery.
Ideally one would script all this. Creating a personal configuration script that would download, install and configure everything just as you like. But that's a bit much to maintain when one is hacking and experimenting all the time.
I agree in principle (I use Subversion though), but maybe I should have created a new system as described above from the downloaded image, but then only performed the actions that pertain to the system itself like installing the stuff via apt install and doing nothing at all about the user files yet.
Then after that was done I could have created the backup system disk sans the /home/pi stuff.
Such an image would have been pretty much smaller and would be simpler to create and used.
Then I would also maybe use tar to archive the complete /home/pi directory once it had been properly set up as a home.tgz file.
In such a case the actual file changes in the future would probably only affect the /home/pi dir and could be easier handled unless they were caused by new apt installs, which would have to be kept track of.
If I am hit again in the future I will probably head that way.
Is there a way to archive away (using tar for instance) only changed files in the home directory?
Thus in the future one could back up more easily what has actually changed....