Whoever designed that has some strange ideas about 'scope inputs.
1. The article says: "The use of a BNC socket for the input ensures that you can use this with proper oscilloscope probe leads; these normally have an X10 switchable attenuator fitted, thus allowing voltages of +/- 25V to be measured."
Every switchable x1/x10 and every fixed x10 probe I have used since 1971 has been designed* for a scope input impedance of 1 Mohm**. The scope in the project has an input impedance of not more than 51 kohm. A x10 probe will function more like a x200 one. I cannot believe the designer would not have noticed that if he had actually tried using a x10 probe with his design.
(* The x10 is accomplished by little more than a 9 Mohm series resistor and a trimming capacitor.
** Strictly speaking the impedance is a resistance and parallel capacitance but at the very low frequencies the Magpi scope can handle the capacitance is not relevant.)
2. What's with biasing the earthy end of the input socket to 2.5 volts? It achieves nothing beneficial. In fact it cause the 1 uF electrolytic to become reverse biased on positive input signals. And 99 times out of 100 the earthy side of a probe or test clip will be clipped to the 0 volt rail of whatever is under test. If the 0 volt rail of the scope and the 0 volt rail of the device being scoped are commoned then the bottom left 100k is shorted out anyway. I suggest omitting the two 47uF capacitors and the 100k resistors across them. Connect that side of the input to ground just like every conventional 'scope does.