Page 1 of 1

Any benifit to overclocking gpu_freq for purely headless users?

Posted: Fri Jun 26, 2020 12:21 am
by bassamanator
I thought I read somewhere there might be benefits to increasing gpu_freq even for headless raspberry pi setups. Is this true?

Re: Any benifit to overclocking gpu_freq for purely headless users?

Posted: Fri Jun 26, 2020 12:59 am
by kerry_s
no

Re: Any benifit to overclocking gpu_freq for purely headless users?

Posted: Sat Jun 27, 2020 10:17 pm
by bassamanator
I did put this to the test and indeed, there are no benefits for headless users. I used xhpl to benchmark.

Re: Any benifit to overclocking gpu_freq for purely headless users?

Posted: Sun Jun 28, 2020 3:03 am
by LTolledo
its like doing a speed test on a NOS modded car engine, but without any wheels, axle, drive train fitted to the car. :mrgreen:

Re: Any benifit to overclocking gpu_freq for purely headless users?

Posted: Sun Jun 28, 2020 8:53 am
by crossbar
Any benifit to overclocking gpu_freq for purely headless users?
ANY benefit ?
Yes.

As always: The exact answer depends on the details.
Not every headless PI does without the GPU.

Example:
If your headless Pi use the vector-processing-QPUs in the videocore there are benefits in increasing gpu-freq. Not linear - because crossing clock domain boundaries does (sometimes) introduce stalls.

Aside from this : For the "everyday"-user the answer is "no". (note the lower case)

michael
.

Re: Any benifit to overclocking gpu_freq for purely headless users?

Posted: Sun Jun 28, 2020 2:01 pm
by ejolson
crossbar wrote:
Sun Jun 28, 2020 8:53 am
Any benifit to overclocking gpu_freq for purely headless users?
ANY benefit ?
Yes.

As always: The exact answer depends on the details.
Not every headless PI does without the GPU.

Example:
If your headless Pi use the vector-processing-QPUs in the videocore there are benefits in increasing gpu-freq. Not linear - because crossing clock domain boundaries does (sometimes) introduce stalls.

Aside from this : For the "everyday"-user the answer is "no". (note the lower case)

michael
.
Can you recommend a good tutorial on writing QPU code, running it and then checking the performance?

Re: Any benifit to overclocking gpu_freq for purely headless users?

Posted: Thu Jul 02, 2020 9:57 am
by Akane
ejolson wrote:
Sun Jun 28, 2020 2:01 pm
crossbar wrote:
Sun Jun 28, 2020 8:53 am
Any benifit to overclocking gpu_freq for purely headless users?
ANY benefit ?
Yes.

As always: The exact answer depends on the details.
Not every headless PI does without the GPU.

Example:
If your headless Pi use the vector-processing-QPUs in the videocore there are benefits in increasing gpu-freq. Not linear - because crossing clock domain boundaries does (sometimes) introduce stalls.

Aside from this : For the "everyday"-user the answer is "no". (note the lower case)

michael
.
Can you recommend a good tutorial on writing QPU code, running it and then checking the performance?
py-videocore for Raspberry Pi Zero/1/2/3 (VideoCore IV) and py-videocore6 for Raspberry Pi 4 (VideoCore VI).
There are some example codes (matrix-matrix multiplication, memory copy, etc.) in the repositories.
You will see lower performance without force_turbo=1 in your /boot/config.txt, which disables the dynamic frequency scaling of V3D block.

Re: Any benifit to overclocking gpu_freq for purely headless users?

Posted: Thu Jul 02, 2020 10:03 am
by jamesh
ejolson wrote:
Sun Jun 28, 2020 2:01 pm
crossbar wrote:
Sun Jun 28, 2020 8:53 am
Any benifit to overclocking gpu_freq for purely headless users?
ANY benefit ?
Yes.

As always: The exact answer depends on the details.
Not every headless PI does without the GPU.

Example:
If your headless Pi use the vector-processing-QPUs in the videocore there are benefits in increasing gpu-freq. Not linear - because crossing clock domain boundaries does (sometimes) introduce stalls.

Aside from this : For the "everyday"-user the answer is "no". (note the lower case)

michael
.
Can you recommend a good tutorial on writing QPU code, running it and then checking the performance?
In my opinion you would be much better off learning ARM NEON. More standardised, and gets faster with every version of the pi.

Re: Any benifit to overclocking gpu_freq for purely headless users?

Posted: Thu Jul 02, 2020 11:46 am
by timg236
bassamanator wrote:
Fri Jun 26, 2020 12:21 am
I thought I read somewhere there might be benefits to increasing gpu_freq even for headless raspberry pi setups. Is this true?
Not really, unless the test is limited by memory bandwidth in which case boosting the core-frequency (also set by gpu freq) to 550 (4kp60) or 600 MHz might be of some benefit because of the increased AXI bus speed.

However, that's really something to try after you have optimized everything else.

Re: Any benifit to overclocking gpu_freq for purely headless users?

Posted: Thu Jul 02, 2020 1:02 pm
by Akane
I learned so many things from Raspberry Pi and QPU that I could never know, and with the experience, I got a stimulating part-time job that altered my whole life though I'm still a student. So I think it's not bad to learn QPU :D

Re: Any benifit to overclocking gpu_freq for purely headless users?

Posted: Thu Jul 02, 2020 2:39 pm
by jamesh
Akane wrote:
Thu Jul 02, 2020 1:02 pm
I learned so many things from Raspberry Pi and QPU that I could never know, and with the experience, I got a stimulating part-time job that altered my whole life though I'm still a student. So I think it's not bad to learn QPU :D
Problem is that its a difficult to learn proprietary instruction set that is unlikely to develop as fast as NEON.

Re: Any benifit to overclocking gpu_freq for purely headless users?

Posted: Thu Jul 02, 2020 3:00 pm
by ejolson
jamesh wrote:
Thu Jul 02, 2020 2:39 pm
Akane wrote:
Thu Jul 02, 2020 1:02 pm
I learned so many things from Raspberry Pi and QPU that I could never know, and with the experience, I got a stimulating part-time job that altered my whole life though I'm still a student. So I think it's not bad to learn QPU :D
Problem is that its a difficult to learn proprietary instruction set that is unlikely to develop as fast as NEON.
In my opinion, learning how to leverage heterogeneous computing architectures to accomplish a single task has become exceedingly relevant to machine learning and high-performance computation. It is nice to know that this important skill transcends instruction sets and can be developed using the GPU on the Pi.

This may be an extension of the rule that learning at least one machine language is needed to become good with computers, but it matters not which one.