spurious
Posts: 343
Joined: Mon Nov 21, 2011 9:29 pm

Re: Harness the power of the GPU?

Wed Jan 11, 2012 9:36 pm

Will enough details of the GPU be published for the more enthusiastic to get it to process stuff other than the screen output?

Tomo2k
Posts: 127
Joined: Mon Dec 19, 2011 10:00 pm

Re: Harness the power of the GPU?

Wed Jan 11, 2012 10:12 pm

"OpenGL ES2.0" is actually enough information to do this already.

The original GPGPU technique is:


Pass a texture (and optionally geometry) to the GPU
Ask it to render to a texture using a set of custom shaders that do the actual processing.
Read back the texture to get your results.

This will definitely be possible from day 1, as it is all standard OpenGL stuff.

LLVM compilers etc will have to wait a while.

foo
Posts: 52
Joined: Thu Dec 29, 2011 12:49 am

Re: Harness the power of the GPU?

Thu Jan 12, 2012 1:43 am

I don't think "wait a while" is quite right here.  The impression I get is that you will not, as a user, be able to write custom code on the GPU at all.  The APIs exposed to the OS will be what you get.

In theory, the foundation or Broadcom could release a OpenCL driver or something, but unless you're under NDA with Broadcom, you won't get any information or tools for compiling straight GPU code.

(Whether or not there will be in-roads into reverse engineering portions of it remain to be seen.  It sounds like the code is all self-contained on the GPU with the CPU only sending data to it, which would make it difficult.)

Svartalf
Posts: 596
Joined: Fri Jul 29, 2011 6:50 pm

Re: Harness the power of the GPU?

Thu Jan 12, 2012 2:16 am

foo said:


I don't think "wait a while" is quite right here.  The impression I get is that you will not, as a user, be able to write custom code on the GPU at all.  The APIs exposed to the OS will be what you get.

Heh...  You're not paying attention here.  You don't NEED OpenCL.  It makes things nicer and easier- but to do GPGPU, you only need an API that provides programming of shaders via something like GLSL or HLSL.  You will be able to write custom code (just not AS flexible as OpenCL provides...) to run on the GPU because shader coding is just that.  Once you show Broadcom the wisdom of maybe providing OpenCL afterall, then you MIGHT get to see it.  I suggest a bit of reading on "old school" GPGPU coding before you make any further comments along the lines you've just espoused.

jamesh
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 26442
Joined: Sat Jul 30, 2011 7:41 pm

Re: Harness the power of the GPU?

Thu Jan 12, 2012 10:18 am

Svartalf said:


foo said:


I don't think "wait a while" is quite right here.  The impression I get is that you will not, as a user, be able to write custom code on the GPU at all.  The APIs exposed to the OS will be what you get.



Heh...  You're not paying attention here.  You don't NEED OpenCL.  It makes things nicer and easier- but to do GPGPU, you only need an API that provides programming of shaders via something like GLSL or HLSL.  You will be able to write custom code (just not AS flexible as OpenCL provides...) to run on the GPU because shader coding is just that.  Once you show Broadcom the wisdom of maybe providing OpenCL afterall, then you MIGHT get to see it.  I suggest a bit of reading on "old school" GPGPU coding before you make any further comments along the lines you've just espoused.


OpenCL is a big ask as its would be a large amount of work (few man years). That's expensive in engineering time, and unless you can come up with a decent case for making the money back, it's a bit of a non-starter. People like Nvidia can do it for desktop GPU's as they get used in big compute projects and can make the money back. Not so likely with a mobile GPU, even thought its pretty powerful.
Principal Software Engineer at Raspberry Pi (Trading) Ltd.
Contrary to popular belief, humorous signatures are allowed.
I've been saying "Mucho" to my Spanish friend a lot more lately. It means a lot to him.

spurious
Posts: 343
Joined: Mon Nov 21, 2011 9:29 pm

Re: Harness the power of the GPU?

Thu Jan 12, 2012 6:30 pm

Thanks, but sounds like the answer is no.

I didn't want to use the GPU for grraphics, but to supplement the the main CPU as raw grunt power to process anything you fed to it.

Oh well... bit of a waste of what sounds like a very fast bit of kit.

Still want a few of these babies for various home projects though, as they are still awesome.

Tomo2k
Posts: 127
Joined: Mon Dec 19, 2011 10:00 pm

Re: Harness the power of the GPU?

Thu Jan 12, 2012 7:03 pm

Sorry to be repeating myself, but it's clear you didn't understand the first time

The answer is yes, you CAN do GPGPU task because it's OpenGL ES 2.0.

Have a look here for some code examples:

http://gpgpu.org/developer/leg.....phics-apis

The GLSL versions will probably run on the RPi with little to no changes.

- I was really proud of myself for figuring out you could do this when I first starting writing GLSL a few years ago. Then I found out it had been done for quite a while and realised that I'm not actually that smart!

OpenCL (CUDA etc) don't give you any more power, it's just a framework that makes it a lot easier to write software involving GPGPU portions that runs on a variety of GPUs (translating to GLSL variants etc as required), and helps with assigning appropriate parts of the task to the CPU(s) and GPU(s)
- Mostly doing the housekeeping, to be honest.

As Svartalf said above: OpenCL is a nice-to-have and not (I repeat not) a requirement for GPGPU.

ReverseEsper
Posts: 1
Joined: Wed Sep 28, 2011 11:07 am

Re: Harness the power of the GPU?

Thu Jan 12, 2012 7:06 pm

I think you didn't quite understood.

Texture is nothing more but a set of data, on witch you can work. Shader is a set of instruction that you can order to be done on this set of data. GPU are not made for banch work, but they are perfect for working on streams of data. Its bit farfetched as it requires thinking outside the box, but you can use normal OpenGL instruction to do the math work.

spurious
Posts: 343
Joined: Mon Nov 21, 2011 9:29 pm

Re: Harness the power of the GPU?

Thu Jan 12, 2012 7:15 pm

ok... I'm not used to accessing APIs to talk to a processor, and your example was about graphics. Hence I obviously got the wrong end of the stick.

I will read about the APIs (goes against everything I know about writing directly to a processor).. you can see how I misunderstood.

Thanks for being patient Tomo2k

Tomo2k
Posts: 127
Joined: Mon Dec 19, 2011 10:00 pm

Re: Harness the power of the GPU?

Thu Jan 12, 2012 7:36 pm

One of the most obvious (thus first) tasks done using GPGPU is image processing - put your image into a texture, do edge detect or whatever on it, and use the output texture.
- This is stupendously fast compared to the same job on a single-core CPU, as many output pixels are computed simultaneosly.

Now, take a step back and remember that a pixel is really four 8 or 16 bit numbers (usually RGBA), and thus an image is really just a 2D array of four number fields.

- Those numbers can mean anything at all.

Thus the "texture(s)" are your data, and the "output texture" is your complete set of results.

The challenge is to package and process your data in such a way as to make efficient use of the GPU.
- For example, if your computation is a tight for-next loop where each iteration doesnt rely on any results from previous iterations, it is likely to be suitable for GPGPU.

Svartalf
Posts: 596
Joined: Fri Jul 29, 2011 6:50 pm

Re: Harness the power of the GPU?

Thu Jan 12, 2012 8:32 pm

JamesH said:

OpenCL is a big ask as its would be a large amount of work (few man years). That's expensive in engineering time, and unless you can come up with a decent case for making the money back, it's a bit of a non-starter. People like Nvidia can do it for desktop GPU's as they get used in big compute projects and can make the money back. Not so likely with a mobile GPU, even thought its pretty powerful.

Yes and no on that one, James.  It's a large task, but apparently some of the mobile GPU vendors think it's quite important to at least their customers:

http://developer.nvidia.com/opencl
http://www.uplinq.com/2011/pdf.....evices.pdf
http://www.arm.com/products/mu.....i-t604.php
http://www.imgtec.com/forum/fo.....sp?TID=194

If it were not something sought after, you wouldn't see those four links for stuff either in progress or already shipping...from the major players in this space.  This is the reason I keep bringing up the subject of people doing old-school GPGPU first to see if there's really that much usable power in the device.  If there is, you're going to have a reason for doing it- whether Broadcom chooses to go down that path or not.

Svartalf
Posts: 596
Joined: Fri Jul 29, 2011 6:50 pm

Re: Harness the power of the GPU?

Thu Jan 12, 2012 8:40 pm

spurious said:


ok... I'm not used to accessing APIs to talk to a processor, and your example was about graphics. Hence I obviously got the wrong end of the stick.

I will read about the APIs (goes against everything I know about writing directly to a processor).. you can see how I misunderstood.

Thanks for being patient Tomo2k



Heh...  You don't directly write to a GPU unless you're doing the backend for a state machine for OpenGL/OpenCL/OpenVG/etc.  Not even on an open-sourced device like the Radeon or Intel's GMA/etc. GPUs.  You write code to an API that does the desired task- even with OpenCL.  OpenCL just makes it easier to make efficient code because it's more tuned to that task that OpenGLES is.

selvmarcus
Posts: 18
Joined: Sat Jan 07, 2012 1:21 pm

Re: Harness the power of the GPU?

Tue Jan 17, 2012 11:15 am

Thanks for the helpful link.

Marcus

jamesh
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 26442
Joined: Sat Jul 30, 2011 7:41 pm

Re: Harness the power of the GPU?

Tue Jan 17, 2012 1:03 pm

Svartalf said:


JamesH said:


OpenCL is a big ask as its would be a large amount of work (few man years). That's expensive in engineering time, and unless you can come up with a decent case for making the money back, it's a bit of a non-starter. People like Nvidia can do it for desktop GPU's as they get used in big compute projects and can make the money back. Not so likely with a mobile GPU, even thought its pretty powerful.


Yes and no on that one, James.  It's a large task, but apparently some of the mobile GPU vendors think it's quite important to at least their customers:

http://developer.nvidia.com/opencl
http://www.uplinq.com/2011/pdf.....evices.pdf
http://www.arm.com/products/mu.....i-t604.php
http://www.imgtec.com/forum/fo.....sp?TID=194

If it were not something sought after, you wouldn't see those four links for stuff either in progress or already shipping...from the major players in this space.  This is the reason I keep bringing up the subject of people doing old-school GPGPU first to see if there's really that much usable power in the device.  If there is, you're going to have a reason for doing it- whether Broadcom chooses to go down that path or not.


Well, the Nvidia link is a red herring - NVidia already have their desktop OCL stuff, so that a *relatively* simple task to move to their Mobile GPU's. Broadcom would need to start from scratch.

Arm - well, they do it once for multiple Arm licencees - makes it very worthwhile.

Qualcom - interesting and quite relevent. Good ammo.

PowerVR - from that thread I didn't really see whether they provide OCL drivers or not.

That said, I would like to see OCL stuff for the Videocore. I'll keep mentioning it, but we already have all our hands very full indeed, so it would need the employment of three or four more OCL experienced engineers, and years of time.
Principal Software Engineer at Raspberry Pi (Trading) Ltd.
Contrary to popular belief, humorous signatures are allowed.
I've been saying "Mucho" to my Spanish friend a lot more lately. It means a lot to him.

Neil
Posts: 98
Joined: Thu Sep 29, 2011 7:10 am
Contact: Website

Re: Harness the power of the GPU?

Tue Jan 17, 2012 2:02 pm

Svartalf said:

Yes and no on that one, James.  It's a large task, but apparently some of the mobile GPU vendors think it's quite important to at least their customers:
http://developer.nvidia.com/opencl


http://forums.developer.nvidia.....anguage/p1

http://developer.nvidia.com/ar.....port-tegra

http://news.softpedia.com/news.....3746.shtml

http://www.uplinq.com/2011/pdf.....evices.pdf
Qualcomm....  Do let us know when you can download the OpenCL SDK.


http://www.arm.com/products/mu.....i-t604.php


That's an IP block.  Until it is licenced by an ARM customer and instantiated in some IP then it doesn't really "exist" in the physical world.


http://www.imgtec.com/forum/fo.....sp?TID=194


http://www.imgtec.com/forum/fo.....sp?TID=194

In summary, in theory its possible to compile OpenCL programs to run on almost any GPU.  Whether it is practically possible or not depends on the support of the IP vendor and the silicon vendor.  Some silicon vendors consider it important enough to expend the considerable effort in adding OpenCL support to their drivers to support their customers, others don't.

zerth
Posts: 8
Joined: Fri Oct 21, 2011 3:13 pm

Re: Harness the power of the GPU?

Tue Feb 07, 2012 6:13 pm

I'm just going to leave this link here:

http://www.phoronix.com/scan.p.....px=MTA1Mzk

"Nouveau developers now having an initial working OpenCL implementation for NVIDIA GeForce graphics hardware on the driver that the Linux community developed themselves via reverse-engineering without NVIDIA's support."

spurious
Posts: 343
Joined: Mon Nov 21, 2011 9:29 pm

Re: Harness the power of the GPU?

Tue Feb 07, 2012 6:31 pm

I am grateful for all the info, and will be reading and trying all aspects of talking to the GPU once I get my hands on a board. I will see if I can get it doing some maths. if I can manage that I will look at producing a simple GPU maths library, but I am not sure I have all the knowledge and time needed for this.. we will see

Also very grateful of the SoC pdf, not read through it yet but looks interesting.

Thanks to all

eric_baird
Posts: 37
Joined: Sun Dec 04, 2011 3:42 pm
Contact: Website

Re: Harness the power of the GPU?

Tue Feb 07, 2012 11:21 pm

Could you use this to do impulse response modelling, for audio?

The idea is to recreate any linear sonic environment by feeding a single "pulse" level-transition through the original environment, recording the waveform that comes back, and then using that waveform as a template that's triggered by any variation in your input signal.

So, if you stand in the middle of a cathedral holding a loudspeaker, and you feed half a squarewave through the speaker as a click, and record the series of click-echoes and general mush that comes back as a waveform, that then gives you the template to load into your IR software, and if you then feed any other digital signal through that template, you get the result of how that signal would have sounded if you were listening to it in that cathedral (with that particular signal-source position and listening position).

If you use dummy-head recording or in-ear microphones, your dumb-box-and-template  can even recreate the deep sonic effect of the cathedral in 3D.

People use "impulse response" modelling for recreating all sorts of electronic effects (like "sampling" classic effects units or plate reverbs).

The trouble with IR modelling is that it takes lots and lots of repetitive calculations. Every input sample value-change needs to be multiplied by every value in the template to produce its own resulting waveform, and then all the waveforms need to be summed.

Maybe a graphics processor could handle it, dunno. Raspberry Pi-based effects boxes might be cool.

Return to “Other projects”