jamesh
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 20908
Joined: Sat Jul 30, 2011 7:41 pm

Re: Divide By Depth??

Fri Nov 16, 2018 10:58 am

DavidS wrote:
Fri Nov 16, 2018 10:28 am
jamesh wrote:
Fri Nov 16, 2018 10:18 am
DavidS wrote:
Thu Nov 15, 2018 11:32 pm
Remember that there is a fairly big price we are paying for compilers to be capable of doing that level of code analysis, as it can get rather complex. The code analysis is eating RAM, CPU Time, and disk space. This is why a simple compiler like TCC is so much faster, smaller, and more effecient in its own use of resources, it does not attempt to do any significant analysis.
Which is why people have build machines and build servers. You are slowing down the compile process in order to improve the final result, which is the important bit. You only need to buy a big build rig once, but the improvements to the final output are seem by ALL the machines running the resulting code.

Almost anything you can push in to the compile process from the development and execution processes is worth doing.
No you are speeding up the compile process. If you compiler with a less optimizing compiler in situations where the compile time with the better optimizing compiler takes to long, and you are only optimizing one or two loops (which is usually all that is needed) then you are taking a 2+ hour build and turning it into a 1 hour or less build.

And it is well known that always needing a faster system to compile is a very poor excuse.

For more see my other replies.
Nope, entirely wrong.

Compilers nowadays are fantastic, you should use them and move as much as possible to compile time as you can, even if your compile times increase. 1. Because the slower optimising compile is only done at the end of the development process. 2. Because you are trading off a long compile time for making your final product faster for everyone 3. Because buying one big compile server is cheaper than having to improve the specs on every machine your binary runs on (both in actual cost, plus environmental damage) 4. Because compilers are almost always better than handcoded assembler so it's a much faster dev process

These are all facts.
Principal Software Engineer at Raspberry Pi (Trading) Ltd.
Please direct all questions to the forum, I do not do support via PM.

jahboater
Posts: 3250
Joined: Wed Feb 04, 2015 6:38 pm

Re: Divide By Depth??

Fri Nov 16, 2018 11:02 am

DavidS wrote:
Fri Nov 16, 2018 10:55 am
And while I have not built gcc in a long time, I am willing to bet it takes quite a bit of time to compile with the default level of optimization when compiled using gcc.
You are right there! It takes about 4.5 hours to build on my Pi3+ (quad core ARMv8) and about 2 days on a Pi Zero (single core ARMv6)
(In practice, I cross compile for the Pi0 on the Pi3).

But then, its compiling and optimizing some 51 million lines of code at a rate of about 11 million lines of code per hour.
No human could possibly do that.

Actually I think its pretty amazing that a credit card sized computer costing 35$ can do it too:)

Interesting chat!

User avatar
DavidS
Posts: 3636
Joined: Thu Dec 15, 2011 6:39 am
Location: USA
Contact: Website

Re: Divide By Depth??

Fri Nov 16, 2018 11:38 am

Yes very interesting chat, thank you for the stimulation of thought.
The Raspberry Pi is an ARM based computer, that runs many different and varied Operating Systems, including Linux, RISC OS, BSD, as well as many more.

User avatar
DavidS
Posts: 3636
Joined: Thu Dec 15, 2011 6:39 am
Location: USA
Contact: Website

Re: Divide By Depth??

Fri Nov 16, 2018 11:49 am

jamesh wrote:
Fri Nov 16, 2018 10:58 am
DavidS wrote:
Fri Nov 16, 2018 10:28 am
jamesh wrote:
Fri Nov 16, 2018 10:18 am


Which is why people have build machines and build servers. You are slowing down the compile process in order to improve the final result, which is the important bit. You only need to buy a big build rig once, but the improvements to the final output are seem by ALL the machines running the resulting code.

Almost anything you can push in to the compile process from the development and execution processes is worth doing.
No you are speeding up the compile process. If you compiler with a less optimizing compiler in situations where the compile time with the better optimizing compiler takes to long, and you are only optimizing one or two loops (which is usually all that is needed) then you are taking a 2+ hour build and turning it into a 1 hour or less build.

And it is well known that always needing a faster system to compile is a very poor excuse.

For more see my other replies.
Nope, entirely wrong.

Compilers nowadays are fantastic, you should use them and move as much as possible to compile time as you can, even if your compile times increase. 1. Because the slower optimising compile is only done at the end of the development process. 2. Because you are trading off a long compile time for making your final product faster for everyone 3. Because buying one big compile server is cheaper than having to improve the specs on every machine your binary runs on (both in actual cost, plus environmental damage) 4. Because compilers are almost always better than handcoded assembler so it's a much faster dev process

These are all facts.
No those are mostly facts, not all. So to your list:
  1. Yes the optimization is done at the end of the development process.
  2. The difference in performance between the object code produced by a less optimizing compiler that is many times faster at compiling, and code created by an extreme over the top optimizing compiler is not enough to justify the cost. 99% of the code does not need to be optimized as it is not run very much (often only one time per run of the program).
  3. What does that have to do with the topic at hand? We are talking about getting everything to run better on lower spec machines, not improving the specs of everyones machine.
  4. More true than it used to be, though still open for debate when dealling with those that eat/sleep/breath optimization and assembly.
I will agree that having the codebase 100% in the HLL and not including the hand optimized assembler in the codebase that is distrubuted is good practice. That keeps for portability. Even though it is only a very small fraction of the code that needs optimized, so long as it is correctly identified.

I thank you for the BASH IT HARDER to make it fit point of view you provide (use more power systems forever). Personally I prefer the work it to do more with less force approch (get it to use less processing time on lower end systems [both compiler and end product]).

But I try to save power, because I want to see everyone on solar and wind power, and completely off grid for electric, like I am.
The Raspberry Pi is an ARM based computer, that runs many different and varied Operating Systems, including Linux, RISC OS, BSD, as well as many more.

Heater
Posts: 10236
Joined: Tue Jul 17, 2012 3:02 pm

Re: Divide By Depth??

Fri Nov 16, 2018 12:36 pm

Let's talk about saving power.

Optimized code takes less time to run. Therefore it saves power on every machine that runs it. That is billions of machines if we are talking operating system code. The guys at the likes of Google and Facebook love optimized code because it saves them huge amount on their electric bills. Never mind Joe home PC user worrying about his electric bill, or Jane phone user getting annoyed at short battery life.

Because it takes less time to run and service requests you need less machines in your server farms. The guys at the likes of Google and Facebook love optimized code because it saves them buying more hardware and building bigger data centers.

Ergo, unoptimized code uses more power and consumes more resources. If you want to be green, then optimize.

Hand written assembler would end up being the least optimized code. There just isn't the man power to be hand crafting assembler, nobody would want to be doing it all day every day anyway. Besides you can't beat the compiler at optimization.

Ergo, writing in assembler, especially Intel assembler, is bad for the environment and causes global warming. Only a bad person would ship hand crafted assembler. :)

I think a couple of things have been forgotten here. Certainly David is right and long compile times can be annoying, but:

1) If optimizing compilation is taking too long then while iterating on your code development turn optimization off "-O0".

2) Typically when iterating on code during development one is not recompiling the whole project with every little change. Projects are made of lots of small files, using make only the file(s) one has just edited get recompiled. No time at all.

3) If it takes all night to recompile a big project, say a latest kernel or GCC or Qt for a Raspi it's not a big problem, one does not do that very often and it can be run over night.

User avatar
DavidS
Posts: 3636
Joined: Thu Dec 15, 2011 6:39 am
Location: USA
Contact: Website

Re: Divide By Depth??

Fri Nov 16, 2018 12:39 pm

@Heater:
I agree with the concept of saving power by optimizing. That question is not at issue, the question is the balance between optimization power used versus the total power saved. This is one of the reasons I am apposed to the use a bigger system method, and why I advocate the methods of optimization that I advocate.

See my rant thread in Off Topic. Just posted.

You should well know that I am pro optimization, very much so. And I have no problems with a reasonalble amount of help from the compiler, though not an unreasonable amount.
The Raspberry Pi is an ARM based computer, that runs many different and varied Operating Systems, including Linux, RISC OS, BSD, as well as many more.

jahboater
Posts: 3250
Joined: Wed Feb 04, 2015 6:38 pm

Re: Divide By Depth??

Fri Nov 16, 2018 1:02 pm

Heater wrote:
Fri Nov 16, 2018 12:36 pm
2) Typically when iterating on code during development one is not recompiling the whole project with every little change. Projects are made of lots of small files, using make only the file(s) one has just edited get recompiled. No time at all.
Yes - that's the key
People complaining about long compilation times have badly structured projects and/or don't know how to use make.

User avatar
DavidS
Posts: 3636
Joined: Thu Dec 15, 2011 6:39 am
Location: USA
Contact: Website

Re: Divide By Depth??

Fri Nov 16, 2018 1:29 pm

jahboater wrote:
Fri Nov 16, 2018 1:02 pm
Heater wrote:
Fri Nov 16, 2018 12:36 pm
2) Typically when iterating on code during development one is not recompiling the whole project with every little change. Projects are made of lots of small files, using make only the file(s) one has just edited get recompiled. No time at all.
Yes - that's the key
People complaining about long compilation times have badly structured projects and/or don't know how to use make.
Fair point, except it misses a big secondary:
While a rebuild may only take a few minutes, the initial build already took the time, and the cumulative time of rebuilds adds up over the duration of a project.
The Raspberry Pi is an ARM based computer, that runs many different and varied Operating Systems, including Linux, RISC OS, BSD, as well as many more.

User avatar
DavidS
Posts: 3636
Joined: Thu Dec 15, 2011 6:39 am
Location: USA
Contact: Website

Re: Divide By Depth??

Fri Nov 16, 2018 2:23 pm

This has gotten off topic to the point of absurd. So I am going to bow out of this one.

I do thank everyone for the enlightening conversation. I hope everyone remembers to save power in all of there activities, including programming.
The Raspberry Pi is an ARM based computer, that runs many different and varied Operating Systems, including Linux, RISC OS, BSD, as well as many more.

Heater
Posts: 10236
Joined: Tue Jul 17, 2012 3:02 pm

Re: Divide By Depth??

Fri Nov 16, 2018 6:05 pm

DavidS,
This has gotten off topic to the point of absurd.
Indeed it has.
I hope everyone remembers to save power in all of there activities, including programming.
Indeed we do.

Energy is money. And that is in short supply. That is why we make the case that we do.

I suspect that the reason we have had this debate is that if one is living off grid with a solar panel and a battery the solutions to saving energy locally may be a bit different to the rest of the global civilization. Just guessing.

Return to “Graphics programming”