Branchless Programming (Geek)
-
At first I thought it’s an old, retro video talking about old tech and old techniques. But apparently it’s made fairly recently.
Link to videoSome types of programmers will appreciate this or hate this more than others. I look at it as just another tool in the toolbox.
-
OK, I got through the first couple examples and he has a point. The CPU will be doing more work, but less I/O, which is slower than the work.
However, in many cases it sacrifices easy readability - the poor schmuck that comes behind you has to dig in and figure out what you were trying to do which is much more human time, unless you are a great programmer and document your routines appropriately (which I always did when there was any complexity at all).
So you are left, as always, with the ages old choice of efficiency vs. maintainability.
-
This is nothing new. It's as old as my career in IT is long. I know this "technique" as boolean logic. We were forced to use it in our early days (early 1980s) as our development/runtime environment had no branching commands like If, etc.
My father, brother and I, wrote an entire payroll AR/AP system using boolean logic. Not a single If statement to be found.
-
Lol "Often it really pays to have a look at the assembler the compiler is producing to check whether your branchless code is giving any benefits".
He misspelled "this is a useless technique, of interest only out of curiosity because compilers are better than you are at this anyway. Also, if you find yourself interrogating your compiler's assembler output for efficiency, I hope you're programming for a nuclear reactor where milliseconds count, and then asking yourself why you're using some random implementation of a c++ compiler anyway".
That's quite a misspelling, but it can happen. The probability of it happening is the same as the probability of a branchless technique being a good idea in modern software programming.
I'm old enough to have used inline assembly in Borland Turbo c++, for a game I wrote that wrote directly to the VGA Mode-X graphics buffer. I bought a book by Michael Abrash that explained how.