Bitwise Right Shift vs Divide by 2
807589Nov 30 2008 — edited Nov 30 2008Something that just pondered across my mind.
Bitwise operators have their uses, and I've always known about right/left shifting to divide/multiply by 2. I recently did some empirical analysis on an algorithm using this technique, and I had to iterate it over a trillion times before I started seeing any timing differences, and even then they were in the order of 0.0001 milliseconds.
Is the performance argument for using bitwise over standard divide still applicable on modern day processors? When I use any kind of algorithms doing this kind of operation, they probably won't loop more than a few million times at most. For readability/understandability I may as well stick with divide by two.
Perhaps it is only significantly faster in certain languages (e.g. C)?
Regards,
Rob.