Integer division, or float multiplication?
If one has to calculate a fraction of a given int value, say:
int j = 78;
int i = 5* j / 4;
Is this faster than doing:
int i = 1.25*j; // ?
If it is, is there a conversion factor one could use to decide which to
use, as in how many int divisions can be done in the same time a one float
multiplication?
Edit: I think the comments make it clear that the floating point math will
be slower, but the question is, by how much? If I need to replace each
float multiplication by $N$ int divisions, for what $N$ will this not be
worth it anymore?
No comments:
Post a Comment