I am going through some book assignments.. and I am running into some terminology that doesn't make sense to me..
Impelement this class...
Class Cis Float
a * b
14 (exp # bits)
113 (sigment # bits)
what does the author mean by, "exp # bits", "sigment # bits" and "10000.....0 bias..?" I have been programming for a long time and have never ran into this terminology. I am tempted just to disregard these as an attempt to discract the programmer from the main objective as I could implement this class easily without regard to these miscelleneous terms.
please provide clarification if possible/applicable.