I’ll bet anybody $100 that there will be practically useful quantum computers before there’s a native hardware implementation of POSIT or UNUM in a widely available general purpose processor (something a normal person can buy).

Every trend in the industry right now favors fixed-point or low-precision floating-point for AI. It’s known that one can build fast and accurate arbitrary precision from AI widgets (https://arxiv.org/abs/1904.06376) and this, not crazy new formats with which nobody has any serious experience, is the trend that numerical computing is going to exploit in the future.

If somebody believes in new floating-point formats, they should release a library for a widely available FPGA and prove out the method on at least a few million lines of numerical code to show the community why they need to throw away decades of numerical software and rewrite a few billion lines of codes in an as yet to be standardized datatype.

Jeff
--
Jeff Hammond
jeff.science@gmail.com
http://jeffhammond.github.io/