Skip to content

Add an option to convert numbers between double and decimal in the same way as node.js, python3, ruby, rust or so #172

@kubo

Description

@kubo

Request in short

Could you add an option to use dtoa.c by David M. Gay or similar code to convert values between DPI_NATIVE_TYPE_DOUBLE and DPI_ORACLE_TYPE_NUMBER.

dtoa.c is used by ruby and python3.

Though I don't know whether node.js uses dtoa.c or not, ECMAScript definition suggests to refer dtoa.c as follows.

Implementers of ECMAScript may find useful the paper and code written by David M. Gay for binary-to-decimal conversion of floating-point numbers:

Gay, David M. Correctly Rounded Binary-Decimal and Decimal-Binary Conversions. Numerical Analysis, Manuscript 90-10. AT&T Bell Laboratories (Murray Hill, New Jersey). 30 November 1990. Available as
http://ampl.com/REFS/abstracts.html#rounding. Associated code available as
http://netlib.sandia.gov/fp/dtoa.c and as
http://netlib.sandia.gov/fp/g_fmt.c and may also be found at the various netlib mirror sites.

Rust uses same algorithm with dtoa.c as a fallback of Grisu algorithm.

I think it can be included in ODPI-C because MySQL includes code based on dtoa.c.

Background

As far as I checked, node.js, python3, ruby and rust seem to convert numbers between double and decimal in the same way.

  • Different double values are converted to different string values even when the difference is only one bit.
  • When a double value is converted to a string and the string is converted to a double, the original and converted double values are exactly same.

Here is a python3 code to check above. Any value between 2.3 and 2.30000000000001 is converted to string differently. No round trip errors are displayed. I made similar codes for node.js, ruby, rust and go. They printed exactly same results.

from struct import pack, unpack

dstart = 2.3
dend = 2.30000000000001
istart = unpack('Q', pack('d', dstart))[0]
iend = unpack('Q', pack('d', dend))[0]

# increment bits in floating point number one by one from dstart to dend
for i in range(istart, iend + 1):
    dval = unpack('d', pack('Q', i))[0]
    sval = str(dval)
    round_trip_dval = float(sval)
    if dval != round_trip_dval:
        print("round trip error {} != {}".format(dval, round_trip_dval), file=sys.stderr)
    print("0x{:x},{}".format(i, sval))

I think that this issue is resolved when decimal to binary conversion is exactly same with ODPI-C and languages using ODPI-C.

A floating point number 2.3 in node.js, python3, ruby and rust consists of bits 100000000000010011001100110011001100110011001100110011001100110.
It is inserted to an Oracle number column as 2.3. But when the column is fetched using DPI_NATIVE_TYPE_DOUBLE, the fetched value consists of bits 100000000000010011001100110011001100110011001100110011001100111 (the last one bit is set.) It is displayed as 2.3000000000000003 because different double values are converted to different string values in the languages. If ODPI-C converts Oracle number, represented as decimal number, to floating point number as the languages do, it is displayed as 2.3.

FYI

Ruby includes dtoa.c from util.c as follows to rename public function names.

In https://github.com/ruby/ruby/blob/v3_1_1/util.c#L610-L616:

#undef strtod
#define strtod ruby_strtod
#undef dtoa
#define dtoa ruby_dtoa
#undef hdtoa
#define hdtoa ruby_hdtoa
#include "missing/dtoa.c"

ruby_strtod is used to convert string to double.
ruby_dtoa is used to convert double to string.

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions