Jackson 2.16: faster BigDecimal reads?

@cowtowncoder
4 min readNov 23, 2023

(actually already included in 2.15, just not yet measured)

TL;DNR

Reading of JSON documents with heavy BigDecimal usage may be faster by up to 20–25%, AS LONG AS you enable StreamReadFeature.USE_FAST_BIG_NUMBER_PARSER for JsonParser.
Support for this feature was added in Jackson 2.15; performance was tested on Jackson 2.16.0 running on JDK 21.

Background: faster FP reads in 2.15

Earlier I wrote about improved read performance of floating-point values (“Jackson 2.15: yet faster floating-point reads”) — and in particular doubles, including test results for Jackson 2.15.0.

But although the title suggests improvements all around, I did not actually test whether there is a speed-up for reading BigDecimal values. This is important because internal handling of binary (2-base) floating-point values (float, double) differs significantly from that of decimal (10-based) floating-point value (BigDecimal) handling.

So let’s have another look, this time measuring whether BigDecimal reads too are faster with Jackson 2.16.0.

How and why does BigDecimal differ from Double/double?

But before investigating performance differences let’s consider why there might be some. To do that it is necessary to understand differences between 2-based (double/Double, float/Float) and 10-based (BigDecimal) floating-point numbers.
More detailed explanation can be found from Baeldung’s “Java Double vs BigDecimal” article; for now let’s consider Pros and Cons of double:

Double/Float: benefits

  • Fast arithmetic operations: due to fixed size (32/64) bits, standardized format (IEEE-754 from 1985!), there has been CPU-level hardware acceleration for past 30+ years
  • Low memory usage: double and float have compact representations (64-/32-bits == 8/4 bytes) — this also helps faster operations wrt memory locality
  • Wide value range (for Double) — due to “floating” part of definition, double in particular can represent very wide range of Big and Small number (but with caveats regarding precision — there are still only 64/32 bits that differentiate values)

Double/Float: drawbacks

  • Lossy and slow conversions to/from textual representation: whereas written numbers like 126.42 use 10-base representation (10 digits used), double/float are 2-based — the fractional part is represented as sums of fractions of form 1/2^n. This means that numbers like 0.25 (1/2²) can be represented exactly; but some seemingly simple numbers like 0.1 can not be represented 100% accurately at all: they are actually approximations! This means that seemingly simple operations — most notably monetary calculations — do introduce non-trivial errors.
    And not only can there be conversion errors, conversion process itself is slower due to need to do best possible approximation
  • Limited precision: despite impressive dynamic range (for double), precision is still limited so calculations involving very big and very small numbers in particular lead to significant additional precision loss

Compared to this, BigDecimal has following benefits/drawbacks:

  • Benefit: Unlimited precision (with expanding size)
  • Benefit: 100% accurate conversion to/from textual representation due to 10-based representation
  • Benefit: Potentially faster conversion to/from textual representation (due to native 10-based representation)
  • Drawback: higher memory usage both due to flexible (expanding) structure and due to use of 10-based representation.
  • Drawback: slower arithmetic operations

So: despite BigDecimal generally considered as “much slower” than Double/Float, it is actually not certain that the specific case of JSON deserialization should be slower with BigDecimal.

Enabling Fast BigDecimal reading

One important thing to note is that support for fast BigDecimal reads is enabled separate from fast double/float reads. You may want to enable both, but they are controlled separately.
To turn both on, you’d use something like:

JsonFactory f = JsonFactory.builder()
.enable(StreamReadFeature.USE_FAST_DOUBLE_PARSER)
.enable(StreamReadFeature.USE_FAST_BIG_NUMBER_PARSER)
.build();
ObjectMapper mapper = new JsonMapper(f);

// Or, with existing JsonMapper enable for ObjectReader
ObjectReader r = mapper.reader()
.with(StreamReadFeature.USE_FAST_DOUBLE_PARSER)
.with(StreamReadFeature.USE_FAST_BIG_NUMBER_PARSER);

Measurements

For measurements, I added new options for “Currency” tests I used for earlier articles (part of jackson-benchmarks project https://github.com/FasterXML/jackson-benchmarks/)

After cloning the project, building (mvn clean package), actual invocation is:

java -Xmx256m -jar target/perf.jar ".*Json.*StdReadVanilla.readCurrencyBig.*" -wi 3 -w 1 -i 5 -r 1 -f 5

and with JDK 21, I get following results on my Mac Mini:

java -Xmx256m -jar target/perf.jar ".*Json.*StdReadVanilla.readCurrencyBig.*" -wi 3 -w 1 -i 5 -r 1 -f 5

Benchmark Mode Cnt Score Error Units
JsonStdReadVanilla.readCurrencyBigDecPojoDefault thrpt 25 54281.246 ± 457.707 ops/s
JsonStdReadVanilla.readCurrencyBigDecPojoFast thrpt 25 60107.856 ± 234.320 ops/s

which is basically 10% speed-up for the test case.

Using similar reasoning as with the double test case (in “Faster FP reads in Jackson 2.15”) we can guesstimate that the “raw” decoding speedup might be around +50% (since maybe 30% of total time for test case is spent for actual decoding).

Note, too, that for truly have FP use case — transferring Vector data (big Arrays of FP values) for, say, GenAI usage — improvement should be higher, maybe 20–25%, since majority of content are numbers.

Takeaways?

Same as in the earlier case, your instinct may be that 10% speed-up — or even 20–25% for heavier use — is not particularly impressive.

But then again, it’s effectively “free”: just need to enable faster reads and that’s it.

So seems like something you probably would want to enable if reading JSON content with BigDecimal values.

--

--

@cowtowncoder

Open Source developer, most known for Jackson data processor (nee “JSON library”), author of many, many other OSS libraries for Java, from ClassMate to Woodstox