Description
Saving entities with byte arrays (byte[]) is very slow, especially when using larger byte arrays.
When debugging this, one can see that the byte[] is converted to a Byte[] inbetween. So I assume that every byte in the source byte array is boxed into an object. Which is slow and consumes lots of memory.
I created a reproducer for this: https://github.com/daspilker/spring-data-jdbc-reproducer
Saving an entity with a 20MB byte array takes more than 3 seconds in an H2 memory database. See https://github.com/daspilker/spring-data-jdbc-reproducer/actions/runs/15320075825/job/43101704776#step:4:749
I created a workaround to by-pass the converter for byte[] by using a custom JdbcConverter. See https://github.com/daspilker/spring-data-jdbc-reproducer/blob/main/src/test/java/org/example/ExampleRepositoryWorkaroundTest.java#L48
In that case, saving a 20MB byte array only takes a few milliseconds. See https://github.com/daspilker/spring-data-jdbc-reproducer/actions/runs/15320075825/job/43101704776#step:4:791
Is there a better way to avoid the conversion? Or is there a better representation for BLOBs than byte[]?
PS: I know that storing large byte arrays is a database is not the best idea, but it should not be unnecessary inefficient.