Open
Description
Hi,
When we iterating over a collection with a large number of documents (> 1 mio) our pod ran out of Memory.
Even, when we just used the MongoCurser without any other actions this happens.
void republishFromDatabase(Consumer<StoreVehicleOperations> onNext, boolean changeStream, boolean stateStore) {
try (var cursor = mongoTemplate.getCollection(Vehicle.COLLECTION_NAME)
.withReadPreference(ReadPreference.secondaryPreferred())
.find()
.batchSize(batchSize)
.cursor()) {
while (cursor.hasNext()) {
var document = cursor.next();
}
}
}
With this configuration:
@Bean
public MongoClientSettingsBuilderCustomizer mongoCustomizer() {
return builder -> {
builder
.contextProvider(ContextProviderFactory.create(observationRegistry))
.uuidRepresentation(UuidRepresentation.JAVA_LEGACY)
.retryWrites(false)
.addCommandListener(new MongoObservationCommandListener(observationRegistry));
};
}
I found out hat this happens duo the usage of the MongoObservationCommandListener.
Since this listener will add a getMore observation for each getMore and will link them:
getMore
getMore
getMore
...
After I removed
.addCommandListener(new MongoObservationCommandListener(observationRegistry))
from my MongoClientSettingsBuilderCustomizer, it works fine.
This configuration works:
return builder -> { builder
.uuidRepresentation(UuidRepresentation.JAVA_LEGACY)
.retryWrites(false);
};
I tried this with:
spring-boot-starter-mongodb-reactive with 3.1.0
spring-boot-starter-mongodb with 3.1.0