Skip to content

ValueCache calls are made inside the batch load call #99

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 9 commits into from
Aug 9, 2021

Conversation

bbakerman
Copy link
Member

The ValueCache is an async affair.

The previous PRs such as #91 handled it badly because they made the "load" call asynchronous with repect to the method returning a CF that might need to be dispatched

This fixes this by approaching it differently.

The BatchLoader is an async method and hence we can call the ValueCache then, meaning given 10 keys, if it gets 6 from the cache first it need only batch call for 4 after wards.

The results are then combined together again.

This opens another possibility - that is being able to do "batch cache" lookups. This has been added to ValueCache and if the implementation cant do batch lookup it just delegates back to the singleton get calls.

For something backed by say REDIS (which can do batch cache calls) this may improve performance

The batch size here is no bigger than the max batch size options

@bbakerman bbakerman changed the title ValueCache calls are made before the batch load call ValueCache calls are made inside the batch load call Aug 8, 2021
private ValueCache<Object, V> determineValueCache(DataLoaderOptions loaderOptions) {
return (ValueCache<Object, V>) loaderOptions.valueCache().orElseGet(ValueCache::defaultValueCache);
private ValueCache<K, V> determineValueCache(DataLoaderOptions loaderOptions) {
return (ValueCache<K, V>) loaderOptions.valueCache().orElseGet(ValueCache::defaultValueCache);
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not sure why this was <Object,V> - weird

.whenComplete(setValueIntoCacheAndCompleteFuture(cacheKey, future));
}
}
});
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This was the old async code in respect to the load call. This has been removed and cache lookup is now done in batch inside the "batch load" call done because of a call to dispatch

@@ -2,28 +2,26 @@

import org.dataloader.annotations.Internal;

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

changed to supplier assert messages - better performance

}
}

CompletableFuture<V> invokeLoaderImmediately(K key, Object keyContext) {
CompletableFuture<V> invokeLoaderImmediately(K key, Object keyContext, boolean cachingEnabled) {
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is used when batching is off. We can still cache here

boolean completeValueAfterCacheSet = loaderOptions.getValueCacheOptions().isCompleteValueAfterCacheSet();
if (completeValueAfterCacheSet) {
return nonNull(valueCache
.setValues(missedKeys, missedValues), () -> "Your ValueCache.setValues function MUST return a non null promise")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

promise is confusing: You should stick with CF imho

@bbakerman bbakerman added this to the 3.x milestone Aug 9, 2021
@bbakerman bbakerman merged commit 8836d7e into master Aug 9, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants