-
Notifications
You must be signed in to change notification settings - Fork 94
ValueCache calls are made inside the batch load call #99
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…d function and not the load method
private ValueCache<Object, V> determineValueCache(DataLoaderOptions loaderOptions) { | ||
return (ValueCache<Object, V>) loaderOptions.valueCache().orElseGet(ValueCache::defaultValueCache); | ||
private ValueCache<K, V> determineValueCache(DataLoaderOptions loaderOptions) { | ||
return (ValueCache<K, V>) loaderOptions.valueCache().orElseGet(ValueCache::defaultValueCache); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
not sure why this was <Object,V> - weird
.whenComplete(setValueIntoCacheAndCompleteFuture(cacheKey, future)); | ||
} | ||
} | ||
}); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This was the old async code in respect to the load
call. This has been removed and cache lookup is now done in batch inside the "batch load" call done because of a call to dispatch
@@ -2,28 +2,26 @@ | |||
|
|||
import org.dataloader.annotations.Internal; | |||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
changed to supplier assert messages - better performance
} | ||
} | ||
|
||
CompletableFuture<V> invokeLoaderImmediately(K key, Object keyContext) { | ||
CompletableFuture<V> invokeLoaderImmediately(K key, Object keyContext, boolean cachingEnabled) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is used when batching is off. We can still cache here
boolean completeValueAfterCacheSet = loaderOptions.getValueCacheOptions().isCompleteValueAfterCacheSet(); | ||
if (completeValueAfterCacheSet) { | ||
return nonNull(valueCache | ||
.setValues(missedKeys, missedValues), () -> "Your ValueCache.setValues function MUST return a non null promise") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
promise is confusing: You should stick with CF imho
The ValueCache is an async affair.
The previous PRs such as #91 handled it badly because they made the "load" call asynchronous with repect to the method returning a CF that might need to be dispatched
This fixes this by approaching it differently.
The BatchLoader is an async method and hence we can call the ValueCache then, meaning given 10 keys, if it gets 6 from the cache first it need only batch call for 4 after wards.
The results are then combined together again.
This opens another possibility - that is being able to do "batch cache" lookups. This has been added to ValueCache and if the implementation cant do batch lookup it just delegates back to the singleton
get
calls.For something backed by say REDIS (which can do batch cache calls) this may improve performance
The batch size here is no bigger than the max batch size options