-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature: Allow stream custom maxsize per batch #2063
Feature: Allow stream custom maxsize per batch #2063
Conversation
|
✅ Deploy Preview for badger-docs canceled.
|
716d7b0
to
acba512
Compare
@simon28082 the linter is not happy. Could you check that please? and also pull in latest changes from the |
@mangalaman93 will do it |
Change maxSize type
@mangalaman93 I done it, please review |
@@ -315,7 +323,7 @@ func (st *Stream) streamKVs(ctx context.Context) error { | |||
// Send the batch immediately if it already exceeds the maximum allowed size. | |||
// If the size of the batch exceeds maxStreamSize, break from the loop to | |||
// avoid creating a batch that is so big that certain limits are reached. | |||
if batch.LenNoPadding() > int(maxStreamSize) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
could you try upgrading the action version in .github/workflows/ci-golang-lint.yml:29 to
uses: golangci/[email protected]
Problem
I'm trying to read the data using badger and use grpc for node sync, but I can't change the max cut size of stream because it's restricted hardcode
Solution
Add a property to Stream to allow custom slice size