Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Direct batch and multi last client API and examples #310

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

scottf
Copy link
Collaborator

@scottf scottf commented Nov 5, 2024

No description provided.

@scottf scottf changed the title Direct batch and multi last API and examples Direct batch and multi last client API and examples Nov 5, 2024
Copy link
Member

@MauriceVanVeen MauriceVanVeen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@ripienaar ripienaar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

* API: `batch: number, max_bytes: number, sequence: number, subject: string`
* Request: `{"batch":3,"max_bytes":2002,"seq":4,"next_by_subj":"foo.>"}`
1. gets up to batch number of messages >= than start time that has specified subject, limited by max bytes
* API: `batch: number, max_bytes: number, start time: time, subject: string`
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

start_time is not supported in batch requests (possibly should be)

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have unit tests that say otherwise.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The start time is ignored.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have unit tests for all of these and they all work as expected.

  1. batch only
  2. batch with starting sequence
  3. batch with start time
  4. batch with max bytes
  5. batch with max bytes and starting sequence
  6. batch with max bytes and start time

For multi last I have these tests

  1. just subjects
  2. subjects with up_to_seq
  3. subjects with up_to_time

Here is the server validation code (Stream.go line 4266):

// Check we don't have conflicting options set.
// We do not allow batch mode for lastFor requests.
if (req.Seq > 0 && req.LastFor != _EMPTY_) ||
	(req.Seq > 0 && req.StartTime != nil) ||
	(req.StartTime != nil && req.LastFor != _EMPTY_) ||
	(req.LastFor != _EMPTY_ && req.NextFor != _EMPTY_) ||
	(req.LastFor != _EMPTY_ && req.Batch > 0) ||
	(req.LastFor != _EMPTY_ && len(req.MultiLastFor) > 0) ||
	(req.NextFor != _EMPTY_ && len(req.MultiLastFor) > 0) ||
	(req.UpToSeq > 0 && req.UpToTime != nil) {
	hdr := []byte("NATS/1.0 408 Bad Request\r\n\r\n")
	mset.outq.send(newJSPubMsg(reply, _EMPTY_, _EMPTY_, hdr, nil, nil, 0))
	return
}

Relevant notes on that code

  • seq and last for cannot both be present
  • seq and start time cannot both be present
  • up_to_seq and up_to_time cannot both be present

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It appears that batch works to limit the multi_for response, so I have to add these variants in

Copy link
Member

@aricart aricart left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Need to do more investigation, but for direct batch:

  • multi_last
  • batch
  • seq
  • max_bytes
  • (up_to_seq | up_to_time) - they are mutually exclusive

Only multi_last is required.
All the other options can be specified in the query - however only one of up_to kind can be specified.

A better way to understand is that a batch:

  • Always has a multiple filter specified (can be [">"])
  • Can specify a start seq (otherwise grabs from the end of the stream)
  • Can specify multiple limits at the same time: batch, max_bytes and a one more sentinel condition up_to_seq or up_to_time

@scottf
Copy link
Collaborator Author

scottf commented Nov 7, 2024

Added multi last for variants with batch

Copy link
Member

@MauriceVanVeen MauriceVanVeen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, one minor comment

@@ -115,6 +115,30 @@ After the batch is sent a zero length payload message will be sent with the `Nat

When requests are made against servers that do not support `batch` the first response will be received and nothing will follow. Old servers can be detected by the absence of the `Nats-Num-Pending` header in the first reply.

There are 4 viable API calls for a batch. All require a batch amount greater than 0 and a subject which may include a wildcard.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should now be 6* viable API calls.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants