Replies: 4 comments 9 replies
-
@dimanzver please never again use issues for questions in repositories that have Discussions enabled. It is technically possible to reuse a consumer function (or object) across N queues. Consumers on a shared channel also can have priorities. However, there is no mechanism for "priorities across N queues", and there never will be. The design you have in mind is a terrible idea (with 11 years on the RabbitMQ core team and some ≈ 14 years as a contributors I feel confident about those strong words). Using a single queue for N message types or a single consumer for N message types often runs into a wall. Imitating priorities across N queues for consumer coordination is guaranteed to not be worth the effort. This is a consumer coordination problem. Use N consumers across N queues, with a prefetch they need (it could be 1 if you understand the extent to which this option will affect throughput), and coordinate them however you like in your own code. Or use a stream which will allow consumers to consume and filter out what they want from a shared source. Again, if they need to coordinate, how to best do that is up to you but repeatable reads of streams will generally help. I haven't been following RabbitMQ Stream protocol PHP client development, to my knowledge our team does not maintain one. |
Beta Was this translation helpful? Give feedback.
-
I was going to post a similar question, but I found this discussion. I have a similar setup with a single consumer reading from a single queue with 3 levels of message priorities and a prefetch count of 1. This setup allows me to process the highest priority messages first, and then only if none are remaining, process the lower priority messages in order by priority number. I see that quorum queues no longer support message priorities and instead the recommendation on https://www.rabbitmq.com/docs/quorum-queues#priorities is: "To prioritize messages with Quorum Queues, use multiple queues; one for each priority." I'm trying to figure out how to structure my consumers to prioritize messages if I have multiple queues. Based on the discussion above, I believe the recommendation is to setup a separate consumer for each queue. Does that mean that all 3 levels of priorities would be processed at the same time through each of the consumers? I only want to process the lower priority messages if there are no high priority messages. Can you please assist in an example of how to coordinate the consumers to support message priority with multiple queues? Thank you for your help. |
Beta Was this translation helpful? Give feedback.
-
https://github.com/rabbitmq/rabbitmq-server/releases/tag/v4.0.0-beta.4 is out for those who want to try priority support in quorum queues. Note: it is not exactly the same behavior when there is a backlog of messages with different priorities but should be enough for many cases. |
Beta Was this translation helpful? Give feedback.
-
@michaelklishin This result can be obtained if consume one queue some times. For example
So we get the ratio 3:1 (3 processed message from p1 queue and 1 processed message from queue p2). |
Beta Was this translation helpful? Give feedback.
-
Is your feature request related to a problem? Please describe.
Is it possible to set up so that the consumer receives messages first from the first queue, then from the second, then from the third? That is, so that messages in different queues have different priorities. As soon as the tasks in the first queue run out, we consume messages from second queue, when the tasks in it run out, we consume messages from third queue. If at this time a message suddenly appears in the first queue, then we process it. Yes, I know that there are message priorities, but we have plans to use queues with the "quorum" type that do not support message priorities.
Describe the solution you'd like
I tried this, but not works correctly (consumer receives 1 message from queue p1, then 1 message from queue p2, then 1 message from queue p1, then 1 message from queue p2, etc...)
Describe alternatives you've considered
No response
Additional context
No response
Beta Was this translation helpful? Give feedback.
All reactions