-
In my application, I want to limit a specified maximum page number for paging, so that when the user tries to request a page number that exceeds the maximum page number I set, the request is still the maximum page number I set. What if this effect is achieved? Thank you very much. |
Beta Was this translation helpful? Give feedback.
Replies: 7 comments 5 replies
-
As you probably already know, there's likely no performance advantage in limiting the collection from a good behaving DB. But let's assume you have a different (good) reason to do so. The technical trick with pagy is quite simple. Just do a count query on the collection, and cap it at a maximum number of records. max_records = max_pages * records_per_page
count = collection.count
count = max_records if count > max_records
@pagy, @records = pagy(collection, count: count) You may also want put some caption explaining the user what you are doing, and maybe also why you limit the access to the first |
Beta Was this translation helpful? Give feedback.
-
Well well well... It looks like you are trying to use tools that you don't know, on something that you know less. What about dedicating a bit of time to actually learn the tools the old way (i.e. RTFM?) before trying to use them by guessing?
"Hey, this carpet doesn't fly. Is it because it is not supported yet?" No, it's because it's a carpet. 😀
"Hey, is there a way to put this carburator in this EV and make it run on gas?" What? 🙃
I better stop "supposing" that, because it looks very likely that we have a XY problem here. So please, explain well why do you want to limit your results. |
Beta Was this translation helpful? Give feedback.
-
Thank you for your advice and guidance on my application. I will try another way to solve this problem. |
Beta Was this translation helpful? Give feedback.
-
Just found out |
Beta Was this translation helpful? Give feedback.
-
Back to Pagy, I read the code of pagy's documentation again and made the following modifications to achieve my purpose. Add the following configuration to require "pagy/extras/overflow"
Pagy::DEFAULT[:overflow] = :last_page
require "pagy/extras/searchkick"
Searchkick.extend Pagy::Searchkick
module PagyExtension
# Create a Pagy object from a Searchkick::Results object
def new_from_searchkick(results, vars = {})
vars[:items] = results.options[:per_page]
vars[:page] = results.options[:page]
vars[:count] ||= results.total_count
Pagy.new(vars)
end
end
Pagy.extend PagyExtension This is done mainly because the Use passive mode: @products = Product.search("*", page: param_page, per_page: param_limit)
@pagy = Pagy.new_from_searchkick(@products, count: 400, items: param_limit) Now, after items is set to 40, the total number of pages is limited to 10 pages. |
Beta Was this translation helpful? Give feedback.
-
... and now you have to maintain a totally unnecessary hack, just for the sake of not following (or maybe not understanding?) the advice of doing "that directly in elasticsearch", which would also have other bigger advantages. BTW, your hack is incomplete, because it will ALWAYS return a fixed number of pages, even if you have less or none. And you managed to overflow the Happy that you found your very own perfect solution! 👍🏻 |
Beta Was this translation helpful? Give feedback.
-
From 8.1.+ pagy implemented the max_pages However, as already explained that's not the right tool against crawling, and in this specific case, doing it at the source (i.e in elasticsearch), would have more advantages, but at least it's not a half-backed hack to maintain. |
Beta Was this translation helpful? Give feedback.
From 8.1.+ pagy implemented the max_pages
However, as already explained that's not the right tool against crawling, and in this specific case, doing it at the source (i.e in elasticsearch), would have more advantages, but at least it's not a half-backed hack to maintain.