You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Once you get into tens of thousands of URLs, the viewer starts taking too long to render results.
The backend becomes slow/starts using a lot of RAM (bad, but OK for a single-user app) and the results page gets too big to render in a reasonable amount of time (much worse since it makes the viewer unusable past a certain point).
The text was updated successfully, but these errors were encountered:
The list of crawls goes away, at least in its current form as the front page.
The results screen becomes the front page, with search filters expanded to include the list of crawls.
The results will no longer show script-level data. All data will get collapsed to domain level.
The results page will probably be a dynamically-rendered, sortable table. Loading visible rows only (occlusion rendering?) is like pagination 2.0: performance w/o the extra burden of pagination controls.
There will be a new domain detail report page, reachable from the main results page. This page will present script-level findings and provide additional information such as exact properties accessed, etc.
Once you get into tens of thousands of URLs, the viewer starts taking too long to render results.
The backend becomes slow/starts using a lot of RAM (bad, but OK for a single-user app) and the results page gets too big to render in a reasonable amount of time (much worse since it makes the viewer unusable past a certain point).
The text was updated successfully, but these errors were encountered: