Is Search Paging Smelly

As a user-interface issue, is having "paged" result sets a user-interface-design smell?

For example, you run a query (such as QueryByExample) and let's say there are 1000 result rows or items. Some systems will give the user the first say 50 rows, and then a list of pages along with "next" and "previous" options.

Personally, I would rather have all 1000 rows come up and use the browser to scroll down rather than app-implemented paging. I'll call this "full-scroll" as a working term. It has the following advantages:

It has the following advantages: [perjorative language removed]

The first, column titles, is going to be a problem anyhow because a paper-sized page does not fit on the screen (unless you are using a large screen or micro-fonts). But I agree it is less of a problem. It would be nice if HTML defined "locked" column placement that always stayed at the top of the screen or panel (and printouts) without having to rely on DHTML and JS hacks. This would allow one to mirror the desktop-GUI "grid-box" solution more or less where column titles are always at the top of the box. Or if HTML simply had a data-grid widget as part of the standard. A work-around for full-scroll is to just repeat the column headings every X rows, perhaps as a user-controllable frequency.

Having paper-friendly printouts from webpages is a huge moose of a problem beyond just paging. I'm almost inclined to suggest having a dedicated paper-based reporting side-system if you want paper-friendly printouts. But these are either expensive or immature. Optimizing output for both interactive screens and paper at the same time is a nearly impossible task.

--top


Pagered search results favor common human behavior. Most of the time, most people will either find what they're looking for in the first result set or two, or they'll refine/rework their query. They don't want to wait for all 350,000 results to be rendered and transmitted. Current web SoTA (early 2011) seems to be using AJAX to load the next chunk of results while you're looking at the first chunk, and so on, giving you the full set if you're willing to wait, but quick response if you don't need everything.

For small bounded sets that grow slowly (such as a list of all the current countries in the world), the non-pagered direct approach is good. But for open-ended sets or those which may grow quickly, some kind of pager is likely to be needed.

It also depends upon whether searching is more common than accessing. If you're likely to do one search and then access 20 to 50 of the results, the all-in-one might make sense. On the other hand, if you typically do another search after accessing only 1 to 3 results, the pager makes more sense.

I didn't suggest 350,000. Limit it to say 2,000 and display a notice if the results exceed that. That's still usually quicker than loading JS libraries. And I imagine this is a case of personal preference. I'd personally rather scroll than click paging buttons. Google etc. pagifies because it gives them a way to display more ads, lining their pockets, not because it's what people prefer.

I Prefer scrolling too; typically with around 100 items per page (used to have a script that scraped the top 10 results pages from Google and combined them into one). But even if you limit it to 2000, that's still a pager with 2000 items per page. (What's really bad are the sites that break a short one page article up into 5 pages in order to get more ad impressions - and also usually make it hard to find the pager/navigation.)


It might be nice to have an option... e.g. a 'Sample Query' button, or perhaps a 'get full results' hyperlink. For working from a browser, it might also be nice for the full results option to support a few common formats (such as CommaSeparatedValue?).

The ideal "fix" would be a kind of TableBrowser GUI widget with an options menu for exporting to CSV etc. A smart one could incrementally load the next set of rows as one scrolls fairly close to the end or boundaries on either side. For example, if the first load is 500 records and you scroll to record number 400, it triggers the table browser to request another sub-set, say rows 500 to 1,000. Unless the user scrolls really quickly, they won't know about the incremental loading under the hood: it would look like one big table. And if they do scroll quickly, they just have to wait for an hour-glass icon or the like. In this case we could say that the grid mechanism has a boundary threshold of 100 records and a load size of 500. These two attributes would ideally be developer configurable.

I've seen such things written in AJAX - several, even, where the rows at the top are unloaded while those at the bottom are loaded. A technique to minimize observable delay is to have the previous page, the current page, and the next two pages in the current direction of scrolling all loaded... i.e. if a 'page' is 300 records, you might have 1200 loaded at any given instant. The scroll-bar itself must be replaced with something proportional to the total record count rather than the number of loaded records. I understand you have objections to HtmlDomJsCss, but the basic idea seems sound.

Given your interest in TableBrowser, I'm a bit surprised you've never bothered to write yourself some sort of NSAPI/NPRuntime plugin that will add nice TableBrowser capabilities to compatible web browsers.

Being that I don't do SystemsSoftware for a living, I'm much slower at that kind of thing. I am, however, working on an open-source QueryBrowser? web tool that is kind of half Toad and half DesktopDatabase-like. I've yet to find a decent open-source AJAX data grid so far. I've listed the problems already somewhere around this wiki.


Yes. The problems is as so often the repeating of abstractions. Searching can be done


The "lost column headings" scrolling issue can be semi-mitigated with the following:


See also: ResultSetSizeIssues


CategoryWebDesign JanuaryEleven


EditText of this page (last edited January 12, 2011) or FindPage with title or text search