Jump to content

Pagination of PQs.


JeremyR
Followers 3

Recommended Posts

I can understand why PQs are limited to 500 results but these days, most GPS devices can store many more than that with most able to hold 1000 or more.

 

At the moment, if I wanted to have the nearest 1000 caches to home on my GPS, I would have to craft many PQs that overlap and use them to build a composite database with GSAK or similar before sending them to my GPS. There's a lot of waste involved in doing that, both on my part and in terms of processing on geocaching.com's database servers.

 

A much simpler solution would be to allow us to set up two PQs with the same basic criteria (a centrepoint, for example) but have the second one set to return 'the next page' as it were (i.e. results 501 through 1000).

 

Such a process would make it much easier for us to build a larger cache database on our GPS devices and would reduce the database hit involved in doing so considerably whilst not increasing the amount of data that each user can pull down.

 

Any thoughts?

Link to comment
I can understand why PQs are limited to 500 results but these days, most GPS devices can store many more than that with most able to hold 1000 or more.

 

At the moment, if I wanted to have the nearest 1000 caches to home on my GPS, I would have to craft many PQs that overlap and use them to build a composite database with GSAK or similar before sending them to my GPS. There's a lot of waste involved in doing that, both on my part and in terms of processing on geocaching.com's database servers.

 

A much simpler solution would be to allow us to set up two PQs with the same basic criteria (a centrepoint, for example) but have the second one set to return 'the next page' as it were (i.e. results 501 through 1000).

 

Such a process would make it much easier for us to build a larger cache database on our GPS devices and would reduce the database hit involved in doing so considerably whilst not increasing the amount of data that each user can pull down.

 

Any thoughts?

 

Good suggestion. If that could be pulled off it would save server resources. Using two zip files would help lower RAM & CPU resources used.

Link to comment

Yes, I understand that possibility but it's not simple for the average Joe - it takes quite a bit of messing about with dates to get each 'older caches' PQ just under 500 results and it requires monitoring and adjustment of the 'newest caches' PQ as it approaches the limit.

 

What I'm suggesting above is a tick-box 'so easy a caveman could do it' solution that wouldn't need checking up on or altering every time 500 new caches are published. It's quite simple to achieve from a database/SQL point of view too.

Link to comment
Any thoughts?

I think an excellent idea! (I've mentioned it before.)

 

Your post did prompt another thought on the subject. Some folks have requested a "donut" PQ. If the PQ mechanism was changed to allow a starting record that would solve two problems.

 

All queries at present start at record 1. (I'll not get into the actual record number for clarity.) The query returns up to record 500. But if we were able to start the query at record 501 then we could create that donut. String a few of these together and one could create pagination manually. In fact, one could pre-design a query where the return is zero until the number of records that fit the rest of the query reaches the specified minimum.

 

This is probably the simplest solution to date to both implement by the site and for the end user. No changing of the sizes of returns, number of returns, and minimum programming for the site. It would on require the passing of a single number to the query and would affect the LIMIT part of the query.

 

On the user side, it would be much simpler to set up and understand than the present "separate by date" scheme.

 

In fact, the queries might run faster. The fewer internal data to shift through the faster a query runs. By removing the search on date and simply changing the start record, I thinking these PQs would run a hair faster.

 

As an example, at present if you're looking for all the caches within, say, 50 miles and there are 821 caches, you'll create two PQs with the caches divided by date. The first PQ starts at "Geocaching Day One" or earlier. (Or at least the day of the first cache placed that is still active.) and the end date is manipulated until you get just less than 500 caches. The second PQ starts the day after the first PQ end date.

 

The problem comes when the number of caches grow and older caches go away. Soon there are 905 caches. But there are more than 500 caches from the start date of the second PQ to present. The number of caches in the first PQ has dropped to 367. You have to go back and massage the end date of the first PQ and change the start of the second. Otherwise, you have to add a third PQ to handle the overflow of the second PQ.

 

This is time consuming and, quite frankly, a clumsy solution. It's not worthy of a geeky techno hobby.

 

Enter the above mentioned solution. You create a PQ to gather all caches within 50 miles. You know there are more than 500 caches. You duplicate the first PQ and change the start record to 501 (from 1). You set it and forget it until the number of caches reaches 1001. There is no need to change the first (or second) PQ--ever. When the number of caches within 50 miles reaches 1001 simply duplicate the first PQ again and change the duplicated PQ starting record number to 1001.

 

This is a "set it and forget it" solution with minimum programing and easier to understand and easier on the servers.

Link to comment

To add a little more about why using dates for this sort of thing isn't efficient if you're just looking for your closest 1000, I've just been playing with my local area, which is reasonably cache-dense and to achieve the desired cache database, I reckon I'd need a huge amount of PQs (I'd hazard a guess at 10-15). At the current rate of placement, a new PQ would be required roughly every 8-10 weeks (500 publications 'ago' in my area was December 4th '08). If the current upwards trend in publications continues, you'd run out of available PQs very quickly.

 

Please Groundspeak, make it simple for us :D

Link to comment

If you live in a cache-dense area, it becomes even of a chore, since you also have to deal with adjusting the radius, as well as the two date ranges. It would be great if you could select if a PQ is a "single" or a "double". The "double" would count as 2 PQs in your daily total, and would give you up to 1k caches.

Link to comment

To add a little more about why using dates for this sort of thing isn't efficient if you're just looking for your closest 1000, I've just been playing with my local area, which is reasonably cache-dense and to achieve the desired cache database, I reckon I'd need a huge amount of PQs (I'd hazard a guess at 10-15). At the current rate of placement, a new PQ would be required roughly every 8-10 weeks (500 publications 'ago' in my area was December 4th '08). If the current upwards trend in publications continues, you'd run out of available PQs very quickly.

 

Please Groundspeak, make it simple for us :D

Wow. That does seem like a bunch of new caches in your area. Is it possible that you are running your PQs wide open and keeping the distance at 100 miles?

Edited by sbell111
Link to comment
(snip)

 

This is time consuming and, quite frankly, a clumsy solution. It's not worthy of a geeky techno hobby.

 

Enter the above mentioned solution. You create a PQ to gather all caches within 50 miles. You know there are more than 500 caches. You duplicate the first PQ and change the start record to 501 (from 1). You set it and forget it until the number of caches reaches 1001. There is no need to change the first (or second) PQ--ever. When the number of caches within 50 miles reaches 1001 simply duplicate the first PQ again and change the duplicated PQ starting record number to 1001.

 

This is a "set it and forget it" solution with minimum programing and easier to understand and easier on the servers.

 

This sounds like an excellent solution! Nate, Raine what say ye?

Link to comment

If you live in a cache-dense area, it becomes even of a chore, since you also have to deal with adjusting the radius, as well as the two date ranges. It would be great if you could select if a PQ is a "single" or a "double". The "double" would count as 2 PQs in your daily total, and would give you up to 1k caches.

I really like this idea.

 

We can now get 2500 caches a day. What difference does it make if we get those in 5*500 or 1*2500 doses.

 

Only thing we have to take into account is the size of the PQ mails, because they could become too large for some mailservers. Although splitting the actual result in batches of 500 caches would solve that as well.

Link to comment

Sorry to Bump the thread. But I figured out a better way of doing these. I figured out a Way to make a Set Area instead of a Radius Circle. This eliminate areas that I know I will never search for a cache in and save space on my GPS for the areas I would search in. For example I have at http://www.geocaching.com/my/userrouteedit...35-8d5168bd0b6a .

I used MapSource and the Track Draw Function to make the Route and I zig zagged back on fourth on the map to fill the radius of my search area. I made sure I had less than 500 miles length and less than 500 points (The points will never be a problem here). I measure the largest gap (To determine the distance from route In the map of St. Louis my Biggest Gap was just under 7 miles so I would divide that by half and my area of Search would be 3.5 miles from the route. I then save that as a GPX file and upload it to Geocaching.com as a route. I would then make a Pocket Query of the route with the 3.5 mile area of search along the route. Now we get to the problem with the area having more than 500 caches. To deal with this problem is to create multiple queries of the same route and use dates to prevent overlaping. I make the dates to where I just end up with slightly less than 500 Caches in there. Then I make another one for the day after the date range of the last one until I have just about another 500 caches. I keep doing that until I get through all the available dates (his means go into the future dates) in the query. The Dates I used for my St. Louis Area Queries are From January 1, 1999 (The earliest PQ date) - March 7, 2007, March 8, 2007 - August 2, 2008, and August 3, 2008 - December 31, 2010. In all 3 of my searches I end up with about 1300 caches in that area, well below my 2000 cache limit on my Garmin Colorado. This seems to be working for me right now.

Link to comment
Guest
This topic is now closed to further replies.
Followers 3
×
×
  • Create New...