Jump to content

(My find) Pocket queries


Ursprung

Recommended Posts

I do not understand to get the .gpx of the found caches to get it once a week. If you go geocaching daily this is a very long periode of time.

I would prefer the following solution:

The "My Find" .gpx should be part of the normal pocket queries. I agree for the limit of 5 downloads daily, but the user should be able to define if he downloads 5 standard pocket queries or 4 pocket queries plus the "My Find" file.

 

During a week this would get nearly the same traffic as before:

 

Old: 7x5 pocket queries plus 1x "My Find" = 36 .gpx files

New: 7x5 pocket queries up to 7x4 pocket queries + 5x "My Find" = 35 gpx files

 

What do you think?

 

matthias (from Switzerland)

Link to comment

The "my finds" PQ is limited to once a week because of its size and complexity. Users with 5000, 6000 , 10000 or more finds could hog a lot of resources by running it daily. (10,000 finds would be equal to running 20 normal PQs)

 

Why would you need the data more often anyway?

Link to comment

Why would you need the data more often anyway?

 

First of all, I doubt that the most cachers habe 10'000 founds... Most of us will have arround 200-2000 (in Europe).

Second, the resources is a problem. In the evening (Europe) we often have problems reaching http://www.geocaching.com. Maybe it is time to engange the server :=) (not only for the .gpx pocker queries)

 

When I go geocaching a day I have about 1-2 dozen new founds. I would like to see my effort directly in the profile. Downloading the individual .gpx 's is very boring.

Link to comment

Why would you need the data more often anyway?

 

First of all, I doubt that the most cachers habe 10'000 founds... Most of us will have arround 200-2000 (in Europe).

Second, the resources is a problem. In the evening (Europe) we often have problems reaching http://www.geocaching.com. Maybe it is time to engange the server :=) (not only for the .gpx pocker queries)

 

When I go geocaching a day I have about 1-2 dozen new founds. I would like to see my effort directly in the profile. Downloading the individual .gpx 's is very boring.

What "profile"? - geocaching.com doesn't offer any stats in the profile. If you are using a 3rd party app for stats - you just need to live with weekly.

 

Doesn't matter if the "average" cacher doesn't have 10,000 - there are several that do and that would tax the system. And even at your 2000 number - that is equal to 4 regular PQs. Would you be willing to give up 4 of your PQs to get a daily "my finds"??

Link to comment

When I go geocaching a day I have about 1-2 dozen new founds. I would like to see my effort directly in the profile. Downloading the individual .gpx 's is very boring.

 

Assuming you're using GSAK to generate your stats, you don't have to download individual GPX files.

 

I create a PQ to download caches I have found and select an appropriate center point, I also check found in last 7 days so I don't get EVERYTHING (since I'm one of those 2,000+ find types), then I load that PQ into GSAK in a database named MyFinds, I delete the extra logs (those which aren't mine), and re-run my stats (I use FindStatGen).

 

Problem solved. :unsure:

Link to comment

Another approach, if using GSAK, is to merely check the found box for those you find "today" and then move them to a separate "My Found Caches" database. Then use that database in the FindStatGen macro.

 

Although this does not include your found log, it does provide all the other data the macro requires. The result is then pasted into your online profile.

 

Then, once a week, download the normal "My Finds" PQ and all your own logs will come with it.

Link to comment

Another approach, if using GSAK, is to merely check the found box for those you find "today" and then move them to a separate "My Found Caches" database. Then use that database in the FindStatGen macro.

 

Although this does not include your found log, it does provide all the other data the macro requires. The result is then pasted into your online profile.

 

Then, once a week, download the normal "My Finds" PQ and all your own logs will come with it.

 

The only issue with that method is without the logs they finds don't always appear in the correct order. That's why, if I find a LOT of caches I run my own PQ to play catch up, deleting the extra logs...if I only find a handful of caches then I just grab the GPX files for those few caches, again I delete the extra logs.

Link to comment

Another approach, if using GSAK, is to merely check the found box for those you find "today" and then move them to a separate "My Found Caches" database. Then use that database in the FindStatGen macro.

 

Although this does not include your found log, it does provide all the other data the macro requires. The result is then pasted into your online profile.

 

Then, once a week, download the normal "My Finds" PQ and all your own logs will come with it.

 

The only issue with that method is without the logs they finds don't always appear in the correct order. That's why, if I find a LOT of caches I run my own PQ to play catch up, deleting the extra logs...if I only find a handful of caches then I just grab the GPX files for those few caches, again I delete the extra logs.

Point taken. I guess I haven't been that concerned with that level of accuracy since I know I'll get the weekly PQ and things will correct themselves.

 

I find, on average, about 25 caches per week and haven't felt the need to maintain my milestones that closely. On significant ones, like #3000, I might actually go to the effort of downloading the individual GPX files just to make sure I entered them into GC.com in the right order so I can fix it immediately rather than wait until the weekly update to figure it out.

 

Otherwise, I just wait until Friday.

Edited by Cache O'Plenty
Link to comment
(10,000 finds would be equal to running 20 normal PQs)

This is not true. This data service returns significantly fewer logs per cache. In looking at my PQ files I find that 10000 finds would equate to approximately 10 regular PQs, not 20, when considering just email attachment file size.

 

Furthermore, from a database management perspective, a regular PQ would require orders of magnitude more processor time to identify which caches are to be selected out of the database. The AllFinds query does no subsetting except for "found by me". Generating a list of caches to send is as simple as going into the logs table and finding all the logs owned by the same account; any DBMS can do this extremely fast because of something called "indexing". Reading the string data out of lat/lon fields and then doing 3-dimensional distance calculations over the surface of a spheroid to compare with a central location or distance to a route takes much longer. So if CPU time is the resource you are focused on, then I would say that you have some work to do to prove your point.

Edited by Hynr
Link to comment
Guest
This topic is now closed to further replies.
×
×
  • Create New...