Jump to content

PQ for all of OWNED caches


Rhialto

Recommended Posts

It would be great to also have a button to get a PQ with all of our owned caches that would also include all logs (not last 20). I think that it would be even a better idea if those would be included in the current PQ of our finds that we can run evey 7 days only.

 

I wanted to know if it will ever happen. Yes it's for GSAK use and for stats. Tonight I was doing an big update in GSAK and was looking at my own caches and had to redownload each GPX individually but I skipped that part for tonight.

 

GSAK may not be a Groundspeak product but how many geocachers use it? I would say A LOT. Doesn't the PQ for all of our FINDS is for GSAK use? Why else someone would want to download such a PQ? Unless I'm missing something.

 

Crossing fingers...

 

I'm pretty sure it has been asked before and I'm too lazy to start a search about it tonight.

Link to comment

An "All Finds" PQ contains just my logs plus the cache description. I shudder to think of the file size for an "All caches, all logs" pocket query for a prodigious hider who keeps their caches maintained for many years so they accumulate many long logs (think BrianSnat) or for someone who hides a "101 Dalmations" type of power trail that gets ten logs every weekend from numbers runners.

 

There are other uses for an All Finds PQ besides GSAK. My favorite use is uploading my query to the "It's Not About the Numbers" website so I can study my maps and statistics there.

 

I would have zero interest in combining an "All Finds" and "All Hides" search into a single query.

Link to comment

It would be great to also have a button to get a PQ with all of our owned caches that would also include all logs (not last 20). I think that it would be even a better idea if those would be included in the current PQ of our finds that we can run evey 7 days only.

 

I wanted to know if it will ever happen.

 

It has already happened. It is called a bookmark. You put your owned caches in a bookmark list and then at the bottom you click the generate PQ button. Granted you limited to the the 5 logs per PQ but once your up to date on your logs a weekly run of the PQ keeps everything ship shape.

 

Jim

Link to comment

No, I think he really meant combined. He said "I think that it would be even a better idea if those would be included in the current PQ of our finds that we can run evey 7 days only."

 

He's trying to save a PQ run. Just one instead of two separate ones.

 

I really don't see a need for all the logs on all of my finds. OMG, that would be huge!!!!! With 3300 finds, just think. And what would I ever do with all that info. (Ideas are welcome here).

 

That said, I do see a use for a special PQ for all hides with all logs. For the stats generator that analyzes who has found my caches and who has found the most number of my caches. For that you need the whole history.

 

I currently run a "Owned" PQ (as suggested above) once a week but if I have have a cache that suddenly gets a rush of finders (or logs of any kind), I could easily miss a couple finders with it limited to just the last 5 logs.

 

So, I'd go for a "Once a week" PQ of owned caches with all logs. Just a second button at the bottom where the "My Finds" queue button is currently located.

Edited by Cache O'Plenty
Link to comment
He's trying to save a PQ run. Just one instead of two separate ones.
Yeah I meant combined for the good reason that when we update our stats with the FSG macro in GSAK, everything would be included in this PQ because the current PQ update my finds but we have to manually update our hides to get that section updated. Yes, there is a macro that can do it but I'm pretty sure everyone would like to stop using it and use the one generated here instead.
Link to comment
An "All Finds" PQ contains just my logs plus the cache description. I shudder to think of the file size for an "All caches, all logs" pocket query for a prodigious hider who keeps their caches maintained for many years so they accumulate many long logs (think BrianSnat) or for someone who hides a "101 Dalmations" type of power trail that gets ten logs every weekend from numbers runners.

I really don't think this would be a very significant problem, but I'd be happy to have an "All Hides" query with no logs. You probably know how I feel about GSAK, so you know it's not for that purpose! It's actually because it's nice to have the coords for all my hides in my GPS so that I can pop by and do maintenance when I am in the area.

 

I do it now by maintaining a bookmark list of all my hides. Works fine, but it would be nicer to have it automatically generated by gc.com

 

Oh, and I think that the combined query is a bad idea. I'm generally against gc.com providing services to compensate for shortcomings in third-party software.

Edited by fizzymagic
Link to comment

That said, I do see a use for a special PQ for all hides with all logs. For the stats generator that analyzes who has found my caches and who has found the most number of my caches. For that you need the whole history.

 

But it's a waste of server resources and bandwidth to download ALL the logs every time you get an update, you only need ALL logs for your hides once then you should be able to keep accurate logs with a weekly PQ, or with a RefreshAll macro in GSAK if a group logged a significant number of finds. All that old stale data is being downloaded again for no purpose if we got ALL logs every time this special PQ was downloaded.

Link to comment
An "All Finds" PQ contains just my logs plus the cache description. I shudder to think of the file size for an "All caches, all logs" pocket query for a prodigious hider who keeps their caches maintained for many years so they accumulate many long logs (think BrianSnat) or for someone who hides a "101 Dalmations" type of power trail that gets ten logs every weekend from numbers runners.

I really don't think this would be a very significant problem, but I'd be happy to have an "All Hides" query with no logs. You probably know how I feel about GSAK, so you know it's not for that purpose! It's actually because it's nice to have the coords for all my hides in my GPS so that I can pop by and do maintenance when I am in the area.

 

I do it now by maintaining a bookmark list of all my hides. Works fine, but it would be nicer to have it automatically generated by gc.com

 

Oh, and I think that the combined query is a bad idea. I'm generally against gc.com providing services to compensate for shortcomings in third-party software.

As an owner of many caches I would *love* to have a PQ of my hides. Please implement this guys! :mad:

Link to comment

As an owner of many caches I would *love* to have a PQ of my hides. Please implement this guys! :mad:

 

Either that or allow a special bookmark list that can exceed 500 caches for own hides. I find the PQ from my bookmark list works pretty good for me. I just have to remember to add the new ones, and I like that it automagically handles archived caches. If I recall right from the time I archived a cache the PQ did include the archived cache. The only problem, and I will agree with Rhialto on this one, at one time allow downloading of *all* logs for newly adopted caches.

 

Jim

 

P.S. Yes, the bookmark PQ does include archived caches every time.

Link to comment

If you use GSAK, the ADDLOGS macro already does this.

I suppose it would still be rather tedious for excessively prolific hiders, but that's what you get when you spew.

I can update my 103 active hides in much less than 30 minutes.

 

With a PQ you don't need to run addlogs. The PQ will keep things up to date. The only time I ran addlogs was on the four caches I adopted, otherwise it is my bookmark PQ. I update my 32 hides in mere seconds.

 

Jim

Link to comment

If you use GSAK, the ADDLOGS macro already does this.

With a PQ you don't need to run addlogs. The PQ will keep things up to date. The only time I ran addlogs was on the four caches I adopted, otherwise it is my bookmark PQ. I update my 32 hides in mere seconds.

And if there are more than 5 logs since the last time you received a PQ update for your hides, please use the RefreshALLGPX2_0 macro, instead of AddLogs, to get caught up....RefreshAllGPX grabs the last 20 logs by using the GPX file, rather than scraping the site.

Link to comment

I think this is a great idea too. :ph34r:

If the date/time that the PQ ran was recorded then the next time you ran the PQ you would only require any additions/changes recorded after that date/time. The timestamp must be recorded internally I would have thought.

After the initial run this method would most likely produce a smaller gpx file than a regular PQ of all owned caches.

You could run it as frequently or infrequently as you required.

Link to comment

The only data hangup with the "Update" method is in the rare occasion when a past log (back before the past 5 logs for PQ updates or 20 in the case of individual GPX download) is changed/updated/added/deleted.

 

I only run the "Addlogs" macro once in a very great while and only when I know there have been some historical changes. Otherwise, my update PQ run once a week is more than adequate.

Link to comment

I know that this is a bit "off topic" in that most of the posts speak to maintaining owned caches, but as a relatively new person to geocaching but with an information systems background, I am trying to automate my search selections etc. Specifically, I have a PQ that I use to find all the caches with in a given number of miles of my house. When I used the print function for an individual cache before got GSAK (I'm trying to go paperless), I had the option to print with no logs, 5 logs or 10 logs.. "Simple [No Logs] [5 Logs] [10 Logs]". I've wondered why the programming powers couldn't just add a similar function to the Pocket Query selection criteria. My guess is that 10 logs would encompass 99.9% of all logs if used on owned queries weekly. It would also allow me to construct a PQ for my DNFs that would allow me to download more logs to help me get a few hints on my not founds. A "governer" could be put into the PQ process to put an absolute limit to gross number of records to be downloaded at one time to prevent truly huge downloads.

 

Bottom line - some flexibility in allowing more logs to be downloaded with appropriate limits might address the concerns of all the posts so far...

 

Just a geocaching newbie's $.02 worth (that has worked in information systems development for almost 40 years).

 

Jim

 

My greatest fun in life, now that I've retired, is taking my granddaughters geocaching...

Link to comment

I know that this is a bit "off topic" in that most of the posts speak to maintaining owned caches, but as a relatively new person to geocaching but with an information systems background, I am trying to automate my search selections etc. Specifically, I have a PQ that I use to find all the caches with in a given number of miles of my house. When I used the print function for an individual cache before got GSAK (I'm trying to go paperless), I had the option to print with no logs, 5 logs or 10 logs.. "Simple [No Logs] [5 Logs] [10 Logs]". I've wondered why the programming powers couldn't just add a similar function to the Pocket Query selection criteria. My guess is that 10 logs would encompass 99.9% of all logs if used on owned queries weekly. It would also allow me to construct a PQ for my DNFs that would allow me to download more logs to help me get a few hints on my not founds. A "governer" could be put into the PQ process to put an absolute limit to gross number of records to be downloaded at one time to prevent truly huge downloads.

 

Bottom line - some flexibility in allowing more logs to be downloaded with appropriate limits might address the concerns of all the posts so far...

 

Just a geocaching newbie's $.02 worth (that has worked in information systems development for almost 40 years).

 

Jim

 

My greatest fun in life, now that I've retired, is taking my granddaughters geocaching...

 

Probably bandwidth. Logs can be 4000 characters, times 10 logs, times 500 caches is a whole lot of bandwidth. It would probably bring the email server to its knees.

 

Jim

Link to comment

That said, I do see a use for a special PQ for all hides with all logs. For the stats generator that analyzes who has found my caches and who has found the most number of my caches. For that you need the whole history.

 

But it's a waste of server resources and bandwidth to download ALL the logs every time you get an update, you only need ALL logs for your hides once then you should be able to keep accurate logs with a weekly PQ, or with a RefreshAll macro in GSAK if a group logged a significant number of finds. All that old stale data is being downloaded again for no purpose if we got ALL logs every time this special PQ was downloaded.

 

It was a pain in the butt to get all of the original data into GSAK. In fact, it took a couple of hours on consecutive nights. Once it is done, however, I see no need to replace the entire database on a weekly basis, which is what this proposed PQ would do.

 

I have a PQ arrive late Tuesday night that contains my caches and the last five logs. I import it into GSAK on Wednesday morning before heading off to work. I make sure to click the show full log, (exact wording may no be correct) and then scroll though to make sure that none of the caches have five logs added. If so, I will need to ADD the extra logs for that single cache. The entire process takes less than a minute.

 

In essence, I saw a need for such a PQ at one instance. It would have been helpful and maybe saved me a couple hours of work. After running it that one time, I can see no need for ever running it again. In fact, it would be cumbersome as it would essentially contain 95% of the data that I already had.

 

BTW, since I have less than 500 placed caches, I have been simply using a standard PQ. I never even though about the bookmark method. That's a good idea.

Link to comment

I'd also like to see this feature implemented. Sounds like a great idea.

 

Which feature Tyler? There have been a few suggested.

 

An interesting things. I have 139 caches with about 3100 logs on them. I was concerned about the "server load and bandwidth" argument. In fact, I was buying into it. Well, I have all of the data in GSAK, so I exported it to a GPX. It came out as a 485K file. Out of curiosity, I zipped it up. 83% savings. It turned it into a 85K Zip file .

 

I know nothing about "server load", but I don't need to be an expert to realize that bandwidth would not be an issue.

 

Boy, did I screw that up. When I exported the GPX, it was set to only include my logs. The correct numbers are 2.2M for the GPX, which zips down to 456K.

 

(Edit for content)

Edited by Don_J
Link to comment
If you use GSAK, the ADDLOGS macro already does this.

I know, that's the macro I was talking about and when I wrote:

I'm pretty sure everyone would like to stop using it and use the one generated here instead.
I wanted to say that I believe Groundspeak also would like that macro to go away. There is a warning shown when running that macro because it's sending many request to GS servers and we have no control on when and how many time that macro ir run.

 

That's why the PQ I'm asking should makes everyone happy. Of course there are MANY people who would not need to run it because they see no use to this particular PQ but those liking to update their stats once a week would enjoy it.

Link to comment

If you use GSAK, the ADDLOGS macro already does this.

With a PQ you don't need to run addlogs. The PQ will keep things up to date. The only time I ran addlogs was on the four caches I adopted, otherwise it is my bookmark PQ. I update my 32 hides in mere seconds.

And if there are more than 5 logs since the last time you received a PQ update for your hides, please use the RefreshALLGPX2_0 macro, instead of AddLogs, to get caught up....RefreshAllGPX grabs the last 20 logs by using the GPX file, rather than scraping the site.

 

I've just offered my method to keep up to date, but I really like this method. I DL'ed the updated macro and when I ran it, I instantly noted that it was much mote streamlined from the version that I was using. I guess I need to sit down and change the ADD button to this one.

 

I'm curious about one thing. Doesn't a straight GPX download off of a cache page contain the last fifteen logs?

You mentioned Twenty. Am I mistaken?

Link to comment

I know that this is a bit "off topic" in that most of the posts speak to maintaining owned caches, but as a relatively new person to geocaching but with an information systems background, I am trying to automate my search selections etc. Specifically, I have a PQ that I use to find all the caches with in a given number of miles of my house. When I used the print function for an individual cache before got GSAK (I'm trying to go paperless), I had the option to print with no logs, 5 logs or 10 logs.. "Simple [No Logs] [5 Logs] [10 Logs]". I've wondered why the programming powers couldn't just add a similar function to the Pocket Query selection criteria. My guess is that 10 logs would encompass 99.9% of all logs if used on owned queries weekly. It would also allow me to construct a PQ for my DNFs that would allow me to download more logs to help me get a few hints on my not founds. A "governer" could be put into the PQ process to put an absolute limit to gross number of records to be downloaded at one time to prevent truly huge downloads.

 

Bottom line - some flexibility in allowing more logs to be downloaded with appropriate limits might address the concerns of all the posts so far...

 

Just a geocaching newbie's $.02 worth (that has worked in information systems development for almost 40 years).

 

Jim

 

My greatest fun in life, now that I've retired, is taking my granddaughters geocaching...

 

Probably bandwidth. Logs can be 4000 characters, times 10 logs, times 500 caches is a whole lot of bandwidth. It would probably bring the email server to its knees.

 

Jim

 

That's what I meant about putting "appropriate limits". In the PQ process, a limit could be programmed to stop at a given file size. I could suggest what the file size limit should be, but I would have to have some stats to judge what the server load is and what normal PQ size is. I'm sure the programming powers that be could come up with a reasonable limit. BTW, your example is a worst case scenerio... so it's not a good argument. What you need to be realistic is the average # of characters per log and the average number of logs and the average number of caches requested per PQ? You could do a standard deviation calculation to come up with the "appropriate limits" that would cut off the PQ when it reached that size.... perhaps 500% of the standard, depending upon the server load?

 

It could be done, and if you want more logs for all your hidden caches than the limit allows, then you could schedule the 1st half to run on Mondays and the 2nd half to run on Thursdays and base the selection on bookmarked lists... ie PQ First is the first 100 caches and PQ Second would be the remaining 100 caches.

 

I really don't have an idea of what the development cost would be to set a max file size, but I would guess it would be reasonable.

 

Jim

Link to comment

Well I guess another answer is TPTB has set the limit at 500. They have not responded to multiple lengthy threads of why it should be higher. I accept the limit is now 500 and will probably be 500 for some time into the future. I'm sure the reason is not lack of programming skills or development effort.

 

Jim

Link to comment

I'm curious about one thing. Doesn't a straight GPX download off of a cache page contain the last fifteen logs?

You mentioned Twenty. Am I mistaken?

An individual gpx will grab the last 20 posted logs plus all of your logs if they aren't already in the last 20.

Link to comment

With the news of users being banned who use a program to refresh their local store of cache data, I feel the need for the ability of collecting all of the logs for owned caches becomes even greater.

 

I have placed a good number of placed caches and use the addlogs macro in GSAK about once every 3 months to refresh my database.

 

I am worried that if I attempt it again I will get banned. I do try to be gentle on the website and do my information gathering late at night and during the middle of the week.

 

Come on Groundspeak! It wouldn't be too hard to implement this feature and would solve one of the reasons for website overload and the banning of users.

 

By the way... Great job on the new features that have been gradually added to the website (the ability to save gpx files on the website and the ability to add PQ names to the zip files)! You guys (and gals) do listen. :lol:

 

Thanks!

Link to comment

With the news of users being banned who use a program to refresh their local store of cache data, I feel the need for the ability of collecting all of the logs for owned caches becomes even greater.

Why? For the need to become "greater", doesn't it first need to be "great". I see it as being non-existent. Why do you need a local copy of 3-year-old logs posted to your cache?

Link to comment

With the news of users being banned who use a program to refresh their local store of cache data, I feel the need for the ability of collecting all of the logs for owned caches becomes even greater.

 

I have placed a good number of placed caches and use the addlogs macro in GSAK about once every 3 months to refresh my database.

 

I am worried that if I attempt it again I will get banned. I do try to be gentle on the website and do my information gathering late at night and during the middle of the week.

 

Come on Groundspeak! It wouldn't be too hard to implement this feature and would solve one of the reasons for website overload and the banning of users.

 

By the way... Great job on the new features that have been gradually added to the website (the ability to save gpx files on the website and the ability to add PQ names to the zip files)! You guys (and gals) do listen. :lol:

 

Thanks!

 

So set up a bookmark of your placed caches and generate a PQ from that. Then you can run the PQ every day if you wish and be perfectly legal. Running the PQ frequently will keep you logs up to date.

 

Added:Your apparently a GSAK user, there is a macro bookmark.gsk that if run from your placed database will help you set up the bookmark. It is not automatic, but at least it steps through the database and brings each cache up one at a time so you can add it to a bookmark.

 

Jim

Edited by jholly
Link to comment
Why do you need a local copy of 3-year-old logs posted to your cache?

If you own a challenge cache (compilation type, where the prerequisite list is fixed) and maintain a list of those making progress, such as in Spinal Tap, then you need to be able to automate counting which GS doesn't provide.

 

Situations where PQs do not suffice include: more than five logs on one day, a find log post-dated so that it appears more than five logs deep when initially posted, and a ancient note being changed to a find. All three situations have occurred on one or the other of my challenge caches. (Note that the "last five logs" in a PQ are the most recently dated logs, not the most recently posted logs.

 

Doing it manually is not feasible, so the choices are to use some method which digs deeper into the logs, or to forget including the progress report.

 

Other than that, I agree with Prime Suspect -- I don't know of another reason to keep a local copy. In fact I do not have local copies of the logs for the caches I own (except for the one linked above, and that's only because it's on the bookmark list for the prerequisites). And the progress report is not an essential part of the cache, but I think it's an interesting part of the cache listing. There probably are other valid reasons, but those reasons are likely to be as obscure as mine.

 

Still, it would be useful if a PQ could "send everything that's changed since the last one". However, there are so many complications to this that I'm not even going to start listing them -- and to support an offline database, which GS has stated they won't do. I'd be happy if GS would simply support compilation challenges by running the small number of SQL statements needed (which I'd be happy to provide) and sending me the results. But I'm not holding my breath.

 

Edward

Link to comment
Why do you need a local copy of 3-year-old logs posted to your cache?

If you own a challenge cache (compilation type, where the prerequisite list is fixed) and maintain a list of those making progress, such as in Spinal Tap, then you need to be able to automate counting which GS doesn't provide.

"Fixed list" challenge caches are no longer being published, so any need for this going forward will be diminishing, not growing. And this could have easily have been automated, with a bookmark list and a free Gmail account. Create a duplicate of the exiting bookmark list (yes, a little work, but you only have to do once) and set it to email logs. On your account's email client, set up a filter to look for the bookmark name, and forward that email to the Gmail account. Gmail would then have a database of everyone who has logged any of the cache on the bookmark list. Huzzah!!! You could then use Gmail's very powerful search to check for compliance, in a matter of seconds. Don't you wish you had thought of that a year ago?

Link to comment
Why do you need a local copy of 3-year-old logs posted to your cache?

If you own a challenge cache (compilation type, where the prerequisite list is fixed) and maintain a list of those making progress, such as in Spinal Tap, then you need to be able to automate counting which GS doesn't provide.

"Fixed list" challenge caches are no longer being published, so any need for this going forward will be diminishing, not growing. And this could have easily have been automated, with a bookmark list and a free Gmail account. Create a duplicate of the exiting bookmark list (yes, a little work, but you only have to do once) and set it to email logs. On your account's email client, set up a filter to look for the bookmark name, and forward that email to the Gmail account. Gmail would then have a database of everyone who has logged any of the cache on the bookmark list. Huzzah!!! You could then use Gmail's very powerful search to check for compliance, in a matter of seconds. Don't you wish you had thought of that a year ago?

You don't see that word every day! Nice! :)

Link to comment
Why do you need a local copy of 3-year-old logs posted to your cache?

If you own a challenge cache (compilation type, where the prerequisite list is fixed) and maintain a list of those making progress, such as in Spinal Tap, then you need to be able to automate counting which GS doesn't provide.

"Fixed list" challenge caches are no longer being published, so any need for this going forward will be diminishing, not growing. And this could have easily have been automated, with a bookmark list and a free Gmail account. Create a duplicate of the exiting bookmark list (yes, a little work, but you only have to do once) and set it to email logs. On your account's email client, set up a filter to look for the bookmark name, and forward that email to the Gmail account. Gmail would then have a database of everyone who has logged any of the cache on the bookmark list. Huzzah!!! You could then use Gmail's very powerful search to check for compliance, in a matter of seconds. Don't you wish you had thought of that a year ago?

I don't see that solving his problem. His challenge caches are lists of older caches. He allows finds on these caches from before the date the challenge caches were published. He could have a gmail database of all logs after the cache was published but he would still need to get the three year old logs to see who found the caches three years ago if he wanted to track those as well.

 

I'd see people who want the three year old logs on their owned caches because they want to keep statistics. Who found my caches? How often does each cache get found? etc. There are probably better ways to maintain this kind of data than having all my hides pocket query. The main problem again is getting the initial data when you start doing this if you have many old caches with lots of logs.

 

Finally, I hadn't heard what challenges are no longer being published. I'm not even sure what a "Fixed List" challenge is, though I have an idea. There was some speculation when the ALR changes were put in that Grounspeak might have some better definition of what a challenge is. If there are "secret" guidelines that have been given to the reviewers as to what sorts of challenges can or can't be published, it would be nice to share them with us. I'm always a bit surprised to find out these things from a comment in the forums that is address a different issue.

Link to comment

I don't see that solving his problem. His challenge caches are lists of older caches. He allows finds on these caches from before the date the challenge caches were published. He could have a gmail database of all logs after the cache was published but he would still need to get the three year old logs to see who found the caches three years ago if he wanted to track those as well.

 

I'd see people who want the three year old logs on their owned caches because they want to keep statistics. Who found my caches? How often does each cache get found? etc. There are probably better ways to maintain this kind of data than having all my hides pocket query. The main problem again is getting the initial data when you start doing this if you have many old caches with lots of logs.

Easily handled by requiring anyone who wants to use an old find to relog it as a Find, note the original find date in the text, then delete the log. The Gmail entry is made, and everyone's cache count is correct. Life is good. Huzzah again!!!

Edited by Prime Suspect
Link to comment
"Fixed list" challenge caches are no longer being published, so any need for this going forward will be diminishing, not growing.

Hmm, since when? I hadn't heard about that. Reference to an announcement, or something in the guidelines, or to a rejected submission? I just reviewed the listing guidelines and the definition of a challenge actually appears to have been widened, not narrowed -- it is now explicit that the "geocaching-related qualification" can include something based on Waymarking.

 

And this could have easily have been automated, with a bookmark list and a free Gmail account. Create a duplicate of the exiting bookmark list (yes, a little work, but you only have to do once) and set it to email logs. On your account's email client, set up a filter to look for the bookmark name, and forward that email to the Gmail account. Gmail would then have a database of everyone who has logged any of the cache on the bookmark list. Huzzah!!! You could then use Gmail's very powerful search to check for compliance, in a matter of seconds.

Fails on several counts.

 

gc.com does not guarantee that notification emails will always be sent out, and we've certainly seen the service break on occasion.

 

I don't want to place silly requirements on the seekers like "you have to relog your finds" -- some people may take challenges seriously enough not to be bothered by that, but I'd rather they be out geocaching rather than attending to the details of logging just right because of my inadequate system.

 

Finally, you don't say whether gmail's search is powerful enough to build an HTML table to insert in the cache description, which is what I'm doing with a GSAK macro.

 

As I said before, I don't HAVE TO insert that table. I don't HAVE TO check for compliance -- in fact, as I state in my descriptions "this is for fun and you are on the honor system", though I think the seekers appreciate my confirming their figures. But IF I'm going to display that table, the gmail method doesn't work as far as I can tell.

 

Don't you wish you had thought of that a year ago?

No. No huzzah.

 

Edward

Link to comment

Fails on several counts.

 

gc.com does not guarantee that notification emails will always be sent out, and we've certainly seen the service break on occasion.

Easily handled by setting up an auto-responder to send back a confirmation note. You seem to want to do everything for the people who are supposed to be working a CHALLENGE cache.

 

I don't want to place silly requirements on the seekers like "you have to relog your finds" -- some people may take challenges seriously enough not to be bothered by that, but I'd rather they be out geocaching rather than attending to the details of logging just right because of my inadequate system.

Posting existing finds is hardly more trouble than setting up a Bookmark of existing finds, as most challenge caches request.

 

Finally, you don't say whether gmail's search is powerful enough to build an HTML table to insert in the cache description, which is what I'm doing with a GSAK macro.

 

As I said before, I don't HAVE TO insert that table. I don't HAVE TO check for compliance -- in fact, as I state in my descriptions "this is for fun and you are on the honor system", though I think the seekers appreciate my confirming their figures. But IF I'm going to display that table, the gmail method doesn't work as far as I can tell.

Not natively, but a GM script could do it. But as you said, it's not really required. You're bending over backwards to do the work that the finders should be doing. That's your choice to add unnecessary bells and whistles to your page, but it's not Groundspeak's responsibility to assist you. After all, the people attempting the challenge should already KNOW what they have and haven't completed, and should be keeping their own records. All that's really required is verification, and my system will do that quite well, and it's always up to date - no waiting for PQs to run.

Link to comment
Why do you need a local copy of 3-year-old logs posted to your cache?

If you own a challenge cache (compilation type, where the prerequisite list is fixed) and maintain a list of those making progress, such as in Spinal Tap, then you need to be able to automate counting which GS doesn't provide.

"Fixed list" challenge caches are no longer being published, so any need for this going forward will be diminishing, not growing. And this could have easily have been automated, with a bookmark list and a free Gmail account. Create a duplicate of the exiting bookmark list (yes, a little work, but you only have to do once) and set it to email logs. On your account's email client, set up a filter to look for the bookmark name, and forward that email to the Gmail account. Gmail would then have a database of everyone who has logged any of the cache on the bookmark list. Huzzah!!! You could then use Gmail's very powerful search to check for compliance, in a matter of seconds. Don't you wish you had thought of that a year ago?

You don't see that word every day! Nice! :lol:

And it's worth 30 points in Scrabble!

Link to comment

Several of the caches in paleolith's challenges are also caches that were milestones for me. I don't think I'd want to relog log these and then backdate the new logs as then my milestones would be messed up when I process my my finds PQ. I know, simple enough to use a third party software whee I could reorder my finds as I like so the milestones are correct.

 

I'm really not sure what we are arguing now. One time manually running of the GSAK add logs macro on the each of the caches in the challenge to get the old logs into GSAK. After that run the bookmark list PQ weekly to update the database. If some cache is found more than five times in a week (or if someone logs a backdated log on a cache) you get a notification so you can handle that one caches specially (either a manually GPX file with the last 20 logs or if that isn't enought run the add logs macro). IMO, the add logs macro if run as it should be - manually for one cache at a time - it not screen scrapping any more than any of Prime Suspect's Greasemonkey scripts.

Link to comment

Several of the caches in paleolith's challenges are also caches that were milestones for me. I don't think I'd want to relog log these and then backdate the new logs as then my milestones would be messed up when I process my my finds PQ. I know, simple enough to use a third party software whee I could reorder my finds as I like so the milestones are correct.

What I suggested would do nothing to your milestones. I never said anything about backdating logs. I said to post finds with the original find date in the text of the log. Once you post the log, you delete it. This generates an email to the bookmark owner, but leaves your find count the same as it was.

Link to comment
Guest
This topic is now closed to further replies.
×
×
  • Create New...