Jump to content

pppingme

+Premium Members
  • Posts

    1238
  • Joined

  • Last visited

Everything posted by pppingme

  1. I DO choose to read every single email that is sent to the email address that I supplied to gc.com for communicating with me. I have separate emails for PQ's myself, and thats all that should go to that email address. No where did I even come close to indicating to gc.com that I wished to receive any other communication to that email address except for PQ's. Which is why gc.com needs to communicate to the email address that I supplied for that purpose, not to an email address that I clearly indicate is only for sending a PQ to.
  2. Actually, NO, its not... I expect a PQ email to be just that, a PQ. IF TPTB want to send me an email, they should send it seperate to my primary email address. There is no reason the PQ generator couldn't send two emails, one for the PQ (and thats it), and a 2nd for the message. PQ's are very often automated. I would suspect (no stats) that there are more people that use programs/macros/whatever to pick up the PQ emails than people that dont. If true, that means the majority of people for which the message was intended did NOT see it. Watching the forums supports that. Many people complained about the PQ's getting unchecked before a single individual several days later noticed the message. If TPTB @ GC wish to send a message, it needs to be seperate. I do not believe the PQ emails are the place for that, even if they involve PQ's. The messages WILL be missed.
  3. If you know ahead of time which caches you plan to do, you can pull individual .gpx files off the cache pages, and these will have 20 logs in them. Assuming you get your caches to your Palm using GSAK, then its real easy to "merge" individual .gpx files into the database, then do your cachemate export from there. But if you cache like me (I usually don't know where I'm going to be from day to day, so I just load the closest couple hundred unfound caches in my PDA (I use gpxsonar) and GPSr, its hard to predict. But I also generate a closest unfound PQ every day, and using GSAK, it accumulates the logs, so for my closest unfound caches I have a full set of logs. But I also run (weekly) PQ's for areas I'm not in as often, but like to keep loaded up, so my logs aren't complete for those areas.
  4. Yes, but it results in more than 500 miles and 500 results. The idea here is that if I plan a cross-country trip, I'm obviously not going to do the entire trip in one day. Lets say I plan on driving I-70 across the country, I may plan (more or less) a day in Kansas, a day in Missouri and Illinois, and so on. I realize these should be seperate PQ's as a trip this large will generate a lot of results. BUT, on the otherhand, its a pain to contruct a dozen individual routes. So, my request is one "master route" and the ability to break it down by state (ie, use my main route, but only give me the KS results in one PQ, then the next day I get the Missouri and Illinois results, and so on). This keeps route generation simple, keeps the PQ's simple, and just seems to be a more logical approach than trying to build a dozen routes. You still have the same number of PQ's, but only one route to base those PQ's on.
  5. There probably could not have been a worse way to do this. A solid number of people never even look at these emails and depend on other programs, scripts, etc to pull these emails and pull the .gpx files out of them. Are TPTP listening here?? It would have been more appropriate to send an email to the primary account (yes, I have my PQ's going to another unmonitored account that gsak checks). PQ emails are NOT the place for communication, as they are (from everyone I know) mostly unmonitored.
  6. I'd like to see the ability to filter by state on caches along a route. Same type of filtering that is available in a regular PQ.
  7. I noticed the same thing. I went in to change the order of some of my PQ's, and noticed that "SOME" of my PQ's had gotten unchecked. Not all of them, only some. I'll watch and see what happens tommorow..
  8. It is about as affordable as a pack of gum... I'm only paying $4.99 a month for WAP access, and the really cool part is I can teather it to my laptop (via bluetooth, no cable needed) as well... Slow (about dialup speed) but unlimited internet for $5/month isn't a bad deal.. If I ever upgrade my phone, it should be even faster... My carrier claims about a 4x increase in speed if I do upgrade the phone, with no additional monthly cost.
  9. OK raine, one more technical question. Lets say I have a route that is 600 miles. Am I guaranteed (assuming other factors keep the PQ results below 500) the first 500 miles OR is there any chance it could go random? I would like to run a route, and if I "lost" the first 100 miles, I would be OK with that (well I would run the route in reverse and loose the last 100 miles). This seems to reflect what I'm seeing now. OR, do I really need to figure out a point that keeps the route to less than 500 miles?
  10. When I first setup a caches along a route query, I had a route that was slightly over 500 miles, but everything worked. Now I notice that caches at the very tail end of the route are missing. When I measure to the last point, I still get slightly over 500 miles (512 miles), but I'm still missing the very tail end. This raises the question of, is 500 a hard limit or a suggestion? Since this used to work, I'm wondering if this is a change since caches along a route was implemented.
  11. I know regular PQ's can be redirected to another email address, but how did you redirect the "My Finds" PQ?? I don't see a way to do it??
  12. I do live in a somewhat cache rich area. I'm in the Kansas City area, very close (within 4 miles) of the state line. I like to keep a list of all of the KS and MO caches. This currently consists of 9 PQ's, and this covers over 4000 caches that I have not found yet. By staggering the PQ's, I keep both states up to date in my local list and nothing is ever more than a week out of date. (I run 2 PQ's a day on weekdays). To keep my close caches up to date, I also have a PQ of just my closest unfound caches, that way my close caches are always up to date. On top of that, I also run the 500 closest caches of each state surrounding me (7 more PQ's), this give me a couple thousand more caches. Add a few more PQ's for another part of the country that I frequent, this gives me a list of around 10,000 caches that I could make into a day trip or an overnight trip pretty easy. Now if I'm looking for a specific type of cache (say caches with the word cave, or any other word I choose to search on), I can quicly get a list of everything in my state, my closest neighboring state, and the closest caches in the states that surround me, none of this data being more than a week old, and of course I would probably pull up a live cache page on anything that I decide to visit if I'm only going after one or two. All very easy since I do download PQ's on a regular basis and keep my local database up to date.
  13. No kidding. Pocket queries cannot search by keyword. They limit themselves to 500 results. My search for the word "cave" came up over a thousand search results. Which means if I could do a pocket query, I would have missed over half of what I was really looking for. NOT cool. Although there may be over a thousand, there are no distance limitations when doing a search from the website. That means your probably going to get caves on the other side of the world. With the PQ, you download the closest 500 caches to you (depending on how dence your area is, this may be plenty, or you may have to get creative to get a larger circle, still easy). You now use a tool that can work with GPX files (may I suggest GSAK). From there you can search, filter and sort by any parameter you want.
  14. I think the point of the OP is that we as users shouldn't get a full length SQL error just because we enter a bad date. Something should check this before it hits the SQL engine.
  15. I think the difference is that a program like GPXcomplete (I'm not familiar with it) is a screen scraper, that causes database hits and cpu for rendering html and so on.. On the other hand, pulling a .jpg (or whatever) from an already defined url (the one that geocaching.com put in the PQ that they sent us) isn't causing the same database/cpu/rendering hit, much much easier on the servers. I also consider the view that if geocaching.com didn't want us pulling the .jpg images (or again, whatever is linked in) that they would strip that from the PQ before they send it. So again, we have infered permissio to pull the picture, but we DON'T have permission to scrape additional info off the site.
  16. I guess to take the viewpoint of its supplied in the PQ further, when you use a program like GSAK, and you (assuming you have the "view" window open) have a particular cache highlighted, the cache displays in the view window, with the data/text from the .gpx that gsak loaded, but, if you have an active internet connection, it will grab the pic's from gc.com... Does this constitute a violation of the TOS?? If so, then probably the greatest majority of PQ users are violating the TOS. I'd be curious to hear your viewpoint on this?
  17. Although it is a general statement, and without official word from TPTB, I would lean toward it being OK to grab pictures. robot, spider, scraper, other automated means. The argument could be made that your running the "picture grabber" (whatever you run, there are a couple out there) in an interactive mode, therefore it is not a scraper, spider, etc. A spider by definition is following links through the web pages, you already know the picture link and are going straight for the picture. A scraper reads the page and collects information, usually for republishing. A robot, well this one gets grey. Your telling a program what do do and its doing it. Automated, well your usually running it interactively, but again this could be grey. I really do think that this policy is geared against collecting data from the pages, and not toward grabbing the pictures, which (assuming your using a PQ) you were already provided the links to by geocaching.com. By geocaching.com supplying a list of links via the PQ, that could infer permission to grab the picture. Also, part 2 of this policy I think is geared toward keeping the load down on the site. All images are hosted on a seperate set of servers (maybe just one server?) and therefore pulling the pictures does not put a load on the main web servers.
  18. Your probably going straight to http://forums.geocaching.com Go to http://www.geocaching.com then click on forums, then click on the link at bottom of that page to get into the forums. From my understanding, its this last step that actually creates the forum account. The forum's dont link back to the geocaching user database but have their own, and TPTB have disabled manually creating the forum accounts. This link is the "programming" that does that step. You will have the same problem if you change your password.
  19. One last plea to TPTB... Please Please Please make the list sortable by date.
  20. My guess would be that the cache owner did not intend to archive this cache or intended to unarchive this cache. So its not really that I'm worried about the reason, or viewing his log, it was the complete absence of a log that bugged me. It is my understanding that a log that creates an action (needs maint. is a good example) can NOT be modified, and I've seen several people strugle with this in the forums (again focusing on the need maint example), but if an "action" type log can't be modified, why CAN it be deleted?? I guess this is more of a bug report, because the complete lack of the log can create confusion. As is obvious on this cache, as its still there, and there are still people finding it since I discovered that it was "flagged" as archived. By the way, the local reviewer has since "unarchived" the cache, as it has had at least 4 finds in the last month (cache was apparently archived a little over a month ago).
  21. I was watching a cache (manually, not via a watch list). One day I looked at it and at the top of the cache page it said it was archived. I looked through the logs, and except for having a dnf or two, there was nothing to indicate why it was archived. This cache had also had several find logs against it since it went to an archived status. I wrote the local reviewer and asked him what he thought. He indicated that it appeared to him that the owner archived it, then deleted the archive log. It appears that he was able to undelete/unarchive that log entry, which didn't really contain any info (owner didn't indicate why), but at least that adds to the history of that cache indicating that it is truly supposed to be archived. Should these types of log entries be deletable? Another posibility I thought of is Did the cache owner think that by deleting the archive entry he was re-enabling the cache?
  22. A couple of dozen out of 34 thousand could hardly be called "strong support." More like a vocal minority Very Strong, I'm basing that on people that have outspoken. Go back and count, you will see much stronger support for people that prefer it in date order than you will see it for people that prefer it in name order. We all know that the forums only represent a small fraction of the users, but the forum users do "tend" to be the more active and involved users. I do think its accurate to say that date order is much preffered over name order, but I'll throw in my suggestion for making it user selectable again.
  23. This issue happened two weeks ago, and there have been at least two threads about it. None of those threads seem to have a reply (unless I missed it) from anyone on the geocaching.com side. I've also expressed my request for the page (I prefer sorted by date order, but would also be happy to see it be user selectable, i.e. click the column header you want to sort by). I've indicated this in the past as a feature request as well, all with no replies. I may be stretching a guess here, but I really don't think they want to change how this page works, or comment on why it works the way it does, even though reading through the forums there seems to be very strong support for having the page sorted by date.
  24. I would let the unit sit out in the open sky for about 30 minutes and see what happens. My guess would be that it doesn't have a complete mapping of where the satellites are. This can take about 15 to 30 minutes.
  25. I want to speak out too.. I much prefer them to be in DATE order. My 2nd option would be if I could sort them on demand (ie, click the column that I want to sort by, be it date, name, whatever).
×
×
  • Create New...