Jump to content

Team Red Roo

+Premium Members
  • Posts

    27
  • Joined

  • Last visited

Everything posted by Team Red Roo

  1. It's disappointing that the person(s) who made the decision to stop paid members from using the API via GSAK are hiding behind their anonymity and not explaining their decision to us. I guess I'll just have to vote with my wallet and stop paying for the data that you're not supplying.
  2. Trying to stop cheating by a minority by upsetting a majority is not very good business practice. GS needs to remember that we pay for premium membership and that membership entitles us to the extra data available. Instead of addressing the problem - GS has jumped the gun and stopped API access by apps like GSAK, where they should have simply stopped access to the waypoint questions (which by the way - no one needs until they get to the point. There have always been cheats and there always will be cheats - GS will never be able to stop them.
  3. I see nothing specific from GS about this change . Is this forum just a to give your paid members a means of belly-aching or is it supposed to be a means of communications? .
  4. While I've only been fortunate enough to attend 1 Mega Event (prior to Lab caches) I like the idea of Temporary Event Caches (or perhaps even Temporary Caches) with a code requirement for logging. They appear to be accepted by a good part of the Geocaching community (who have attended these Events) other than that they are not your usual cache in regards to stats etc. Temporary caches could also be made available for normal Events as well. With 5 letters (and 0-9 numbers) still available for future cache types I think it is time these caches became 'mainstream' caches and were given a proper GC (GS) code, Id, Cachetype and icon.
  5. Any chance of expanding on this - or perhaps giving a forum link.
  6. Then until he decides to listen to what the people are saying, Groundspeak will continue to lose money.
  7. Then perhaps someone who does represent Groundspeak can come and discuss it with us. There are many people who want this feature added in. Wayne
  8. In other words, Groundspeak have made a decision, and are not prepared to change it, even if it's found to be bad?
  9. No - not their responsibility. But they know of the problem and CAN do something to assist - but refuse to help. Because not everyone uses or wants PRC files. If you want PRC files directly from GC just start a thread and let me know - I'll join in and oppose it using the same arguments as I have seen on this thread (I know that they don't hold water, but Groundspeak will most likely take notice of them) Wayne
  10. I'm not so sure. My primary reason for getting involved in this thread is not so much for myself, though I would benefit a little. It's for the (probably) thousands of geocachers who don't know enough to maintain their offline databases, let alone download and install a macro so that it's easy to use. Leaving them high and dry is just a cop-out, while the change requested would benefit them greatly. Wayne
  11. "And I have yet to see where including an archived Available cache in a PQ assists is necessary in for keeping offline data fresh." Wayne
  12. Groundspeak can intend whatever they like - but geocachers around the world will still do what they want with the data supplied in pocket queries. Offline databases will continue to exist, and when a cacher upsets a land owner, who do you think gets the blame?. The way that I see it ... Groundspeak acknowledges that many geocachers have and maintain their own offline databases with data from pocket queries. Groundspeak supports the use of offline databases by both advertising these databases on its website and also by continuing to produce GPX file data. Groundspeak acknowledges that there is a problem with offline databases. Groundspeak can but continues to refuse to include the requested extra data in pocket queries. In my non legal opinion, that makes Groundspeak an accessory before the fact when new geocachers use old data from their offline database. Wayne
  13. Mr Markwell, Thankyou for your point of view without stooping to sarcasm (as often happens on these forums), however I stand by the points that I made. I also assume also that you are on the Groundspeak payroll (no problem with that - in fact I feel as if I'm being listened too!), so perhaps you can get the following to Jeremy and/or TPTB. Groundspeak may have their rules / policies on what data they include in GPX files, but we are your paying customers and we are not asking for very much. Like many many others, I have an offline (Gsak) database - it allows me to do what GC.COM does not, and like some others, I am quite computer literate, and having been a professional programmer, can I deal with Archived caches. The thing that concerns me most is that there are no requirements to start geocaching - any one can start anytime they want. In time, new cachers will hear about things like offline databases from older cacher's and organise their own. If they do not know what is missing, they will hunt for all/any caches, including Archived caches. Like many, I plan a day out caching the night before, with whatever data I have available and my latest GPX files. And occasionally I end up looking for a cache that has been archived. Sure it's my fault, but I know how to find out which caches are archived. To quote you > A land manager comes by and finds that a cache has been placed illegally. They don't want that particular location attracting visitors because the... - I agree, but the simple fact is that Archived caches DO get looked for, and Land Managers DO get 'annoyed' - probably because some of us are using old data. And Groundspeak CAN help to reduce that from happening. Land managers demands should be heard and acted on, but I don't think that they have the right to expect cache details to be removed from the system. Why should they have a right above and beyond those of paying members. They do have the right though, to expect that we all stop going onto the property/into the area, looking for that cache. Your system as it is does not ensure that. Geocaching the GC.COM way probably does happen, certainly not by all geocachers. Geocaching out here has evolved to use offline databases for various reasons including that internet access is generally line based. I think that it's time for Groundspeak catch up and solve some of these problems, instead of stone-walling and saying it's our problem/responsibility. Keep in mind that currently, I can get Archived cache data in GPX files in two different ways. We are only asking for minimal information to be included in pocket query GPX files, and then only for a short period, and Groundspeak is the only one that can do that. Offline databases are a fact of life now, and the DO (sometimes) contain old data after a cache has been Archived. Groundspeak CAN do something about it. What is so hard. Why can't we have that? Wayne
  14. I suppose that sooner or later, someone needs to come up with an idea that Groundspeak will take hold of. But first - a few points. 1 - Archived cache data belongs in the general cache database. 2 - Archived cache data is available in GPX files from the cache page. 3 - Archived cache data is available from a GPX file of my found caches (which I run every so often) 4 - Archived caches are Valid caches until they are marked as Archived in my (and others) Offline Db and they may be hunted / looked for. Groundspeak, I am one (of many) of your customers who want Archived cache data in regular GPX files. I'm not asking for much - no data, just the cache id and the flag. There are many of us that want this extra data to make geocaching simpler, and in some cases save us time and money. You don't have the right to tell us how we have to deal with the data that you supplied, because you never supplied us with a program to use the data with. Instead you left that to other sources and now we have GSAK, Cachemate and a host of other database programs. Give us what we are asking for. Charge a little more for the data if you want, but stop stone-walling and give us fully up-to-date data. It could be incorporated in a Pocket Query for "Any Activity in the last 7 or 14 days" (cache status and all logs over the period). If this information was available to me in one pocket query, I would disable most of my queries each day, knowing that I wouldn't lose any information. It would be a major benefit to you as well. Wayne (getting down off the box now)
  15. It's a fact of life that many cachers have evolved to use their own offline database such as GSAK - and if a gpx file does not have archived cache data, then it is not the 'newest freshest data available'. It's also a fact the if all cachers searched the GC database for every cache that they wanted to search for, the server probably would not be able to keep up with the demand (especially on the weekends). Without Archived cache information, cachers may find themselves searching in sensitive areas when the cache has already been removed. This definitely goes against the idea of removing a cache because of damage to the area or even more importantly because a landowner has asked for the cache to be removed. Also consider the time lost/wasted by people searching for something that isn't there. There is no reason for us to be constantly denied archived cache information within a gpx file. There is no need to include log data - just the cache id, and the archived flag. I'm sure that this would be enough to satisfy people and allow them to update their offline database in the knowledge that it was in fact "the newest freshest data available". My point of view - Wayne
  16. I agree with you - it wasn't the best way to notify users. And it still isn't notifying many users. I found out about it in the GSAK forum. I personally think that it is a good idea, but every 30 days is a bit too much - 90 days would be more acceptable. There must be other ways though of lowering the overhead on your PQ system than by doing this though, and you're paying to money (I assume) for programmers and system engineers. Several good suggestions have been made above, and should be seriously considered - but here's my bit... I live in West Australia. Like most cachers over here, I run two queries per day to keep my GSAK database up to date though from time to time (as now) I keep tabs on just a little more. A single PQ containing all updated records for say 14 days, could be run and sent to every cacher (premium members only of course) in West Aust who requests it, on a daily/weekly/fortnightly basis, saving perhaps 50 PQ's per day from running. Not much of a saving, but looking at it from a global position, there would have to be a considerable reduction in the number of PQ's run on a daily basis. There would have to be a PQ option to select an unlimited number of caches with recent updates within a pre set period (e.g. all logs for the last 7 or 14 days), for a specified state or country, and would override any/all other selected options. This type of PQ would be run first each day (top priority) and the resulting GPX file would be distributed to everyone selecting it for that state. The PQ would only run once, with your mail server handling the distribution task. The end result is that you could give us as customers more (caching logs) for less (system PQ machine overhead). e.g. If I wanted to keep a database on all of Australia, I could keep it up to date with the data from a run once PQ for all of Australia that includes me in it's distribution list. Just my thoughts - Wayne
  17. How is this coming along? There seems to be a lot of people (including myself) waiting for this feature, before they go any further.
  18. Oops - it appears that this may be the result of Zome Alarm Pro renamimg the file extension Wayne
  19. I have just re-registered and my PQ's are coming through again, but the attached PQ have a .zm9 extension that GSAK doesn't pick up. Is there a simple explanation for this?
  20. Brilliant - almost as good as bottled beer
  21. Thanks for the reply Allen - I hadn't even considered indexes. The last database (Pick) that I worked on didn't have that sort of thing.
  22. No. Not true. 1. The number of 2500 cache queries would increase dramatically, no matter whether the geocacher needs them or not. 2. 5 PQ of 500 apiece, or 1 PQ of 2500 is constant, and involves the same processing power, if not more, as the query time for each PQ would increase the time before another PQ could be processed. 2 low hanging fruit reasons why this idea does not help the issue. It seems to me that one search of the database to find perhaps 2500 caches would be quicker than 5 searches for the same data, especially when PQ's are structured to report up to 500 caches, but are set to never find all 500. As an example, I have two PQ's for Western Australia. The first PQ selects 500 caches caches set prior to a particular date (about 15th May 2005). The second PQ selects 500 caches set after that date. Neither search reports 500 caches, so both PQ's must search the entire database for each PQ. It's true that to increase the selection count from 500 to whatever would slow down the running of that PQ, but that may be preferable to having a multitude of PQ's searching exactly the same data over and over again, when it isn't necessary. I also have 11 PQ's that report on unknown type caches in the US. Each PQ is structured to report on up to 500 caches, but never find that many. I run these PQ's on demand only - perhaps once every 6 months, since I only look for ideas from them. But the point is that the entire database is searched for each PQ, that is 11 searches instead of just 1. There has to be a compromise somewhere though. Setting the number to high may cause dramatic operational slowdowns in PQ operation times, and may also report on caches not required by users. There may also be limitations on PQ reporting because of email attachment size. A suggestion is to increase the number of caches reported on to perhaps 1000 or 1500 or even 2500, and then email everyone who creates PQ's informing them of the change, and requesting them to alter their PQ's to suit. Then sit back and see if the load increases or decreases.
  23. Marvellous - I have recieved my update (all of a sudden). Do I have to post a request/prompt here each time I want a GPX update or will it come through automatically?
×
×
  • Create New...