Jump to content

xafwodahs

+Premium Members
  • Posts

    33
  • Joined

  • Last visited

Everything posted by xafwodahs

  1. Had same issue on GCXE77 Adding the <br/> at the end of the short description seemed to help.
  2. Had same issue on GCXE77 Adding the <br/> at the end of the short description seemed to help.
  3. Hello, my cache at http://www.geocaching.com/seek/cache_detai...26-f5458acf1f18 seems to have some garbage characters, and I can't figure out where they came from. It comes right after this text: "Part of the Aux Plaines Crossing series. Bonus Cache." If this could be cleaned up, that would be appreciated. Thanks.
  4. Browsing caches is often a tedious process because the only way to get an idea of what caches you want to search for is to read the descriptions - the cache name is usually insufficient. This means clicking and opening up a large number of cache pages. I thought it might be a neat feature to allow a cache owner to upload a single photo that attempts to capture the *essence* of the cache. Upon upload, a suitably sized thumbnail could be generated, and the cache browsing on the website could then allow photo browsing as well. For example, each page could have a 4x5 grid of thumbnails with cache names. Any thoughts?
  5. Is there a way to do a pocket query of caches that have not only not been found by me, but also not found by a specified friend? For example, lets say today I am going caching with FriendX - I want a query to give me a list of caches that *neither* of us have found. And tomorrow I might go caching with FriendY, and I'll want a similar query of caches but with that friend's finds excluded... So far, I've been using a PERL script that *subtracts* someone's "all found" gpx from a regular pocket query, but that requires all my caching friends to email me their "all found" gpx files on a regular basis, which is a pain. If the pocket query page could have a "Not found by" text field that allows a (space or comma separated) *list* of geocacher names, that would be very nice indeed.
  6. Has there been any posting as to *why* this "ain't gonna happen"? All I have seen from the administrators is "nope, sorry". If this were a completely free service, then people would just have to use what's available. The problem is, once you start taking peoples money for a service, then those people are going to want a say in how the service is provided. As I've said in a previous post, I don't think it's unreasonable for people to expect a little better discussion than "No we won't do it." If the reasons have been explained in other postings, please provide a link.
  7. I must admit I think I just discovered a feature that had eluded me. When I mentioned iteratively changing PQs based on which ones are maxed out, I didn't realize that if you select the "Preview Pocket Query" icon, the resultant page seems to contain a cache count. If this count is accurate, it would speed up the iterative process considerably, since in the past I scheduled the query and waited for the results (usually the next day) before I knew if it was maxed out. It would still be an iterative process, but not nearly as painful as it was. I've never really used the preview function much and didn't think about it until now, but if there is a count, then that means there is a database query done every time I run the preview. My initial thought was: isn't this compounding the query overload problem? However - and I only have moderate database experience - I would suspect that getting the count may be a much simpler/quicker query than pulling all the information necessary for the GPX file. Especially since the logs are probably part of a different table, and multiple queries may be needed to generate each <wpt> for the GPX file. And that's probably an overly simplistic view.
  8. There is clearly a desire from a number of members (certainly not all) to have an easier method of getting GPX files with lots of caches. And as the newer model GPSr's come out and as the cache density increases, that desire will probably increase as well. (My Garmin 60CSx will essentially hold an unlimited number of caches as custom POIs). Many methods have been proposed here and elsewhere. For some, those methods are sufficient. For others, they are insufficient. The PQ-by-date method is fine, until you find that 90% of the caches in the area were placed in the same year and you have to iteratively break down the maxed queries into multiple queries (highly inefficient). The pay-extra-for-multiple-accounts method is fine, except that most peoples needs for such large queries is infrequent at best (vacation/trips), so most people don't feel that is warranted. If you do want more caches than the current system is designed to offer, your choices are: 1. Give up and live with the limitations. 2. Iteratively modify many PQs until you get a bunch of GPXs that are sufficient. Don't get me wrong - I am grateful for the services that *are* provided and I'm glad that there are people who take the time to work on this site - I'm sure they are underpaid However, I don't think it's unreasonable for paying members to ask for additional services, including having an easier method of obtaining large GPX files (or many smaller GPX files). Clearly, those requests need to be considered within the framework of existing performance limitations, but the requests should be considered. I haven't read *all* the postings on this issue, but my general feeling has been that the administrators are not interested in having much dialog. I know the one post I made on this issue a long time ago was shut down with no explanation: http://forums.Groundspeak.com/GC/index.php...p;p=entry That post is only one suggestion. As I'm sure TheCarterFamily would agree, there are almost certainly a number of ways that the site could at least partially alleviate this problem, and do so without significantly reducing the performance of the more standard queries. I hope that this issue gains enough momentum that it is addressed by the administrators. I don't know how many users want this, and I don't know how the administrators decide what features to prioritize, but I can say that for me personally, this is the feature I would most like to see implemented.
  9. I decided to wait until this morning to post yesterday's finds, but all morning I have been getting a server error when submit my log information. I can view the cache, I can open the log page and fill it out, but when I hit submit I get the error below. I have tried a couple different caches and have tried 3-4 times over the last several hours: Server Error in '/' Application. Object reference not set to an instance of an object. Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.NullReferenceException: Object reference not set to an instance of an object. Source Error: An unhandled exception was generated during the execution of the current web request. Information regarding the origin and location of the exception can be identified using the exception stack trace below. Stack Trace: [NullReferenceException: Object reference not set to an instance of an object.] Geocaching.UI.LogBookPanel.ValidateNewLog(Boolean IsEdit) +275 Geocaching.UI.LogBookPanel.CreateNewLog(Boolean CheckConfirm) +350 Geocaching.UI.LogBookPanel.LogButton_Click(Object sender, EventArgs e) +43 System.Web.UI.WebControls.Button.OnClick(EventArgs e) +108 System.Web.UI.WebControls.Button.System.Web.UI.IPostBackEventHandler.RaisePostBackEvent(String eventArgument) +57 System.Web.UI.Page.RaisePostBackEvent(IPostBackEventHandler sourceControl, String eventArgument) +18 System.Web.UI.Page.RaisePostBackEvent(NameValueCollection postData) +33 System.Web.UI.Page.ProcessRequestMain() +1292 Version Information: Microsoft .NET Framework Version:1.1.4322.2300; ASP.NET Version:1.1.4322.2300
  10. One other point I forgot to make: Some would argue that all these queries would overload an already overloaded query machine. However, I would argue that having these pre-generated gpx files would reduce the use of custom queries and that the total number of queries may go down. I don't know how many queries are currently done, so I can't say for sure. Also, these queries could be run at any time of day - whenever the query machine is least loaded - and the user wouldn't be waiting for the results. The ultimate effect would be to transfer some of the custom query processing (which can require a lot of computing power) over to file downloading (which requires relatively little computing power).
  11. I started running a few numbers for storage size. I wrote a quick script to calculate the average size of each <wpt> </wpt> element in a gpx file. After running it on a couple gpx files (both with 500 caches), I show an average of about 5300 bytes per wpt. The main geocaching page currently reports 212101 caches. 212101 * 5300 bytes = ~1.05 GB That doesn't seem like it would be a problem. The bigger problem seems like it would be defining a grid structure that would be a good balance between number of gpx files (and hence number of queries) and the average/max number of caches per gpx file. However, if this were implemented, I don't see a reason why any particular gpx file would have to be limited to 500 caches...
  12. And what happens when one of those queries, say the one for 2004, comes back with 500 caches. That most likely means that it maxed out at 500, but there are more caches from that year. If you want them all, then you have to split that query for 2004 into 2 queries, maybe 1 for the first half of 2004 and the other for the second half. Whether you define your queries based on time or location is irrelevant - you still might get 500 caches in a result, which means you're missing some, and you'll have to reduce the scope of those queries until the results are <500 caches. Some would argue that since you're going to be filtering out some caches anyway, who cares about those missing caches? But maxing out at 500 caches doesn't give the user the choice of *which* caches to filter out.
  13. I periodically take vacations to other states for the sole purpose of geocaching (actually siteseeing, but the geocaches provide the specific destinations...) Anyway, every trip is the same: it's a big state, and I don't have any set itenerary, so I don't know where I'll be going. So I want my GPSr to have all the caches for the entire area (or at least as many will fix into my GPSr). I end up setting up multiple pocket queries that partially overlap and together cover a wider area. If a pocket query returns 500 caches, I shrink the max distance for the query, re-adjust the other queries so there are no coverage gaps, and run all the queries again. Sometimes I alter the parameters of the query to try to reduce the number of caches. I repeat this process until I'm satisfied with the results. I then have a PERL script which I've written to combine all the GPX files, remove the duplicates (due to the overlapping), and generate a final GPX file. From there I can use other tools to filter out some caches until I have <1000, which is how many my GPSr can hold. Running and rerunning the pocket queries is tedious and can take days. It occurred to me on this last trip that having pre-generated GPX files available for download from geocaching.com might be an answer. (Note: I haven't yet tried to guestimate any numbers on this to see if it's actually practical; I'm hoping the site maintainers will comment on this...) It could work this way: 1. Divide cacheable areas into simple grids ------------------------------------------------- Define a simple grid structure. For example, each grid block could be 30 minutes latitude by 30 minutes longitude. 2. geocaching.com runs automatic queries ------------------------------------------------ Periodically (weekly, maybe), the geocaching.com query machine runs queries for each grid block and stores the resultant GPX file. 3. User downloads GPX files -------------------------------- Thru some browser interface, the user can specify the bounds of the area they're interested in, and get a list of these pre-generated GPX files, which the user can then download. 4. User uses PC software to do the rest --------------------------------------------- The user can then use PC software to combine the results, experiment with filtering out different types of caches to reduce the cache count, etc. The ultimate goal here is to get all the caches in an area of interest onto the user's PC. At this point, the user can try different filters (e.g. terrain difficulty of 3 or less), etc to get the final list of caches they want to take on their trip. Comments are welcome.
  14. I'd like to think I'm commenting more than complaining... I agree that many people do go to a lot of trouble to place caches, and I appreciate their efforts greatly. I also think some people don't put much effort at all into their caches, and I'm just curious if other people think so as well.
  15. Many of these posts suggest simply not going to the uninteresting caches. However, it is often difficult to tell if a cache will be interesting or not until you get there. Difficulty and terrain don't really provide a good, consistent measure - some of the best caches are easy ones; and the descriptions might give you a good idea, but sometimes, instead of spelling the whole thing out before you even get there, the cache has only a short description so that you're surprised when you get there. I realize "interest" is in the eye of the geocacher, so to speak, but I also think most people will generally agree on what's interesting and what isn't. Personally, in 2.5 years of geocaching, I have only placed one cache (I have adopted a couple more from someone else). I want to place more, but I haven't found anything I think is interesting or unique enough, yet, so I'm holding off until I do. I'm not saying ever cache needs to be a masterpiece, but a park bench that's clearly visible in a popular park? Come on...
  16. Just wondering what other peoples opinions are on this... I live in Northeast Illinois - north of Chicago; a fairly populated area. I have done over 500 caches - many of those near my home. And I am getting the feeling more and more that there are simply too many caches around here. Or to be more accurate: too many uninspired caches. It seems I am always reading descriptions like: "I was just on my way to so-and-so and decided to drop off a cache in the little park nearby." How much forethought was there? Why was this place special enough to put a cache there? Usually these quick and dirty caches aren't particularly interesting and the only value of them in my mind is to increment my cache-found count (which to me means little value). I feel like I've been to every park bench and 10 foot creek bridge (and yes, even fallen tree) in the county - after a while they all look the same. And worse: more and more caches blatantly put in easy view of muggles. There are several *series* of caches (meaning many caches) in my area that are just stuck on the back of street signs and so-forth right near busy intersections and shopping malls. Of course I'm going to be seen doing one of these caches! These caches rely on the fact that most people won't bother to actually investigate what you're doing. In my area now, if I do a dozen or so caches in a day, I feel lucky if 2 of them were memorable. Often, none are. So, I feel compelled to travel to more remote areas for my caching expeditions. Of course, this requires more time, and thus I am able to do it less often. I welcome your comments.
  17. The information would merely satisfy my curiosity. I think cache-placers would like to know who is interested enough in their caches to watch them. Granted, there may be some cases like yours where you're not interested in a particular cache because of its construction; merely its location. If a cache owner cannot see 'who' is watching a cache, what is the value of knowing 'how many' people (which is provided) are watching? Personally, I watch caches that were placed by people I know, so that I can discuss the finds with them.
  18. Is there a way to determine who is watching a cache? I don't think this information should be provided to everyone, but it seems that a cache owner should be able to find out. So far, I have found no such feature. If this feature doesn't exist, should it?
  19. 'Road to Ruin' was finally approved. I want to say thank you to anyone and everyone who helped - especially Team Purdy for moving their cache.
  20. Hi Team Purdy- I just emailed the ILAdmin for a status update. I'll let you know as soon as I know anything. Thanks again.
  21. Hi Team Kender - Yes, I did read that log. That's quite an amazing series of events for Hillwilly. I'm not worried about such event related to this cache, however. First, (and probably least important) my container is a green tupperware container - probably less intimidating to the casual observer than an ammo box. Second, because the bridge is elevated and in dense trees, the cacher would not be visible to anyone on/near the train. Third, I think by now, 9/11 based fears are at least a little bit diminished, and hopefully someone looking around on an already destroyed bridge will not cause an over-reaction. Fourth, (and probably most important) there is a fence which defines the separation of properties, and the cache is definitely on the 'away' side of the fence.
  22. Thanks, and thanks for your willingness to move Middlefork. I'll let you know what they say. Ya, I think it is very interesting historically. I found the bridge the same way GyozaKing recently did while looking for Middlefork (see log on Middlefork). The road you see, if I'm not mistaken, was part of John Ogden Armour's 2 mile driveway to his house (which is now the Lake Forest Academy). This multicache is designed to walk you along the driveway from the gates off of highway 43, past the restored bridge, to the ruined bridge. It is only because of the unique historical aspect that I'm being such a pain-in-the-butt regarding this cache. I think other people would enjoy it. Anyway, thanks again for your assistance. I'll let you know what they say.
  23. Hi Team Purdy- Thanks very much for looking into this. 'Middlefork' is about 380 feet north (and slightly east) of the bridge. [i don't have my GPS with me, otherwise I'd give you exact bearing]. So if you move it, the safest bet would be to move it a couple hundred feet north. However, I received an email from the Groundspeak personnel that sounded like the only real sticking point was going to be the proximity to the tracks. I've emailed back and I'm waiting on more details. So, you could move it now, or you could wait to see what answer I get from the Groundspeak personnel. Again, I really don't like the idea of asking someone to move a cache, and if my cache wasn't on an immovable object, I would simply move mine elsewhere. Either way, I appreciate your participation in this matter.
  24. Thanks for replying, Team Purdy. I tend to agree with you. The .1 mile rule is supposed to prevent saturation, but I don't think the area is saturated yet. It has been suggested in the above posts that I ask you to move 'Middlefork' a couple hundred feet, but I really don't want to do that unless the Groundspeak personnel refuse to allow the cache as is. So far, I have not heard from them. Again, thanks for entering the discussion.
  25. I feel compelled to point out that a recent new cache to the area (GCGZ7Q - Red Raspberry) is only 74 feet by my measurement from the very same railroad tracks. Any because I know the area, I can tell you that this is by a trail which actually crosses the tracks, so there is no fence. If this could get approved, then it doesn't seem like the railroad should be any issue for my attempted cache. That only leaves the proximity to the 'Middlefork' cache. But it has been pointed out that this is often a guideline to prevent saturation, not a 100% rule. I would also like to point out that 'Middlefork' and 'Red Raspberry' are 2 of 8 caches in this area all placed by the same person and all within the last 3 weeks! I have no problem with someone enjoying a place and putting caches there, but at this rate, I'm concerned I'll have more than just 1 cache within .1 mile of the bridge by the time this discussion has concluded!
×
×
  • Create New...