Jump to content

caderoux

+Premium Members
  • Posts

    486
  • Joined

  • Last visited

Posts posted by caderoux

  1. Will this be working again (right now I get a blank page)? If not, will the link be removed/disabled?

     

    Also, the forum will not let you search on "+gpx +file +approved" because it contains a 3-letter word. I know gc.com did not write the forum software, and cannot be blamed for the bad design, but it would seem that the three character limit is being misapplied given the nature of the specific search criteria I gave.

  2. I really don't see what the driving force is behind the need to have every single cache that has ever existed in your area in a private database.

     

    I think most people only want the active caches plus the archived caches they've hunted or are tracking for whatever reason.

     

    We can get notifications of new caches (immediately and/or weekly). Would getting similar type notifications for caches being archived be a possible solution?

     

    This is already possible and I'm sure other people are using it as am I. This method is especially good for paper cachers, who can use the notification to be able to purge their binders.

     

    BTW: I've never had the problem of archived caches in my GSAK database because I always use only my new PQs instead of adding them to the existing database.

     

    Resulting in your PQs probably being larger and more frequent than those designed only to add to and update a database. There was a time not long ago when PQs were extremely unreliable due to load issues. We were told that the determining factor in measuring the load was number of caches returned.

  3. I'm starting to think about a tech night for local cachers. After our last event there's a lot of people that want to get more out of their GPSrs or go paperless or just use gc.com's notifications properly (anecdotal evidence that most of the people at the event couldn't get theirs working may point to the interface not being very friendly!)

     

    I've got a big mind map of all the topics I want to cover (it might be a series of tech nights - and I won't be able to cover all the topics myself - I'm going to draft Garmin and Magellan experts to deal with their GPSrs), and it's too big to post here.

     

    What I'm looking for is any resources you may already have gathered related to holding such an event. Especially:

     

    Powerpoints or HOW-TOs

    Web pages with tutorials

     

    Also, if you've held such an event yourself, any lessons learned (not just about what to teach, there's also some logistics I've got to work out, too) which would be useful.

  4. We were told that Waymarking was the solution for the moratorium on Locationless caches. Then were were told that Waymarking would also be the solution to the subjective "wow" requirement on Virtuals. But then the PURITANS said "Locationless and Virtuals are not caches and never were caches". I personallly think that Waymarking can be the solution to the problems of locationless and virtuals. I am even willing to not have waymarks count in my find count. But using the argument that waymarks are not geocaches and therefore there should be no integration of statistics or searching between the two sites seems like we were lied to for two years while we waited for the "solution". The PURITANS' solution is make anything they don't like go away. I used to like geocaching because different people could participate in different ways. No more raspberry flavored ice cream. Only chocolate and vanilla are served here :)

     

    I hear ya. I'm vanilla in Chocolate City myself.

     

    gc.com tends to go in cycles.

     

    A new thing comes out and then it needs to be accommodated and then it gets out of control and there's a backlash and then it gets a clampdown. All because no one really knows how something is going to evolve until about the third iteration.

     

    All these things you mention about the puritans are all manifestations of this.

     

    There are tons of examples which have repeated the pattern here:

    Micros - tons of haters - accomodated with size filters, power trails forbidden

    Virtuals - "it's not the basis for the activity" - now moved to Waymarking

    Locationless - ditto

    Geocoins - big backlash brewing on this one for the last month - with all the "virtual" geocoins and posts about events becoming coin-centric - accomodated with new trackables page

     

    And then odd little areas where gc.com has tried to embrace related things like benchmarks and letterboxing but they end up kind of trailing behind and no one really knows what their future is.

     

    I think you've just got to sit back and enjoy the ride. There are always other sites which will arise to handle things which have been completely ruled out here:

     

    Terracaching: Community policing, scoring, FTF codes, verification codes

    Geodashing: GPS sightseeing (no physical cache)

  5. Is there some reason why people cannot just create HTML documents for their cache pages, using code that browsers understand?

    I'm sure TPTB have reasons - some of which are arbitrary code which could allow people to launch malicious attacks onto viewers' computers which appear to come from Groundspeak.

     

    Notice that no IFRAME or SCRIPT tags are allowed - they have potential for good but also for evil.

  6. gc.com can fix their HTML stripping issue.

     

    As far as browsers, I think we'll always have some weirdness. (Same in Firefox).

     

    For instance take this character: あ (& #12354;)

     

    After I typed it in as an entity and then went to preview, the textbox presents the data as a single glyph. The underlying HTML is the entity, but it doesn't show as the entity any more in the text box. If I copy this to the clipboard and paste it into notepad, I then get prompted that the file must be saved as one of the unicode options or data will be lost.

     

    The rest of the entity/display behavior is specific to this forum. After repeated round trips to the server, the data is always the entity in the stream. It's probably stored that way in the DB for the forum. The forum is PHP - and PHP's unicode support is poor - and this page uses <meta http-equiv="content-type" content="text/html; charset=iso-8859-1" /> and the form uses no accept-charset so the forum is not necessarily a good example of how the regular gc.com handles it (or even how it should be handled.) For instance, when I paste clipboard unicode into the forum - it converts it TO an entity.

     

    The cache editing page uses charset=utf-8, but does not specify an accept-charset on the form ( <form name="Form1" method="post" action="report.aspx?guid=2d33e6d4-9b37-48e0-a1b9-bd96ad87ed1e" language="javascript" onsubmit="if (!ValidatorOnSubmit()) return false;" id="Form1"> ) . Not sure what effect this has, but when unicode is pasted in, the characters are not converted to entities - they stay - and when you view source in notepad, they do show the correct characters - but after repeated saves, the data is lost - looks to me like an accept-charset could help a lot here - but some of it might depend on the code, (1) because any time you do an explicit or implicit conversion, it's going to be done in the current locale context of the thread handling the request. Usually this only screws with dates and currencies to and from text. However, there is also the case (2) where the code sends the string through a layer not expecting unicode (like perhaps HTML tidy, or God forbid, DAO) the strings could go through ANSI marshalling and have problems.

     

    (Edited to correct the URL for the form tag to move it to the paragraph about report.aspx, not about the forum)

  7. gc.com is definitely mangling the data.

     

    Browsers are also part of the roundtrip problem budd-rdc reports, because the entities get changed into native unicode characters, which are then posted to the form as whatever encoding IE decides is appropriate based on the accept-charset (which should be utf-8). The web server should be able to take utf-8 and convert to internal unicode (assuming the site is written in .net, which uses all unicode UCS-16 internally) and then posts them back to the page in some encoding. It may be utf-8, but the browser doesn't detect it automatically because the header specifies a charset. It may be set to a font which doesn't have those characters. It may be that it isn't even utf-8. These tend to show up and boxes or ? for the glyphs.

     

    We had a whole system running one time, thinking the data was saved correctly since it re-displayed fine - and then we found out it wasn't storing them in ucs-16 (nchar/nvarchar) in the database at all, but using one byte of each ucs-16 character for half the glyph.

     

    How did we fix it? We output it all to the screen in IE - copied the data off the screen (so it was stored in the clipboard - presumably ucs-16) - then fixed the system and pasted into a form with the accept-charset set to utf-8.

  8. Their editors store it in UCS-16 (raw Unicode - two bytes per character) or UTF-8 (1+ bytes per character). These get stored in the file. IE can display these encodings. Of course, before Unicode, JIS and Shift-JIS were far more common (for the Japanese), so I'm sure a LOT of them still use native encodings.

     

    Like you can write & #65; (space not intentional, I cannot get the forum to properly render &#65;) or you can write A in the file. The first will take 5 bytes and the second will take 1 byte (assuming the file/stream is encoded in ASCII or UTF-8 - which is the same as ASCII for these characters). Now if the file is "encoded" in UCS-16, the first will take 10 bytes and the second 2 bytes, but each second byte will be 0.

     

    And IE will sort it all out or use the header. There is also an accept-charset in forms which can indicate what is accepted.

     

    It can get kind of complicated since the browser will replace entities in the stream with the single glyphs even in text boxes - so sometimes, you can't tell if the underlying data is entities or binary encodings.

  9. Actually I think it's a problem with Google Earth.

     

    The kml file is correctly marked:

    <?xml version="1.0" encoding="UTF-8"?>

    so that should allow the diacritics to be there. If Google Earth is barfing on them then Google Earth needs to be fixed.

    My example may or may not be a problem with Google Earth - it is those stupid "other" quote marks - which should be substituted or escaped.

  10. The security of people's private lives is important too... you could find out that I go caching every Saturday... and determine the best time to break into my house... or if you see that I signed up for an Event Cache that I will be out for the evening.... or maybe you are tracking an ex-love interest and monitoring who they go caching with now... there are dozens of reasons why people wouldn't want this to be available.

    It's already available on the site, it's just that no one can use it that way and they have to use third party tools offline.

     

    The only thing stopping people building their own systems for everyone to use (like Google Maps, Frapper, Flickr etc) is the site TOU and a lack of an API or web service.

  11. A feature similar to flickr's contacts where you can see their recent logs and photos all together in a river of news type format would be good for keeping up with who's having fun in the local area or friends who are elsewhere. I'd like it to combine caches on my watchlist AND cachers on my watchlist - and just serve it up - photos and logs.

     

    No stats involved. No competition involved either. Just about community.

     

    Individual bookmark pages is fine for people with time on their hands, but a lot of people already use RSS aggregators and they are familiar with using similar technologies.

     

    I find it odd that competition is always raised as an argument against this kind of tracking, given that the small portion of geocachers who frequent the forums are the ones who most see geocaching as a community activity, and they would most likely be the ones to see the use of the feature.

  12. If this will require GPX files to be imported before use, any chance that a PC-based tool can create the databases without having to do the work on the PPC?

     

    Right now, I have a macro in GSAK which creates a GPX, converts it to Mapopolis MLP, and copies both files to a card - and I just go.

     

    GPXSonar reads the GPX file directly, so no more work to be done.

  13. I can't tell if this has been raised before, but 0 (zero) digits are left out of the timestamp string, and it's kind of weird looking.

     

    Please release 1.5 as it is if it has the fix to losing the notes. Right now I tend to exit from the program on a regular basis and then back up the notes, because I've been bitten too many times by the bug.

  14. In fact, one idea I've had as a trade item is a pair of lithium batteries vacuum packed with a note inside stating "Emergency GPS Food: place in bottom of your pack and use only when all else fails." Much better than a McToy, this would be a true geocaching survival item. We keep a pair of lithiums in the bottom of our packs in case the NiMH batteries run out on a long hike, we have an emergency pair, each. (Of course, when we start getting low, one of us will turn off our unit and we use only one.) We've run out before and the lithiums helped. The situation also promtped me to ask Sissy to make the battery holders to organize the rechargables. Let's face it, we don't always go completely prepared and batteries fully charged. Emergency GPS Food is pretty much an essential item for your pack, IMHO.

    Similar idea: Have a set of caches where there is a story you have to listen to from those mini voice recorders (like Mission Impossible), with spare sets of batteries vacuum packed.

     

    I was wondering if vacuum packing is harmful to batteries.

×
×
  • Create New...