
pppingme
-
Posts
1238 -
Joined
-
Last visited
Posts posted by pppingme
-
-
The proper way to handle this, and I'm continued to be amazed that gs doesn't support this (I have brought it up before) is for the cache owner to be able to "hide" (not encrypt) the log contents, or pics, that way the finder doesn't lose a find (as they shouldn't), the cache owner gets to prevent spoilers (as they should), and way fewer less hurt feelings, then the log owner (yes, I believe the log belongs to the log owner, not the cache owner, so no I don't support cache owner editing logs) can edit to everyone's happiness.
-
Imagine if google deployed a "whoa page" on searches?
They actually have, and it works quite effectively, without hurting real users, the reason most people don't even know its there.
-
We're seeing less than .01% of users/hr being throttled. We have yet to determine how many of those are valid. We're still seeing reports from users that they're being throttled, but very few in comparison to before the time sync issue. We plan to release an update to the throttling code tomorrow to gather more information to help determine what issues still exist.
80,000 premium members, so .01% would be a grand total of 8 users being throttled?
I kinda think from the volume of this thread that may be off just a tad.
-
Wouldn't it be silly and counterproductive to name this Greasemonkey script, or any of its robot cousins?
No, it wouldn't be.
If people knew they were running a bad script, they might stop running it.
Also, without naming it, most people would still see this as a generic blaming of greasemonkey scripts overall, indicating that gs is still guessing why their throttling code is so buggy.
-
This is a good argument for why notifies should have home coordinates as an option. Someone is regularly posting with this exact problem. GS, are you listening?
-
The .img server seems to be going REALLY REALLY slow, but its not down completely.
-
The option "Are available to all users" specifically excludes pm caches, do you have that selected?
-
I'm not a big fan of IE6 nor do I use it, but I find this to be an interesting decision considering these points:
- IE6 is STILL a current and supported product my Microsoft
- IE6 is still in use by 12% of users overall
- GS has stated that 10% of its users are still using IE6
Is GS really wanting to cut 10% of their users?
Using numbers that have floated around there are about 80,000 premium users, so approx 8,000 users will be impacted, that's revenues of $240,000/year (8,000 times $30).
Wow, doesn't seem like the smartest decision I've ever seen.
-
These aren't, strictly speaking, bike trails but routes to take if you are on a bike. (What I am seeing is mostly just following regular streets, avoiding the interstates or larger highways.)
When I turn on the bike routes I see two different things.
A dashed green line, seems to be what you are saying, and a solid green line, which are true bike/pedestrian paths and routes (not accessible to cars at all).
Are you seeing both or just the dashed lines?
Since you have the option to switch to "bicycling" when the Google Maps are displayed, is there a reason to have a separate Groundspeak option?
The gs google maps can show multiple caches, where the real google maps only show one cache.
-
So by that logic, they should also stop providing .loc files, stop providing support for older browsers, and any of numerous other things that don't fit into what you personally find of value. They are a business trying to make some money. You need customers to make money. Not all customers are going to have the same tools. They're doing their best to support us all. I guess my needs and thousands of other customer's needs are just a waste of time.
What about Delorme supporting their customers? After all, gs put out attributes on 1/12, thats two months ago, in the mean time, they've implemented a work around for Delorme's broken software (although the work around has been found to be buggy), DURING THAT TIME, WHAT HAS DELORME DONE TO FIX THE PROBLEM? From what I've seen, nothing.
This isn't a matter of what I find of value, its a matter of something that shouldn't have broken to start with because a 3rd party app doesn't correctly read the files, and now gs has had to waste development time dealing with the situation.
The site has to make progress, a broken 3rd party app should not keep progress away.
-
But everyone is dogpiling on DeLorme here over the XML and ignoring the real issue which Groundspeak can address - for users who have not selected a GPX format, Groundspeak changed the default version they get from 1.0 to 1.0.1 without notice, and without any indication in the user's preferences. In your preferences, it still shows 1.0 if you haven't made a selection!
Why should gs spend ANY time implementing a work around for an already fragile PQ system? With the workaround, certain pieces of code have to be written twice, one for attributes, one for no attributes.
The reality should be, that gs pull the ability to choose, put the .gpx out with attributes, and quit wasting time on something that isn't their problem to start with.
PQ's are already fragile, gs need to spend time fixing that, and not waste time making work arounds for someone that can't properly read a xml file.
-
On the flip side, Groundspeak first introduced attributes in GPX files in the fall without giving adequate warning to others.
Since the change was a simple addition of a new tag, no one needed advance notice (I'm NOT saying gs shouldn't have put out notice, they very much should have, BUT it shouldn't have mattered). I would agree with you if a tag were being removed or changed, since some program could be dependent on that tag, but that's not the case here.
No application that is reading an xml file should break just because there is a new/unexpected tag, period.
So, notice or not (and yes, they should have notified), the app shouldn't have broken, and it did.
I also disagree with gs having made a work around to accommodate a broken app, those are development hours that would have been much better spent elsewhere. GS's response should have been to ignore the complaints after clarifying that the .gpx is indeed good clean xml data.
-
People continue to post in the technology forum and on the Delorme forums asking for help, thinking the problem is on their end.
This really IS a delorme problem, and they need to fix their software.
The gpx files are simply xml files, nothing fancy, version 1.0.1 is simply an extra tag (attributes).
The "problem" is that delorme can't handle an unknown tag (in this case the attributes) in the file.
The PROPER way to handle this is to ignore the tag, not choke on it.
This issue really should be raised on the delorme side for them to fix their software to politely ignore rather than choke on an unknown tag, as the rest of the world does when processing xml files.
Does this mean every time gs wants to include something new that they have to wait for delorme to catch up? Absolutely not. The rest of the world didn't choke on the new data, and delorme shouldn't have either.
-
I don't think this has anything to do with the fact you just updated yours.
I haven't updated mine since January, and they are showing up the same way for me. Not sure the last time I looked (doubt its been more than a couple days) but this was fine last time I looked.
So, it appears GS is making changes again (huh, didn't Jeremy say something just yesterday about being more conscious about notifying us before making changes?).
-
Wow, you got everyone's hopes up.
Is there any chance it was a corrupt file? Do you zip the files?
-
Its been longer than 2 days, its been since the "database upgrade" or whatever they did on 2/23.
What makes it worse is that no one seems to be actively monitoring the situation over the weekends. There was a very large gap between the time people complained on the forums and the first "we are looking at it" post.
One would think that a company that just made major changes (database "upgrade" on 2/23 and a site upgrade on 2/24) to an already unstable system (the pq generator has been acting up for quite a while, but the most recent upgrades seen to have all but broken it) would at least monitor it during peak usage times (the weekends).
-
1) Why do you need 5000 caches a day in your PQs?
Because I'm a cacher of opportunity.
Look here:
http://www.geocaching.com/seek/nearest.asp...617&dist=25
More than 5000 caches within 25 miles.
My daily work could easily take me in a circle larger than that.
-
Got my new query run yesterday (Saturday) at 1:22:29 PM PST. Never did get the results from Friday.
You won't. Once Friday is over that's it. You have to reschedule for Saturday (or now its Sunday).
-
Huh, interesting.
No maps, no description, and sidebar and stuff are at the bottom.
I'm running FF 3.0.18 and its not quite right on there either.
-
Since PM status isn't in PQ's, I rarely even know if a TB/coin I'm picking or dropping is in a PM cache or not.
Which makes me wonder, do most people even pay attention?
-
Maybe cache owners pay $1. per year per cache and if their insured cache goes missing they get $100. to replace it.
A quick analysis shows that a $1 premium wouldn't even come close to supporting a $100 payout.
Think about this:
If 100 caches are "insured" then there is $100 revenue, now, over the span of a year, if even ONE of those caches comes up missing, thats a payout, if a second goes missing, the insurance company is now broke.
If you pick 100 caches (even set high standards, like last 3 or 4 logs must be finds with no dnfs in the mix), and monitor those for a year, you will find that at least 20 will come up missing.
Here is a quick stat, for every 2 caches that get "approved", one is archived. There are currently 900k active caches yet 1.6 million gc codes have been handed out.
It would take a premium of over $25/year, probably around $35 to $40 to support a $100 payout, just for the "insurance company" to break even.
Even assuming it could be done for $25/year (which it really can't), I just don't see most CO's willing to pay that much.
-
In my opinion, if the cache description says that the cache itself is hidden in honour of a soldier, that's over the line.
So you think a cache hidden in honer of a soldier and teaching about history from a global war shouldn't exist?
Wow, based on that there are 52 caches honoring obama that should be archived immediately, every one of them offend me.
-
When a new feature is requested in the forums, a discussion starts. Reading these discussions I walk away with a sense of what will be implemented is based on who screams the loudest.
I thought a better system would be like the Dell Idea Storm or CodeWeavers voting system (http://www.codeweavers.com/compatibility/browse/name/?app_id=6293) Where someone can present an idea and it can be voted on. That way the developers can get a clear sense of which features people find most useful. The second part, which code weavers has implemented, would be a pledge system. I've had many show stopper issues come up for me that have hindered my ability to cache paperless effectively. I personally would mind throwing some money at an idea to get it put in, if that's what it takes.
Sadly, features and bug requests are 2nd to revenue, they focus on revenue first (ads on the pages and in forums, making sales pitches out of the gps reviews "feature", etc), only after they quit working on that do they finally take a serious look at bug fixes and features.
Its nothing about what current paying customers want, they figure you are already on the hook, and new customers aren't aware of the bugs and much requested features till they are on the hook too.
The only thing that changes companies attitudes on things like this is a mass exodus of customers, and too many people want their pq's over making a statement to get bugs fixed and usable features implemented.
I'm not saying features and bug fixes never happen, they are just low on the totem pole and gc has no incentive to prioritize them.
For example, its been asked since I started gc'ing for the keyword search to return results in order of distance, or at least limit them to a state or area. Currently, and as it has been since I started using the site, keyword searches are returned by date, a totally useless order, I could care less about caches that are 5000 miles away. When I search for the word "bridges" I have to page about 10 or 15 pages before even finding one in my area. No one has stated they like it the way it is. The problem? There is no revenue incentive to fix it.
Another example, what have they stated they have been working on for a soon release? PQ's. Why? Because they know they've lost paying customers because of the overall unreliability and weakness of the current pq system.
-
The two most common causes of this are:
1 - A double log, you've logged one cache twice
2 - Latency, several people have stated that if they run their my finds pq right after logging, it may not pick up the most recent logs.
Pocket Query, I want all caches but i'm
in Website
Posted
I too have reported this bug and seen a couple threads on it, with NO RESPONSE from gs addressing this issue.
THIS IS A BUG. If I choose the check box that says "Are not on my ignore list", then obviously I don't want the caches, HOWEVER, if I don't choose that option, THEN I EXPECT THOSE CACHES TO SHOW.