Jump to content

Feauture Request: Redesign of the PQ-Feature


Blue Savina

Recommended Posts

Without using the recently banned Macros, there is no way to keep your own database up to date.

It is necessary to recognize archived caches, but there is no possibility, to include these in PQs. Therefore my first request is to create a possibility to include them in the PQs (This could be a PM-feature).

 

Furthermore I would appreciate a complete redesign of the PQ-feauture. For example: for building a local DB for a larger area around your home coordinates, you have to use some tricks to get this working. There are several workarounds for this (e.g. split the PQs with the "placed between" data, or creating several PQs around different locations), but it is not easy to do this. I am really looking forward for a user-friendly way to create a local DB that can be held up-to-date.

 

I have a few suggestions:

 

- No more limitation to 500 caches for one desired PQ-search. If more than 500 caches are included in one search, then this search is splitted up AUTOMATICALLY into several PQs for this search. These single PQs could be counted in the same way as today. (Or instead there is a maximum number of caches, whose data can be used in personal PQs per week).

Lots of redundant data (PQs within intersecting radius around different locations) could be avoided.

 

- Possibility to create a PQ for a user defined polygon, in the same way how this can be used for GSAK.

With these polygons the user can also limit the data more.

For example no need to create a big radius, only because one single area at the edge of this is needed.

 

 

I know these points mean a lot of work for you, but maybe some of these ideas can be useful for you.

Link to comment

:)

That's a huge help, thank you.

 

I was wondering about the recent bann on the Makro and its users as well.

I couldn't find an official statement as to why those users were banned, and which Makros shouldn't be used anymore.

 

Even though I keep a relatively small area in a home database, I use the Email Reader to get the notifications of archived caches into GSAK and therefor can delete those archived ones.

And also can exclude the disabled ones when feeding the Oregon.

 

Was I just to blind to find the thread?

 

To have the PQ policy changed would be so much easier for us paperless cachers.

There is no way to distinguish archived caches in a database. They don't come in the PQs and stay in the database indefinitely with me none the wiser...

Link to comment
The owner of this website is on record as stating that PQs were not create to help anybody create and maintain an offline database.

 

And this is done for a purely pragmatic reason. If they made it easy to keep an offline database then they would be giving away their only asset.

 

More flexibility in PQs would be nice. The ability to keep an offline database would be useful for some scenarios... not me, but I can see where it might be useful.

Link to comment
I was wondering about the recent bann on the Makro and its users as well.

I couldn't find an official statement as to why those users were banned, and which Makros shouldn't be used anymore.

I don't know what "The Makro" was, but I am guessing it was a page-scraper that automatically retrieved pages from the site. Automated page retrieval has always been specifically prohibited by the Groundspeak Terms Of Use that you agreed to when you registered at the site.

 

There is a link to the TOS at the bottom of every cache page.

 

Here is the relevant part:

 

You agree that you will not use any robot, spider, scraper or other automated means to access the Site for any purpose without our express written permission.
Link to comment

First, I'm not sure what macro was recently banned. Spidering and screen scraping was always a violation of the Terms of Use. I've always felt that the GSAK macros that get one page and parse that to update the GSAK database are fine. The problem is if you automated these to update lots of caches in your offline database.

 

I doubt that Groundspeak is interested in changing the PQ system to make it easier to maintain an offline database. For sound business reasons that have been discussed many times, they want to limit the size of offline databases that people keep for personal use. They likely believe that the number of caches that can be downloaded with the current PQ system is plenty to keep any geocacher busy. For those that want the convenience of being able to go looking for any cache in their state or country, there are alternative solutions. The iPhone application and others like it allow you to get the cache information by a cellular phone connection. If you don't have a smart phone that runs one of those apps, you probably have a phone that can access the Gecoaching.com WAP site. Nobody actually needs to keep an offline database of their whole country. At least nobody has made a convincing argument for that. Many geocacher's don't even use the PQ to keep offline database. They simply request a PQ for the area they are going to a few hours ahead of time. They cache with confidence that they have the latest caches and none of the caches are archived.

Edited by tozainamboku
Link to comment

Nobody actually needs to keep an offline database of their whole country. At least nobody has made a convincing argument for that. Many geocacher's don't even use the PQ to keep offline database. They simply request a PQ for the area they are going to a few hours ahead of time. They cache with confidence that they have the latest caches and none of the caches are archived.

I agree, I don't need the whole country.

 

Trouble is, the PQs don't come quickly enough for any spontaneous caching. If I can't be sure to get the PQ within a few hours, I have to request it days ahead. And even then there is no guarantee they'll arrive...

 

Mostly I try to keep my homearea in a GSAK database. But my homearea is so populated by caches, that I would need several PQs to cover the whole area.

 

For holidays it's also nice to have local caches in GSAK, as I don't have a iphone and certainly wouldn't pay for the extra charge they ask, when being abroad. So when outside Germany, I'm offline.

 

Now, when I get a new PQ of an area already in my database, how do I recognize the archived ones? Am I supposed to delete all files?

Link to comment

Sorry for the :) icon but this has been addressed many times. As others have already said, Groundspeak has decided this issue in their favor as it's a business decision.

 

As to how to clean your GSAK database of archived caches, use the Last GPX column. Any dates prior to your PQ file will, most likely be archived.

 

I use 11 PQs for my home area (30 mile radius) and, yes, it takes a three day cycle to get all of them. Then any cache that has a Last GPX date four days old or more is suspect as being archived.

Link to comment

As to how to clean your GSAK database of archived caches, use the Last GPX column. Any dates prior to your PQ file will, most likely be archived.

 

I use 11 PQs for my home area (30 mile radius) and, yes, it takes a three day cycle to get all of them. Then any cache that has a Last GPX date four days old or more is suspect as being archived.

That's an option.

 

I'm aware, that spidering is not acceptable and I didn't use that banned makro.

At the moment I'm using the Email Reader makro.

 

Still with the recent banns, I would like to know which makros GS frowns upon.

Link to comment

Sorry for the :) icon but this has been addressed many times. As others have already said, Groundspeak has decided this issue in their favor as it's a business decision.

 

As to how to clean your GSAK database of archived caches, use the Last GPX column. Any dates prior to your PQ file will, most likely be archived.

 

I use 11 PQs for my home area (30 mile radius) and, yes, it takes a three day cycle to get all of them. Then any cache that has a Last GPX date four days old or more is suspect as being archived.

 

I have a similar practice listing caches within 50 miles of me. But in practice I find caches can not be logged for weeks, and so do not appear in my PQs which look for updated caches. When I search on the Last GPX column I find that the vast majority of caches (with dates up to 6 weeks ago) are still active.

 

As PQs can take days to arrive, it is impractical to look out of the window in the morning, decide its good caching weather, and then download all available the caches in the desired area. Expecting us all to buy a smartphone as well as a GPSr just to find a cache near you takes no acount of the price charged for such items or the cost of the phone calls.

Link to comment
The point of a GPX file is to give you a list of caches you can search for, not a list of dead caches.

Exactly. I'm curious why so many people have a fascination with seeing/keeping a database of old, archived caches.

That is not what we want to achieve. We want to maintain an up-to-date database without archived caches. But if you generate a pocket query and run it constantly, it will somewhen overflow with new caches. Sure, you could just delete all caches that haven't been updated within the last X days, but this will often delete caches that have fallen out of the PQ due to the new caches. You could add a new pocket query, but somewhen you will reach the limitations of the PQ system.
Link to comment

Sure, you could just delete all caches that haven't been updated within the last X days, but this will often delete caches that have fallen out of the PQ due to the new caches.

So what? Unless you're searching right at the fringe of the PQ radius, it's not an issue. I pick a general area I want to cache at, run a PQ for it, and go caching. Life's good.

Link to comment
We want to maintain an up-to-date database without archived caches.

That's what your PQ results are. An up-to-date data file without archived caches.

 

So your goal and the tools align, right?

 

Oh, wait: you want to keep an offline database because you have Got Some Application (Killer) that doesn't work right unless you do that.

 

Maybe, perhaps, you could consider taking that up with the author of said application instead of asking Groundspeak to completely change their business model?

Link to comment

First, I'm not sure what macro was recently banned. Spidering and screen scraping was always a violation of the Terms of Use. I've always felt that the GSAK macros that get one page and parse that to update the GSAK database are fine. The problem is if you automated these to update lots of caches in your offline database.

 

I doubt that Groundspeak is interested in changing the PQ system to make it easier to maintain an offline database. For sound business reasons that have been discussed many times, they want to limit the size of offline databases that people keep for personal use. They likely believe that the number of caches that can be downloaded with the current PQ system is plenty to keep any geocacher busy. For those that want the convenience of being able to go looking for any cache in their state or country, there are alternative solutions. The iPhone application and others like it allow you to get the cache information by a cellular phone connection. If you don't have a smart phone that runs one of those apps, you probably have a phone that can access the Gecoaching.com WAP site. Nobody actually needs to keep an offline database of their whole country. At least nobody has made a convincing argument for that. Many geocacher's don't even use the PQ to keep offline database. They simply request a PQ for the area they are going to a few hours ahead of time. They cache with confidence that they have the latest caches and none of the caches are archived.

 

Have you been spying over my shoulder.

Link to comment

Nobody actually needs to keep an offline database of their whole country. At least nobody has made a convincing argument for that. Many geocacher's don't even use the PQ to keep offline database. They simply request a PQ for the area they are going to a few hours ahead of time. They cache with confidence that they have the latest caches and none of the caches are archived.

I agree, I don't need the whole country.

 

Trouble is, the PQs don't come quickly enough for any spontaneous caching. If I can't be sure to get the PQ within a few hours, I have to request it days ahead. And even then there is no guarantee they'll arrive...

 

Mostly I try to keep my homearea in a GSAK database. But my homearea is so populated by caches, that I would need several PQs to cover the whole area.

 

For holidays it's also nice to have local caches in GSAK, as I don't have a iphone and certainly wouldn't pay for the extra charge they ask, when being abroad. So when outside Germany, I'm offline.

 

Now, when I get a new PQ of an area already in my database, how do I recognize the archived ones? Am I supposed to delete all files?

 

I just ran a PQ on a Sunday morning. It took three minutes from the time I chose the day of the week to run to the time it was delivered. If you choose to write a new one every time they come in fast. Usually just a few minutes. For some reason if you recycle old PQs it seems to take longer, often much longer.

 

I always use GE to look for an area I want to cache in and then use the PQ system to support that choice. I never have "stale data" when caching.

 

I think that many of you who expend so much effort maintaining an off line database would be surprised at how easy it is to DL fresh, reliable, data on your way out the door.

Link to comment

 

I just ran a PQ on a Sunday morning. It took three minutes from the time I chose the day of the week to run to the time it was delivered. If you choose to write a new one every time they come in fast. Usually just a few minutes. For some reason if you recycle old PQs it seems to take longer, often much longer.

 

I always use GE to look for an area I want to cache in and then use the PQ system to support that choice. I never have "stale data" when caching.

 

I think that many of you who expend so much effort maintaining an off line database would be surprised at how easy it is to DL fresh, reliable, data on your way out the door.

What a joke.

 

The thread about GC down time in the German Forum is 270+ pages.... B)

The site is often not accessible, especially when you're in a kind of hurry, cause you're ready to head out.

 

The only solution is to be prepared. Don't get regular PQs, they are the last in line to be send.

The new created PQs come quickly enough. But not reliable.

For one business trip this summer I got the PQ of the area but not the more important (for me) of the caches along the route.... Requested within minutes of each other, one made it, the other never came.

 

To avoid upset, I keep a GSAK database from my homezone at least.

 

In the meantime I found out, that the banned macro isn't needed for the email reader to get archive logs, so the reader can still show me the archived caches. As no GPX file is loaded for those...

Link to comment

I sign for archived caches in PQ.

 

They don't have to be part of a regular PQ, but an option to also include archived caches or to have a archived Caches only PQ once a week would be great.

TPTB have stated that this is not going to happen. The primary reason is that when a property owner/manager contacts GS to have a cache on their property archived they want to know that the listing will no longer be available for searches on the site. By not including them in PQs they attempt to strike a compromise. I don't know how effective it is but it seems that it does preclude PQs of archived caches.

Link to comment

 

I just ran a PQ on a Sunday morning. It took three minutes from the time I chose the day of the week to run to the time it was delivered. If you choose to write a new one every time they come in fast. Usually just a few minutes. For some reason if you recycle old PQs it seems to take longer, often much longer.

 

I always use GE to look for an area I want to cache in and then use the PQ system to support that choice. I never have "stale data" when caching.

 

I think that many of you who expend so much effort maintaining an off line database would be surprised at how easy it is to DL fresh, reliable, data on your way out the door.

What a joke.

 

The thread about GC down time in the German Forum is 270+ pages.... B)

The site is often not accessible, especially when you're in a kind of hurry, cause you're ready to head out.

 

The only solution is to be prepared. Don't get regular PQs, they are the last in line to be send.

The new created PQs come quickly enough. But not reliable.

For one business trip this summer I got the PQ of the area but not the more important (for me) of the caches along the route.... Requested within minutes of each other, one made it, the other never came.

 

To avoid upset, I keep a GSAK database from my homezone at least.

 

In the meantime I found out, that the banned macro isn't needed for the email reader to get archive logs, so the reader can still show me the archived caches. As no GPX file is loaded for those...

 

It is seldom that a fresh, not recycled, PQ takes more than a few minutes to run. On those few occasions that it does I still have the information from my last one right at hand.

 

But go ahead. Build your DB. Just don't blame GS for the failings of GSAK.

 

Edit to ask-How big a database do you need to maintain to cover your caches along a rout needs?

Edited by GOF & Bacall
Link to comment

 

I just ran a PQ on a Sunday morning. It took three minutes from the time I chose the day of the week to run to the time it was delivered. If you choose to write a new one every time they come in fast. Usually just a few minutes. For some reason if you recycle old PQs it seems to take longer, often much longer.

 

I always use GE to look for an area I want to cache in and then use the PQ system to support that choice. I never have "stale data" when caching.

 

I think that many of you who expend so much effort maintaining an off line database would be surprised at how easy it is to DL fresh, reliable, data on your way out the door.

What a joke.

 

The thread about GC down time in the German Forum is 270+ pages.... B)

The site is often not accessible, especially when you're in a kind of hurry, cause you're ready to head out.

 

The only solution is to be prepared. Don't get regular PQs, they are the last in line to be send.

The new created PQs come quickly enough. But not reliable.

For one business trip this summer I got the PQ of the area but not the more important (for me) of the caches along the route.... Requested within minutes of each other, one made it, the other never came.

 

To avoid upset, I keep a GSAK database from my homezone at least.

 

In the meantime I found out, that the banned macro isn't needed for the email reader to get archive logs, so the reader can still show me the archived caches. As no GPX file is loaded for those...

 

If for some reason I'm unable to get a PQ, I just go to my email client and grab the latest GPX file I have for the area I want to cache in, keeping in mind that their may be a few archived caches in the list. Far less headaches than trying to maintain a database using something that was ever intended for that purpose.

Link to comment

Oh, wait: you want to keep an offline database because you have Got Some Application (Killer) that doesn't work right unless you do that.

I keep an offline database because that's the easiest way to maintain updated coordinates for solved puzzles and mutis-in-progress.

143995198_a2df8884b6_o.jpg

I find bookmark entries work even better. B)

 

Sure they do, but that isn't as funny. I guess I once again neglected to tack on the appropriate emoticon, huh?

Link to comment

Nope. Don't want it.

 

I want to get the latest information when I need it and load up my GPS. I would have no desire to see archived caches in my PQs, because I'm not going to look for those.

 

I'd rather not see the PQ redesigned to fulfill the desires of the minority of the people who choose to use the PQs in a manner that they were not designed to do.

Link to comment

I like to record information from solved puzzles etc. in a database. I know its frowned upon but we are in an information technology society so lets use what is available.

 

I like to be able to search upon such recordings, the database facilitates this in a way pen and paper never will.

 

I could greatly reduce my PQ usage if I could seed GC codes in to generate a PQ tailored from database filters upon my own added criteria.

 

To amplify or exemplify this, there will be many puzzles listed on the GC site.

Many I will never be able to solve. If I run a PQ with the Unknown cache icon set I will get every puzzle cache within the other search criteria, downloaded each time I run the PQ a great redundancy to me.

I could set up a bookmark list for those that I consider I might be able to solve but I still need to database or visit each cache page to access and mark such criteria.

A bookmark generated PQ might result in far less than the permitted total limit.

A database initiated PQ would allow me to fill each PQ to its limit by combining several criteria and reduce the amount I submit.

 

Could a Groundspeak representative please pick up on this and voice comment. B)

Link to comment
Without using the recently banned Macros, there is no way to keep your own database up to date.

My understanding is that no macros have been "banned", because Groundspeak cannot tell if you are using any given GSAK macro. Some users have had their accounts suspended because they were downloading a very large number of GPX files. Many of these may have been using a particular macro, and Groundspeak has suggested to Clyde that it might be a good idea to stop distributing that macro, because some people who are using it may not realise that it will generate a volume of traffic which can lead to problems.

 

Of course, there are plenty of ways to keep an offline database "up to date", for some value of "up to date". If you want every cache in Germany, you're going to need to accept that some of the data will be a month old, but I really can't believe that anybody can maintain that sort of database and still have time to actually go out geocaching.

 

It is necessary to recognize archived caches, but there is no possibility, to include these in PQs. Therefore my first request is to create a possibility to include them in the PQs (This could be a PM-feature).

As others have said:

- You can spot archived caches pretty quickly - they're generally the ones which are in your DB and no longer in the PQ

- How would the PQ know which archived caches to send you? Think carefully about this, because it's not a trivial computational problem.

- In any case, Groundspeak has stated that for many reasons, some of them to do with potential legal issues, archived caches are not going to be in PQs other than "My Finds".

 

The thread about GC down time in the German Forum is 270+ pages.... B)

Even if we knew how many posts there are per page, and how long the thread has been running, and how many different people have posted to it, and how many different downtime incidents it covers: exactly what does that prove, other perhaps than that when the Geocaching.com site is down, some of the 100,000+ German geocachers who at that moment cannot by definition use the site are reasonably likely to post, in a forum, that the site is down?

 

To amplify or exemplify this, there will be many puzzles listed on the GC site.

Many I will never be able to solve. If I run a PQ with the Unknown cache icon set I will get every puzzle cache within the other search criteria, downloaded each time I run the PQ a great redundancy to me.

I could set up a bookmark list for those that I consider I might be able to solve but I still need to database or visit each cache page to access and mark such criteria.

You could also add each cache which you aren't able to solve, to your ignore list. Caches on that list can then be excluded from PQs. Yes, you need to visit each such cache page once and click "ignore listing", but you generally have to visit the cache page once anyway - you can have GSAK bring it up in the lower pane.

 

A bookmark generated PQ might result in far less than the permitted total limit.

A database initiated PQ would allow me to fill each PQ to its limit by combining several criteria and reduce the amount I submit.

It would also be a substantial amount of code, to satisfy a small number of people - and their requirements are probably pretty specific on an individual basis, so it might not satisfy them anyway. I'm guessing that Groundspeak won't be in here to answer this request, but you can take it as read that other things will have priority.

Edited by sTeamTraen
Link to comment
Nope. Don't want it.

 

I'd rather not see the PQ redesigned to fulfill the desires of the minority of the people who choose to use the PQs in a manner that they were not designed to do.

 

This "redesign" would just be a selection option for caches already in the database. Not sure you would agree that anything the majority of users don't use should be removed from the PQ options.

 

But I know it isn't going to happen...

Link to comment

Oh, wait: you want to keep an offline database because you have Got Some Application (Killer) that doesn't work right unless you do that.

I keep an offline database because that's the easiest way to maintain updated coordinates for solved puzzles and mutis-in-progress.

 

That doesn't require a database of caches, just a database of corrected coordinates. Had GSAK been designed properly, it would not require you to have the cache loaded into the database to enter corrected coords.

 

FWIW, I have my own program that applies the corrections and outputs a modified GPX file. In addition to being quite simple to use, it is roughly 30 times as fast as GSAK. Let me know if you are interested.

Link to comment

This is the typical scenario where if it doesn't apply to me then lets just keep things the way they are...

 

I cant understand why so people raise their voice so vocally against people that for whatever reason wish to maintain an offline database and to which is pretty obvious that the current system is not adequate?

 

Why are we limiting the choice to the folks that wish to have an offline database? You chose not to have one, why are you so against to other people to have the choice to choose between having or not having an offline database?

 

Thats what I dont understand with this issue...

Link to comment

That doesn't require a database of caches, just a database of corrected coordinates. Had GSAK been designed properly, it would not require you to have the cache loaded into the database to enter corrected coords.

 

FWIW, I have my own program that applies the corrections and outputs a modified GPX file. In addition to being quite simple to use, it is roughly 30 times as fast as GSAK. Let me know if you are interested.

For that matter, even Spinner has this ability (and no database required there either).

Link to comment

This is the typical scenario where if it doesn't apply to me then lets just keep things the way they are...

 

I cant understand why so people raise their voice so vocally against people that for whatever reason wish to maintain an offline database and to which is pretty obvious that the current system is not adequate?

 

Why are we limiting the choice to the folks that wish to have an offline database? You chose not to have one, why are you so against to other people to have the choice to choose between having or not having an offline database?

 

Thats what I dont understand with this issue...

I don't see most people saying any such thing - only pointing out that it isn't going to happen for some valid business reasons and decisions.

 

Others point out ways around the limitations.

 

Others point out that you don't really need an offline database.

 

I don't see anybody trying to limit your choices.

Link to comment

This is the typical scenario where if it doesn't apply to me then lets just keep things the way they are...

 

I cant understand why so people raise their voice so vocally against people that for whatever reason wish to maintain an offline database and to which is pretty obvious that the current system is not adequate?

 

Why are we limiting the choice to the folks that wish to have an offline database? You chose not to have one, why are you so against to other people to have the choice to choose between having or not having an offline database?

 

Thats what I dont understand with this issue...

I think you misunderstand the motives. We have seen this asked before, many times. We know what the answer is and all we can do is give you the scoop and suggest the work arounds. So we are actually trying to help.

Link to comment

This is the typical scenario where if it doesn't apply to me then lets just keep things the way they are...

 

I cant understand why so people raise their voice so vocally against people that for whatever reason wish to maintain an offline database and to which is pretty obvious that the current system is not adequate?

 

Why are we limiting the choice to the folks that wish to have an offline database? You chose not to have one, why are you so against to other people to have the choice to choose between having or not having an offline database?

 

Thats what I dont understand with this issue...

I think you misunderstand the motives. We have seen this asked before, many times. We know what the answer is and all we can do is give you the scoop and suggest the work arounds. So we are actually trying to help.

 

Yes but there is evolution in everything.

How many times since owning a computer have you found you have needed to upgrade or replace it. Is your choice limited to one model?

 

Likewise in geocaching there are many variations of how we achieve an end and individuals will always have their preferences and rightly so should have that choice.

 

As soon as someone makes a request for a change, there is always a profusion of comments saying it will never happen, well no unless voiced it never will. I've replied referencing one quote but many equally apply so no implicit inference to the one quoted.

 

Lets look forward to evolving methods to suit our own individuality, it might be your work workaround or third party software that is banned next. B)

Link to comment
As soon as someone makes a request for a change, there is always a profusion of comments saying it will never happen, well no unless voiced it never will.

While I am generally sympathetic to your point, and agree that there are many good suggestions that are met with negativity, this particular request is different.

 

First, this request been made probably about 50 times in the past. Yet the OP shows no evidence of having done even a minimal search for related topics.

 

Second, TPTB at Groundspeak have made it abundantly clear that they do not and will not support the maintenance of offline databases.

 

Third, many people have developed work-arounds for the issues raised and shared those in the thread.

 

As a result, when a "new" idea like this is raised, it is generally met with the response you have seen.

Link to comment
Lets look forward to evolving methods to suit our own individuality, it might be your work workaround or third party software that is banned next. B)

Again: nothing has been banned here. When you signed up, you agreed not to use tools that scrape the site. A number of people who were using such tools, have had their accounts suspended. Groundspeak has said that they will contact those people, accept an "I didn't know it was scraping" from those who were using the GSAK macro in question - compare that with the attitude of your local law enforcement or bureaucrats in comparable circumstances! - and reinstate their accounts once things are clear.

 

This game is meant to be about finding tupperware in the woods. Once you need better tools than the site provides because you want to keep track of hundreds or thousands of puzzles, I suspect we're some way from the core business.

Link to comment
What's that definition of "scraping" anyway.

I read in this forum post that the use of GC Tour is illegal since it uses information from gc.com. It seems to be forbidden to provide a better user interface than the site does.

Scraping means using a tool other than your eyes to load a page and extract its content, with a view to aggregating it. For example, travel agencies that want to make commission from low-cost airlines will use robots that load every possible combination of airports and dates from the low-cost airline's site, to build a timetable from which they can then resell you tickets. See http://news.bbc.co.uk/2/hi/business/7549547.stm.

 

A part - and I'm guessing, probably a substantial part - of Groundspeak's revenues comes from the advertisements on Geocaching.com - either the Google ads shown to non-members, or the ones which Groundspeak hosts and which appear on the left of each site page for everybody. These ads are billed to the people who place them on a per-click and/or a per-view basis. If they're billed per-click and you are scraping the site, you will never click them, and Groundspeak will lose revenue - that may not concern you personally because perhaps you never click them, but if you distribute a script which allows thousands of people to do the same, it will have an impact. On the other hand, if the advertisements are billed per-view and you are scraping the site, Groundspeak will bill the company which placed the ad, and at that point they could be said to be committing fraud. :) If you were an advertiser paying per-view, how would you feel if you thought that a large percentage of the "views" for which you were getting billed, were in fact being generated by site scraping? You would want to see Groundspeak doing something about it.

 

It can come as a shock or a disappointment to many geocachers to discover that Groundspeak is a business whose aim is to make money. Perhaps many imagine that it would be cool if it could be done as some kind of collective, but that's not where we are right now. For people who would like the world to be that way, there are other sites which are run on that basis. In the meantime, Groundspeak's brand of capitalism is one which I can put up with. Ask me again how I feel about it when they double the PM subscription rate because Jeremy wants a bigger yacht.

Link to comment

GC Tour doesn't take any data to use it somewhere else and doesn't change anything on the advertising, but enables you to retrieve and show in a more user friendly manner.

If sincerely hope that the aim of GS is not to make navigation, user experience as such difficult, annoying, ... that one gets the most pageviews of advertisement.

 

If advertising is a concern, one should prohibit the use of Firefox or any browser which allows you to block advertisements too. Same thing for PQs since no advertising involved.

 

As said in earlier blogs, I don't mind paying a higher PM fee as long as I get something considerable in return.

Link to comment
If advertising is a concern, one should prohibit the use of Firefox or any browser which allows you to block advertisements too. Same thing for PQs since no advertising involved.

I guess that's factored in - I'm sure that the online advertising world knows that X% of Firefox users have AdBlock installed, for example, and also Firefox users like you and me are much smarter people who don't click on ads, so all that would affect the price per view.

 

As for PQs, you paid $30 to get those, which covers a lot of clicks - plus, there are no advertisers getting billed for clicks which didn't happen.

 

I'm not sure what the formal argument against GC Tour is, but generally Groundspeak don't take action against "complementary stuff" unless there is a real threat to their business model.

Link to comment

I'm not sure what the formal argument against GC Tour is, but generally Groundspeak don't take action against "complementary stuff" unless there is a real threat to their business model.

 

I'm really glad to hear that.

It would be nice to see some official statement about this over here and in this forum thread in order to let know people that the 'Illegal' overstatement is nothing to worry about

Link to comment
Guest
This topic is now closed to further replies.
×
×
  • Create New...