Jump to content

Geocaching Premium members get sneak peek at new Advanced Search


Rock Chalk

Recommended Posts

I'm pretty pleased with Advanced search so far. Most of what I have tried worked. One addition I would like to see is the ability to search on Attributes for caches within an area. Related to that I would like caches that are recognized as geocache of the week to get a special attribute. Presently I am planning a trip to Norway. I remember there was a geocache of the week there quite a while ago that I would like to get. I am presently digging through the blog to find it. I want an easier way to find those extra special cool caches.

Ah, nice idea. I wonder if there's a bookmark list of those already though? Seems like that would be a better method for keeping track of them - Groundspeak would just need to make that historic list more prominent and findable for people looking for those caches.

 

Here's one I found with google (using geocache of the week inurl:bookmarks )

http://www.geocaching.com/bookmarks/view.aspx?guid=f32958c5-965c-499c-a044-7155f547abc0

Link to comment

Lots of situations where a 30 mile radius is not enough. Maybe it is OK for a generic search for all caches, but once a filter or two is added all I get are "DNFs". The ability to ask for a certain and reasonable number of caches that fit the criterion, regardless of distance, would make the search more useful.

Link to comment

Lots of situations where a 30 mile radius is not enough. Maybe it is OK for a generic search for all caches, but once a filter or two is added all I get are "DNFs". The ability to ask for a certain and reasonable number of caches that fit the criterion, regardless of distance, would make the search more useful.

 

I agree! My caching radius is about 100km. There's no way I can use this in a useful way without running many many searches, always copy pasting the gc code or coords of a cache nearby as there are no places.

Link to comment

Playing with the advanced search today in anticipation of a trip and wanted to find some trackables while in the area. No such option available to search so I had to resort to the looking for hotel keyword. Would like the ability to filter out caches with/without trackables.

Link to comment

i would also like the search to include a trackable filter option. presently on a pc you need to go directly to a cache page to see if there are trackables which is a painstaking process. it is available in a pocket query so it should be easy to include in an advanced search. would also like to see a trackable icon in the popup when clicking on a cache on the map. thanks.

Link to comment

Looks that in old search engine I can find, e.g. all webcams in country, and in new search engine I cannot.

 

Yes you can, on the first screen don't enter a location, then hit "Add Filters"; in the "Geocache Types" section on the left deselect everything except Webcam; in the "Search Only in" box at the bottom of the third column, enter the country (e.g. United Kingdom), and hit "Update Search".

 

Edit: Having looked closer it seems that the USA and Germany don't have an option to select the whole country, just individual states, whereas other countries (e.g. UK, Spain, Portugal, France) appear in the list as individual states and whole countries - which is odd.

Edited by MartyBartfast
Link to comment

Looks that in old search engine I can find, e.g. all webcams in country, and in new search engine I cannot.

 

Yes you can, on the first screen don't enter a location, then hit "Add Filters"; in the "Geocache Types" section on the left deselect everything except Webcam; in the "Search Only in" box at the bottom of the third column, enter the country (e.g. United Kingdom), and hit "Update Search".

 

Edit: Having looked closer it seems that the USA and Germany don't have an option to select the whole country, just individual states, whereas other countries (e.g. UK, Spain, Portugal, France) appear in the list as individual states and whole countries - which is odd.

 

That isn't much different that how the existing "search by country" works. IF you use the "Hide and Seek a Cache" page and select a country which has states/provinces it goes to an Advanced Search page where you have to select the individual states. Given the the basic search or current advanced search doesn't allow you to do any filter on cache type, size, or d/t ratings (which are only available when creating a PQ) I'd say it's a significant improvement. With the new Advanced Search form I can get a list of all virtual caches in Canada with just a couple of clicks. Previously I would have had to create a PQ. GIven the number of caches in the U.S. and Germany perhaps GS didn't provide an "all states" option for performance reasons.

 

 

Link to comment

G With the new Advanced Search form I can get a list of all virtual caches in Canada with just a couple of clicks. Previously I would have had to create a PQ. GIven the number of caches in the U.S. and Germany perhaps GS didn't provide an "all states" option for performance reasons.

 

The currently existing search form allows such requests for Germany. So the new search form is a step back, rather than an improvement.

Link to comment

G With the new Advanced Search form I can get a list of all virtual caches in Canada with just a couple of clicks. Previously I would have had to create a PQ. GIven the number of caches in the U.S. and Germany perhaps GS didn't provide an "all states" option for performance reasons.

The currently existing search form allows such requests for Germany. So the new search form is a step back, rather than an improvement.

One step back doesn't necessarily negate all the other improvements. While searching the entire country of Germany may be a bit more difficult in the new tool, the rest of the functionality is greatly improved.

Link to comment

G With the new Advanced Search form I can get a list of all virtual caches in Canada with just a couple of clicks. Previously I would have had to create a PQ. GIven the number of caches in the U.S. and Germany perhaps GS didn't provide an "all states" option for performance reasons.

The currently existing search form allows such requests for Germany. So the new search form is a step back, rather than an improvement.

One step back doesn't necessarily negate all the other improvements. While searching the entire country of Germany may be a bit more difficult in the new tool, the rest of the functionality is greatly improved.

 

Agreed. The old search form doesn't have the ability to filter on cache size, difficulty, terrain, cache type, or favorites.. It doesn't provide any means to only show caches with corrected coordinates or personal notes. Although nobody from GS has indicate where the new Advanced Search form, for the most part we have just been asked to evaluate the user interface, and see what results we get when we search using different criteria. The feature that we are evaluating is a *beta* release, and thus we should consider that what goes into production will hopefully be based on the feedback provided from the premium member sneak peak. Although we're just seeing the search interface and the results, the has obviously been some back end code developed which provides the ability include search criteria not previously available (range of favorite points, corrected coordinates, personal notes...). Presumably these same filters could be available on the PQ search page (assuming that they're going to retain the PQ functionality as we know it) or as filters on the map page. GIven how much additional functionality there is on the new advanced search page, some that is currently only available on the PQ, page is anyone else concerned that the intent of the new advanced search page is to replace the existing PQ search page?

 

I do have to wonder how a basic member can characterize the new form as a step back when the sneak peak is only available to premium members.

 

Like it or not, the fact that GS has made a sneak peek of a beta version of a major piece of functionality available for feedback is, IMHO a *huge* step forward.

 

Link to comment

The currently existing search form allows such requests for Germany. So the new search form is a step back, rather than an improvement.

One step back doesn't necessarily negate all the other improvements. While searching the entire country of Germany may be a bit more difficult in the new tool, the rest of the functionality is greatly improved.

 

Fact is that the type of searches I typically made for Germany are apparently not any longer possible in the new serarch tool.

For a fancy wildcard search I got used to use project-gc anyway.

Link to comment

 

I do have to wonder how a basic member can characterize the new form as a step back when the sneak peak is only available to premium members.

 

 

It's as easy as that: I believed you and Moun10Bike and others when stating that searches for all of Germany are not possible in the new system.

I'm aware of newly added functionality in the new system but I do not care about those at all.

If the new system is not going to replace the old search function on the website, then fine - I will not care. Up to now I understood that the idea is to replace the old search function by the new one (I'm not talking about PQs about which I do not care either).

Link to comment

Interesting.

 

The United States was always different on the existing "Find a Geocache" form, there is no choice for United States - you need to do it by state.

 

With the new search, it seems Germany (which is #2 in number of caches) is added as a special case. Canada (#3) is not effected.

 

The top 4 countries for number of caches (excludes archived, source project-gc) at the second I did the query is

 

1. USA 1,026,790

2. Germany 339,846

3. Canada 210,433

4. UK 179,094

Link to comment

Interesting.

 

The United States was always different on the existing "Find a Geocache" form, there is no choice for United States - you need to do it by state.

 

With the new search, it seems Germany (which is #2 in number of caches) is added as a special case. Canada (#3) is not effected.

 

 

This is pure speculation but perhaps for the beta release they decided to implement it such that the countries with two highest numbers or caches required one to enter each state, left the rest such that just entering the country would return all results which matched the other criteria for that country. That might not be how the released version will work, but by presenting it as they did it gives us the opportunity to comment on the behavior.

 

The implementation of auto-suggest for the "Search Only In: " filter is really nice, especially because the "all country" search term appears first (e.g. Italy, the Italy:Lazio). I just noticed that when the auto-suggest list is shown (for example, try typing Ita) the regions aren't sorted alphabetically. From looking at a couple of other countries is looks like the order is based on the number of caches in the region in descending order. For example, the first region for South Africa is Guateng, where Johannesburg is located and Lazio is where Rome is located in Italy.

 

 

The top 4 countries for number of caches (excludes archived, source project-gc) at the second I did the query is

 

1. USA 1,026,790

2. Germany 339,846

3. Canada 210,433

4. UK 179,094

 

I've poked around in project-gc quite a few times and have looked for a way to generate a list like this. How did you do this?

Link to comment

The implementation of auto-suggest for the "Search Only In: " filter is really nice, especially because the "all country" search term appears first (e.g. Italy, the Italy:Lazio). I just noticed that when the auto-suggest list is shown (for example, try typing Ita) the regions aren't sorted alphabetically. From looking at a couple of other countries is looks like the order is based on the number of caches in the region in descending order.

 

Strange. For the old search the option "all" is always the top option and the states are order alphabetically which is exactly what I would expect and prefer.

I do not have a use for auto suggest and do not like it neither in nor outside of geocaching, but of course that's a personal preference.

Link to comment

What I miss is searching on attributes of a geocache. That would be something!!!

 

While that might makes a worthwhile addition, it is not available in the current Seek A Cache search which *I believe* this is designed to replace. That is currently something that you can only do from a Pocket Query.

Link to comment

What I miss is searching on attributes of a geocache. That would be something!!!

 

While that might makes a worthwhile addition, it is not available in the current Seek A Cache search which *I believe* this is designed to replace. That is currently something that you can only do from a Pocket Query.

 

Which is why I'm a little concerned that GS intends to use the new Advanced Search page not only to replace the existing search page but the PQ search page as well. We've been told for a few years that they don't plan on putting more development into PQs because the API available,

Link to comment

This would probably require a significant amount of work as it involve a change in how the page is designed but hear me out.

 

As the Advance Search page is currently designed, the "Update Filters" button brings up an overlay on top of the existing search form and results list. For a lot of online retail sites, what GS is calling "Filters", these sites refer the filter controls as facets. When one enters a search term (for example, men's pants for an online clothing store), a list of results would show below. Additionally, what we see in Filters overlay, appears in a column to the left. In the case of a online clothing store, there might be "facets" or filters for "style" (long/short sleeve), size, color, price range, and so on. As one selects a facet/filter the results list is updated. There is no bouncing back and forth between selecting filters and viewing the results. Some might having everything on one page cluttered but, personally, and a lot of user testing with sites I've worked on confirms it, many find it a lot more user friendly and intuitive.

 

Link to comment

I'm new to this whole geocaching thing and was beginning to wonder why there wasn't a more advanced method to search for caches, and then I stumbled onto this discussion. Still, I'm wondering why the limitations? A really good search engine would let a person search by any criteria, i.e. location, owner, date, keyword, etc., without so many limits, like a 30 mile radius. What if someone wanted to search for ALL geocaches that reference Bigfoot (keyword), of a particular size, say regular and larger (you got that covered), all over the globe? Can't do it here! Come on GC, pull out all the stops so that we can really have fun with it!

Link to comment

What if someone wanted to search for ALL geocaches that reference Bigfoot (keyword), of a particular size, say regular and larger (you got that covered), all over the globe? Can't do it here!

Groundspeak says the poor performance of their current database is the reason for these restrictions. With modern database programs and servers, a database with a few million records generally shouldn't be hard to search, including keyword searches. Their database program must be ancient, their servers very slow, and/or their IT department doesn't know how to index.

Edited by CanadianRockies
Link to comment

Groundspeak says the poor performance of their current database is the reason for these restrictions. With modern database programs and servers, a dataset with a few million records generally shouldn't be hard to search, including keyword searches. Their database program must be ancient, their servers very slow, and/or their IT department doesn't know how to index.

:) Without a glimpse into their actual database structure, I'd either favour the latter, or give them the benefit of the doubt that their database is sufficiently complex that they haven't yet had the necessity to prioritize an overhaul in order to optimize advanced complex searches (relative to their current database schema).

Link to comment

What if someone wanted to search for ALL geocaches that reference Bigfoot (keyword), of a particular size, say regular and larger (you got that covered), all over the globe? Can't do it here!

Groundspeak says the poor performance of their current database is the reason for these restrictions. With modern database programs and servers, a database with a few million records generally shouldn't be hard to search, including keyword searches. Their database program must be ancient, their servers very slow, and/or their IT department doesn't know how to index.

 

Many modern search engines don't search a database. Instead, data is extracted from a database, to populate a search index and search query goes against the search index. I implemented a search engine for a site I developed almost 10 years ago using that mechanism. Our university library catalog (for which I was also a developer) uses a similar index built from a database. It has a little over 8 million records in it and there server that it runs one isn't especially robust.

 

 

Link to comment

Many modern search engines don't search a database. Instead, data is extracted from a database, to populate a search index and search query goes against the search index. I implemented a search engine for a site I developed almost 10 years ago using that mechanism. Our university library catalog (for which I was also a developer) uses a similar index built from a database. It has a little over 8 million records in it and there server that it runs one isn't especially robust.

For most modern database programs, search indexing is in integral part of the software. No special programming is required to construct your own separate search engine. These days, most database developers are familiar enough with indexing to take advantage of this rather basic software feature.

Link to comment

Many modern search engines don't search a database. Instead, data is extracted from a database, to populate a search index and search query goes against the search index. I implemented a search engine for a site I developed almost 10 years ago using that mechanism. Our university library catalog (for which I was also a developer) uses a similar index built from a database. It has a little over 8 million records in it and there server that it runs one isn't especially robust.

For most modern database programs, search indexing is in integral part of the software. No special programming is required to construct your own separate search engine. These days, most database developers are familiar enough with indexing to take advantage of this rather basic software feature.

 

My point is that if the performance bottleneck is due to the complexity of the database, rather than restructuring the database and how it's indexed (which could have far reaching impacts on other areas of the application) it may be simpler just use the database as a data store and write a simple application that extracts that data into a separate index like Solr or ElasticSearch and search against that. I developed a system which does just that. The system has just less than a half million research articles and with a search/browsing interface with multiple facets. The application for searching these documents doesn't have a database at all and the entire system is based on open source software.

 

 

Link to comment

My point is that if the performance bottleneck is due to the complexity of the database, rather than restructuring the database and how it's indexed (which could have far reaching impacts on other areas of the application) it may be simpler just use the database as a data store and write a simple application that extracts that data into a separate index like Solr or ElasticSearch and search against that. I developed a system which does just that. The system has just less than a half million research articles and with a search/browsing interface with multiple facets. The application for searching these documents doesn't have a database at all and the entire system is based on open source software.

Doesn't seem like a very applicable solution for live, immediate, up to date searching on an active database. You could run it on a periodic flash of data, but there's still got to be a translation time from database complexity to indexed simplicity before the search is executed.

Unless you mean updating the data on both sources simultaneously so the new one, indexed, can be searched. But then why not replace just the old one if all the data will be mirrored there? Or, you could add code to only update the new database with search-relevant data, but that would still be a significant code overhaul to gc.com at any function that modifies relevant data... =/

Link to comment

My point is that if the performance bottleneck is due to the complexity of the database, rather than restructuring the database and how it's indexed (which could have far reaching impacts on other areas of the application) it may be simpler just use the database as a data store and write a simple application that extracts that data into a separate index like Solr or ElasticSearch and search against that. I developed a system which does just that. The system has just less than a half million research articles and with a search/browsing interface with multiple facets. The application for searching these documents doesn't have a database at all and the entire system is based on open source software.

Doesn't seem like a very applicable solution for live, immediate, up to date searching on an active database. You could run it on a periodic flash of data, but there's still got to be a translation time from database complexity to indexed simplicity before the search is executed.

Unless you mean updating the data on both sources simultaneously so the new one, indexed, can be searched. But then why not replace just the old one if all the data will be mirrored there? Or, you could add code to only update the new database with search-relevant data, but that would still be a significant code overhaul to gc.com at any function that modifies relevant data... =/

 

I've got a couple different systems that I've developed that use an external search index.

 

For one of them, it *does* update both sources simultaneous. For example, if a record is inserted/deleted/modified in the database it calls a service which updates the search index. All the SearchIndex functionality is encapsulated in a service that handles all CRUD operations for the search index so only requires a few lines of additional code to call the SearchIndexService to perform the necessary operation.

 

For the other system the datastore (implemented with database technology) and the search index are completely separate. I have a similar Search index service for building the search index from scratch (from the database) or updating records in the index. However, the search index is a lot more static since the box the application runs on is designed to work in environments with marginal or no internet access. Currently, the search index is only updated annually.

 

One of the advantages of this mechanism is at all the data that is in the search index doesn't have to come out of the database. For the second system I described, metadata about research articles comes out of the database, and full text is extracted out of the articles (they're all PDF files) and indexed as well. Some of the data in the search indexed is derived on the fly. After the full text is extracted, that text is autotagged from a controlled vocabulary, and those tags are added to the index as well. It takes a few hours to initially create an index of a half million articles but updates for a specific record in the index is extremely fast.

 

This is diverging quite a bit from the topic but it just seems like everytime discussion come up about searching in the GS site, many assume that it's just a matter up changing the SQL used to query the database and my point is simply that there are others ways to implement a search engine.

 

 

 

Link to comment

This is diverging quite a bit from the topic but it just seems like everytime discussion come up about searching in the GS site, many assume that it's just a matter up changing the SQL used to query the database and my point is simply that there are others ways to implement a search engine.

Completely agreed. And yeah what you describe is a much more technical description of the concepts I very simplistically implied :) I haven't implemented any systems to the degree you've described, but I understand, and yep - I think the amount of work required to build an external search index, using either of the methods you described, is very much more than I think Groundspeak is willing to invest at the moment, since all the data is in their database (afaik) as opposed to contained in sources/files outside the database.

 

But I do think the most optimal timewise would be to set up an indexed search database that mirrors the searchable parameters, and build in a service or trigger that updates any relevant search data whenever there's an adjustment to specific data in the primary database. The search function runs on that indexed source. Full text searches would be a bigger thing tho.

 

Basically, they'd be recreating a google-search type function. Google doesn't search the whole internet - they just keep their own optimized database of searchable content, and pushes the user to the original source in the results.

I wonder if Groundspeak could license google's search technology for the geocache database :)

Edited by thebruce0
Link to comment

Yes, but that's not implement in-house, that's just using searching through google's database of publicly accessible and analyzed web pages. I mean, deploy the google search farm internally, directly interacting with the database, as opposed to just limited google search results to public access web pages (for exmaple, google doesn't return results on PMO cache pages). Does google even offer that level of product (and/or hardware) licensing? *shrug* :)

Link to comment

This is diverging quite a bit from the topic but it just seems like everytime discussion come up about searching in the GS site, many assume that it's just a matter up changing the SQL used to query the database and my point is simply that there are others ways to implement a search engine.

Completely agreed. And yeah what you describe is a much more technical description of the concepts I very simplistically implied :) I haven't implemented any systems to the degree you've described, but I understand, and yep - I think the amount of work required to build an external search index, using either of the methods you described, is very much more than I think Groundspeak is willing to invest at the moment, since all the data is in their database (afaik) as opposed to contained in sources/files outside the database.

 

But I do think the most optimal timewise would be to set up an indexed search database that mirrors the searchable parameters, and build in a service or trigger that updates any relevant search data whenever there's an adjustment to specific data in the primary database. The search function runs on that indexed source. Full text searches would be a bigger thing tho.

 

Basically, they'd be recreating a google-search type function. Google doesn't search the whole internet - they just keep their own optimized database of searchable content, and pushes the user to the original source in the results.

I wonder if Groundspeak could license google's search technology for the geocache database :)

 

They don't have to. The mechanism I described includes a query parser that implements google-search syntax by default. So, one can enter: Challenge size:small difficulty:5 and you get results with Challenge in the title, with a size = small, and a difficulty = 5

 

The way it works for a full text search is that when you build the index you define what will go into the default "text" field. For example, it could take the cache title, long and short description and concatenate the strings into one field. The title could be a separate field. A search for "park" would return results where "park" occurred in the title, long or short description. A search title:park would only return caches where the work park was in title.

Link to comment

Yes, but that's not implement in-house, that's just using searching through google's database of publicly accessible and analyzed web pages. I mean, deploy the google search farm internally, directly interacting with the database, as opposed to just limited google search results to public access web pages (for exmaple, google doesn't return results on PMO cache pages). Does google even offer that level of product (and/or hardware) licensing? *shrug* :)

 

Google Search Appliance

Link to comment

I'm really happy you guys are putting some thought into this. This is something we've really needed. I appreciate it!!

 

We've really been needing a search engine where we can do a search on something within our areas, like "Tubular" within Seattle.

 

I also think it's a great idea to check with a few people and get feedback.

 

 

Now all that being said, I can't get it to work at all.

 

I tried caches within the state of Washington with the name "tubular" in the title. It said, "DNF".

 

So I tried all caches anywhere with "tubular" in the title. "DNF"

 

So then I tried clearing all filters and tried just "Dayspring". "DNF"

 

Maybe it's down today.

 

I went back to the old search engine and tried "tubular" and got a whole lot of caches, but not Daysprings caches. I freaked out and went to the map to find if they were all archived, but easily found the ones I've found.

 

I'd love to find all of his caches that I haven't found already. I can use the old search for now, but it was a good test of the new one.

Link to comment

I tried caches within the state of Washington with the name "tubular" in the title. It said, "DNF".

 

So I tried all caches anywhere with "tubular" in the title. "DNF"

 

So then I tried clearing all filters and tried just "Dayspring". "DNF"

 

Maybe it's down today.

 

Try this: https://www.geocaching.com/play/search?originomitted=True&kw=tubular&owner=Dayspring&r=48

 

Were your searches set up using the same criteria?

Link to comment

I tried caches within the state of Washington with the name "tubular" in the title. It said, "DNF".

 

So I tried all caches anywhere with "tubular" in the title. "DNF"

 

So then I tried clearing all filters and tried just "Dayspring". "DNF"

 

Maybe it's down today.

 

Try this: https://www.geocachi...=Dayspring&r=48

 

Were your searches set up using the same criteria?

 

Since this issue has come up several times it probably indicates that the feedback given to the user is not sufficient. I think there is some confusion regarding how the search is done when entering a location in the search form and when entering a location in the "Search only in" filter.

 

Here's how I would explain it.

 

When entering a location in the search form (City, State, coordinates, GC code), that location is geocoded (the conversion of a place name to a set of lat/long coordinates) to a set of lat/long coordinates (if it can) and then the proximity limit takes effect. In this case, the maximum search radius is 30 miles relative to the lat/long coordinates as determined by the geocoding . If I enter Washington into the search box, assuming that it converted Washington to a set of lat/long coordinates, the lat/long coordinates (i.e. a center point) are likely somewhere in "the middle" of Washington state. Since the other criteria is set to caches with "tubular" in the title and are owned by Dayspring, it's not going to show caches in Seattle because they're well beyond the 30 mile radius from the lat/long coordinates for Washington.

If, however, I enter Seattle, Washington in the search box, now the lat/long coordinates are some place in Seattle, and now I not only get the three Dayspring "tubular" caches but it shows the distance relative to the center point. The advantage of specifying a location in the search form is that you'll get a distance value from that center point. The disadvantage is that you'll only see caches within 30 miles of that center point. If the "Search on in" filter is used (and selecting United States: Washington) and the location string is removed from the search box, the list of caches will be the same (because all the Dayspring tubular caches are in the United States:Washington region, but you won't see a distance value. That is because, instead of showing caches within proximity to a center point it's showing caches which have the same countryID or stateID as the specified region.

Combining both a Search location and "Search only in" as a filter producing a result that was previous not possible. Try this;

 

Enter Buffalo, NY in the search box then, "Change Filters". Select the maximum radius (30 miles), then select "United States: New York" in the "Search only in" filter, and select "Update Search". When you get results, click "Map these results". You'll see about a bunch of caches within 30 miles of Buffalo, NY but none which require crossing the border into Canada.

Link to comment

I tried caches within the state of Washington with the name "tubular" in the title. It said, "DNF".

Try not specifying anything in the initial "City, state, coordinates, GC code" box. Instead, simply click on the "Add filters" box.

 

Geocache name includes... tubular

 

Search only in... United States: Washington

 

For me, this yielded six results, including three in Seattle, one near Spokane, one near Bellingham, and one near (sort of) Sunnyside.

Link to comment

Yes, having "State" in the empty-text label for the search filter is confusing - that would not become a search for all caches in the state, it would be translate to whatever the GPS coordinates are for the state, as the center point.

 

Perhaps more indicative of the search field's purpose would be to label it not "Search location" but "Search from center point" or something of that sort.

Link to comment

One of the greatest desires I have is the ability to do a search, which can be downloaded as a GPX file, of a wide area without power trails.

 

Yeah, power trails are all micros, but sometimes I'm looking for old micros or micros for a challenge or yet some other reason. Power trails saturate my searches and it would be nice to find all caches within 500 miles not hidden by (powertrail publisher 1, power trail publisher 2, ...) So I get a nicely scattered search of geocaches which are all those in the radius which are not power trails.

 

(all this because we don't have an attribute for Power Trails :) )

 

Nice to be able to see what Challenges we have around the area I'll be visiting. Stuff like that.

 

I travel often and wish to generate some decent queries, but they are often distorted by what I don't want.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...