Jump to content

Zor

+Premium Members
  • Posts

    222
  • Joined

  • Last visited

Everything posted by Zor

  1. Ok, so I am just now getting back to this. What is the redirect URI? Was this something GSP was supposed to ask for or can we put anything we want to bounce back to here?
  2. I've been using the API for several years but have drifted away from some of the work for awhile and was unaware that a "new" API was implemented. As a result, none of my API apps work anymore. I had an "oauth" PHP script that I used to use to generate a token that I could then use in my apps when calling the GC API. That script stills generates Oauth tokens but when I try to use those tokens all I get is "not authorized". I'm guessing that either the tokens aren't valid because it's using the old API, or my app is using the tokens I have wrong because it's accessing the old API. Can someone take a look at this code and give me any insight as to what needs to change for me to get codes using the new API, or if I can still use this. Here's the PHP I was using to generate an OAuth token: <?php include_once "OAuth.php"; $key = 'MyKey'; //prod $secret = 'MyKey'; //prod $geoauth = 'https://www.geocaching.com/OAuth/mobileoauth.ashx'; // prod $sig_method = new OAuthSignatureMethod_HMAC_SHA1(); $auth_consumer = new OAuthConsumer($key, $secret, NULL); $oauth_token = $_GET['oauth_token']; if (empty($oauth_token)) { $params = array(); $params['oauth_callback'] = 'http://' . $_SERVER['HTTP_HOST'] . $_SERVER['REQUEST_URI']; $req = OAuthRequest::from_consumer_and_token($auth_consumer, NULL, "GET", $geoauth, $params); $req->sign_request($sig_method, $auth_consumer, NULL); $ch = curl_init($req->to_url()); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); $output = curl_exec($ch); curl_close($ch); parse_str($output); if (empty($oauth_token) || empty($oauth_token_secret)) { echo 'Could not get an auth token<br>'; echo 'Error returned:'.$output; die(); } setcookie('auth', $oauth_token_secret, 0); header('Location: ' . $geoauth . "?oauth_token=" . urlencode($oauth_token)); } else { $auth_token = new OAuthConsumer($key, $_COOKIE["auth"], NULL); $params = array(); $params['oauth_verifier'] = $_GET['oauth_verifier']; $params['oauth_token'] = $oauth_token; $req = OAuthRequest::from_consumer_and_token($auth_consumer, $auth_token, "GET", $geoauth, $params); $req->sign_request($sig_method, $auth_consumer, $auth_token); $ch = curl_init($req->to_url()); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); $output = curl_exec($ch); curl_close($ch); parse_str($output); if (empty($oauth_token)) { echo 'Could not get a auth access token'; die(); } echo 'The auth access token is ' . $oauth_token; } ?>
  3. I just recently learned about the CHS from a reviewer and lackey at a mega I was at a few weeks ago. When I saw that rewards were being distributed across the globe, I knew right away that this "score" would play a factor in how they were assigned. I am sure it was not the only factor but I am betting that the better the score on the cache, the better the chance. Combine that score with additional information such as how many finds the owner has, how many hides, if they've cached in other places, as well as the general saturation of caches in their area, population of cachers in that area, and other factors unbeknownst to us, you get a formula that would spit out candidates that made the most sense. I also suspect that the list they created may have even been vetted afterwards to see if TPTB agreed with the names it produced. Honestly, I think this was the best, and fairest way to pick people. I didn't get one nor did anyone in my entire province from what I know, but every area is different and some areas were going to be overlooked just by the nature of it being a calculated selection. If GSP had hand-picked people, there would have been a huge argument about why they chose those people and not others who many felt were worthy. By letting the computer do it, they removed the subjectivity of it and tried as best as they could, within the parameters they set, to be fair. Just my two cents on it.
  4. People have been BEGGING for something like this for ages. It is SO nice that HQ is acknowledging this. I am sure there will be issues, like there is with anything really, but this is something that at least allows something new in this realm to happen.
  5. I've hosted a ton of events and I have had an event where no one showed up. It was in my own hometown and for whatever reason, people never showed up. I still logged an attended as I was there. To me, it's about intention. If the intention is to host an event so others will come, that's one thing. If the intention is to host an event just to get the smiley, that's a different story.
  6. So for us, the trip is about getting all 48 states with a bit of vacation time in. The biggest thing we wanted to was to define a "route" that gave us the best maximization of getting the all 48 states, while also going through a few areas that we know wanted to see vacation wise. You can see our route and website here. We are sticking to caches we can get on our route or near to our route. We'd like to include as many virtuals per state but also want to have traditionals. The idea being we want to make sure we get at least a couple of caches per state so we know definitively that we get each state. We're 18 days for our departure and we've basically decided to do a combination of CAAR PQs, range PQs in larger area like Vegas, Seattle, etc, and I'm also using the CAAR macro suggested above to get another view of the CAAR caches. Plus we also know that there will be things that come up and will change what we decide to go for so we're not committed to just the route. The really commitment is to make sure we get all 48 states.
  7. I'm actually the youngest in our group coming in at the young age of 43. Never too old to lose your mind on a road trip
  8. I have been working with three other cachers for a major road trip in September. Inspired by the 48 in a week that was done awhile ago, we're doing all 48 states in two weeks. Throwing in some vacation time in the middle of the trip, and trying to snag a good combination of virtuals and physical caches, we're trying to make it as interesting as we can. The biggest part of this project has been fine tuning our route to make sure we can maximize the use of our time. The other big challenge is making sure we capture as many caches in our pocket queries and GSAK searches as we can. We don't want to rely on just live map updates with cell phones so we're covering as much as we can.
  9. We are actually taking a lot of time to "smell the roses" and enjoy what we are surrounded by. Yes, the route we are taking is one we developed which you can see here. This whole trip started out as a way to get caches in all 48 states (despite the fact that I have almost half of them already) but now that we are getting closer, the trip is less and less about geocaching and more about 4 friends enjoying a wicked road trip. The places we will see and landscapes we will be exposed to will be amazing. It's not often the opportunity to go on a trip like this comes around so we will be absorbing as much of everything around us as we can.
  10. What got me to start looking at options like these was that it appears in some cases the official CaaR routes and PQs skip or miss certain caches. I found it hit and miss when running the queries even with a very refined distance between my route and the surrounding area. When I ran the macro you mentioned and compared the caches to the official listings via Google Earth, they are identical. This makes me think the API method is more reliable than the official PQ method. Snagging most of the US with a DT filter would be fine in a lot of cases, but we're going to have situations where we'll be looking at caches beyond the smaller range so we want to include as much as we can. I think what is going to happen is it will end up being a mix of macros like these (which are awesome btw) and official PQs and CaaR PQs.
  11. Trying the route macro now. Only issue I had was that all of my content was in KMZ format. I had to convert it to GPX but the GPX format wasn't compatible. I did manage to find a way to tweak the GPX manually to allow it to load into the route macro. I've got that one running now and will try the others after. Thanks again!
  12. How did you do that particular load in GSAK?
  13. A lot of great suggestions here thanks! We will have 4 mobile devices plus multiple GPS units but with things like limited battery power and areas with no cell coverage, we wanted to make sure we covered ALL of our basis. I had no idea there was a GSAK macro for doing CaaR. I am definitely going to try that.
  14. Me and three other geocachers are doing a road trip in September inspired by the 48-in-1-week trip that was done previously. We too are doing all 48 states in one trip but are doing it over two weeks. Looking for any recommendations on the best way to collectively do PQs for this trip. We've got our route finalized and can do CaaR for the trip but wanted to know if anyone here has any suggestions on the best way to tackle something of this scale? All four of us are pretty seasoned cachers but I figured I'd ask here for additional suggestions. What we've ended up doing so far is splitting the route among each of us and doing multiple CaaR PQs for our part of the route. Each PQ is 1000 caches on either side but only within 1km (we're Canadian) of the road. I've also got PQs for large population areas (NYC, Vegas, San Francisco, Seattle, etc) and a few extra ones for things like virtuals we want to get and the ET highway, etc. That seems to be covering as much as we think we need but I'm wondering if anyone here who has done large road trips has had issues missing caches via the PQs, or other unknowns we haven't heard of. Any suggestions or info is appreciated. Thanks.
  15. In my area, we've talked about this exact issue many times, even on our podcast. Others have said it, but here's my two cents. It's just my own opinion... If you go back to when geocaching started, the log book wasn't just for signing your name. The original first few caches had logs in them so people could write about how they found it and tell a little story about their adventure. This is something the owner could read and smile at. It was also a way that the owner would know that user X actually found the container. This is what I read and learned about when I first started caching (2007). I wasn't around in those days but I would guess that the owners back then may have occasionally read their log book to read the stories. But now a days, the log books are literally just a piece of paper for your name. There's no real incentive, or need, for a cache owner to look at the log. Most cache owners (not all, but definitely most of the ones I know personally) never check their logs and compare them to the online ones. Why? Because for most of the owners, it's not the end of the world if someone doesn't sign the log on a cache in a lamp post or guard rail. In 10 years of geocaching, I have only checked a couple of caches and compared the written logs to the online logs. In fact, a lot of the events we have in our province have no log book at all. The "rule" may say that you can't log online without signing the physical log, but it's on the owner as to whether or not that needs to be enforced. For me, I see the "rule" as a way for Groundspeak to have something to fall back on when there's an issue between an owner and a finder. If an owner deletes a log because they checked the physical log and saw that the finder didn't actually sign the log, and then the finder disputes it with Groundspeak, Groundspeak can say "We can't restore your log because our guidelines say you have to sign the physical log to log it online.". If they didn't have that rule, then anyone could log any cache and a cache owner would have nothing to back up their reason for deleting a log. There are always going to be people who will stick hard and fast to the exact word written. But even the actual law around us is not that black and white. How many times do judges interpret the true meaning or intention behind a law to address a specific case? It's never black and white, no matter what anyone says. So for me, I take that rule as something that was written as a way of protecting the owners, and finders, for when the situation becomes cloudy. Again, just my opinion. There's also the fact that the entire point of geocaching is to go out and "find" a container. That's the actual point of this activity. We FIND things. If I find the physical container, to me, that IS a find. I did find the container by navigating to the location given to me. But again, even that for me personally sometimes is not enough. There have been times where I have found the container, or seen the container (which is technically finding it) and I did not log it because I knew the point of the cache was for me to do something else to find it. Underwater caches or caches that require me to climb are prime examples. I would not feel comfortable signing those knowing the owner went through great lengths to place them there with the intention of them being found a certain way. Not everyone works like that but for me, I have my own thresholds of what I think makes sense. Just my own take on the whole logging thing...
  16. I'm looking at writing an application that is pulling information from the GC API and then generates a GPX file to be used by a mobile device or GPS. My understanding is the "Groundspeak:" extension to the GPX format is a private extension owned by Groundspeak. I found the XSD at: http://static.Groundspeak.com/cache/1/0/1/cache.xsd where it indicates it is private. My question is, what are the terms of the usage of this format? I know things like GSAK are able to produce GPX files legible by GPS and mobile devices. Do I need a specific license from Groundspeak to use these extensions, or are they made available as part of the API agreement. I really just want to make sure if I create Groundspeak compatible GPX files from my own app that I'm not violating some other license. Any information is appreciated. Thanks.
  17. The edits I did were today and the test snapshot I tried to create was today as well. In fact, I just tried it again and I am still getting the error. I can send you the link to my adventure via email if you want.
  18. So up until today, I have had no issues with creating the lab caches and environment for our upcoming mega. Then today, I go to add our last cache, and noticed the Cache Trigger was new. I modified my labs to all have a cache trigger and then tried to create a test snapshot.. No go. After I click create test snapshot, I get this error: 500 {"Message":"Question is required"} Nothing else, and no snapshot. I have emailed Groundspeak directly but I need to find out if this can get fixed immediately. Our mega is this Saturday and we have been heavily promoting the Lab caches as part of this event. Prior to the addition of the Trigger, all my snapshots worked just fine. And yes, I did add in the trigger and questions as required by each of the cache listings. Anyone else having this issue?
  19. In 7 years of using geocaching.com, I could probably try and count on one hand how many times I have used the Advanced Search feature. Yes, it is handy once in awhile but I never use it. However, the List Newest in my home province I use almost every day. I know MANY people who use it because they go to their "my" page, and click that link to see the most recent caches and recent events. I use it for the events more than anything, but I also use it when we do the Cache Up NB podcast to see what caches have been hidden in the last month. Log into profile, click a link, done. It doesn't need to be more complicated than that. I don't understand how one link can be such a bother when time and effort could be spent on other things. If the idea is that many folks don't populate the state field correctly, maybe you should adjust the Find Nearest link to use home coordinates before removing it entirely. For those of us who use it, we could continue to use it until you have modified your code to use the home coords instead. There's no need to remove it and then re-add something similar later using home coords.
  20. It doesn't surprise me that they have these kind of conditions. It's mostly about liability and getting sued by people who claim the network ripped off their idea, etc. I know at one time when Star Trek was on the air they had an open script policy where anyone could send one in. The other option would to actually submit a script through an agent or some kind. Seems like a long process to get geocaching on TBBT tho. It would be pretty awesome to see Sheldon's reactions on geocaching.
  21. Does anyone know the GC code of the very first official MEGA event? I'm trying to do a bit of history on the subject and can't find much on the first mega.
  22. Does anyone here know if a change was made recently that disabled the sending of emails when you now authorize GSP via the API? The website I run uses a GC login plugin that leverages the API. Everytime a user logs in, it would present the "authorize" screen from GSP's API, and then send them an email if they clicked allow. It seems these emails are no longer sent. I'm just wondering if this was intentional or by accident. I actually prefer not to get the email but am just wondering if it should be doing that.
  23. Instead of GSAK 101 doing Geocaching Software 101 and include GSAK as one of the topics. I think that makes it less commercial.
  24. This was exactly how I approached my first WiG cartridge. I wanted something that wasn't just a tour or walk through, but something that had a taste of those exact elements. I ended up combing The X-Files and LOST to create an adventure where someone had to solve a mystery around one of our local lakes. Takes almost 2 hours to complete the whole thing but man was it satisfying to make. I really wish more of those kinds of WiG's were out there.
  25. Did anyone ever get an answer on this? Do we know if the project is dead or just in limbo at the moment?
×
×
  • Create New...