Jump to content

Augmented Reality Caches


Recommended Posts

Areas of Poor Reception:

 

I was in a sheltered valley that had poor reception testing out a Metaverse experience. But Metaverse requires constant internet connection and I was having a difficult time with the app on and off again. Since you are able to download and save a TaleBlazer (TB) game to your device, I thought I would test that out with the same concept. But even with TB, I had some difficulty. TB programming allows you the option to have a live map (Google) or a custom map. The live map is nicer so that you can zoom in or out and even look at areas not related to the experience. But it would require a constant connection. So you could opt for a custom map and if your play area is a small confined area, that would be fine and you can complete the game just with your phone's GPS. But mine was a nature trail about a mile long. Doing a screen capture of the satellite image of that entire area results in poor viewing quality when you try to zoom it. Although you can complete the game without the map and without internet connectivity, its not a great experience, especially if the cacher doesn't know to download the game before coming to the posted coordinates.

 

So for areas of poor reception, I recommend Wherigo.

Link to comment

Note that the navigation would be the same experience with taleblazer or Wherigo - Wherigo would need data to load the live map as well. I suppose map caching would depend on the device/os.

 

23 minutes ago, Team Christiansen said:

Metaverse requires constant internet connection and I was having a difficult time with the app on and off again.

 

Yeah that's on the owner, I'd say. You can't create an experience that requires internet, which can only be run in a place without internet :P  I don't think owners can be required to test their experience before the cache is published (how can that be verified) but they can certainly be strongly encouraged to do so; much like the checkbox stating sufficient permission has been obtained for any geocache placed.

 

AR creators should be aware of whatever limitations exist for the platform they're using.

Even Wherigo cartridges can be buggy, and the creator needs to be aware of and fix them before it's published.

Link to comment
55 minutes ago, Team Christiansen said:

I was in a sheltered valley that had poor reception testing out a Metaverse experience. But Metaverse requires constant internet connection and I was having a difficult time with the app on and off again.

Yeah, Intercaching runs into the same issue. That's a browser-based system that uses a mobile browser and requires a constant internet connection.

 

Actually, a lot of smartphone apps seem to assume a constant internet connection. One of the things I miss about Palm PDAs is that PalmOS worked really well in situations where you didn't have an internet connection. Of course, it was designed for that situation, where the device had an internet connection only when it was connected to the docking station plugged into your desktop computer. But it's frustrating when apps don't cache data properly so they can work when you're offline.

  • Upvote 1
Link to comment
4 minutes ago, niraD said:

Actually, a lot of smartphone apps seem to assume a constant internet connection.

 

Yes, that really bugs me. Especially a few geocaching apps *cough* that don't actually provide a completely offline geocaching experience ;):ph34r:  (other than os-level uncached map sources)

But yeah the assumption that smartphones are always connected gives me that feeling that they don't necessarily even consider optimal data usage.  I like to know when my data is being used and for what.  Another reason I dislike using cloud-based services, especially transparently. Such a data hog

Link to comment

Well, I'd made up my mind to print and laminate a Zapcode and fix it at the starting location....

...then I remembered that a Zapcode would probably be classed as a physical stage and it would then fall foul of proximity rules.  :(

Enthusiasm drifting away....

 

M


 

Link to comment
On 6/14/2018 at 5:26 PM, Team Christiansen said:

 

One attempt at getting cachers to follow a particular sequence of multiple Metaverse experiences if explained by the cache owner in the cache description of this cache.

I don't know how this particular CO solved it, but you can easily force a sequence by using the 'Has completed sequence' block.

Link to comment
On 6/15/2018 at 10:35 AM, Team Christiansen said:

Yay!! Finally, my TaleBlazer AR cache just got published.

Im going to be in your neck of the woods July 6th. My son and I are flying in from Huntsville, AL to ultimately go to Yellowstone for a couple days and we will spending the day in Ogden on the 6th. I look forward to finding your AR cache while there!

  • Upvote 1
Link to comment

I'm a developer so of course got all excited about creating my own experiences..... especially that can be used with a modern mobile browser... no app required. But then their clause of "Allow users to upload their own AR overlays" shut that down :( granted unless I really went into developing something for others to upload their own content.... hmmmm maybe future idea if these become a permanent fixture.

 

Anyways, Still would be able to make it work then using the app "Argon" so I have messaged my reviewer to see if that would be allowed as users are able to provide their own weblink (ie upload their own overlay?) and see if that qualifies. Guess I'll just have to wait and see, that way I hopefully won't be limited by the features of the other apps.

Link to comment

For those of you who are working with Metaverse, they have a community forum you can find here. Because of so many geocachers posting questions, I posted a request feature asking for a Geocaching-specific topic category, which was granted (it can be found here). Because a single "experience" can only contain one waypoint, you have to "group" related experiences together to have mulitple waypoints for a single cache. As such, I posted a question, well -- 3 actually, about putting it all together. I repost it here to let you know of the ongoing discussion there and give consideration here:

 

 

Quote

 

Dear fellow geocachers – three questions:

  1. What have you seen as the most effective way to reveal the final coordinates of your cache after a cacher opens up and interacts with all of your grouped experiences?

  2. What is the most effective way to ensure the cacher has actually interacted with each of the experiences you have grouped for the cache before revealing the final coordinates?

  3. If your cache requires opening each of the grouped experiences in a specific sequence, what is the most effective way in making that happen instead of just saying “do it in this order” on the cache page?

 

 

On a side note, TaleBlazer does not have a community forum, but does have an email for technical support. I have used it asking them to give a game creator the ability to use words other than "agent" in the heads up view. They liked the idea and will implement it. I will now ask if they will create a community forum so that creators, such as geocachers, can ask questions and share ideas. I will let you know how that goes.

Link to comment

I've been able to tinker around  a little more with Zap Logic and I really like the builder and it seems capable of some pretty creative stuff. I haven't gone to any deep cut range yet but from what I've seen/used, I'm pretty impressed. But, there are some cons:

 

As thebruce mentioned, the scan limit might be a real issue. Jumping from free to $45 a month, yeah, that's not going to happen for me. I just looked at my zapcode that I'm building and just from previewing, I've already scanned the code 15 times.

 

I was looking at building a multicache and using the zapcodes as the waypoints, but without an ability to use the app offline, that's putting too much trust in that other cachers are going to have a data connection where the cache is placed. I can live with just adding the zap code to my cache page (haven't published my zapcode yet, so right now if I scan the code without using the preview button, it just says "Coming Soon") and having the cacher go through the experience sitting at their desk, but I was hoping to make it a more interactive experience.

 

You can upload a tracking image and if someone is pointing their camera at the image, whatever you have as your experience will play against it as a backdrop. But, you only get to choose one tracking image. I created five scenes so far, but I can't seem to change the tracking image for each individual scene.

 

I guess I'm going to have to try out some of the other builders. None of the issues with Zap Logic are a deal-breaker for me but I haven't tested any of the other platforms to any real extent, so maybe they can offer more features with less restrictions.

 

Link to comment

I had been following the thread for awhile, and tried the metaverse and TaleBlazer a bit, i got a rough idea, what can be done thru these AR apps.

At this moment, i guess it is very similar to Wherigo, which allow cacher to load an "experience" at predefined waypoint, ask a few question, sth like that.

I just wonder, does any AR apps (3 allowed by HQ and other potential app), provide the following similar feature in the youtube ?

Sorry that, i don't know the exact term of this feature, please correct me if i am wrong.

 

Image overlay (am i correct?):

The feature allow cacher to visit a predefined waypoint, there is specific "object", may be a sign or a poster (like that in the youtube), when the apps recognize that "object", it will play extra information to cacher.

 

 

Link to comment
19 minutes ago, Team Christiansen said:

 

Yes.

The app you demonstrated in the YouTube video is one of the approved apps. Aurasma has been renamed as HP Reveal.

Wow, thanks for the quick reply.

Oh, HP Reveal... according to the summary of different apps in previous post, seems HP Reveal can't add "way-point" (not location-based), am i correct?

Link to comment
8 hours ago, Team Christiansen said:

Dear fellow geocachers – three questions:

  1. What have you seen as the most effective way to reveal the final coordinates of your cache after a cacher opens up and interacts with all of your grouped experiences?

  2. What is the most effective way to ensure the cacher has actually interacted with each of the experiences you have grouped for the cache before revealing the final coordinates?

  3. If your cache requires opening each of the grouped experiences in a specific sequence, what is the most effective way in making that happen instead of just saying “do it in this order” on the cache page?

 

I don't know if it's the most effective way, but I did it this way for my simple technology-demonstration three-stage multi in MetaVerse. There are three experiences simply named first stage, second stage and third stage. That sort of implies the order. It is also explained in the description of the cache. The third stage waypoint is set some way from the final location, and once the third stage experience is completed, the cacher is given description of the cache location + coordinates, hint and spoiler that are accessible on separate buttons. This is the easiest way to find the cache as it is in an area with many many possible hiding places. If the cacher went to the third stage directly (using built in map on iOS), it would be much harder (but not entirely impossible). Of course, they don't know that the third stage waypoint is not on the final coordinates... The stages are connected by tasks and passwords. The cacher receives instruction what to do on the second stage at the end of the first stage (calculate something) and is asked just for the result (input number). If they went directly to the second stage, they wouldn't know what the task is. They could in fact start at any stage, but then they would learn the hard way that the order to do it is from the first to the third stage. ;)

  • Upvote 1
Link to comment
5 hours ago, wanrex said:

Oh, HP Reveal... according to the summary of different apps in previous post, seems HP Reveal can't add "way-point" (not location-based), am i correct?

 

I thought the same, but yes in the studio you can lock an aura to a GPS location. When you upload the tracking image you can provide GPS coordinates.

 

6 hours ago, wanrex said:

Image overlay (am i correct?):

The feature allow cacher to visit a predefined waypoint, there is specific "object", may be a sign or a poster (like that in the youtube), when the apps recognize that "object", it will play extra information to cacher.

 

Yeah, generally it's considered a tracking image. The app will scan the camera view until it recognizes the image, and is able to recognize the size and orientation of it in every frame in order to build the 3D representation of the 'world' where a virtual object can then be placed over top relative to that tracking image, "augmenting" the camera view accurately.

 

11 hours ago, Crow-T-Robot said:

I just looked at my zapcode that I'm building and just from previewing, I've already scanned the code 15 times.

 

My thoguhts on this - if the zapcode or link isn't available until a person is on site, at a gz, then it won't get internet scans. You'd have to have at least 250 visitors (give or take if people try it multiple times) per month to hit the cap. And I've verified that it's not a hard cap - Zapworks is notified of a user crossing 250, so they may contact and encourage an upgrade, but the code won't just stop working after 250 scans. I'm really hoping they come up with a plan for a cheaper uncapped per-code subscription, if not some way to do it free.

 

I've also nudged them to implement a location-lock feature to their zapcodes. One response from HQ told me that as of that email, Zappar wouldn't be approved because the experiences make no use of GPS location data.  That hasn't stopped me playing around with it, because if that's all that's needed, I'm hoping Zap adds that feature.

 

I'm also toying with a concept I've informed HQ about, and I'm just waiting for some kind of response, but I think (hope) they're discussing it...

Link to comment
7 hours ago, thebruce0 said:

 

I've also nudged them to implement a location-lock feature to their zapcodes. One response from HQ told me that as of that email, Zappar wouldn't be approved because the experiences make no use of GPS location data

 

That's...disappointing. So, if I were able to add the zapcode image to my cache page and the person scanned the code on my page while at home and went through the experience that I designed and obtained the coordinates and then used their GPS to navigate to the cache, that wouldn't be approved?

 

I think I'm gonna go back and re-read the AR guidelines from the first post. Maybe I skimmed over the part that pointed out that an AR cache had to be done in the field.

Link to comment

Correct, the guidelines now include the point that the AR itself need to provide an augmented overlay to the camera view, and have a location-based element. So it's not like the AR is a form of puzzle that can be solved anywhere, or a play-anywhere Wherigo (which even so still uses gps); for this content the AR experience provided has to itself make use of GPS.

Link to comment
8 minutes ago, thebruce0 said:

Correct, the guidelines now include the point that the AR itself need to provide an augmented overlay to the camera view, and have a location-based element. So it's not like the AR is a form of puzzle that can be solved anywhere, or a play-anywhere Wherigo (which even so still uses gps); for this content the AR experience provided has to itself make use of GPS.

from how I understand their comments that statement would be incorrect. You could have a play/solve anywhere AR experience as long as a second part includes a GPS element. This could be but not limited to the use of AR to recieve final coords and then you have to then navigate via GPS to the final like many other puzzles. I make this assumption based on the notes listed below from this page: https://www.geocaching.com/help/index.php?pg=kb.chapter&id=127&pgid=921

 

GPS usage is required for at least part of the search. GPS requirements are met, if:

  • Players need to navigate to GPS coordinates.
  • The AR experience is set to be location dependent, meaning that you can only see it at the location specified in the AR app.
Link to comment
1 hour ago, MTCLMBR said:

from how I understand their comments that statement would be incorrect. You could have a play/solve anywhere AR experience as long as a second part includes a GPS element. This could be but not limited to the use of AR to recieve final coords and then you have to then navigate via GPS to the final like many other puzzles. I make this assumption based on the notes listed below from this page: https://www.geocaching.com/help/index.php?pg=kb.chapter&id=127&pgid=921

 

GPS usage is required for at least part of the search. GPS requirements are met, if:

  • Players need to navigate to GPS coordinates.
  • The AR experience is set to be location dependent, meaning that you can only see it at the location specified in the AR app.

 

By quoting the Help Center article, I don't know that you contradicted thebruce0 at all.

 

Looking at the article's comments on GPS:

 

Quote

 

To be allowed in Mystery Cache designs, an AR app must

  • ...
  • Allow location based AR experiences

...

GPS usage is required for at least part of the search. GPS requirements are met, if:

  • Players need to navigate to GPS coordinates.
  • The AR experience is set to be location dependent, meaning that you can only see it at the location specified in the AR app.

 

 

it seems to me that the easiest way to describe the "location based" requirement is to just say, that at least at some point in the program and for what ever purpose, the AR app itself actually accesses your device's GPS function to somehow further the game/experience.

Edited by Team Christiansen
spelling -- again
Link to comment

This is what I received from HQ:

Quote

One of the requirements for an AR app to be allowed in the AR experiment is that the app must "Allow location based AR experiences". Looking through the documentation for widgets, designer and studio I was not able to find a way to make that a location based AR experience (i.e. an AR experience that is only viewable when at the coordinates).

 

...Other than only providing the zapcode when physically at the location - and even then if someone has the link or code anywhere else they can run it away from the location - Zappar doesn't have location-based scripting.

 

Now, I have looked deeply into Studio to find out if the script has access to the GPS location, so you could programmatically lock the AR to a location - that may be the solution - but the web-based widgets and designer variants have no GPS ability.  The documentation has no hits for gps, and nothing relevant for latitude or longitude; the forum has one person asking about gps location, and 1 response implying it should be possible but without supplying any information about how.

 

In short, if it's possible, it's (currently) only in the most complex variant, only programmatically, and there doesn't seem to be any information as to how =P

 

It does seem rather odd that Zapworks doesn't include the ability to make use of gps.

 

 

On a similar note, Metaverse recently removed the functionality that opens the gps location to dynamic scripting, so the only gps use that Metaverse includes is the ability to lock an experience to 40m of a gps location. And that's it.  I'm hoping they reinstate the user location values for dynamic scripting...

Link to comment
19 hours ago, MTCLMBR said:

GPS usage is required for at least part of the search. GPS requirements are met, if:

  • Players need to navigate to GPS coordinates.
  • The AR experience is set to be location dependent, meaning that you can only see it at the location specified in the AR app.

 

To me, this isn't clear.. do BOTH of these bullet points need to be true, or is fulfilling just one of them sufficient?   In other words, is there an implied AND or OR between the bullets?

 

One near me was published yesterday.    The AR is NOT location based, you can play it anywhere.   However, the first thing you need to do is find a trackable item at specific coordinates and enter that code, and that uses GPS coordinates.    And after that, it leads you from one place to the next.    So whilst you can "play it" anywhere, you can't get the final coordinates by playing at home.    In this case, the AR is "logically" location dependant, even if it is not set as "location locked".      So you could argue it meets both criteria.

 

I've seen others where the AR is designed to be played completely anywhere.    You can play it at home, when you complete it you get coordinates.      Perhaps the intent is these should not be allowed.   

 

 

 

Link to comment
3 hours ago, redsox_mark said:

 

To me, this isn't clear.. do BOTH of these bullet points need to be true, or is fulfilling just one of them sufficient?   In other words, is there an implied AND or OR between the bullets?

 

One near me was published yesterday.    The AR is NOT location based, you can play it anywhere.   However, the first thing you need to do is find a trackable item at specific coordinates and enter that code, and that uses GPS coordinates.    And after that, it leads you from one place to the next.    So whilst you can "play it" anywhere, you can't get the final coordinates by playing at home.    In this case, the AR is "logically" location dependant, even if it is not set as "location locked".      So you could argue it meets both criteria.

 

I've seen others where the AR is designed to be played completely anywhere.    You can play it at home, when you complete it you get coordinates.      Perhaps the intent is these should not be allowed.   

 

 

 

I am new to AR and have only created 2 so far in Metaverse. Both require going to a physical location to activate the experience. However, I have a friend who's Metaverse experiences are both play anywhere (because we live in areas with sketchy phone reception). So one of his experiences does reveal coordinates to the physical container. So I interpret that as any use of GPS to get to the final geocache container. 

Link to comment

This certainly needs to be clarified.

Because the example I provided HQ and asked about (Zappar) was specifically declined because it does not use GPS in the AR experience.

 

Essentially, at this point Zappar AR are inherently play-anywhere. But the implication was the play-anywhere AR is not allowed.

 

However, a play-anywhere AR can still use the GPS, just anchored to wherever you start.  Zappar doesn't use GPS at all in the experience. That may be the difference

 

6 hours ago, redsox_mark said:

I've seen others where the AR is designed to be played completely anywhere.    You can play it at home, when you complete it you get coordinates.

 

Yeah if it still uses gps location in the experience, then it's good to go. If it's just an image overlay AR like Zappar, then that's different.

 

Edited by thebruce0
Link to comment
1 hour ago, J Grouchy said:

The first AR cache is Georgia was a bit...underwhelming.  I'm going to give it some thought and see if I can come up with something a bit more interesting.

 

Yeah I think a lot of people just wanted to get something out there and be one of the first with an AR cache, without actually doing much research for creativity.

First one I found was just a one-step character asking a question. Correct answer and you got the coordinates. Nothing unique AR other than the character floating on the camera.

Link to comment
20 minutes ago, thebruce0 said:

 

Yeah I think a lot of people just wanted to get something out there and be one of the first with an AR cache, without actually doing much research for creativity.

First one I found was just a one-step character asking a question. Correct answer and you got the coordinates. Nothing unique AR other than the character floating on the camera.

 

The one I saw was just holding the camera up to a specific electrical meter box and the coordinates would pop up on the screen.  ~yawn~

Link to comment

Finally got my answer from HQ about Argon and with developing your own AR experience. This is their response.

Quote
Thanks for your message.
We have looked at Argon and decided that we will not allow it for the AR experiment at this time. The reason is that it is not an AR app that provides a complete solution, but rather an AR capable browser that requires coding to set up an AR experience. As such it makes the process of creating AR experiences quite difficult and restricts the use to those that have coding skills.
As an alternative, would it be an option for you to use an SDK such as ARToolKit or Wikitude? Creating an AR experience that is viewable in multiple browsers, e.g. Chrome or Firefox, would be guideline compliant and would not need to be part of the exception provided by the AR experiment.
I hope this helps.
Happy geocaching,
Geocaching HQ Admin

 

Link to comment

That sort of makes sense. The other AR products have graphical interfaces, and some have scripting abilities for more complex stuff. One that is only scripting and only within a certain browser seems pretty restrictive. But it is a subjective judgment on their part; "too difficult", rather than a hard line lack of required capability. hm.

Link to comment
On 6/22/2018 at 5:06 PM, J Grouchy said:

 

The one I saw was just holding the camera up to a specific electrical meter box and the coordinates would pop up on the screen.  ~yawn~

I was rather afraid that would be the norm as people rushed to publish. I do hope the first one I put out does not fall into that category.

Link to comment
11 minutes ago, Asgoroth said:
On 6/22/2018 at 2:06 PM, J Grouchy said:

The one I saw was just holding the camera up to a specific electrical meter box and the coordinates would pop up on the screen.  ~yawn~

I was rather afraid that would be the norm as people rushed to publish. I do hope the first one I put out does not fall into that category.

To be fair, some of the early Geocaching Challenges (not to be confused with Challenge Caches) were not very impressive, including some of the locationless ones created by Groundspeak. But after a while, people figured out how to do interesting things with them. Some of the ones I saw when I finally decided to take another look at them were actually pretty interesting. And then Groundspeak pulled the plug, and Geocaching Challenges vanished without a trace.

 

For AR caches, Groundspeak even came right out and said that they might be archived once "this experiment" has concluded.

Link to comment
On ‎6‎/‎22‎/‎2018 at 5:06 PM, J Grouchy said:

 

The one I saw was just holding the camera up to a specific electrical meter box and the coordinates would pop up on the screen.  ~yawn~

 

I imagined a target such as a building or something, where the clue then pops up, "100 feet tall".  Not to dis the coordinates on the meter, I'm sure that's fine :P, but I'm anticipating an amazing first impression when the clue pops up.

 

 

Edited by kunarion
Link to comment
On 6/22/2018 at 9:45 PM, thebruce0 said:

 

Yeah I think a lot of people just wanted to get something out there and be one of the first with an AR cache, without actually doing much research for creativity.

First one I found was just a one-step character asking a question. Correct answer and you got the coordinates. Nothing unique AR other than the character floating on the camera.

 

I like this AR idea, I've created one and found 5 so far.    The ones I found were good - a good story, multiple stages, good locations etc.   Using Metaverse.   But really, the only thing which is "Unique AR"  in any of them is floating characters, and a bit of Google vision to recognise objects.    Otherwise, the same could have been done as a Wherigo.     It seems a nice toolkit, and an alternative to Wherigo for creating interactive adventures.. but the "AR" aspect itself doesn't seem a big deal to me.   I've only looked at Metaverse so far, so maybe some of the other toolkits offer a more unique experience.  

 

 

Link to comment
6 hours ago, redsox_mark said:

I like this AR idea, I've created one and found 5 so far.    The ones I found were good - a good story, multiple stages, good locations etc.   Using Metaverse.   But really, the only thing which is "Unique AR"  in any of them is floating characters, and a bit of Google vision to recognise objects.    Otherwise, the same could have been done as a Wherigo.     It seems a nice toolkit, and an alternative to Wherigo for creating interactive adventures.. but the "AR" aspect itself doesn't seem a big deal to me.   I've only looked at Metaverse so far, so maybe some of the other toolkits offer a more unique experience.  

 

Yes exactly. Per my list earlier, each allowable app has an element or implementation of AR, even if it's the most basic form of overlaying graphics on the camera.


Metaverse: Wherigo style experience with basic AR overlay for interactions at waypoints

HP Reveal: Tracking image anchored single-location AR

TaleBlazer: Wherigo style experience with even more basic AR overlay showing other waypoints on screen

 

I am trying something a little different that is about 90% complete. Functionally it's done, just tightening up the 'lore' aspect :)

 

What would be best is an AR app that has all the above - a scripted experience like Wherigo, with the ability to overlay 3D AR elements on the camera view as if they were in the real world. And without the need for a marker or tracking image.  We need that latter level of AR, compatible with iphone and android, in one app, for a full AR experience.

 

An example that's at least fully functional on iOS is Jurassic World: Alive, which renders dinosaur models in the real-world environment - it uses ARKit to automatically create a 3D representation of the camera view behind the scenes, live, in which dinosaurs can be placed, anchored on a surface, and viewed. So as you move around you can look anywhere and it'll remain in place. I recently whipped up a fun little tech-demoish video playing with the game.  That's the sort of AR that's next on the smartphone horizon (it takes a higher-tech phone though with lots of CPU)

Link to comment

I solved my first AR cache today. I will tell you my experience, although I need to preface it by saying I use cheap phones.  I downloaded the app (Metaverse) and was informed the app might not work right until I updated Google Play, which I was not able to do. Anyway, I ran outside and started the game. All I could see on my phone was a camera view of my  backyard. Immediately three responses showed up for me to choose from (no question, just three options to choose). I picked one, and about 10 seconds later I got another choice of 3 "answers", again with no question or reference, so it was just guessing. I was told I was wrong, to try again, so I did and got it right. I hit continue. All this time, nothing on my screen but my backyard. I walked around the driveway to see if anything would happen and then the final coordinates appeared with a little Congrats note. Huh? I didn't do anything!!

 

I guessed at one answer (no idea what the question was), guessed at a second answer (no idea what the question was), and then my coordinates showed up.

 

I have no doubt the experience is much more than that, but that's all I got. And this has nothing to do with the CO or the game, I'm sure of that. This is just MY experience .

Link to comment
7 hours ago, Max and 99 said:

I solved my first AR cache today. I will tell you my experience, although I need to preface it by saying I use cheap phones.  I downloaded the app (Metaverse) and was informed the app might not work right until I updated Google Play, which I was not able to do. Anyway, I ran outside and started the game. All I could see on my phone was a camera view of my  backyard. Immediately three responses showed up for me to choose from (no question, just three options to choose). I picked one, and about 10 seconds later I got another choice of 3 "answers", again with no question or reference, so it was just guessing. I was told I was wrong, to try again, so I did and got it right. I hit continue. All this time, nothing on my screen but my backyard. I walked around the driveway to see if anything would happen and then the final coordinates appeared with a little Congrats note. Huh? I didn't do anything!!

 

I guessed at one answer (no idea what the question was), guessed at a second answer (no idea what the question was), and then my coordinates showed up.

 

I have no doubt the experience is much more than that, but that's all I got. And this has nothing to do with the CO or the game, I'm sure of that. This is just MY experience .

 

Did you get any messages on the screen turn left (or right)?   Were you holding the phone upright?

 

I've found with Android, when there is a "character question and answer", it shows the answer options immediately.    So it is possible to see the answers, without ever seeing the characters to get the question.    Which sounds like your experience.

 

With the iPhone, it doesn't show the answer choices until you turn the phone to see the characters.  

 

It's possible it doesn't work at all on your phone.    Or, you may just need to get used to how you need to hold the phone and see the questions and characters.     Hold the phone upright and move it around in a circle at arms length until you see a character.   

Link to comment

Did you get any messages on the screen turn left (or right)?   Were you holding the phone upright?  Yes, I was holding the phone upright (like I was taking a picture). No messages at all appeared about turning and I saw no characters if there were any. No questions.

 

I've found with Android, when there is a "character question and answer", it shows the answer options immediately.    So it is possible to see the answers, without ever seeing the characters to get the question.    Which sounds like your experience.

 

With the iPhone, it doesn't show the answer choices until you turn the phone to see the characters.  

 

It's possible it doesn't work at all on your phone.    Or, you may just need to get used to how you need to hold the phone and see the questions and characters.     Hold the phone upright and move it around in a circle at arms length until you see a character.    I held the phone upright and walked around. I also held the phone upright and turned in a circle. Saw nothing.

Link to comment

Welllllll, my first attempt on a multi was a bust ... app took a dump X3 on a morning attempt.  Went back later in the evening for a second shot of specialness and had X2 more glitchy app experiences   Between "frustrations" I had the opportunity to do a single stage variant and had a most positive experience.   

 

Based upon the negative experience on extended visits to the same multi ... I am not certain if I will have another ride on the MULTI merry-go-round.

Link to comment
On 6/30/2018 at 4:56 AM, Max and 99 said:

All I could see on my phone was a camera view of my  backyard. Immediately three responses showed up for me to choose from (no question, just three options to choose). I picked one, and about 10 seconds later I got another choice of 3 "answers", again with no question or reference, so it was just guessing.

 

In Legacy rendering on iOS, the app tells you to turn left or right to see the character. I haven't figured out, though, how it chooses the direction where the character is displayed when the experience is started. 

 

Metaverse app seems to behave in very different ways depending on what kind of hardware and software (ARKit on iOS and some AR support on Android) you use. For my AR cache, it seems that too sophisticated AR support is not an advantage. :cute: The best way to do it on iOS is to switch to "Legacy rendering". We have to remember, that these apps were not developed with geocaching in mind, so some nice AR features turn out a bit awkward. For example, with ARKit on, the character size is displayed proportionally to the distance from the waypoint. For my cache, that means that at maximum distance from the waypoint, Signal is displayed so small that people can't see it on the lush green background. Even the treasure chest in the last step is hard to see, unless the cacher is near the waypoint. In Legacy rendering, the character size doesn't change. Also, I've seen on an Android device with some additional AR support (I have no idea what it is, never owned an Android phone, something one of the cachers downloaded in the field when I was giving him a tour), the character and dialog are shown mirrored when the cacher moves past the waypoint. So we had to move back to the "right" side of the waypoint to be able to solve the stage... On this Android phone, the character and the dialog were rendered too large for the screen very close to the waypoint. The map on Android seems to be working now, so that greatly improves the experience with multiple stages.

 

I guess we all need to cut some slack to both the technology and the cache owners. It's is still the early phase of testing this technology in geocaching. I've seen a question, why would someone use this if the same could be done (possibly better) in Wherigo. My answer would be: Why not? We have a unique opportunity to try something new in geocaching this summer.

  • Upvote 1
Link to comment
On 6/30/2018 at 4:56 AM, Max and 99 said:

I solved my first AR cache today. I will tell you my experience, although I need to preface it by saying I use cheap phones.  I downloaded the app (Metaverse) and was informed the app might not work right until I updated Google Play, which I was not able to do. Anyway, I ran outside and started the game. All I could see on my phone was a camera view of my  backyard. Immediately three responses showed up for me to choose from (no question, just three options to choose). I picked one, and about 10 seconds later I got another choice of 3 "answers", again with no question or reference, so it was just guessing. I was told I was wrong, to try again, so I did and got it right. I hit continue. All this time, nothing on my screen but my backyard. I walked around the driveway to see if anything would happen and then the final coordinates appeared with a little Congrats note. Huh? I didn't do anything!!

 

I guessed at one answer (no idea what the question was), guessed at a second answer (no idea what the question was), and then my coordinates showed up.

 

I have no doubt the experience is much more than that, but that's all I got. And this has nothing to do with the CO or the game, I'm sure of that. This is just MY experience .

Looks like a experience without a location added to it, which results in quite a pointless user experience imo. Surprised the guidelines allow for it.

Link to comment
On 7/1/2018 at 6:51 AM, icabrian said:

 

IMetaverse app seems to behave in very different ways depending on what kind of hardware and software (ARKit on iOS and some AR support on Android) you use. For my AR cache, it seems that too sophisticated AR support is not an advantage. :cute: The best way to do it on iOS is to switch to "Legacy rendering".

 

How do you turn "Legacy Rendering" on or off?

 

On my own AR cache, on my iphone, the characters are always the same size.     Same on my wife's Android.   But on a friend's Android, the characters got larger closer to the waypoint.

 

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...