قالب وردپرس درنا توس
Home / Technology / Handy with an alpha build of Augmented Reality mode from Google Maps – TechCrunch

Handy with an alpha build of Augmented Reality mode from Google Maps – TechCrunch

I think most of us have had this experience, especially if you're in a big city: get out of public transport and take a look at Google Maps to find out which way to go should … and then somehow two blocks in the wrong direction.

Maybe the little blue dot was not in the right place yet. Maybe your phone's compass was a problem, and the wrong direction, because you're surrounded by 30-story buildings full of metal and other things that hate the compass.

Google Maps Augmented Reality Mode is about to end this process In this scenario, you draw arrows and signage on your camera's view of the real world to get the most out of it. It compares this camera view to the vast collection of Street View images to try to find out exactly where you stand and where you are heading, even if your GPS and / or your compass may be a little bit different are made. It's currently in alpha testing and I spent some time this morning.

A glimpse of what it looks like in action:

Google first announced AR directions to its I / O conference about nine months ago, but it has been quiet since then. Most of the time was spent figuring out the subtleties of the interface. If they have drawn a particular route on the ground, early users have attempted to stand while walking directly on the line though this was not necessary or safe. When they tried to use particle effects floating in the air to depict paths and curves (in an early prototype below), a Google UX designer tells us that a user has asked why he is following Floating Trash ,

[19659002] The Maps team has also learned that nobody wants to hold their phone for a long time. The whole experience has to be pretty fast and was designed for short bursts – if you hold the camera down too long, the app prompts you to pause.

Firing The "Up AR" mode feels like every other Google Maps trip is started. Go to your destination, press the directions button, but instead of "Start", tap the new "Start AR" button.

A view of your camera appears on the screen, and the app prompts you to aim the camera at the target building across the street. When you do this, a series of points will be displayed as it recognizes building features and landmarks that could help determine your location. Quite fast – a few seconds, in our handful of tests – the dots disappear and a set of arrows and markers seems to show you the way. A small snapshot below shows your current location on the map, which makes the transition from camera mode to map mode a little less upsetting.

If You Fall The phone is in a more natural position – closer to the ground and parallel to the ground than if you were reading it while walking – Google Maps will switch back to default 2D map. Hold the phone up as if you were taking a portrait photo of what is in front of you and the AR mode will be reactivated.

In our short test (about 45 minutes in total) the function worked as promised. In some scenarios, it definitely works better than others; As you get closer to the road and see better the buildings across the street, the location is determined fairly quickly and with ridiculous accuracy. If you are in the middle of a seat, it can take a few seconds.

Google's decision to construct this as something you should only use for a few seconds is the right one. No one wants to roam a city, especially through the camera lens of their phone, whether you want to be an easy target for would-be thieves or go in light poles. I can imagine using it on the first one or two steps of a hike to make sure I get on my right foot. At this point, an occasional look at the standard card is enough. It's about making you feel safer and not having to hold your hand all the way.

Google has given a deeper insight into how the technology works, but in short: it takes your eyes off your camera and sends a compressed version of the cloud where it's analyzed for unique visual characteristics. Google has a good idea of ​​where you are with your phone's GPS signal, so it can compare Street View's data to the environment to search for things it thinks should be nearby – specific building elements , Statues or permanent structures. and work backwards to your exact location and direction. There is also a lot of machine learning voodoo to ignore things that are salient but not necessarily permanent (like trees, big parked vehicles and constructions).

The feature is currently being provided as feedback in "Local Guides". Local guides are an opt-in group of users who contribute reviews, photos, and places, and help Google review location information to gain early access to such features.

Google has repeatedly told us that it has no idea when it will roll beyond this group.

Source link