I'm not sure if you know that, but the introduction of Apple Maps went bad. After a rough first impression, an apology from the CEO, several years of patching holes with data partnerships and a few light shades with long-awaited transit directions and improvements in business, parking and location data, Apple Maps is not there yet – it has to be world class. Service.
Cards need to be repaired.
Apple, it turns out, is aware of this, so it rebuilds the Map part of Maps.
Each version of iOS will eventually receive the updated maps and respond to changes in the streets and streets. This construction is visually more complex and depends on the context in which they are viewed, providing more detailed land cover, foliage, pools, pedestrian walks and much more.
This is nothing but a complete rearrangement of maps, and it was 4 years that Apple began developing its new data acquisition systems. Ultimately, Apple will no longer have to use third-party data to build the foundation for its maps. This was one of the biggest pitfalls right from the start.
"Since we launched that six years ago, we're not going to refresh everything. The problems we had when we launched them – we've made big investments to bring the map up to date," says Apple SVP Eddy Cue who owns cards in an interview last week. "When we started, much of it was about the direction and to reach a certain place, finding the place and getting directions to that place, we've made big investments to make millions of changes, to add millions of locations, update the map and change the map more frequently – all of these things in the last six years. "
But, says Cue, Apple has plenty of room to improve the quality of Maps, which most users do even with the youngest Progress would be agreed.
"We wanted to take that to the next level," says Cue. "We've been working to create the hopefully best map app in the world and take it to the next step, which means we're going to build our own map data from scratch."
In addition to Cue, I'm in with this week Apple VP, Patrice Gautier, and talked about over a dozen Apple Maps team members at the mapping center in California rebuild Maps, in keeping with Apple's very public privacy stance.
If you're wondering if Apple was thinking of creating maps before launching maps, the answer is yes. At the time, there was a decision on whether or not she wanted to work in Maps business. With the future of mobile devices becoming very clear, she knew that mapping would focus on just about every aspect of her device, from photos to directions to app location services. The decision was made, Apple continued to work and build a product based on a patchwork of data from partners such as TomTom, OpenStreetMap and other geodata brokers. The result was disappointing. Almost immediately after the introduction of Maps by Apple it became clear that it would need help and that it signed a number of other data providers to fill in the gaps in location, base map, point-of-interest and business data.
It was not enough.
"We decided to do so just over four years ago, and we said," Where do we want to take maps? What do we want to do in Google Maps? We realized that we had to do what we wanted to do and where we wanted to go, "says Cue.
Because maps are so important to so many functions, success was not tied to a single function. Maps needed to be awesome to transport, drive, and walk-as well as a utility used by location-tracking apps and other functions.
Cue says Apple must have all the data it needs to create a map both in terms of quality and privacy.
There is also the issue of fixes, updates, and changes that submit a long loop of validation to update when working with external partners. The Maps team would need to be able to correct roads, paths, and other updating features in days or less, not months, not to mention d competitive advantages that could result from building and updating traffic data from hundreds of millions of iPhones, rather than relying on partner data.
Cue points to the proliferation of iOS devices, which now include millions, are a crucial factor in delaying his process.
"We felt that the switch to the equipment had happened – we made a map today, as we have done traditionally, we have been able to significantly improve things and improve them in different ways," he says. Two is able to update the map faster, based on the data and things we see, as opposed to driving back or getting the information that the customer proactively tells us. What if we could actually see it before all these things? "
I ask him about the speed of Maps updates and whether this new map philosophy means faster changes for users.
"The truth is that Maps have to be [updated more] and are even today," says Cue. "With our new maps we are doing even more, [with] the ability to change the map in real time and often, we do that every day today, and this extends us to make it possible for us to get around the map certain things that take longer to change.
"For example, a road network takes a much longer time to change. We can change that relatively quickly in the new map infrastructure. When a new street opens, we can see it right away and make that change very, very fast. It's much, much faster, to make changes in the new map environment. "
So, a new effort was made to create our own base cards, the lowest building block of a really good card system, after which Apple would begin collecting data on living locations, high-resolution satellite imagery, and brand new, high-resolution image data from its ground vehicles. until it had what it considered a "best-in-class" mapping product.
There's just really one big company on Earth that has a whole deck of cards from scratch: Google .
Apple knew it had to be the other one.
Apple vans spotted
Although the overall project started earlier, the first glimpse that most people made of Apple's renewed efforts was had to build the best map product, the vans that appeared on the streets in 2015, with "Apple Maps" signs on the side – these vans equipped with cameras and sensors emerged in various cities and triggered rampant discussions and speculation.
The new Apple Maps are being used for the first time to use the data collected by these transporters to create and inform their maps. This is her coming out party.
Some people have noticed that Apple's rigs look more robust than the simple GPS + Camera arrangements on other mapping vehicles – and go so far as to look more like the meaning of something that could be used in autonomous vehicle training
Apple does not comment on autonomous vehicles, but there's a reason the arrays look more advanced: they are.
Earlier this week, I took a test drive route in one of the vans to collect the kind of data needed to create the new maps. Here's what's inside.
In addition to a beefed-up GPS rig on the roof, four LiDAR arrays on the corners and 8 cameras that record overlapping high-resolution images, the standard physical measurement tool is also attached to a rear wheel that has a Accurate tracking of the distance and image capture allows. In the rear, there is a surprising lack of bulky equipment. Instead, it's an uncomplicated Mac Pro screwed to the floor and connected to a number of solid-state storage drives. A single USB cable leads to the dashboard, where the actual mapping capture software runs on an iPad.
During the assignment, a driver drives … while an operator takes care of the route and ensures that an assigned coverage area is assigned fully powered and monitoring the image capture. Each drive captures thousands of images as well as a complete point cloud (a 3D map of the space defined by points that represent surfaces) and GPS data. I later saw the raw data in 3D and it looks absolutely like the quality of data needed to train autonomous vehicles.
More about why Apple needs this amount of data later.
When the images and data are captured, they are immediately encrypted during operation and stored on the SSDs. Once full, the SSDs are pulled out, replaced and packaged in an enclosure that is delivered to the Apple data center, where a software suite removes private information such as faces, license plates and other information from the images. From capture to cleanup, they are encrypted with a key in the vehicle and the other key in the data center. Technicians and software that are part of their mapping efforts never see unimagined data from there.
This is just one element of Apple's focus on the confidentiality of the data it uses in New Maps.
Sample Data and Privacy
During each conversation I have with each team member during the day, privacy is addressed, stresses. This is obviously intentional, as it wants to impress me as a journalist that it really takes this seriously, but it does not change the fact that it's obviously built-in from the ground up and I could not get a false mark on any of the technical claims or the conversations I had.
In fact, from the data security people to the people whose job it is to make the cards really good is the constant refrain that Apple does not feel that it is in any way detained by not every piece of customer-sucking, stores and analyzes.
The consistent message is that the team believes it can deliver a high quality navigation, location, and mapping product without personal data used directly from other platforms.
"In particular, we do not collect data, not even from point A to point B," notes Cue. "We collect data – if we do – in an anonymous way, in subregions of the whole, so we can not even say that there is one person who went from point A to point B. We collect the segments of it, like you In truth, we do not think it will do us any good [to collect more] We will not lose any functions or abilities. "
The segments, on which he refers to are cut out of the navigation session of a particular person. Neither the beginning nor the end of a journey will ever be transferred to Apple. Rotating identifiers, not personal information, are assigned to any data or requests sent to Apple, and they supplement the "ground truth" data provided by their own mapping vehicles with these "sample data" returned by iPhones.
Because only random segments of a person's drive is ever sent and these data are completely anonymized, there's never a way to tell if a trip was ever a single person. The local system signs the IDs and only knows who this ID refers to. Apple works very hard not knowing anything about its users. This kind of privacy can not be added at the end, it has to be woven on the floor.
Because Apple's business model does not depend on it serving as an ad for a chevron on your route, you do not even need to bind ad IDs to users.
All personalization or Siri requests are all processed by the on-board processor of the iOS device. So, if you receive a drive notification that tells you it's time to leave your commute, this is learned, remembered, and delivered locally not from Apple's servers.
This is not new, but it is important to note What's New What You Can Take Here: Apple is harnessing the power to passively and actively enhance millions of iPhones in real time with their card data.
In short: traffic, road conditions in real time, road network, new construction and changes in pedestrian routes are about to get a lot better in Apple Maps.
The secret sauce here is what Apple calls probe data. Essentially, small slices of vector data representing the direction and speed that have been fully transmitted back to Apple without being tied to a particular user or even a particular journey. It accesses a minuscule amount of data from millions of users and gives it a holistic, real-time image without compromising user privacy.
If you're driving, hiking or cycling, your iPhone can already detect this. Now, when it knows you are driving, it can also send relevant traffic and routing data to this anonymous splitter to improve the overall service. This only happens if your Maps app was active, tell you to check the map, look for directions, etc. If you actively use your GPS for walking or driving, the updates are more precise and can be helpful when walking, eg pedestrian paths through Parks – Developing the Overall Quality of the Map
Of course, all of this depends on whether you have opted for location services and can be disabled with the Map Position switch in the privacy settings section] Apple says that this will have virtually no impact on battery life or data usage, since you already use the "map" features when sharing the data, and it's only a fraction of what these activities consume. 19659002] From the Point Cloud Upwards
But maps can not live solely on the ground truth and the mobile data. Apple also gathers new high-resolution satellite data to combine with its basic data for a solid base map. Then satellite images are superimposed to better determine leaves, paths, sports facilities, building shapes and paths.
After the downstream data has been cleaned from license plates and faces, it is rasterized through a series of computer vision programming to extract addresses, street signs and other points of interest. These are cross-references to publicly available data such as addresses of the city and new construction of districts or streets, which originate from city planning departments.
But one of the special sauce bits that Apple's mix of mapping tools is a full cloud that maps the world around the mapping transporter in 3D. This allows them all kinds of ways to better understand what items are street signs (retroreflective rectangular object about 15 feet off the ground – probably a street sign) or stop sign or speed limit sign.
It could also be so allowing the positioning of navigation arrows in 3D space for AR navigation, but Apple declined to comment on "any future plans" for such things.
Apple also uses semantic segmentation and Deep Lambertian Networks to analyze the point cloud coupled with captured image data by car and synchronized by high-resolution satellites. This allows 3D identification of objects, traffic signs, traffic lanes, and buildings, as well as the separation into categories that can be highlighted for easier retrieval.
The coupling of high-resolution image data from car and satellite and a 3D point cloud now give Apple the ability to create complete orthogonal reconstructions of city streets with textures in place. This is a massively higher resolution and visually better to see. And it is synchronized with the panoramic images of the car, the satellite view and the raw data. These techniques are used in self-propelled applications because they provide a truly holistic view of what's going on around the car. The ortho view, however, can do even more for human viewers of the data by being able to "see" through a brush or tree cover that normally covers streets, buildings, and addresses.
This is very important when it comes to the next step In Apple's fight for extremely accurate and useful maps: human editors.
Apple has a team of tool developers who work specifically on a toolkit that can be used by human editors to review and analyze data, street by street. The editor suite includes tools that allow human editors to assign building transitions to specific geometries (just think of the unique dome of the Salesforce tower), which makes them instantly recognizable. It allows editors to view real images of street signs taken right next to 3D reconstructions of the scene and computer vision detection of the same characters, and immediately recognizes them as correct or not.
Another tool fixes addresses and leaves an editor Quickly move an address to the center of a building, determine if it's misplaced, and move it around. In addition, access points can be set up to make Apple Maps smarter over the last 50 feet of your journey. You made it to the building, but which street is actually the entrance? And how do you get into the driveway? With just a few clicks, an editor can make that permanently visible.
"When we bring you to a store and this business exists, we think exactly where we are going to bring you from being in the right building," says Cue. "If you look at places like San Francisco or big cities from this point of view, you have addresses where the address is a certain street, but the entrance in the building is in another street, they did that because they wanted the better street name These are the things that our new maps will really shine on, we'll make sure we get you in the right place, not near. "
Water, swimming pools (new to Maps), sports facilities and vegetation are now becoming more distinctive and sophisticated with new computer vision and satellite imager applications. So Apple had to create synonymous for editing tools.
Hundreds of editors will use these tools, in addition to the thousands of employees Apple already works on cards, but the tools had to be built first, now Apple is no longer relying on third parties to troubleshoot and correct problems.
And the team also needed to develop computer vision and machine learning tools to determine if there were any problems at all.
Anonymous probe Data from iPhones, visualized, look like thousands of dots flowing and flowing like a luminescent mesh of color through a network of streets and sidewalks. First, chaos. Then patterns appear. A street opens for shops and nearby ships pump orange blood into the new artery. A flag is triggered and an editor checks whether a name has to be assigned to a new street.
A new intersection is added to the Web, and an editor is marked to ensure that the left turn lanes are properly over the overlapping layers of directional traffic. This has the added benefit of massively improved tracking in the new Apple Maps.
Apple expects this combination of human and artificial intelligence to allow editors to create basemap cards first and then maintain them as ever-changing biomass devastation on streets, addresses and the occasional park
Apple's new maps, like many other digital maps, are very different depending on the scale. When you zoom out, you get less detail. If you zoom in, you get more. But Apple has a team of cartographers working on more cultural, regional and artistic levels to make sure their maps are readable, recognizable and useful.
These teams have goals that are both concrete and a bit out there, the best traditions of Apple pursuits that overlap the technical and the artistic.
The cards must be usable, but they must also meet cultural-level cognitive goals that go beyond what a particular user might know. For example, in the US it is very common to have cards that have relatively low detail even at medium zoom. In Japan, however, the maps are absolutely equipped with details in the same zoom, as the increased information density is expected by the users.
This is the Department of Details. They have reconstructed hundreds of traffic signs to make sure the sign on your navigation screen matches what you see on the road sign. When it comes to public transport, Apple licenses all the fonts you see in your favorite subway systems, like Helvetica for NYC. And the line numbers are exactly in the order in which you'll see them on the deck signs.
It's about reducing the cognitive burden it takes to translate the physical world in which you need to navigate into the digital world represented by Maps
The new version of Apple Maps will be previewed next week, only the Bay Area of California will go live. It will be seamlessly inserted into the "current" version of Maps, but the difference in quality should be immediately visible, based on what I have seen so far.
Better road networks, more pedestrian information, sports areas like baseball diamonds and basketball courts, more land cover including grass and trees on the map, as well as buildings, building shapes and sizes that are more accurate. A map that feels more like the real world you are traveling through.
The search is also being redesigned to make sure you get more relevant results (on the right continents) than ever before. The navigation, especially the pedestrian guide, gets a big boost. Parking and building details that will bring you the last few meters to your destination are also included.
What you will not see at the moment is a complete visual redesign.
"You will not see big design changes on the cards," says Cue. "We do not want to combine these two things at the same time because it would cause great confusion."
Apple Maps gets the long-awaited attention it really deserves. By completing the project, Apple is committed to creating the map users expect from the very beginning. Especially on iPhones, there was a persistent shadow, where alternatives like Google Maps offer more robust features that can easily be compared to the native app but are inaccessible at the deep system level.
The argument was made ado Sure, but it's worth repeating once again that if Apple thinks mapping is important enough to own it, it should own it. And that's what it's trying to do now.
"We do not think anyone does the level of work we do," adds Cue. "We did not announce that, we did not tell anyone about it, it's one of those things that we could almost keep secret, no one really knows, we're looking forward to doing it out there, next year we'll be there Introduce section by section in the US. "