Google Translate has behaved strangely lately. This is not an opinion based on occasionally getting a slightly different translation or even one that is only half accurate. Google Translate spits out biblical style prophecies from doomsday and various messages that have completely disappeared from the wall. As apps sometimes do, something in the code is somewhat mutilated somewhere, or new learning changes the way an app responds to input, but seeing changes as drastically as they are now seen is as completely abstract and alien as it is to ask if Google translator (19659002) In social media, people publish ScreenCaps, which messages they have received from the app and the results are both humorous and sometimes disturbing. Anyone can try it on their devices by simply typing a random combination of two or three letters and let the app do the rest. The more often the letter combination is repeated, the more detailed the translation. For example, if you type "dog" 1
Admittedly, sometimes you have to mess around with the language setting to get results. For example, in the dog example, you must set the language on the left hand side to Maori. Sometimes it works with the default language recognition setting. The more different things you try, the more strange translations you get.
While there was some social media speculation that Google Translate was possessed by demons, ghosts, or became an actual tool for God to come with us Talking to people, the real answer is a lot easier than that. It's just how the app learns and learning is not perfect. Google spokesman Justin Burr said Nextweb exactly what's going on.
"Google Translate learns from examples of translations on the Internet and does not use" private messages "to perform translations, nor would the system even have access to that content. This is simply a function of typing nonsense into the system, too nonsense is generated. "
Andrew Rush, an assistant professor at Harvard who deals with natural language processing and computer translation, said Motherboard essentially what Rush said, but more complex terms. Rush describes that the AI that runs the translation system for Google is trained to make something human-sounding, whatever it is typed. The translation it produces may not make sense, but it works within the limits of its programming and learning skills.
Between Burr and Rush, the idea that Google Translate has developed AI to the point that it has adopted religion is Now we are not spreading the word of a divine entity. It's just a hole in the learning curve of the app, and over time, it's predicted that it will teach itself to better respond to strange inputs. For the moment, most people who have tried to get weird answers from the app say it's fun.