In the 2070s the Internet of computers changed from a World Wide Web of information to the R Net of relevant knowledge. This meant that people were fed information triggered by position, direction, occupation and requirement. All this information was delivered through AR glasses and voice activated AIs. Most websites disappeared and activation engines appeared to replace search engines.There are already applications which access GPS and give you information relevant to your location, and map applications can track your position as you travel. Company servers hold databases of information relevant to your occupation. But how do you specify your requirements except through a search engine?
The answer is voice and image searches.
Because mobile is becoming the primary form of consumption, future search engines will try to use powerful sensing technologies like accelerometer, digital compass, gyroscope and GPS. Google recently bought a company called Behavio which predicts what a user might do next by using the information acquired from the different sensors on the user’s phone. (Quora)55% of millennials use voice search once a day and the rise of voice assistants, such as Google Home and Amazon’s Echo, voice has now entered the mainstream.
|Google Home Hub|
As technology advances, personal assistants will perform searches in the background based on information they pick up from conversations, consumer location and even biometric information. They will take into account a consumer's daily commute schedule and use real-time traffic data to recommend when they should leave home to make it to work on time. Personal assistants are already combing through emails and apps to produce info cards showing upcoming flight times, purchased movie tickets or incoming package shipments, so there is certainly scope for further development.
Google, Amazon, Microsoft and Apple are all working on optimizing their artificial intelligence so that instead of you asking the questions, the assistant will ask you questions to help organize your life. It will remind you of appointments, advise on life choices and come close to being a truly advanced artificial intelligence. (Smart Insights)Images
There are already early apps that allow the user to take a photograph and search for information relevant to the image. Adding location and other data will improve this further.
The Amazon Firefly App, for example, utilizes users’ smartphone cameras to identify products and then performs a search for that product.
eMarketer expects visual search to become a mainstream tool for retailers within one or two years. The future of visual search is more a question of when, not if.
For brands to benefit from visual search, they will need to improve their ‘visual dictionary’. Visual search technology is already with us but the biggest challenge right now is the limited image library. In addition to this, brands need to expose their visual dictionary to the search engines. (Smart Insights)If you consider visual queries, you can then consider visual answers.
|Crystal Eyes glasses|
You visit a new city, and you want to understand where to go tonight. You ask "where's the best pub or nightclub in Los Angeles?" whilst wearing a virtual reality headset, and probably more augmented reality in this case. You will be given two or three options of dance floors, bars and you can see it's a lot more engaging and visual in terms of what people are doing. And the headset will notice from the reaction in your eyes which one you like the most. (Quora). Follow her at http://eepurl.com/bbOsyz