epsilon ces 2017 trend recap
Post on 15-Apr-2017
Embed Size (px)
Epsilon Agency January 10, 2016
CES & Our View of 2017
Technology is now essential to our daily lives. Accessibility and empowerment has transformed how we connect and communicate. This has led to new forms of user interaction that will usher in the business models of the future. CES 2017 showcases the tools and technology that will further empower consumers through new types of conversational experiences, the continued evolution of artificial intelligence as well as the rapid rise of third-party ecosystems supporting virtual, augmented and mixed reality. The trends we covered at CES 2017 outline the evolution of marketing in 2017
through the consumer centric filters of connection, cognition and immersion. Connection Trends that reimagine how we connect, enable and empower
consumers Cognition Trends where machine based intelligence will disrupt and
redefine data assets and how we work Immersion Trends that align technology and presence to evoke emotion,
entertain and power commerce How we consume and interact via digital channels is about to be absorbed and redefined. The technology featured at CES 2017 aligns with our view of the coming convergence toward an ambient computing future built on new data types
that will simplify complex tasks and predict need states vs. reacting. Tom Edwards Chief Digital Officer, Agency
Zone of Convergence
MACHINE LEARNING AS A SERVICE
PERVASIVE VOICE-BASED EXPERIENCES
BLOCKCHAIN + AI
SEARCH TO 1:1 PREDICTIONS
SIMPLIFIED CONVERSATIONAL EXPERIENCES
(RE) MIXED REALITY
THE DEMOCRATIZATION OF VR
Emergence of General AI Assistants !
1. Connected Product Ecosystems!
Connected Product Ecosystems
During the LG keynote executives announced 100% of their home appliances released in 2017 will be advanced wifi capable. This is the first time a megabrand such as LG has connected an entire line of their products, and others are sure to follow. Paired with the LG Hub Robot, these appliances can employ their own data and cloud services with the power of a central control center that minds the environment while deploying independent devices based on environmental cues, user preferences, and cloud data. Implications: This marks the beginning of an era where connected products integrate into branded smart ecosystems that become more than the sum of their parts. We're likely to see multiple brands fight for dominance in this space creating their own closed systems.
LG's Hub robot controls several of LG's Smart home appliances like their smart oven and washing machine. It runs also runs Amazon's Alexa.
Previous: LG announces that their entire suite of home appliances will be 100% advanced wifi capable, creating an entire ecosystem of connected products.
2. Alexa: The Interface of IoT!
Alexa: The Interface for IoT
Alexa was the undisputed winner of CES this year. Hundreds of devices announced Alexa compatibility or complete Alexa integration. Everything from smart humidifiers, refrigerators, ovens, vacuums, lights, switches, TVs, and oh yes, cars. The huge volume of devices that announced compatibility shows a sharp turn from the trend of pushing independent mobile apps for IoT devices. The appeal of a single, voice activated interface that can manage all the smart devices in a home may influence consumers to look for devices that are designed to work with Alexa. Implications: Voice Interfaces are quickly becoming the remote control for the Internet of Things and Amazon stands firmly at the head of the pack.
VW's Alexa integration can give status updates, help with location assist, create reminders, and purchase items and parts for their specific model VW, while LG's smartfridge (previous) will include all standard Alexa functions along with direct ordering from Amazon Pantry.
3. Secure Connected Home!
Secure Connected Home
A number of companies including Norton, BitDefender, McAffee introduced home network security routers aimed at protecting your personal devices and IoT systems from external threats. With an estimated 50 Billion IoT devices to be connected by 2020, a strong security solution is a necessity. With botnets like Mirai using unsecured IoT devices to wage DDoS attacks on major websites and individuals alike, the security of IoT is more important than ever. Implications: Stories of hacked baby cams and even a Jeep have caused concern over the safety of IoT devices. A solution to secure home networks and eliminate these threats is likely to increase confidence and result in greater adoption of IoT and smart home technology.
New routers from Bitdefender (previous) and Norton (above) promise to keep home networks and IoT devices safe from external threats including phishing websites and hacking attempts.
4. Projected Screens!
Light-based screens projected onto various surfaces were found in a forms at CES. These projectors allow nearly any surface to become a screen, with many incorporating motion sensitive cameras that can emulate the feel and interactivity of a touch screen. While touch-responsive projected screens aren't new at CES, the size, variety, responsiveness, and cost provide a welcome upgrade from previous generations and makes them far more appealing for multiple use cases. Implications: We're likely to see a proliferation of screen "form factors" in everything from consumer devices to in-store retail experiences. The ability to turn any surface into a screen provides incredible flexibility and endless possibilities for interactive content.
The Beam Lab light bulb projector (previous) allows consumers to add a projector to their home without the need for a dedicated device.
Sony's Xperia projector turns any surface into an interactive touch screen using a motion sensing camera to understand movement and gestures. 8
Emergence of General AI Assistants !
5. Body and Mind as an Interface!
Body & Mind as an Interface
From high-fidelity mapping and gesture tracking for virtual reality to EEG devices that monitor brain waves from inches away, there were several demonstrations of technology using the body and mind to control digital experiences. Implications: As more and more technologies begin to use the body and mind as an interface, digital experiences will become more immersive. These interfaces will also allow digital and virtual experiences to be far more mobile and universally accessible. In the long term they may lead to a decline in the use of traditional input methods like the mouse and keyboard.
The SoftKinetic's platform (previous) showed high fidelity models of users' hands, allowing extremely nuanced gestures and interactions with the environment.
Bodywave's EEG monitor actively measured driver's brainwaves to assess attention to the road and stopped the car every time the user broke attention. This technology can be used to help eliminate drowsy driving. 8
6. Movement as a Platform!
Movement as a Platform
Sensors that track body position, velocity, force, traction, and other various states of movement were found in devices designed to go beyond measuring the body's state. These devices provide the user feedback necessary to understand and modify their technique, and even compare their movement and positioning with masters of their craft. Implications: This combination of sensors, cloud services, and instruction will allow people to learn skills and movements like dancing, martial arts, or even a golf swing with the accuracy that typically requires in-person instruction. This can provide brands with tremendous opportunity to expand their offerings in highly experiential and engaging ways.
Evalu measures minute changes in pressure, force, height, and velocity to track nuanced movement that can be used to coach proper form and mechanics.
Atomic Bands (previous) captures movement of dancers and martial artists to teach users the style and moves of experienced practitioners.
7. Pervasive Cognition!
This year the halls of CES featured hundreds of devices that embed or provide data to an artificially intelligent voice assistant platform, like Siri, Cortana, and Alexa, or even one of the new entrants like Baidu, Google, or Mattel. While it's exciting to see electronics imbued with even a limited form of "smarts" and automating worthwhile tasks, the real peak into the future of AI assistants came in the form of emotive robotics - assistants like Lynx and Jibo that recognize human emotion through facial recognition and respond accordingly. Implications: In a single year we've shifted paradigms from "everything will be connected" to "everything will be cognified." As more sensors become commercially available and AI assistants improve we'll see the automation of increasingly complex tasks and the development of engaging robot personalities that will begin to resemble elements of what we've seen in science fiction movies. It will happen sooner than you think.
Autonomous lawnmowers incorporate machine vision, sensors, and cloud-based AI to understand the surrounding environment and adapt to changes over time.
Lynx robot (previous) is an Alexa enabled AI robot that can dance, play music, do yoga and uses facial recognition to acknowledge emotion.
AI Becomes a Service !Cameras to Eyes!
Emergence of General AI Assistants !
Search to 1:1 Answers !7. Contextual Assistants Arrive !8. Contextual Assistants!
As Amazon, Google, and Siri compete for dominance across mobile devices and IoT, several AI powered contextual assistants made an appearance at the show. From the cooking assistant Hello Egg which plays recipe videos and assists with cooking tasks, to Mattel's Aristotle which includes Cortana along with a separate, kid friendly AI to read them stories and play games. Expect even more contextually focused and function-specific assistants to appear throughout the year. Implications: Amazon's Alexa may be dominating the voice assistant landscape, but that doesn't mean there isn't room for other assistants purpose-specific form factors and AIs. Adding features like cameras, screens task-enhanced AI can provide additional functionality or context specific capability that the leading platforms can't.
Mattel's Aristotle can identify when babies wake and soothe them back to sleep. It can also sing to them and teach the ABC's.
Hello Egg (previous) is a cooking assistant for the kitchen. It can help users plan their meals for the week, organizes their grovery list and order produce.
9. Object Recognition!
iDentifi uses convolutional neural networks to identify patterns and objects. It's available in 25 languages and oh, it was written by a sixteen year old.
Poly (previous) also uses AI to identify objects in extremely complex situations with almost no latency because most of the image library is stored on the device.
Object recognition isn't anything new, (Google released Goggles in 2010), however the technology on display this year was recognizing objects in far more complex scenarios and with greater detail and speed than we'd ever seen before. Interestingly enough, the versions that were most impressive were born from assistive technologies meant to help the blind and are capable of identifying a dizzying array of objects and all sorts of variations thereof. Implications: In order for Augmented Reality to deliver on its hype, devices will have to speedily identify billions of objects in all sorts of conditions and understand context. The examples seen on the show floor are likely to be the brains of the augmented reality headsets and devices of the near future.
10. AI Goes Inside the Cabin!
AI in the Cabin
While most of the news around autonomous cars has focused on how they understand the roadway, several new technologies are bringing the focus inside the car and to the driver. From reading lips to reading emotions, machine vision and artificial intelligence are being used to assist drivers in getting to their destination, while monitoring and even impacting the emotional state of the driver. Implications: As AI begins to influence ever more elements of driving and transportation, it will radically transform the concept of what it means to drive a car. While we're years away from fully autonomous cars, we're entering a transition periods that will begin to train some of the behaviors we'll take with us into the autonomous future. Pay careful attention to these as they'll likely signal the changes in media consumption in the years ahead.
Toyota's Concept-I Emotion Map uses facial recognition to log the driver's emotions and recommends routes that provide the most pleasant experience.
Nvidia's AI Co-Pilot (previous) uses lip reading to detect commands from the driver and gaze detection to understand when they are distracted from the road.
11. Autonomous everything!
While the world waits for autonomous cars to become commonplace, the revolution in autonomous vehicles is quietly happening with drones, personal mobility devices, and delivery vehicles. The show floor had everything from autonomous drones that navigated complex terrain without an external signal, to super-high-tech self-balancing, multi-axis unicycles from Honda. Implications: Autonomous vehicles are going to have a greater impact on our lives than the concept of autonomous cars alone. They'll create new opportunities for delivery, shipping, logistics, personal mobility, that we're never possible before. This evolution will dramatically impact industries that rely on shipping or just-in-time delivery of parts and supplies. These vehicles will also begin to shape our expectations around delivery and transport in the same way on-demand digital services have over the past few years.
The Local Delivery Robot can deliver items 5-30 minutes from a central hub and is much cheaper than traditional last-mile delivery methods.
Honda's Uni-Cub is a self-balancing, autonomous, personal mobility device.
12. VR: Full Sensory Immersion!
Full Sensory Immersion
Devices that convey various sensations of touch and smell added further depth to the immersive worlds of VR. Haptic feedback simulated variations in texture and pressure while robust 3D Inside-Out tracking allowed users to see and use their hands in virtual environments, an element sorely lacking from most VR experiences today. Implications: The addition of sensory stimulation will provide increasing depth within VR experiences. As these peripherals gain traction and increase in resolution and fidelity we'll be seeing experiences that are extraordinarily realistic and fool more than just the eyes and ears. This will also more realistic representations of products within VR, making the idea of shopping in VR more appealing than it is today.
This Haptic feedback device allows users to feel various textures and pressure as they interact with the items in a virtual world.
The Taclim shoes (previous) mimic different terrain such as wood, wet grass, snow and sand.
13. VR: Spatial Freedom!
VR: Spatial Freedom
Multiple booths were dedicated to solving of VR's lack of spatial freedom. With high-end VR currently requiring a cumbersome wire and mobile headsets dependent on stationary operation, the experience of space and mobility has been a challenge for the industry. Various mobile Vive and Oculus form factors allowed users to ditch the wire and move freely about a confined play space. Meanwhile Zeiss presented a solution for mobile headsets that allows six degrees of freedom, adding the ability to crouch, jump, and move forward and backwards to mobile headsets through inside-out tracking. Implications: While initial impressions of VR tend to wow, significant experience with the device leaves players with a sense of restriction rather than infinite possibility. These enhancements to VR's capability will ensure even more immersive experiences that enable new modes of interaction and ultimately, impact adoption and scale of the devices.
Zeiss VR One enables players to move on all axis in mobile VR by using the device's camera to understand the players relationship with space around them.
Wireless adapters like Display Link (previous) allow players to engage in VR games without the need to worry about tripping over wires.
14. Augmented Reality Audio!
Augmented Reality Audio
Screens and headsets may get all the attention, but audio proved to be one of the most promising and capable media at the convention. Various headsets and ear buds incorporated smart filtering to allow users to selectively focus on or enhance certain sounds in their environment like speech or filter out sounds like sirens, and office chatter. Other capabilities included directional audio that played through a smart device replicating many of the capabilities or Augmented Reality headset like directions or restaurant information through the use of voice and audio. Implications: What we're witnessing is the viability of audio as a computing platform. With brands like Sony, Apple and Doppler labs creating experiences for the ears, we won't be confined to a series of screens to get data that might be better served through audio and can free our attention toward the environment around us.
Here One from Doppler labs can isolate and cancel specific sounds or enhance others. For example, cancelling airplane noise, but amplifying speech nearby.
Sony's Project N (previous) delivers private listening and AR audio experiences that layer over the sounds of the world around you without the need for a screen.
15. Alternative Interfaces Mature!
Alternative Interfaces Mature
Eye tracking and gesture detection have been around for years, but the processing power to leverage them for robust and complex interactions wasn't present in the right form factors. Now, eye and gesture tracking is feasible at a incredibly high definition, allowing for more natural and intuitive interfaces and more immersive interactions. Implications: Although early prototypes fell far short of their hype and dampened enthusiasm, new incarnations of the technologies show incredible promise and will certainly impact the way we interact with our devices, allowing content providers to create novel experiences with new modes of interaction.
MSI and Alienware laptops have incorporated Tobii eyetracking into their monitors to allow gamers to use their eyes to aim and direct the camera.
Usens' Impression.Pi (previous) brings hand and gesture tracking to the mobile VR experiences.
ZONE OF CONVERGENCE
16. Car-based Commerce!
Over the past decade, each new digital platform has incorporated e-commerce in some way, and as cars become increasingly digital they will be no exception. With the assistance of blockchain everything from paying from parking and buying gas, to car sharing will be enabled by smart contracts that are initiated on the blockchain. Owners of autonomous vehicles will even be able to receive package deliveries via their cars by giving vendors authenticated access to their trunk, verifying the authenticity of the package, and keeping the rest of the car off limits. Implications: Transacting through a car on the block chain will provide the necessary elements for autonomous cars to join car sharing fleets, and allow leasing and financing models to be based on driving variables such as milage, wear and tear, and others. This technology will dramatically change the way own and share vehicles, while creating ever more frictionless options for mobility and delivery.
An eWallet concept from ZF, UBS and Innogy will provide cars the ability to pary for parking and tolls, manage autonomous package delivery, and enable smart car sharing.
17. Cars as Health Platforms!
Cars as Health Platforms
Nearly every major car manufacturer addressed the element of health and well being this year as it relates to driving and transportation. Facial recognition helped determine driver's emotional state, while the car responded by changing the environment in the cabin to enhance alertness, a state of calm, or by recommending certain routes the system knows will produce the most enjoyable ride for the driver. Implications: As we move towards a future with ever greater options in automobiles, from personal mobility devices, to fully autonomous cars and beyond, the idea of what constitutes a luxury car will ultimately change. While the focus is the act of driving today, tomorrows most favored brand will be delivering experiences that are the most responsive to those inside the cabin while moving from point A to B.
Toyota's Concept-I experience drove passengers through a virtual environment while plotting their emotions to a map by using facial recognition tracking.
Hyundai's Mobile Health concept attempts to impact driver mood by mixing scent and temperature to generate appropriate shifts alertness or calm.
As another CES closes and the excitement and energy of the digital utopian vision on display in Vegas crystalizes in the rearview mirror, the sheer enormity of what just happened begins to settle in. Within a single year weve gone from a theme everything will be connected to everything will be cognified and were beginning to see the first consumer experiences in ambient computing, where the idea of an internet or digital fades to the background and the focus is less about individual devices or even ecosystems and more about the momentary experience of the consumer. One thing is certain: just as quickly as the concept of digital disrupted what it meant to create ads, and mobile uncoupled context from circumstance, the convergence of Connection, Cognition, and Immersion is rapidly changing what it means to create an experience in a way that dwarfs what weve seen in previous years. Digital isnt about channels or engagements that live behind a glass screen anymore. Were moving to a place where the word experience carries more weight and nuance than ever before and we have an ever growing set of tools an canvases on which to paint. Ian Beacraft VP, Digital Strategy
Questions or inquiries can be sent to [email protected] and [email protected]!!!! !