How long will it take an Amazon Echo, mashed up with a Robotic Exoskeleton, to invade my house?
2017 was the year that robots truly arrived. Robots have escaped the factory floors, and are now roaming the streets. In the labs, across the world, people are creating advanced robots. They are developing so rapidly, it’s like the arrival of a new species.
They are starting to look like us, move like us and beginning to think like us. Robots will first take our jobs and then control of our lives.
We are all familiar with the sci-fi movie I, Robot, a 2004 American science fiction action film directed by Alex Proyas. In the year 2035, humanoid robots serve humanity, who are protected by the Three Laws of Robotics.
Del Spooner, a Chicago police detective played by Will Smith, hates and distrusts robots because he was rescued from a car crash by a robot using cold logic. In the movie, Will Smith says:
“I was the logical choice. It calculated that I had a 45% chance of survival. Sarah only had an 11% chance. That was somebody’s baby. 11% is more than enough. A human being would’ve known that. Robots…”
Robotics is being studied every day by leading universities, private institutions, military installations, and hobbyists. The race for the creation of a humanoid robot, with the ability to interact and fulfill human tasks… is one of the holy grails of the 21st century.
So where are we today? Can we expect a humanoid robot in my home within a decade or two?
If we are to simplify this article into two major aspects; a ‘humanoid robot’ and a brain to power that device. I will start by focusing on how a company like Amazon would fill that niche.
According to Reuters,Amazon.com offers a range of products and services throughout its websites. The company operates through three segments: North America, International, and Amazon Web Services (AWS). The company’s products include: merchandise, content that it purchases for resale from vendors, and those offered by third-party sellers.
Amazon has developed many products, but for now, I will be focusing on Amazon Echo, and its supported layers that power the ‘brain.’
Amazon Echo is the current leading digital assistant on the market.
But what is Alexa? Alexa is Amazon’s solution for a voice activated intelligent digital assistant that makes use of machine learning and artificial intelligence.
Al Lindsay, from Alexa Engine Software, describes Alexa as, “A low-cost, ubiquitous computer with all its brains in the cloud that you could interact with over voice—you speak to it, it speaks to you.”
Launched in 2014, an Alexa enabled device can can perform a variety of functions, such as: searching the web, creating calendar events, modify existing to-do lists, taking notes, placing an order via Amazon store, and acting as a gateway for IoT compatible devices. These are just a small list of the tasks that this intelligent device can achieve.
Consumer Intelligence Research Partners (CIRP) released data showing that 20 million Amazon Echo devices have been purchased by U.S. Consumers and the device is an amazing piece of technology. Alexa has been built into multiple gadgets (Echo Dot, Tap, Echo Look, Echo Show, Echos Spot, and Echo Plus). The range of Alexa-enabled devices is also growing, with the technology now being ported to be used in smartphones, cars and refrigerators.
Artificial Intelligence & Robotics
Artificial intelligence is an intelligence demonstrated by machines, in contrast to the natural intelligence (NI) displayed by humans and other animals. With AI, each algorithm is optimised to an exact predefined goal. These goals are limited. They can ‘reproduce an action, or copy an action’ but they cannot create. Above all else, they have no feelings or empathy.
Robots are mechanical devices and controls, which are advancing at a much slower rate than AI. This is due to the multitude of hardware that is involved in ‘simple’ but highly complex tasks. These tasks include interfacing with nerves, ligaments and tendons in a hand. Robots require sensors, and a plethora of components that are still too expensive. Current batteries are not efficient to hold a meaningful charge either.
Perhaps the biggest leap in hardware has been its sensor technology. Robots need to sense the environment, not just with cameras, but with lasers that build a 3D map of the robot’s surroundings. As of 2017, these kinds of components have gotten both far more powerful and less expensive.
In an article by Wired, Ben Wolff, CEO of Sarcos Robotics said, “I think it’s because we’re finally at that crossover point, where the cost has come down of components while the capability of the components has increased sufficiently.”
For example, in 2010 a sensor needed by Sarcos cost $250,000. Today that same sensor now costs $8000. Other components like actuators—the motors needed to move arm or leg joints—are also decreasing in cost. Today, an actuator that once cost $3,500 is closer to $1,500. “And it’s actuators, perhaps more than any other component, that promise to take robotics to the next level in the very near future”.
A humanoid robot is a robot with its body shape built to resemble the human body. The field of Robotics is a research and development race, which has received significant attention over the last few years and will continue to play a part in our future. Whatever the applicational use for a humanoid robot, the fundamental aspect must be ‘its human-like processing of information and the mechanisms and algorithms to enable the robot to act and mimic the human form’. A tough order indeed.
Why does Amazon Alexa matter?
Digital assistants like Amazon Alexa are leading the AI and machine learning revolution. The past decade has brought enormous leaps in machine learning and speech recognition, which has finally made voice commands viable for consumer products. Amazon’s Alexa has jumped to the front of the pack.
What does this mean?
Let’s imagine that we were to interact with a humanoid robot.
The robot may not be able to express any emotions but it can learn from interactions or ‘conversations’ with people. That will help it develop a personality. A robot has several thousand speech behaviours and motions that are linked together in a big hierarchical flowchart to create the robot’s mind. It’s not just a script, it will take data and put it in its memory bank. The memory banks are constantly being updated with memories, and the controlled ‘conversations’ allow the robot to build positive and purposeful relations with people.
Amazon’s Alexa is a robot brain
While the robot has no physical motion yet, Amazon’s Alexa has access to over 20 million robot brains. Amazon senior principal scientist Nikko Strom spoke at the AI NEXT tech conference and shared details on Alexa and its broader artificial intelligence initiatives.
Amazon devices are collecting valuable data every second of the day, and all this data is being stored in Amazon’s S3 (Amazon Simple Storage Service).
In an article by GeekWire, Strom said, “We train these models on AWS EC2 (Amazon Web Services Elastic Compute Cloud) instances.” The company has to use “distributed training” across 80 GPU (graphical processing unit) instances in order to crunch the massive amount of data it receives.
Alexa’s Brain is in the Cloud
Alexa is learning from all its interactions and ‘conversations’ which are being stored in the cloud, and as Nick explained the machines are learning.
Robots Are Already Here
Amazon’s Alexa and Google Home’s assistant are starting to be integrated into a new class of robot, a ‘Robo-Butler’. Buddy, Jibo, Lynx, Kuri, Pepper, Sanbot Nano and Yumi are just a sample of available ‘robo-butlers’ on the market today.
At CES 2018, LG introduced its own series of consumer robots. Their LG Hub, is powered by the same software on Alexa. It has been modified with more personality, has an animated face and the ability to swivel its head when been spoken too.
Yumi Is an Android-Powered, Alexa-Enabled Robot for Your Home
Meet Yumi, an Android-powered robot which is equipped with a five-inch touchscreen for a head, a couple wheels for legs, and features Amazon’s Alexa Voice Service. (See full feature over at Hackster)
So robots are already in our homes.
A new report from TechRepublic claims the market for humanoid robots will expand tenfold by 2023. Current estimates put its value at $320.3 million, but it’s projected to reach $3.9 billion within the next six years.
Perhaps they are not the full ‘Humanoid Version’, but they are already here. With time, the engineers will figure out the mechanics. As the costs of components become cheaper, and with improved batteries, we will begin to see a more human-like version of these robots. As the costs of components decrease, and more ‘humanoid robots’ are introduced, Amazon will buy the technology and integrate it with the Alexa brain and the AWS Cloud.
The first generation of humanoid robots may not be I,Robot, but they will be better than Robo-Butlers and they will be delivered to our homes. It may take a couple decades, but they will come. “Eventually, you are going to see the humanoid-type of robot, like in Isaac Asimov’s book, I, Robot. That’s definitely going to happen,” Rob Coneybeer, managing director of Shasta Ventures said in an interview with Fortune. “It’s still 20 or 25 years out, but I think that type of robot will fit into the framework of what we think of as our traditional living environments.”
Now excuse me, I need to go and watch Westworld!
- Designing Emotional Intelligence: A Conversation with Amazon’s Head of UX for Alexa Skills
- Why Robots Should be Utilitarian and Not Humanoid
- Roboticist Gives Alexa a Face
- Amazon’s Alexa Reference Hardware
- Impatient Futurist: Your Domestic Robot Servant Has Finally Arrived (in a Fashion)
- Enhance Your Audio Skill Visuals for Echo Show and Echo Spot
- Inside Amazon’s Artificial Intelligence Flywheel
- The 10 Algorithms Machine Learning Engineers Need to Know
- Will Your Next Best Friend Be A Robot?
- Alexa Skill Blueprints Mean Everyone Can Have a Personal Alexa Skill with No Coding