A Deep Dive into Amazon Hunches and Deep Device Embeddings

Customers may use Alexa to manage hundreds of different smart-home devices with just their voices, including plugs, lights, switches, door locks, cameras, and thermostats. However, the Alexa smart-home interface is more than just a voice-activated remote.

Hunches is an Alexa feature that provides reminders when Alexa believes you have forgotten to turn off a light or lock a door. Hunches have since grown to include a proactive experience called Automatic Actions, which turns off lights and adjusts thermostats when consumers are sleeping or leaving the house.

Alexa now initiates one out of every four smart-home interactions — without customers having to say anything — thanks to predictive and proactive capabilities like Hunches and Routines.

Source: https://www.amazon.science/blog/the-science-behind-hunches-deep-device-embeddings

Taking preventive steps in the home is a responsibility that Amazon researchers take seriously, and the machine learning models must fulfill high user expectations. Multiple models make decisions before launching an action, such as estimating light levels, forecasting if consumers have gone to sleep or left the house, and determining which gadgets should be controlled.

🔥 Recommended Read: Leveraging TensorLeap for Effective Transfer Learning: Overcoming Domain Gaps

The team learns about a device by looking at how it’s called and grouped, as well as how it’s utilized. In the image of state data timelines, consider the time duration of the “on” states of the example devices. According to the researchers, the normal bathroom-light pattern is brief spans of on-state throughout the day and larger spans in the evening.

In the evenings, another device in the living room is mostly used. A third gadget is simply referred to as “Plug 1,” but it is closely linked to the living room’s usage patterns. Alexa may deduce that this outlet is in the living room. It might be used to decorate a Christmas tree, for example. If the model was certain of this, it could provide the customer with a Routine that syncs the tree lights with holiday music.

Two factors make proactive routines challenging:

  • Sparsity: While some customers may have a fully connected smart home, others may only have an Echo Dot and a smart light bulb. The team would like to offer them Hunches as well.
  • Diversity: For Echo devices, smart-home equipment, and their functions, there are hundreds of thousands of different configurations. Even if two homes are exactly set up, users may use their devices in quite different ways. A device labeled “bedroom light” could be in the master or guest bedroom, or it could simply be a closet light.

Many of the services boil down to determining what state users want their devices to be in: both Automatic Actions and Hunches are driven by this question. Researchers use an encoder model to create a device representation — an embedding — that encodes the device’s prior states as well as the home’s configuration, including device names, device types, and device groupings. A decoder model can anticipate the device’s future state based on that embedding.

However, predicting the home state does not provide straightforward answers to queries such as whether a light should be turned off or a Hunch should be issued. Researchers utilize a model trained to predict whether a Hunch will be accepted or not for this purpose. The Hunch acceptance model makes use of the encoder from the home state prediction model, which may compute embeddings per house or per device. Deep device embeddings are what they call per-device embeddings.

The embedding encoder’s weights represent information collected across consumers in this model; devices with similar usage patterns produce representations in the embedding space that are close to one another. Some bathroom lights, for example, maybe identified by device grouping or usage patterns, while others may be identified by device clustering or usage patterns. 

The team can abstract these specific features by using an embedding space.

While deep device embeddings are effective for predicting Hunches, researchers have discovered that when the embedding space is viewed, it has some features that are fascinating. This area is intriguing because it contains clusters to which the team can assign usage meanings. The researchers discovered that clusters include semantic information; for example, the isolated island of devices made up of the “outside” and “porch” (blue and brown) clusters is most likely made up of outside equipment that is left on for extended periods of time.

While the labels provided to clusters are the most common names associated with them, they are not the only ones. In the outside cluster, for example, the team discovers devices labeled “First Light.” The embeddings can tag devices not just by their names and groups but also by how they are used.

As a result, the device embedding space provides an automated mechanism to map device usage patterns in diverse homes into a common area that can be used to power services like Automatic Actions and Hunches.


Despite the convenience Hunches and Automatic Actions provide, experts believe there are still ways to make the smart home smarter.

The team, for example, assigns common roles to all customer device usage.

 While this is useful when data is scarce, they would like to provide more personalized smart-home services to their clients. Deep house embeddings, a novel representation developed by researchers, integrates data from all devices in the home and allows the team to design proactive automation that takes the status of the home into account holistically.

Most smart homes begin with one or two Internet-connected gadgets and gradually expand. Researchers are constantly looking for ways to automate repetitive chores, deliver relevant reminders that improve consumer convenience and make meaningful recommendations to all Alexa Smart Home users.

Reference: https://www.amazon.science/blog/the-science-behind-hunches-deep-device-embeddings

Nitish is a computer science undergraduate with keen interest in the field of deep learning. He has done various projects related to deep learning and closely follows the new advancements taking place in the field.