The coolest AI right now: from air gestures to intent detection

Tech
Innovation
CES 2024 - Tech trends

In the ever-changing landscape of technology, the rise of artificial intelligence (AI) has sparked a profound transformation in the way we interact with machines. Say goodbye to complex commands and intricate interfaces, and welcome simplicity and intuition as the new rulers of the tech realm. CES 2024 showcased this evolution in three exciting areas: air gestures, long-lasting conversations, and intent detection. Here, technology harmonizes seamlessly with the human experience.

Table of contents

Air gestures for a hands-free future

Imagine being able to control your smart home devices, browse the internet, or even play games without ever touching a screen or button? With air gestures, all of this becomes possible. By simply moving your hands in the air, you can command devices to perform a variety of tasks, from turning on the lights to changing the channel on the TV. This technology goes beyond traditional interface navigation, offering a more natural and immersive way to interact with machines.

At CES 2024, Singapore-based startup Neural Lab revealed AirTouch, a technology enabling users to control interfaces with simple gestures, using only a webcam and an app.

Our latest innovation at Baracoda is BMind, the first AI-powered smart mirror for mental wellness. It provides personalized recommendations and experiences based on a user’s mental state, as part of a seamless, touchless experience with gesture recognition, voice command, and intent detection. By leveraging a built-in camera (whether it's on a mirror, screen, mobile phone, or tablet), computer vision algorithms can detect the user's hand, while an AI model recognizes the specific gesture. Each of those are then mapped to corresponding commands or actions. For example, a wave of the hand might trigger a "next" or "previous" command, while a thumbs-up gesture could initiate a "start" command.

BMind smart mirror

BMind smart mirror

Finnish startup Doublepoint unveiled the WoWMouse app, introducing air-gesture detection for Android watch wearers. The app transforms your smartwatch into a mouse to help you control your connected devices thanks to an integrated accelerometer that detects hand mouvements. A promising idea for the manufacturers of smartwatches – Mudraband presented a similar tech integrated into a wristband that allows users control their Apple devices. While this is a more invasive technology, requiring the user to wear a wristband or watch, it's another step towards transforming the way we use our devices. "We're rewriting the rules for human-computer interaction," emphasizes CEO of Doublepoint.

Wow Mouse App by Doublepoint

Wow Mouse App by Doublepoint

The seamless integration of air gesture technology in various devices, such as smart TVs, home appliances, and automobiles, demonstrated the potential for a more intuitive and user-friendly future. More importantly, it also could be a boon for people with disabilities – older adults or individuals with mobility issues would only need to move their hands to control their home devices. Tech companies can use it as an opportunity to make their products and services more inclusive.

Long-lasting conversations: Devices as companions

Large language models (LLMs) are ushering in a new era of human-machine interaction, where devices are not just functional tools but also conversational companions. These models, trained on vast amounts of text and code, enable devices to understand and respond to human language in a natural and intuitive way.

CES 2024 showcased Intuition Robotics' ElliQ, a tabletop robot designed to provide support, engagement, and social connections for older individuals. Equipped with an LLM and machine learning algorithms, ElliQ can engage in meaningful conversations, offer personalized advice, and provide reminders and assistance with daily tasks. It leverages its understanding of the user's preferences, emotions, and routines to create a truly personalized experience.

ElliQ 3.0 by Intuition Robotics

ElliQ 3.0 by Intuition Robotics

Amazon Alexa will evolve to become more than just a voice-activated command system. With the advancements in LLMs, Alexa can now engage in extended conversations, answer complex questions, and provide real-time information and assistance. Its ability to remember previous interactions and preferences allows for more natural and context-rich conversations.

As our digital companions gather more information, our relationship with them will become more interactive. Having an AI assistant can be useful in various fields to automate tasks, speed up work, and provide an interactive and enjoyable experience.

AI, Read my mind: Decoding intentions and signals

The boundaries of AI continue to expand, raising the intriguing question: could AI one day know our thoughts? One of the most exciting frontiers of AI research is the development of brain-computer interfaces (BCIs), which allow computers to directly communicate with the brain. This technology has the potential to revolutionize the lives of people with disabilities, restoring mobility and communication to those who have lost it.

At CES 2024, the French research organization CEA introduced Wimagine, a Brain-Computer Interface that uses brain-recording implants and specialized AI algorithms to track and interpret a patient's intended movements. This technology has the potential to restore mobility to people who have lost the ability to move due to injury or disease. In a clinical trial, the Wimagine BCI allowed a paralyzed patient to control a robotic arm and perform simple tasks, such as reaching for objects and drinking from a cup.

Elon Musk recently announced that his company Neuralink has implanted its brain-computer interface into a first human patient. The technology consists of ultrasmall electrode implants that record electrical signals from the brain and transmit them to an external device, enabling the patient to control it with just their thoughts.

Neuralink's first neural implant

Neuralink's first neural implant

Another example of AI being used to decode intentions and signals is Cappella, an app that uses AI and machine learning to decode a baby's cries. Cappella can identify different types of cries, such as hunger, tiredness, and pain, and can even distinguish between the cries of different babies. This technology has the potential to make it easier for parents to understand their babies' needs and to respond accordingly.

From controlling devices with our minds to understanding the needs of infants, AI is opening up new possibilities for communication and interaction. As this technology continues to develop, we can expect to see even more amazing applications of AI in the years to come. As we embrace this new era of AI-driven technology, we must also consider the ethical implications – privacy, security, and the potential for job displacement are just a few of the concerns that we should keep in mind to guarantee that AI serves us in a positive and sustainable manner.

Building the future of your industry

Baracoda's expertise in developing smart technologies with AI seamlessly combines innovation strategy, software and hardware development, to help companies become pioneers in their industries. We support you in developing next-generation tech that optimizes conversion and ROI, improves the customer journey, and extends across your entire ecosystem.

Whether you're looking for a turnkey solution or considering building a new tech platform from scratch, our experts work with you to define the project, develop the right infrastructure, and manage product lifecycle challenges.