“AI” was everywhere at CES this year. You couldn’t swing a badge without hitting some company claiming genetic AI was going to revolutionize your sleep, your teeth, or your business. However, a few applications of machine learning stood out as really useful or surprising – here are some examples of AI that can actually do good.
The whole idea that AI might not be a complete red flag came to me when I spoke with Whispp at a press event. This small group works to give voice to the voiceless, meaning people who have difficulty speaking normally due to a condition or illness.
The name refers to conditions where a person is able to form words, but whose vocal cords have been taken out of the picture, for example by throat cancer or injury. These people can whisper just fine, but they can’t speak – they often have to rely on a decidedly last-century electronic voice box. So Whispp’s first big feature is synthesizing their voices and turning those whispers into full speech.
Synthetic voice is created by similar means as other platforms — some old recordings of someone and you can adjust a voice model to sound pretty much like them. Whispp’s main challenge, it seems, was to create a speech recognition model that worked well with whispers and other affected speech. Interestingly, this also works for people who stutter, as for whatever reason whispering often reduces the stuttering greatly. You can read more about Whispp in our post about it here.
Around the corner from Whispp I ran into its extremely happy women Louisea French startup focused on fertility monitoring and advice for both men and women who want to improve their chances of conceiving.
Louise is very much a B2B affair, working with hospitals and fertility clinics to analyze patient data. It uses machine learning as a signal detector, sorting through thousands of data points from trials and research and looking for biomarkers that could provide insight into the complex process of improving fertility. Artificial intelligence is good at finding subtle correlations or patterns in large collections of data, and fertility is certainly one area that could benefit the most from this.
The company was actually at CES to promote his new app, Olly, which is its first B2C offering: an end-to-end “fertility journey” app for women and men (whose role in the process is often overlooked), from the decision to pursue it to success. Tracks appointments, offers documentation on medications and strategies, etc. And the icon is a cute little chick. Olly is scheduled for a worldwide release on February 14.
The r1 rabbit got plenty of hype at CES, as a candy-colored pocket AI assistant should. However, while it’s anyone’s guess whether the company will survive long enough to reproduce (the r2, one assumes), the capabilities of this little doodad may actually be more useful to the visually impaired than the sighted who just don’t want to. to take out their phones. .
Alexa, Siri, and other voice assistants have been transformative for countless people for whom navigating a smartphone or desktop computer is a pain due to their primarily graphical user interface. Being able to just talk and get basic information like weather, news and so on is a huge feature.
The problem is that these so-called helpers couldn’t do much outside of some strictly defined tasks and APIs. So you might be able to find out when a flight departs, but you can’t repeat it. You can book a car but not adjust the route for accessibility. Look for vacation destinations, but don’t rent a beach cabin there. Things like this. The r1 is built to be able to not only make basic assistant queries through a ChatGPT-style voice interface, but also handle any normal phone or web application.
If the device and service live up to the company’s claims, the r1 could indeed be a useful helper for anyone struggling to interact with a traditional computer. If you can talk, you can get things done — and if you use Whispp, you don’t even have to talk!
Elderly care is another area where some of the common criticisms of genetic AI fall short. I don’t think anyone should rely on a computer for companionship, but that doesn’t mean the computers we already interact with couldn’t be a little better for it. Although I’ve made it clear (at length) that I don’t think they should pretend to be human, they can still be friendly and helpful.
ElliQ makes devices (“robots”) for places like assisted-care facilities, where having a gadget in the room that can remind patients of things or ask them how they’re doing is a profitable addition. The latest device uses a large language model to produce more natural conversation and, like ChatGPT and others, can talk about anything. Many seniors are starved for conversation or struggle to understand new developments in technology or the news, so this could be a really good way to stay engaged. Additionally, it adds an element of care and safety, able to listen to the person if they are calling for help or relaying requests to caregivers or just checking in.
These aren’t all good applications of AI at CES, but a sampling is enough to show you that while it may be the next big thing, that doesn’t mean there aren’t good ways to apply machine learning to everyday life.