USA | Europe

An interview with Dr. Rana el Kaliouby, Deputy CEO at Smart Eye

We were excited to catch up with Dr. Rana el Kaliouby, Deputy CEO at Smart Eye and Founder of Affectiva. Rana joined us at our inaugural InCabin event on 15th September 2022 with a session on ‘Humanising the in-cabin experience’. Continue reading to learn more.

 
  1. You’re joining us for our brand-new conference, InCabin, this September in Brussels. Why is InCabin monitoring such a hot topic right now?

Over the past several years, we’ve seen Driver Monitoring Systems (DMS) go from nice-to-have to must-have technology in cars. In-cabin sensing is the natural evolution of DMS and is seeing increased demand because of its many important applications in safety as well as entertainment, comfort, wellness and beyond. Instead of just focusing on the driver, interior sensing solutions have a full view of the entire cabin.  With a camera and other sensors, these systems provide human-centric insight into what’s happening in a vehicle, by detecting the state of the cabin and that of the driver and passengers in it.

In-cabin sensing is such a hot topic because it allows OEMs to not only meet regulatory and rating requirements that are on the horizon, but also sets them up to differentiate their brands in a very competitive market.  Also, car manufacturers can expand on cameras, other sensors and machine learning-based algorithms that are already deployed for DMS. Building on that as a foundation, they can often, cost effectively, add advanced safety features and engaging mobility experiences to existing platforms.

2. We’d love to hear a bit more about you – how did you get to be where you are, and can you tell us more about your role at Smart Eye?

I grew up in the Middle East. My Egyptian parents very much valued the education of their daughters, and contrary to what societal norms dictated, I studied computer science, and then went on to do my Ph.D. at Cambridge University.  It was there that I found my life’s mission of humanizing technology and started to lay the foundation for my pioneering work in artificial emotional intelligence (known today as the field of Emotion AI). I joined the MIT Media Lab as a postdoc to advance Emotion AI and explore its applications in many industries, for example in areas like health, autism and automotive.  We started to see commercial interest increasing, so in 2009, I co-founded Affectiva together with Dr. Rosalind Picard, and we spun out of MIT.

I eventually became Affectiva’s CEO. During that time, we raised over $50 million in venture capital for the company, we created the new technology category of Emotion AI and we became recognized as the leaders in this space.  We brought several products to market.  In the media & entertainment industry we served 28% of the Fortune Global 500, and in automotive we launched the first ever AI-based interior sensing solution: Affectiva Automotive AI.  Then last year (in 2021), we got acquired by Smart Eye and merged our two companies.

Today at Smart Eye I am Deputy CEO. In that role I collaborate with CEO Martin Krantz on defining and executing our strategy. I oversee our AI roadmap and ensure that our technology is built with ethics and DEI in mind. One of the things I love doing most is meeting with our OEM clients to brainstorm with them and educate them on the many critical applications of our Interior Sensing AI.  I always find these meetings so energizing!

Last, but not least, I try to free up time to pay it forward. It took me a few years to come to this realization, but I see now that as a female Egyptian-American scientist, entrepreneur, investor, author and AI thought leader, I have a very unique profile in the technology and automotive industries that are still overwhelmingly white and male.  I firmly believe that diversity fosters innovation and I want to be a voice for change. If I can be a role model and inspire young women (and men) of diverse backgrounds to follow their passions, just like I did, then change will follow.

3. We’re thrilled that you will be speaking at InCabin about humanizing the in-cabin experience.  Can you give us a sneak peek into the use cases that you are most excited about?

First and foremost, our focus should always be on advancing automotive technologies to save lives and I love how interior sensing enables several advanced safety features. One important functionality is child and child seat detection, which helps determine if a child is left behind in a vehicle unattended. I am hopeful that this can help avoid tragic deaths due to vehicular heatstroke. And, as anyone who has driven around with kids – or for that matter, pets – will know, they often cause distraction, so child and pet detection can provide important inputs to promote safer driving behavior. 

In addition, I have always been very interested in the intersection of healthcare and automotive. So, I am glad to see that, for example through Euro NCAP, attention is being given to sudden sickness.  What if interior sensing AI could detect driver impairment due to a medical event? For example, we already have early-stage capabilities for detecting heart rate variability using computer vision, and I am eager to see these types of technologies advance so that they can actually be deployed in vehicles.

Occupancy detection, which determines how many people are in the cabin and where they are sitting, is another interesting area of functionality. This too can help improve existing safety functionality, as it helps determine proper seat position and seat belt usage and can provide important analysis for airbag deployment.

I am also very interested in the “experience” features that in-cabin sensing will enable. The Emotion AI technology that we created at Affectiva, now part of Smart Eye, also unlocks massively interesting personalization.  By understanding the emotional and cognitive states of people in a vehicle we can adapt the environment to their needs in the moment. For example, if someone is getting drowsy, the lighting dims, the heating turns up and soothing music plays quietly in the background, creating a comfortable and restorative environment.

There are also fascinating backseat use cases around content recommendations and monetization of advertising data, even.  Understanding passenger reactions and emotional engagement with music and video content can help refine content recommendations, making these more relevant to the user and their experience.  Where advertising is deployed, understanding viewer engagement with that content provides OEMs and advertisers with very valuable data.  I think users will be interested in opting in to that, if there is an incentive for them.  Imagine a scenario where you and your friends take a rideshare to a concert and engage with advertising in the back of a car.  In return, since the system knows you are taking a ride to a concert, it offers you a discount coupon for a band t-shirt or such. 

And, to wrap this up, a fun use case often requested by OEMS, is “trip highlights” where the Emotion AI detects smiles and joy, so that an interior camera can take cool pictures just at the right moment — there’s a lot of fun we can add to the overall experience considering the right applications of this technology. 

4. You recently celebrated 1 year since Smart Eye acquired Affectiva. What has been achieved in this first year, and what can we expect in the future?

It’s been an exciting year in which we achieved a lot. This acquisition was really a merger of two equals. Our vision, technology and teams were very complimentary and together we have set out to build a global AI powerhouse that can bring to market next generation interior sensing, better and faster than any other technology company. This is an ambitious goal I admit, and it is not easy for sure, but we have made such great progress.

First and foremost, we integrated our two teams.  Not a trivial feat when you think about the fact that we now have departments spread across multiple offices, time zones and cultures. And, we have integrated our technology stacks.  I am so excited that we now have an integrated demo that combines Smart Eye legacy driver monitoring with Affectiva’s Emotion AI based interior sensing.  At InCabin in Brussels we will demo our Interior Sensing AI that shows driver monitoring, cabin monitoring and occupant monitoring features working together seamlessly. Make sure you come see that in our booth.

In the future you can see our technology deployed in many more car models.  We already have 94 design wins from 14 OEMs for our driver monitoring technology and are being asked to bid on the broader interior sensing focused RFQs that are now coming out.  And, of course, you will see Smart Eye adding more safety and experience features to our portfolio.

5. Smart Eye is conducting research on how machine learning and computer vision-based systems can be advanced to detect alcohol intoxication in drivers. How do you actually do this given that these AI-based algorithms need real world training data?

For me, intoxication detection is the next frontier in automotive safety systems. According to the World Health Organization every year 1.3 million people die in road crashes around the world, and more than 20% of these fatalities are estimated to be alcohol related. For over a century we have seen different strategies to combat drunk driving, such as ignition interlocks to roadside alcohol tests, yet none of them have made a big enough difference to prevent accidents caused by intoxicated driving. I think we would all agree that this is a serious societal problem in need of much better solutions.

Today critical research is being conducted by government bodies, the automotive industry, academia and technology companies such as Smart Eye to determine more effective approaches. I believe that in-vehicle sensors and machine-learning based algorithms, similar to the ones we use for DMS, have a lot of potential to better detect impaired and drunk driving versus more traditional methods. As with any AI-based technology, we need large amounts of data to train these algorithms and improve performance. And, as you can imagine, collecting data of drunk drivers in moving vehicles is a very complicated thing to do.

At Smart Eye, we have been doing quite a bit of research for detection of drunk driving.  One interesting study is called Fit 2 Drive, a collaboration between Smart Eye and the Swedish National Road and Transport Research Institute (VTI). So far this project has been able to collect data from more than 30 participants while we gradually increased their blood alcohol concentration. We then let the participants drive a car equipped with multiple sensors on an enclosed racetrack. This was our first collection of data that captures visual information from the driver, giving valuable insight into how different people behave at different levels of intoxication. 

Getting this study off the ground was not easy. In Sweden, driving while intoxicated is illegal even in very controlled situations on an enclosed test track. To conduct the study, we had to get special permission from the Swedish government. But despite the complicated processes of collecting data from drunk drivers, these types of studies are absolutely necessary for future research on driver impairment.  If you would like to learn more about how we collected data from drunk drivers have a look at this video.

As an industry we are certainly in the early stages of this type of research, however the potential of driver monitoring systems that can detect alcohol intoxication carries a lot of hope that drunk driving is a problem that can be solved with better technology solutions.

6. The conversation around in-cabin technology is growing rapidly – how do you anticipate this impacting development and deployment to market?

We are already seeing OEMs issuing RFQs for in-cabin sensing beyond driver monitoring.  This will accelerate the development of functionality and we can expect to see this technology deployed in vehicles on the road in the next several years. This is only the beginning.  OEMs will initially implement basic in-cabin features and will expand those over time. 

 

Now is really the right time for the community to come together at InCabin. This thought-provoking event will offer a forum where industry leaders can have critical conversations and collaborate to advance automotive safety and mobility experiences.  I am excited to be part of this important work.

If you like what you read, join us in Phoenix this March to find out more about Automotive interior intelligence.
Don’t miss out! https://auto-sens.com/incabin/pass

2024 ADAS GUIDE

The state-of-play in today's ADAS market

With exclusive editorials from Transport Canada and SAE;  the ADAS Guide is free resource for our community. It gives a detailed overview of features in today’s road-going vehicles, categorized by OEM, alongside expert analysis.