Virtual sales meetings made it difficult for salespeople to read the room. So, some well-funded tech providers are stepping in with their own bold sales proposition: AI can not only help sellers communicate better, but detect the “emotional state” of a deal — and the people selling to them.
In fact, while AI researchers have tried to instill human emotions into cold, computational robotic machines for decades, sales and customer service software companies including Uniphore and Sybill have been building products that use AI in an effort to help humans understand and respond to human emotions. . Zoom also plans to provide similar features in the future.
“It’s very difficult to build a relationship into a relationship in this kind of environment,” said Tim Harris, director of product marketing at Uniphore, of virtual meetings. The company sells software that attempts to detect if a potential customer is interested in what the salesperson has to say during a video call, and alerts the salesperson in real time during the meeting if someone appears to be more or less involved in a particular topic.
The system, called Q for Sales, may indicate that a prospect’s sentiment or level of engagement improved when a salesperson mentioned a particular product feature, but then regressed when the price was mentioned. One competitor, Sybill, is also using artificial intelligence in an effort to analyze people’s moods during a call.
Uniphore’s software incorporates computer vision, speech recognition, natural language processing and artificial emotion intelligence to capture behavioral cues related to someone’s tone of voice, eye and face movements or other nonverbal body language, and then analyzes that data to assess their feelings. behavior.
A physical digital scorecard.
Sitting alongside a photo of someone in front of the camera during a virtual meeting, the Q for Sales app depicts emotion through fluctuating metrics indicating detected levels of emotion and engagement based on the shared system’s interpretation of satisfaction, happiness, engagement, surprise, anger, disgust, fear or sadness. The software requires video call recording, and is only able to assess someone’s feelings when that individual customer – or room full of potential clients – and salesperson agree to record.
Although Harris said Uniphore does not collect individual profiles based on data it intercepts and produces, its software does provide data that he says indicates the “emotional state of a transaction” based on sentiment and the engagement of all purchasing committee members who were present at the meetings across the timeline. For discussions with this potential client.
Always be… Sign?
Just asking to record a virtual conversation can change a customer’s attitude, said Grace Briscoe, senior vice president of customer development for digital advertising firm Basis Technologies. “Once that alert comes out of the recording, it puts people on the alert,” she said. “I think it will be unconvincing to clients; they will be less outspoken. I don’t think it will be helpful for the kind of relationship building that we want to do.”
Josh Dolberger, head of product, data and artificial intelligence at Zoom, said that while some participants in sales meetings may be uncomfortable signing up, others will be more open to it. “Part of it is the culture of the sales team,” he said, noting that registration may not be tolerated when selling to more sensitive industries such as financial services.
Zoom, the king of virtual meetings, said Wednesday that it is introducing new features called Zoom IQ for Sales that provide sales meeting hosts with transcripts of post-meeting conversation and sentiment analysis. Although some AI-based transcription services have been known to make mistakes, Dolberger said Zoom was built in-house using an automated speaker recognition and natural language understanding system. The system is integrated with Salesforce.
“We are looking at things like speaker rhythm and other factors in the linguistic approach to try to separate one speaker from another,” Dolberger said.
Currently, Zoom’s new features for salespeople do not assess emotions in real time during a meeting. Instead, they provide post-meeting analysis. For example, Dulberger said that an interaction might be categorized as “low interaction” if a potential customer didn’t talk much.
“You’ll be able to gauge how well they don’t interact,” he said, noting that salespeople aim for balanced conversations in which customers talk a lot like the salesperson.
Frustration is revealed. Show sympathy.
Sentiment analysis is nothing new. Since the early days of social media, presenters have pulled text from posts, tweets and product reviews, and analyzed their content to help determine what it means to consumer brands, restaurants, or political candidates. Today, help desk and call center conversation software use voice recognition and natural language processing AI to ask customer service representatives to speak more slowly or be more active. For example, Amazon has partnered with Salesforce to offer sentiment analysis for apps used by customer service agents, and a product from Cogito uses in-call voice analysis to assess the emotional state of callers or service reps.
“Frustration revealed. Show sympathy,” states an alert shown as an example on the Cogito website.
Questionable artificial intelligence to train basic human skills
But what companies like Uniphore, which recently raised $400 million in Series E with $2.5 billion in funding, and Sybill are doing goes well beyond customer service claims. Uniphore and Sybill aim to monitor human behavior during video calls in real time. They are betting that even seasoned salespeople can benefit from the guidance of emotional AI training.
Dolberger said Zoom also has active research underway to integrate emotional AI into the company’s products in the future. He cited research that he said shows improvements are being made to artificial intelligence used to detect people’s emotions, including a study that includes technology that removes facial images from background images that can confuse computers and a new data set that includes data on facial expressions and physiological signals such as heart rate. Heart, body temperature, and self-reported emotions.
“These are informative cues that can be useful; emotion-based metrics can be added to provide sales reps with a richer understanding of what happened during a sales meeting, for example by detecting ‘we think sentiment went south in this part of the call,’ said Dolberger. . “
Briscoe said it has recognized the potential value of emotion-based techniques as management tools to help identify salespeople who may encounter problems. However, she said, “Companies should hire people with some level of emotional intelligence. If the people on our team can’t read that someone has lost interest, these are basic human skills I don’t know why you need AI.” [to facilitate]. “
Even if emotional AI guidance is attractive to some sales teams, its validity is called into question.
“The claim that a person’s internal state can be accurately assessed by analyzing that person’s face is based on shaky evidence,” wrote Kate Crawford in a 2021 article in The Atlantic. In this article, Crawford, an AI ethics researcher, research professor at UC Annenberg and chief principal investigator at Microsoft Research, cites a 2019 research paper that states: “Available scientific evidence suggests that people sometimes smile when they are happy, and turning when sad., frowning when angry, and so on, popular opinion suggests, more than might be expected by chance.However, the way people communicate anger, disgust, fear, happiness, sadness, and surprise varies greatly across cultures. situations, and even across people in one situation.”
“We are able to look at faces and categorize them into different emotional expressions created by psychologists that are pretty standard there,” said Patrick Ellen, Uniphore Vice President of Artificial Intelligence.
You may be smiling and nodding, and in fact, you are thinking about your vacation next week.
Elaine said the technology developed by Uniphore uses the same cues that people use to infer what others are thinking or feeling, such as facial expressions, body language and tone of voice. “We strive to do like a human,” he said. Uniphore’s software integrates the computer vision and human sentiment analysis technology the company acquired when it bought Emotion Research Labs in 2021 at an undisclosed price.
Ellen said Uniphore’s AI model was trained using private, open-source data sets that display images of diverse ethnic groups of people. Some of this data came from actual sales meetings the company held. To help the machine figure out which facial cues represent certain types of emotions, the image data was categorized by people hired by Uniphore to make those annotations based on a set of guidelines created by the company and then modified based on whether the people agreed to certain criteria.
“Going forward, there is always room to improve these things as the system gets into the hands of larger scales,” said Eileen. The company is also conducting a study to verify the program.
But Elaine recognized the limits of technology. “There is no real objective way to measure people’s feelings,” he said. “You could smile and nod, and in fact, you’re thinking about your vacation next week.”