FCAT RESEARCH
New Data Fuels AI Opportunities in a Remote World
By: SARAH HOFFMAN | MARCH 17, 2021
Now that so many of us are doing almost everything online at home -- shopping, work, doctor appointments, school, financial check-ins, parent-teacher conferences --- you may have noticed some not so subtle behavior changes among those around you. Maybe a colleague has suddenly started blocking her video feed during Zoom calls. Perhaps a customer has started speaking slower, making a lot of spelling errors, or is typing at a different pace. Maybe you’ve noticed a change in the tones of a colleague’s MS Teams or Yammer posts or changes in a customer’s chatbot message style. All of this could be indicative of something important, and AI is an ideal tool to pick up on these changes.
  • Facebook.
  • Twitter.
  • LinkedIn.
  • Print

Of course, when we think about AI monitoring this kind of behavior, we call it corporate surveillance and a slew of dystopian scenarios come to mind: businesses snooping to find out who’s been working and who’s been playing hooky; data gathered to determine who gets fired, gets a raise, or gets promoted. But it doesn’t have to be this way. With all the new data and behaviors these systems now have access to, a far more beneficial application of AI is feasible, one that could vastly improve:

Employee wellbeing. Managers know well the concerns raised by an employee who suddenly starts sending a lot of emails at 3 in the morning. AI systems can not only flag worrisome off hours activity, but also indicate when Zoom fatigue is likely or when a much-deserved break from a given task is a good idea. Changes in facial expressions can indicate even subtle mood changes, and AI systems could suggest taking a short walk, getting a cup of coffee, or even turning the next meeting into an audio-only or walking meeting. Some services, like Receptiviti’s, plug into email and messaging systems like Slack to search for signals that employees are depressed or burned out. Other companies, like Cornerstone OnDemand, take things further and are experimenting with heart-rate data from wearable devices like watches. They’re exploring ways to tie this data to entries from a person’s calendar or project management software to determine whether certain meetings, projects, or even people could correlate with elevated stress levels.1

Customer experience. As more of our conversations with customers go digital, we can look at calls, video meetings, and chatbot conversations in new ways, looking for changes in customer behaviors and emotions to better understand their context and needs. We can detect in real-time over a video call if we’re losing their attention and can prompt representatives to reengage the customer when needed. Automated summaries of video meetings could be useful in prioritizing product enhancements based on unmet needs. Transcripts of customer video calls can be used to learn more about what customers desire, find annoying or confusing. Changes in customers’ behaviors – whether it’s their voice, expressions, or keystrokes – can also be used to help customers who may be aging into certain disabilities. Researchers are studying whether AI tools that analyze typing speed and speech could be used to better identify people with early-stage dementia.2

Fraud detection. These digital conversations with customers can also be used to find other shifts in customers’ behaviors. Changes in word choices or unusual syntax by customers over chat could potentially indicate that a customer’s identity is being used without their consent. ProctorU, an online exam monitoring service, uses facial recognition software to match students to the image on their ID and verifies their identities with a typing test, to confirm the speed and rhythms of a student’s keystrokes.3 Some insurance companies are using voice analytics to detect whether a customer is telling the truth when submitting a claim.4

Inclusive design. There may also be ways to use this technology to increase inclusion for employees and customers. For those with difficulty focusing, an AI prompt could nudge people they’re meeting with to repeat things that may have been missed. When people have difficulty hearing they often make a facial expression that communicates their confusion. AI can read this face and nudge a contact center agent to either speak louder or shift to another interface, like text chat. Other facial expressions by a listener could be a sign that the speaker needs to slow down or explain things in simpler terms. All of this could be done through AI nudges sent in real-time.

Questions to Consider

Even with the best of intentions, deploying this kind of technology can be tricky. Companies need to carefully consider their approach to using the abundance of data we now have, especially in regard to privacy.

Do people want this? Different cultures and age groups may have different tolerances for this kind of relationship with an employer. Many employees are feeling more stressed and exhausted due to the expanded corporate surveillance but are hesitant to speak up.5 In April, Zoom removed an “attention tracking” setting, which alerted a call host when a participant was focused elsewhere, following a public uproar about its invasiveness.6 Schools faced a major backlash after using cheating software that tracked eye and head movements while students took their exams remotely due to coronavirus. Some students were so afraid that the testing system would brand them as cheating that they cried from the stress, threw up in trash cans, and even urinated at their desks; students with dark skin shined bright lights at their faces, worrying the systems wouldn’t recognize them.7

How do we monitor this? Clearly for any level of buy in, communication on this front is essential; we need transparent descriptions of what is being collected and how it’s used. But beyond that, how do we oversee performance? Using facial expressions to identify emotions sounds promising, but some argue that this doesn’t make sense to consider across cultures and contexts.8 One person may scowl when angry; another might smile politely. Someone can look downward as a sign of respect; another person may do that out of shyness. In December 2018, a study showed that emotion detection technology assigned more negative emotions to Black men’s faces than white men’s.9 In 2016, an algorithm rejected an Asian man’s passport photo for having his eyes “closed”.10 Perhaps video analytic features will make similar mistakes. Do companies need an AI ethics board to oversee this?

Can we trust this data? As these technologies become more mainstream, people are getting more adept at faking data. A recent analysis found that companies are adapting their language in their forecasts, SEC regulatory filings, and earnings calls due to the rise of AI that analyzes and derives signals from their words.11 And this isn’t stopping with earning calls. Before Zoom removed its “attention tracking” setting, people were easily able to fool the system by using a second device or making sure to head back to the zoom window before 30 seconds passed. Presence Scheduler, which can set your Slack status as permanently active, doubled in sales and traffic in the beginning of the pandemic (until Slack closed the coding loophole).12 To make people appear more engaged, Microsoft added a feature to its Surface Pro X to switch your eye position so it looks like you’re looking at your camera when you’re actually looking at on-screen faces.13 Photoshop’s Neural Filters lets users change facial expressions, strengthening or reducing feelings like joy, surprise, or anger.14 It’s not hard to imagine this technology being incorporated into videoconferencing.

  • Facebook.
  • Twitter.
  • LinkedIn.
  • Print
1 Cutter, C., & Feintzeig, R. (2020). Smile! Your boss is tracking your happiness. The Wall Street Journal.
https://www.wsj.com/articles/smile-your-boss-is-tracking-your-happiness-11583255617.
2 Wang, S. (2020). AI May Help Identify Patients with Early-Stage Dementia. The Wall Street Journal.
https://www.wsj.com/articles/ai-may-help-identify-patients-with-early-stage-dementia-11604329922.
3 Harwell, D. (2020). Mass school closures in the wake of the coronavirus are driving a new wave of student surveillance. Washington Post.
https://www.washingtonpost.com/technology/2020/04/01/online-proctoring-college-exams-coronavirus/
4 McCormick, J. (2019). What AI Can Tell from Listening to You. The Wall Street Journal.
https://www.wsj.com/articles/what-ai-can-tell-from-listening-to-you-11554169408
5Harwell, D. (2020). Managers Turn to Surveillance Software, Always-on Webcams to Ensure Employees are (Really) Working From Home. Washington Post.
https://www.washingtonpost.com/technology/2020/04/30/work-from-home-surveillance/
6 Ibid.
7 Harwell, D. (2020). Cheating-Detection Companies Made Millions During the Pandemic. Now Students are Fighting Back. Washington Post. https://www.washingtonpost.com/technology/2020/11/12/test-monitoring-student-revolt/
8 Schwartz, O. (2019). Don’t look now: why you should be worried about machines reading your emotions. The Guardian.
https://www.theguardian.com/technology/2019/mar/06/facial-recognition-software-emotional-science
9 Rhue, L. (2019). Emotion-reading tech fails the racial bias test. The Conversation.
https://theconversation.com/emotion-reading-tech-fails-the-racial-bias-test-108404
10 Cheng, S. (2016). An algorithm rejected an Asian man’s passport photo for having ‘closed eyes’. Quartz.
https://qz.com/857122/an-algorithm-rejected-an-asian-mans-passport-photo-for-having-closed-eyes/
11 Cao, S., Jiang, W., Yang, B., & Zhang, A. L. (2020). How to Talk When a Machine is Listening: Corporate Disclosure in the Age of AI. National Bureau of Economic Research.
https://www.nber.org/papers/w27950
12 Christian, A. (2020). Bosses started spying on remote workers. Now they're fighting back. Wired.
https://www.wired.co.uk/article/work-from-home-surveillance-software
13 Protalinski, E. (2019). Microsoft’s AI-powered eye gaze tech is exclusive to the Surface Pro X. VentureBeat.
https://venturebeat.com/2019/10/03/microsofts-ai-powered-eye-gaze-tech-is-exclusive-to-the-surface-pro-x/
14 Horwitz, J. (2020). Adobe’s Photoshop Neural Filters use AI to change faces, recolor photos. VentureBeat.
https://venturebeat.com/2020/10/20/adobes-photoshop-neural-filters-use-ai-to-change-faces-recolor-photos/
968260.2.0
close
Please enter a valid e-mail address
Please enter a valid e-mail address
Important legal information about the e-mail you will be sending. By using this service, you agree to input your real e-mail address and only send it to people you know. It is a violation of law in some jurisdictions to falsely identify yourself in an e-mail. All information you provide will be used by Fidelity solely for the purpose of sending the e-mail on your behalf.The subject line of the e-mail you send will be "Fidelity.com: "

Your e-mail has been sent.
close

Your e-mail has been sent.

This website is operated by Fidelity Center for Applied Technology LLC (FCAT®). FCAT experiments with and provides innovative products, services, content and tools, as a service to its affiliates and as a subsidiary of FMR LLC. Based on input and feedback, FCAT is better able to engage in technology research and planning for the Fidelity family of companies. Unless otherwise indicated, the information and items presented are provided by FCAT and are not intended to provide tax, legal, insurance or investment advice and should not be construed as an offer to sell, a solicitation of an offer to buy, or a recommendation for any security by any Fidelity entity or any third-party. Third-party trademarks and service marks are the property of their respective owners. All other trademarks and service marks are the property of FMR LLC or its affiliated companies.


1150441.2.0


This is for persons in the U.S. only.


245 Summer St, Boston MA

© 2008-2025 FMR LLC All right reserved | FCATalyst.com


Terms of Use | Privacy | Security | DAT Support