Can Crisis Text Line Help Improve Culture Workplace

Crisis Text Line, a nonprofit that offers emotional support through text messaging, has spent four years connecting people in extreme emotional duress with online counselors. Now its founder is creating a startup called Loris.ai to help companies teach employees how to communicate.

“There are a lot of companies right now that are fearful of having hard conversations,” says Nancy Lublin, the founder of both Crisis Text Line and Loris.ai. “Managers are nervous having a one-on-one meeting with a direct report of a different gender, and that holds women back. People worrying about inclusion worry they’ll get it wrong, and that holds people back.”

Silicon Valley companies have been in the crosshairs of a nationwide debate around fairness and inclusivity after a string of well-publicized incidents related to workplace culture. Uber engineer Susan Fowler publicized complaints of discrimination; Google fired engineer James Damore over an internal post debating gender differences in ability; and lawsuits alleged discrimination at a Tesla factory. Against that backdrop, Lublin saw an opportunity to use what she had learned with her nonprofit to try to meet this need.

With Crisis Text Line, a person in a state of duress can send a text to 741741 and begin to chat with a volunteer counselor. To date the nonprofit group has processed 61 million messages between people in crisis and its counselors, producing an extensive database of emotionally charged conversations. Last year it announced a partnership with Facebook to bring crisis support to Messenger, increasing the service’s reach and its pool of data.

Using machine learning and other data analysis, Lublin’s staff have gleaned hints as to what kinds of crisis counseling work best. For example, they discovered that the words “smart,” “proud,” “brave,” and “impressive” tend to have a strong positive effect on people’s moods—what Lublin calls “magic words.” They also learned that when a person says they are feeling overwhelmed, effective responses tended to use the word “strong.” As her staff unearthed these clues, they incorporated them into the training for the organization’s approximately 12,000 counselors, honing their skills as the database of conversations grew.

Lublin says that she started getting inquiries from companies and a law-enforcement agency asking if her group could also train their employees on how to navigate difficult conversations. “I thought, that’s a great idea, but our not-for-profit doesn’t do that,” she recalls. She started contemplating how to build a software service that would codify the insights Crisis Text Line had gleaned from its data into empathy lessons. Companies that use Loris.ai will gain access to software that coaches its users on how to navigate workplace drama. She named it after the slow loris, a cute little animal with a toxic bite. “If it bites you, it could kill you,” Lublin explains. “Just like, if you get [hard conversations] wrong, it can kill careers or companies.”

The software will offer training to help with such situations as a salesperson dealing with an angry customer, a boss giving sensitive feedback to an underling, or an employee needing guidance on how to interact with LGBTQ colleagues, Lublin says. She argues that past training efforts have typically relied on academic research or hunches, whereas Loris.ai uses data. Yet it’s unclear if Crisis Text Line’s data from emotionally fraught people in distress is useful when applied to most office scenarios.

“I’d hope that the workplace doesn’t have the intensity of life or death,” says Nicole Sanchez, CEO and founder of Vaya Consulting, which works with tech companies to improve their workplace culture. She adds, however, that people are now more open to discussing the toll of toxic work environments on mental health. “As people are ceasing to be silent about some of the horrific things that have happened in the workplace, there’s an obvious relationship between Crisis Text Line and these companies.”

Yet software can only do so much to promote inclusivity. It is unlikely to help unless a company’s top leaders support changing the culture, and supplement the software with in-person training, she says. “How do you take something so personal, so intimate, like a conversation about identity, and scale it via technology—this is a very difficult thing to do,” Sanchez says.

Ann Miura-Ko, a partner at venture capital firm Floodgate, says Crisis Text Line’s training focuses on building trust, which applies equally to less fraught situations. “Those very same techniques that create empathy between a complete stranger and a person in an absolute crisis moment will actually be very effective in creating empathy between two employees or a customer-support person and an angry customer,” she says. Floodgate led the $2 million seed round that Loris raised.

Miura-Ko notes that Loris’s training becomes more useful if the software continues to observe a person’s interactions and suggests changes to behavior. It could monitor a customer-service worker’s language, for example, and offer alternate words or phrasings. “The best companies will have artificial intelligence and machine learning tightly integrated into your workflow and effectively give you superpowers,” Miura-Ko says. “One of those superpowers is very effective conversation.”

Lublin has donated her equity to Crisis Text Line, so that the company’s financial success will help to fund the nonprofit’s work.

Comments

comments