This Earnings Season Robots Will Be Hanging On… Your… Every… Word

By Christopher Westfall
For CFOs and their finance staffs’ earnings calls have their own level of stress as teams try to anticipate what analysts and shareholders may quiz them on during the quarterly conversation.
But the earnings season that kicks off this month will bring a new level of scrutiny as firms employing artificial intelligence (AI) systems plan to parse literally every syllable, pregnant pause and inflection of an earnings call in attempt to squeeze what financial preparers really “mean.”
Asset managers and hedge funds are leveraging AI in a new push to extract value at of the billions of pieces of “unstructured data” (data that is not defined) that is released every day by financial preparers in the form if earnings calls, press releases and assorted financial disclosures.
“The asset management industry is going in the direction of collecting and extracting information from alternative data sets as the amount of unstructured data (i.e., heavy in text) grows exponentially on the internet,” says Frank Zhao, a Director with S&P Global Market Intelligence’s Quantamental Research group. “Investors, especially quantitative investors and even some fundamental investors, are starting to explore these new data sets en masse.”
Fog Indexes, Heatmaps and Tone
The technology being implemented on earnings calls is dubbed natural language processing (NLP) and has been around for decades and used in consumer applications like dictation software and online assistants like Apple’s Siri.
However, Zhao points out that NLP is being increasingly embraced by asset managers and hedge funds as a way to understand if managers are trying to skew conversations with analysts and investors, focusing on the upside while avoiding harder financial reporting issues.

For example, one way NLP can be utilized in to create a “sentiment analysis” of the proportion of negative words in a stock’s earnings call (decline, weakness, debt) to the number of positive words (progress, better, success) to try and glean the meaning beyond a particular quarter’s headline number.
NLP is being used in several different ways on earnings calls, according to an S&P Capital IQ report. For example, it can be used to:
  • Determine the language complexity of earnings calls using a “fog index” to determine if managers wanted “to soften bad news.”
  • Create a “heatmap” of industry sentiment trends based on recent earnings calls, revealing if a particular industry is facing similar performance challenges.
The overall use of NLP on earnings calls allow the user to determine the “tone” of the prepared, Zhao adds, explaining that inclusion of this information in larger AI systems gives the investors insight into what preparers are really thinking.
“Investors are quantifying the tone of a firm’s earnings call by systematically going through earnings call transcripts and use the derived tones as an additional piece of information in their future investment decision process,” Zhao says.
What’s Next for AI, Data and Financial Disclosures? All of It.
As the push for financial disclosure increased, unstructured data is being created at an exponential rate by thousands of companies every day. As the chart below reveals, the rise in unstructured data globally has already skyrocketed.

But that does not mean that senior-level financial executives should expect the investor community to stop at the earnings call,
Despite the amount of data being “created faster than it can be processed,” S&P Global’s Zhao argues that with greater computing power investors will look to other types of financial disclosures in addition to earnings call transcripts.
“In my opinion given the interest on the topics of alternative data and machine learning, investors will move to financial filings like the 10Ks and, in fact, they are starting already,” Zhao says.