In a turn of one-upmanship only plausible in an irreversibly social-media-dependent generation, Thomson Reuters has seemingly responded to Bloomberg’s bold move last year by adding select Twitter feeds to its terminal. The company included not just Twitter feeds across its Eikon platform but a technology called sentiment analysis as well.
A massive amount of arbitrage stands to be gained by scraping data from the 140-character-per-tweet babble of millions of people about the health and future of any company. It’s no longer controversial that the messaging site Twitter holds certain keys to real-time insight into market shifts, and that sometimes this insight is fresher and better than that obtainable from conventional news sources, economic indicators and companies themselves.
Powered by automated sentiment analysis, Eikon now attempts to assess the Twittersphere’s reaction to the plight, fortunes and misfortunes of publicly listed companies and reduce that chatter to a series of quantifiable metrics that can be tracked over time. Eikon visualizes those metrics — the aggregate feelings of sometimes millions of users (Twitter has about 250 million active users) — in terms of relative positivity or negativity. Want to know how Twitter users feel about Twitter (the company)? Ask Eikon to show you a handsome bar graph. Make your trades accordingly, or at least with the confidence that you’ve had access to trends you might otherwise not have seen.
Sentiment analysis tools break down text and read its tone and affect, determining what is being talked about and how whoever wrote it felt about the subject. As a whole, this part of the financial technology industry’s mission is to take everything that people anywhere put online, in any language, and convert it from the qualitative to the quantitative — from an inconceivably enormous mass of human conversation and feeling into numbers. And once you’ve got numbers, you can start using them to produce models and run algorithms until they’re legible enough for human beings to understand, act and trade on.
Although leading edge for a firm of its size, Thomson Reuters is arriving late to an already crowded space. Dozens of apps and the companies behind them already specialize in gathering and selling social sentiment analysis to a financial audience. Whole start-ups have built their businesses — and not just a feature of a broader system — on structuring data from social media. Institutional Investor profiled one such firm, Dataminr, late last year.
Many firms today look beyond the by now well-combed postings on Twitter and Facebook to the farthest reaches of the blogosphere, consumer review sites and search engine queries — all of which get subjected to sentiment analytics by third-party providers catering to the financial services sector.
Late last year Boulder, Colorado–based Socialgist was the first to broker a deal with Beijing-based social network Renren, the Facebook of China, for access to its social data; Socialgist analyzes and sells that information to global brands to help target their marketing strategies on- and offline. That same intimate information about the habits, purchases and opinions of 194 million consumers in one of the world’s fastest-growing affluent markets is of obvious worth to analysts. Among its other products, Socialgist also constructs tickers that display sentiment data from Twitter — a service Thomson Reuters is only now bringing to its clients. Whereas Thomson Reuters draws from Twitter, Socialgist canvasses a broader range of platforms and message boards that might serve as advance predictors of price movements in high-volatility stocks.
In other words, institutional finance — or at least its big players and their service providers — seem perpetually late to the big data party. But late is often inconsequential when you have the distribution that the smaller start-ups do not, as Thomson Reuters certainly does.
In a recent column I discussed the rise to prominence and influence of analysts on Twitter, a new breed of independent stock watchers commenting on the markets free of any institutional context. It can still be difficult for financiers to accept the informality and the unpredictability of tweets as reliable sources of tips, trends or news; they seem too far removed from thorough, thickly textured conventional research.
Eikon addresses this limitation by abstracting the information over a large series of tweets into something traders can better relate to — a quantified score of general sentiment. Eikon takes in the Twitter API-level feed and applies analytics to raw data that traders would otherwise need programming knowledge to grapple with, but it also incorporates scores generated by influence thermometer Klout into its evaluations. The application can seek out significant or notionally trustworthy human actors as well as engage the flux of data on its own terms. This creates a certain kind of meaningful data, but it also suggests that Thomson Reuters expects its customers to feel reassured by the idea that the best human opinions (or at least the opinions of those humans other humans think are important) are considered amid the chaotic chatter of anonymous users whose institutional affiliations and intentions are unknown. The fact that Eikon uses Klout to inform its analysis implies it knows its customers hesitate to surrender to numbers derived entirely by machines from faceless users — many of whom, it has recently been shown, are themselves machines.
Thomson Reuters focuses on the ticker as the unit of analysis and has modified its terminal so you can ask it to tell you how the broad user base of Twitter thinks a certain stock will perform (using a reductionist, if clear, bullish or bearish dichotomy). Eikon’s new feature is emblematic of the long-overdue mainstreaming of big data. (For tech commentators, Thomson Reuters’s alliance with Twitter-mining is expected to raise the value of Twitter’s data as the microblogging service considers how to sell that data.)
But Eikon’s new iteration may be simply a far more commoditized version of what already exists at the digital edge of technology-powered quant firms today. Firms like Two Sigma are well known by reputation to already be mining the Twitterverse, as are likely dozens of others. These hedge fund firms aren’t afraid of ever-increasing automation: The best engineers across all sectors insist on automating every process they can to compress research cycles, reduce the cost of labor-intensive processes and derive insights through rapid analysis of large-scale datasets, driving harder and faster at every next phase of innovation.
Commoditized news and sentiment analysis tools like Eikon — which are built at a fraction of the cost of the current quant hedge fund social media analytics infrastructures — disseminate advanced analytics to a wider user base, one that extends beyond traditional hedge funds. Better, faster visualization; smarter trend-spotting; the ability to learn from users’ habits and from the patterns implicit in social media data and adjust analytic parameters accordingly — the financial professional of tomorrow will take these abilities for granted no less than his historical counterparts counted yield curves and high-speed quotes as necessary tools of the trade.