Saturday

April 20th, 2024

Insight

'Big data' is still only a little helpful

Leonid Bershidsky

By Leonid Bershidsky Bloomberg View

Published August 11, 2016

"Big data" is one of the tech world's ubiquitous buzzwords. In the old days, people just called it data, but in Silicon Valley it's not a thing unless it's big. It's not yet obvious, however, that data collected by various internet services is any more useful than those mined in more traditional ways -- through surveys, for example.

As consumers, we know the data collected by big internet companies can be used to push goods and services -- mostly things very much like those we have already purchased. They can also be studied to explain our behavior on social networks: Facebook's data science team does a lot of that kind of work on its own and with academic institutions. As Joonas Tuhkuri of the University of Helsinki put it in a recent paper, "Google data is one of the largest data sets ever collected. Forecasters and researchers alike need to know how useful it actually is." In other words, can this information say anything about real-world phenomena?

Seminal research in this area came from Hal Varian, Google's own chief economist, and Hyunyoung Choi, another employee of the search giant. They showed that trends in Google queries are good at "predicting the present," or "nowcasting." Actual data on, say, car sales, vacation destinations or unemployment claims takes time to collect and process, so it's released a month or so after the data collection period ends. One can, however, track the share of related search queries and predict the official numbers before they are released. Varian and Choi found that using the query data in prediction models dramatically improves "nowcasting" quality.

Google's interest in offering its new tools to mainstream academics is clear: Once it becomes accepted, central banks and governments will start using it, consolidating the company's centrality to the economy and policy-making. A Bank of England study has already confirmed the relevance of search data as an economic predictor.

So far, of all conventional economic indicators, unemployment has lent itself most readily to deduction by Google trends. Tuhkuri and his team at the Research Institute of the Finnish Economy have even turned its Google Trends-based model into a product. The description of ETLAnow, the institute's unemployment forecast for European countries, says the frequency of Google queries such as "unemployment benefits" or "labor market subsidy" in 22 European languages helps improve the accuracy of three-month jobless rate forecasts by as much as 39 percent.

The difficulty of using search data, of course, is that people use all kinds of search terms to make the same query. It's relatively easy with unemployment where the relevant search terms are limited; it gets much harder when more variety is introduced. Varian and Choi, for example, looked for terms that predict consumer sentiment and found it was highly correlated with queries that fell into Google's "crime and justice" category. The Google scientists honestly admitted they couldn't understand the connection.

Another Google data scientist, Seth Stephens-Davidowitz, based his Harvard doctoral thesis on three studies of search queries; the one on electoral bias against black candidates was widely quoted in the press. It relied on the frequency of searches for the n-word and similar derogatory terms to determine how racist an area in the U.S. leaned. Stephens-Davidowitz found that more "racially charged" searches meant a higher Republican turnout, a lower Democratic one and worse results for Barack Obama in the 2012 election. I was interested enough in this to run some back-of-the-envelope calculations at the height of the primaries season and found a nice positive correlation between "racial searches" and Donald Trump's results. Yet I'm not sure the correlations are better predictors of liberal and unorthodox candidates' election results in the U.S. than are previous voting patterns. I'm not even sure -- and neither is Stephens-Davidowitz -- what the search patterns mean. If I search for a racially charged term, does that make me a racist?

Another Stephens-Davidowitz study appeared to show a connection between the damage the recent recession did to specific parts of the U.S. and child abuse, reflected in queries such as "dad hit me." The frequency of such queries appears to go up as economic conditions worsen, but the rate of reported abuse goes down. It's intuitively understandable: In tough economic times, kids and their mothers may be more tolerant of fathers's inability to control themselves because there's a clear reason for it. But I wonder if the conclusion is useful at all, except to a government willing to install a system of filters that would catch the "dad hit me" queries and immediately dispatch police to the homes where they originate.

The noisy, imprecise data collected by internet giants clearly contain lots of jewels if you know how to sift for them. Yet it's not easy to see how the sifting can be much more useful than traditional prediction methods, such as polling and modeling on the basis of historical data. For example, ETLAnow's unemployment rate forecast for the European Union as a whole is 8.89 percent in June and 8.9 percent in September. Bloomberg's consensus forecast, based on analyst predictions, is 8.7 percent for the second quarter, which ends in June, and 8.6 percent for the third quarter, ending in September. The difference isn't particularly significant.

Tuhkuri and his team wrote in the ETLAnow description that their tool was especially good around turning points, making it a useful measuring instrument during economic crises. "Big data" is probably better for reflecting unexpected changes than traditional models and polls could ever be: The models use historical information and pollsters don't always know what to ask about. Yet this property of internet data suggests a different approach to studying it: Researchers could probably use it to find out what to look for. A trending search term could prompt a survey question or a whole line of investigation. Trendspotting is more fun than nowcasting, anyway.

Comment by clicking here.

Leonid Bershidsky, a Bloomberg View contributor, is a Berlin-based writer.

Columnists

Toons