Previous Submission

Too Much Data Strikes a Blow to Tableau

Next Submission

Tableau has historically been a leader in data visualization, but latency issues stemming from cloud deployment could hinder its ability to handle big data analytics.

All the data in the world is useless unless someone can analyze it. Tableau has led the charge in helping companies derive richer insights from the increasingly complex data sets generated by digitization. Tableau allows businesses’ internal teams to import data sets and use easy drag-and-drop features to create graphs, charts and tables. Historically, customers kept their proprietary data on their own physical servers (“on prem”), and Tableau’s software was installed similarly in order to improve connectivity.[1] As companies have begun to move away from capital-intensive server space into scalable, subscription cloud servers, Tableau has updated its operating model to include a cloud-based offering as well.[2]

By 2020, depending on whom you ask, there will be 20-30 billion connected devices comprising the “Internet of Things.” [3] These devices will be generating exponentially more data than current sources are today, allowing businesses to better understand their customers through deeper and more robust analytics. In some cases, this data will be quite easy to understand – if my smart refrigerator tells Amazon I’m running low on eggs, it wouldn’t take a marketing genius to send me a reminder to buy eggs or automatically replenish my eggs through a subscription. However, in areas like the complex digital marketing sphere, deriving customer insights amidst the sheer volume of noisy, real-time data (e.g., location data, health data, routine data, preference data) will be increasingly difficult and will require business intelligence tools to adapt.

Tableau’s memory banks are not equipped to store big data, so the software has sufficient connectivity to source data from cloud servers such as Amazon Web Services or Microsoft Azure[4] – 70% of Tableau’s connections are to cloud-deployed data sources.[5] However, the use of cloud-based data storage rather than on prem adds two layers of latency to Tableau’s analytics: first, the latency inherent in any Internet-based connection beyond a hardwired local connection – or “cloud service latency”[6]; second, the required translation of data between Tableau’s software and the non-native storage supplier.

These two issues align quite well with the effect referred to as “Data Gravity.” In a blog post coining the phrase, Dave McCrory states: “As data accumulates (builds mass) there is a greater likelihood that additional services and applications will be attracted to this data. This is the same effect gravity has on objects around a planet. As the mass or density increases, so does the strength of gravitational pull.” [7] The effect of the Internet of Things on business intelligence tools feeds directly into this framework – as exponentially more data is fed into companies’ databases, the “mass” of the data stored on cloud servers increases. As a result, Tableau’s business model and position as a non-integrated analytics layer will become increasingly weak whether it is deployed on prem or in the cloud. As the volume of data increases and real-time analytics for techniques such as location-based targeting require more and more pulls, the latency will add up.

The Data Gravity effect would imply that applications and services are headed towards the location of data storage in order to reduce that latency and increase throughput. Since data is increasingly being stored in Amazon’s and Microsoft’s clouds, it would make sense to build applications and services as close to those as possible. Lo and behold – they are doing just that. Amazon’s QuickSight and Microsoft’s PowerBI are business intelligence tools aimed specifically at big data analytics being run on their respective clouds.[8] Microsoft’s PowerBI is already on par with Tableau on Gartner’s Magic Quadrant, beating it on completeness of vision.[9] While these tools are more likely to solve some of the latency and throughput issues, they serve the double benefit of making their cloud storage customers stickier.

To eliminate these fundamental business model issues, Tableau would have to offer its own big data storage capabilities. This would be extremely capital intensive and they could never hope to overthrow the big players. If Tableau were able to partner even more closely with a cloud data provider and run more directly on the data, it could improve some of the integration issues. However, given Micosoft and Amazon’s existing business intelligence products, it seems unlikely they would agree to fully partner unless Tableau’s analytics were truly differentiated. The last option is simply to accept defeat in high-volume, cloud-based big data analytics and focus on enterprise customers that are more likely to remain on prem due to data security concerns (Healthcare, Financial Services) and continue cloud connectivity with customers with lower volumes of data. This path still relies on Tableau having superior analytics to competition, which is difficult, but possible. Given they just replaced their CEO with Amazon Web Services veteran Adam Selipsky, it is not clear which path they aim to choose.[10]

(794 Words)

[1] Tableau. 2015 Annual Report. 2016. Web. Accessed November 2016. [Link]

[2] ibid

[3] “Ericsson Mobility Report.” Ericsson. November 2015. Accessed November 2016. [Link]; “Gartner Says 6.4 Billion Connected ‘Things’ Will Be Used In 2016, Up 30% From 2015.” Web. November 2015. Accessed November 2016. [Link]; “The Internet of Things: Sizing up the Opportunity.” December 2014. Web. Accessed November 2016. [Link]

[4] “Allow Direct Connections to Data Hosted on a Cloud Platform.” Web. Accessed November 2016. [Link]

[5] “The Cloud Data Brief: Big Data Transitions to the Cloud.” Web. Accessed November 2016. [Link]

[6] “What Is Cloud Service Latency?” July 2015. Web. Accessed November 2016. [Link]

[7] “Data Gravity: In the Clouds.” December 2010. Web. Accessed November 2016. [Link]

[8] “Amazon QuickSight – Fast and Easy to Use Business Intelligence for Big Data.” October 2015. Web. Accessed November 2016. [Link]

[9] “Magic Quadrant for Business Intelligence and Analytics Platforms.” February 2016. Web. Accessed November 2016. [Link]

[10] “Tableau Appoints Adam Selipsky as New CEO.” August 22, 2016. Web. Accessed November 2016. [Link]

2 thoughts on “Too Much Data Strikes a Blow to Tableau

  1. This was a very interesting synopsis of where the value in big data analytics is heading in the years to come. I took a closer look at McCrory’s blog that you referred to in the post, and it was really interesting to see yet another example of a major corporation that is aiming to move closer to the epicenter of big data– Their strategic construction of and Dreamforce plays right into this phenomenon of ‘data gravity’ that we have seen from Amazon Web Services and Microsoft Azure. What I found most fascinating about this discussion was that a company like Tableau could be positioned within one of the most dynamic growth sectors in the world [1]– yet, face existential risks stemming from how it relates to the value chain within the big data analytics space.

    [1] McKinsey Report

  2. Truly interesting topic and well-written article Michael. In particular I’m sincerely drawn to how firms are now competing to host data of third parties in order to leverage the network effects of ‘data-gravity’ [1]. It appears as though Google, IBM, and Microsoft have been luring companies away from Amazon – with Apple and Spotify recent converts for Google. With increasingly analogous tools and analytics platforms, how do you believe that these and other firms will begin to differentiate themselves in the space?


Leave a comment