- At its annual user conference, Tableau rolled out several data prep and management capabilities, highlighted by the ability for Tableau Prep Builder users to write out to third-party data stores, such as the popular Snowflake solution.
- This, along with several ongoing cloud and database initiatives, marks a significant philosophical shift for the vendor away from pure analytics and toward a more complete solution to help buyers establish a company-wide data culture.
For a company powered by analytics, Tableau put very few on display during its annual user conference in Las Vegas last week. Certainly, there were numerous stats to be found, particularly relating to the adoption of Tableau Online, where there are now 15,000+ active customer accounts. What’s more, Tableau Online is maintaining 100% YoY growth, as reported by CEO Adam Selipsky – a fitting fact given that Tableau and its new, cloud-first parent company, Salesforce, are now free to talk integration and rationalization.
And yet, beyond these few simple facts, there was only one analytical insight to make the main stage, only one ‘idea’ that mattered, namely that enterprise buyers are struggling to get a handle on their data. In support of this, Mr. Selipsky (and many executives throughout the conference) cited a McKinsey study showing that only 8% are achieving analytics at scale. Why are companies failing to put data to use?
Tableau believes the problem lies with people more than with technology. For Tableau, to succeed, companies don’t need shiny new chart types, more data connectors, and AI-infused recommendations (though all of those are important differentiators). Rather, companies must establish what Tableau refers to as a ‘data culture,’ where trust, commitment, talent, sharing, and mindset are addressed in equal measure. This may sound like a lot to unpack, but it’s quite straightforward when viewed through the lens of Tableau’s relatively new, human-oriented (e.g., not a product) offering, Tableau Blueprint.
Emerging from Tableau’s internal professional services group in June 2019, this step-by-step guide to building a data culture works a bit like a subway map with three major analytics destinations: agility, proficiency, and community. It basically codifies tribal knowledge specific to deploying, learning, and sharing analytics at scale. Though new, this holistic view of both data and analytics is already making an impact with Tableau customers, helping one of the company’s larger customers to streamline its administrative operations by approximately 98%.
So, what’s a data visualization vendor like Tableau doing in prioritizing the entire data and analytics lifecycle? Is Tableau turning into a data infrastructure player? The short answer is yes, mostly.
Over the last year, the company has been building up a solid data preparation capability with Tableau Prep Builder and Tableau Prep Conductor, which work together to allow users to locate, refine, curate, and share data flows at scale. And this summer, Tableau took another step into the data management waters with the introduction of Tableau Catalog, a graph database that automatically ingests and ‘catalogs’ all metadata and analytics resources (data sources, dashboards, etc.), revealing the many relationships (e.g., dependencies) between those resources.
These solutions of course prioritize data sources and analytics assets found predominantly within Tableau environments, leaving plenty of room for Tableau’s sizable partner ecosystem to operate. Partners like Alation (data catalog) and Alteryx (data preparation) don’t need to worry about Tableau undermining their value proposition. With Tableau Catalog, for example, Tableau worked with Alation and Collibra early on to allow for the efficient exchange of metadata.
Still, the vendor’s investments in data preparation and management are increasing and signify that Tableau is no longer content being seen as a visualization overlay, merely augmenting existing BI platforms. This is most apparent in how the company is evolving its in-memory data engine, Hyper. Previously, the company preferred to pull extracts from external data sources (everything from spreadsheets to Hadoop Clusters) into Hyper for data modeling and processing. The company is now, however, seeking to work more directly with those external sources.
At this year’s show, Tableau announced the ability for Tableau Prep Builder users to write directly to the popular data store, Snowflake (note that more destinations will follow). More importantly, Tableau has recently opened up a read-write API for Hyper itself, which will allow the engine to move from batch excerpts to a more direct relationship with external data sources. This is a big deal, as the data in Tableau is no longer proprietary to Tableau. To illustrate, Tableau rival Qlik could actually read data direct from Hyper.
Over the long term, opening up Hyper could create some very interesting scenarios for Tableau in support of its efforts to run across a wide array of hyperscale cloud platforms. Already the vendor supports AWS (a huge partner), Microsoft, and Google; and it announced support for Alibaba this September. Obviously, customers will also see Tableau software well-represented on the Salesforce platform in the near future.
Certainly there are many questions regarding Tableau’s future role on the Salesforce platform and stature opposite Salesforce’s competing Einstein Analytics offering. So far, the two companies are pledging long-term openness, compatibility, and coexistence. Regardless, in building a more robust data preparation and management portfolio, by writing out to existing data stores, and by opening up Hyper as a data store in its own right, Tableau is slowly remaking itself into a more complete solution upon which enterprise buyers can establish a company-wide data culture.