Tableau Software CEO Adam Selipsky at Tableau Conference 2016 in Austin

15 Nov 2016

Tableau – The Database, Data Prep and Data Visualisation Company?

Weelin Lim

During the Tableau Conference 2016 Vision Keynote last week, Tableau has announced that they are working on two new pieces of software: Hyper and Maestro. The news is a marked departure from their previous strategy of incorporated extending Tableau's capability to deal with database performance (extracts) and data preparation (data blending, cross-database joins, basic data manipulation).

Hyper

The first new piece of software is called Hyper, a proprietary SQL compliant database technology that was born from a PhD project. It takes a new approach to database technology with the way it tackles data storage, running queries and moving data around. It is being reported as suitable for OLTP, OLAP and, in the near future, graph databases all in one. Additionally, it has a unique approach to parallelisation, making it extremely efficient and fast. Loading of data is also supposed to be remarkably quick and efficient.

During the Tableau Vision keynote, we saw a demo of a Tableau dashboard connected directly to Hyper, running with 400m rows. It was quick and responsive. Additionally, a large amount of data was being loaded into Hyper simultaneously and each time a refresh was triggered, the data was immediately available in the dashboard with no perceivable lag time.

This is a very interesting piece of software, which, to me, looks like it would be an alternative to Vertica, Vector, Exasol and other performance-boosting database technologies which people currently consider using when faced with Tableau dashboard performance issues.

Maestro

The second new piece of software is Maestro, a visual-based data preparation and blending tool which can be used to prepare your data ready for Tableau dashboard consumption.

With Maestro, you can visually inspect the data and correct it on the fly, even with large groups of rows. You can automatically join data for lookups and, as you build up your data cleansing process, you will visually see outliers and incorrect data.

Although this wasn't shown, I imagine you will be able to save and schedule your data prep processes for automated data blending.

Maestro is powered by Hyper, uses Tableau connectivity and query engines, and the front-end is currently in development.

Maestro sits in a space where quite a number of Tableau partner vendors operate, and it will be interesting to see how Maestro affects this area. Visually, it looks quite similar to Trifacta Wrangler, as opposed to some of the more advanced and complete data preparation tools such as Alteryx.

At the conference, we saw a few more new vendors showing their data blending software, and I imagine they will need to innovate considerably to ensure they stay in this market.

Questions

A number of questions need to be answered before we really understand where Tableau is going with these new pieces of software. Some of my main questions and the answers I could glean are listed below:

How will this software be integrated with Tableau?

Newer versions of Tableau will have Hyper integrated within the engine, to make use of the speed and performance improvements. It is still unknown whether this will affect hardware specifications or not.

Although Hyper will be fully integrated with Tableau, there is no reason why it could not be used for wider purposes, as a database technology.

Maestro uses some Tableau functionality and, of course, will integrate directly with Tableau, at least as well as other third-party vendors, producing Tableau data extracts. I can imagine there would also be direct Tableau Server integration, allowing you to publish your prepared data straight to Tableau server too.

How will the software be sold?

Both new tools will be subject to licensing, but I am unsure of the model if you get Hyper as part of a new Tableau purchase. Whether this gives you the full Hyper experience, or whether you only get an integrated version of Hyper remains to be seen.

From speaking to Tableau developers, there is not a lot known about Tableau's approach, but I expect there to be separate licensing requirements for both Maestro AND Hyper. As mentioned, Maestro uses Hyper as part of its engine and newer versions of Tableau will also use an integrated Hyper engine, so current customers that pay for Tableau maintenance will receive Hyper when that version of Tableau is released.

Can the software be stand-alone, without needing Tableau?

Maestro already uses Hyper and some Tableau functionality, so there will definitely be tight integration across the toolset. However, there is no reason why these tools cannot be sold and used independently, from what I understand. Each tool could have its own use without the need for Tableau (the data visualisation tool), so could conceivably be used by a wider audience within an organisation that is running Hyper and/or Maestro.

When will they be available and how much will they cost?

There is currently no information on this at all, but expect the integrated Hyper engine to appear in Tableau before the stand-alone product. I believe Maestro should appear as a stand-alone product sometime in 2017.

Weelin Lim

About the author

Head of Business Intelligence at Concentra. I am a Business Intelligence and Data Warehouse specialist with over 15 years’ experience in a variety of industries. I enjoy visualising freely available Government data and as a parent, I am particularly interested in Education data right now. My blog articles will (hopefully) be a mixture of business and technical experiences, as well as visualisations of interest to the wider public.

Related to Tableau – The Database, Data Prep and Data Visualisation Company?

Subscribe to our newsletter

Get in touch
Get the edge

Go to webform

Subscribe to our newsletter

Get in touch
Get the edge

Go to webform