Making Data Great Again - TC16
The Tableau Conference 2016 opened with newly appointed CEO, Adam Selipsky, taking the stage to welcome over 13,000 Tableau enthusiasts to Austin. Adam briefly spoke about the importance of data-driven decision making and the cultural aspects of a data-driven organisation before handing over to Christian Chabot, who took us back in time to when he co-founded Tableau 14 years ago. Tableau has certainly come a long way in that time, but Christian insisted 'we are just getting started'. To prove how much is to come, Christian invited 5 of his senior management team to showcase their vision throughout 5 core areas.
1. Visual Analytics
A whole range of new features was shown to the crowd which will make Tableau an even more powerful tool to help people explore their data in the way that suits them best. The vision for Visual Analytics orientated around three subjects:
- Instant Analysis: Looks like the tooltip functionality in Tableau will get a massive renovation with a whole raft of new information being produced within them to help the user with context and speed of thought analysis. Furthermore, selecting a number of marks on the screen will display prebuilt dashboards to help the user to test hypotheses and analyse patterns in their data.
- Time and Space: Mapping is due to be boosted by the ability to layer geographical dimensions from different data sources. This allows users to quickly build visualisations that can, for example, plot hospital locations from one source with patient locations from another source all on one map. Working with dates is also a key focus with new drag and drop features which allows fast indexing and comparisons between time periods.
- Natural Language: Tableau's vision is to enable their users to ask questions from their data in the same way that people would search in Google. They are leveraging deep language algorithms that allow common phrases to be used without referring to field names within the data or construct a sentence that matches a pattern that Tableau expects. For example, you could ask 'what are the most expensive houses in London last summer' and Tableau will look at relevant fields (Date, Price Paid, and Location) to serve an answer up!
2. Data Engine
The second core area saw Tableau unveil their new in-memory data engine – Hyper. Hyper further enhances Tableau's capability to generate answers faster, with fresher data. We were first shown a live demo with 400m+ rows of data, and it was rapid – ad-hoc querying a large data source and rendering the results at the speed of thought – it was scintillating!
However, Hyper does not stop there – as well as being heavily performing for data analysis, it will also offer two other massive benefits compared to a Tableau Data Extract (TDE):
- Ingestion Speeds: We saw the ability of Hyper to its potential when a command was run to fetch new data – millions of rows being ingested within seconds! It was noted that Tableau Data Extracts can be slow, and Hyper will target those who rely on large TDEs to complete their analysis.
- 'Live Querying': While data is not streamed into Tableau, Hyper does allow analysts and consumers to analyse newly available data, even while the data is still being ingested, unlike a TDE where you will only see the data once the refresh process has been completed. This really does allow users to work with the data at its 'freshest' making data-driven decisions at their earliest possible convenience.
3. Data Management
Two main themes run throughout the data management segment of the Vision keynote, both of which will allow Tableau to be the true tool of choice for many organisations, regardless of their size and reach.
- Governance: Tableau are looking to introduce 'Certified' data sets to push users towards using gold standard data within their organisation. It is acknowledged that users can sometimes be unsure which Tableau Data Source (TDS) should be used, so Tableau aim to make it easy for users to find the data that has been certified by the owner. A whole new interface for the data sources is also in the pipeline, with the ability for data connectors to suggest new calculations to be added to the certified data source with a simple click of the mouse.
- Data Preparation: Tableau lifted the curtain on 'Project Maestro' – a brand new product that allows analysts to profile, join, and clean their data in a simple to use interface before serving the end product to Tableau to visualise. Maestro continues the Tableau theme on clean user interfaces that are intuitive and easy to use. Maestro will eradicate many of the data cleansing that analysts may solve by writing complex, logical calculations and may stop them turning to some third party tools for data preparation.
Tableau referred to their 4th core area as 'Cloud'. However, this was slightly misleading as the content of this section was far broader than Tableau Online (hosted Tableau Server) or Tableau installed on a cloud service (Azure, AWS, etc.)
- Live Query Agent: Improvements are being made to the way data is refreshed in Tableau Online with the invention of the Live Query Agent. Unlike the Sync tool, the Query Agent will allow users of Online to pull data refreshes to Tableau Online, giving greater control over the freshness of their data.
- Prebuilt Content: Tableau already connects to standard cloud-based data sources – Salesforce, Marketo, Eloqua and QuickBooks and allows the users to join tables and create visualisations. However, Tableau are now looking to leverage the way in which these databases store data in a structured, standardised manner by creating pre-built dashboards, allowing analysts to quickly get a view of their data. This prebuilt content connects and joins relevant tables in the cloud database and offers some out of the box visualisations – which analysts can then tailor to their needs. Furthermore, if a user connects to two databases, then Tableau will leverage their cross database join functionality (released in v.10) to offer a single integrated data source to work within Tableau.
- Simplicity and flexibility: Tableau Server administration is being simplified further, allowing Server Administrators to understand how close to capacity that their server may be running at with simple on-screen recommendations on how to remedy any issues. Tableau are also aiming to have no downtime when certain changes are made to the Server setup – for example, you will not have to restart the server when you add more processing power. Integration between Desktop and server is set to grow stronger by truly connecting the products. Tableau also announced that Tableau will now work on Linux, which will broaden the base which Tableau can work on.
The final pillar of the Tableau vision linked back to Adam's opening remarks – focussing on building a culture of analytics within organisations. It is clear that Tableau know that people interacting with their tool on a day-to-day basis is the key to embedding analytics within the organisation.
- Recommendations: Tableau are keen to ensure that people can leverage the work that their colleagues have already done. They are boosting this by building an engine that can recommend data sources and workbooks that may be of interest to Tableau Users. Tableau is looking to analyse the dimensions and measures that a user has on their visualisation, and then suggest published workbooks that may be similar – allowing a user to quickly identify workbooks of interest.
- Discussions: A key part of collaboration is conversations that are data-driven – allowing people to make data-driven decisions. Tableau are developing a new way of having data-driven discussions that allow you to talk directly to other people within your organisation, while still keeping the context in the data.
- Notifications: Tableau have been moving towards data-driven notifications for a while with the increasing functionality of the subscriptions. However, Tableau are looking to take this to the next level. Soon, a user will be able to define metrics and thresholds and receive an email or a text message when the measure falls outside of the defined threshold. We will be looking to further describe how you can get some data-driven notifications in Tableau 10.1 shortly on this blog!
- Metrics: Many Tableau dashboard consumers have metrics that are completely cross-functional and come from many different data sources. This can sometimes mean the metrics that an individual needs to see at a glance can often be in disparate workbooks, spread across a server in projects and sites. A new function looks to allow users to choose which of the metrics on their dashboards are important to them, and pin these to their personal space on the server, giving them a quick glance of how the organisation is performing for the areas that are important to them.
- Sandboxes: Individuals and teams could be given access to their own personal sandboxes, giving them a safe place for them to play with their data before promoting their work to the rest of the organisation. The sandbox environment(s) are intended to encourage users to truly explore their data without fear of other people making decisions from an incomplete or unverified analysis.
It's worth noting that all of the features we list here are part of the Tableau vision, but there are no confirmed release dates for the features. However, many of these features were demos live on stage and 2017 looks to be an exciting year for Tableau fans!
Stay current with more Tableau updates and tips & tricks with our blog!