Data preparation is one of the biggest challenges facing companies today. A recent Harvard Business Review study reports that people spend 80% of their time cleaning and shaping data, and only 20% of their time analysing it. Customers cannot readily analyse much of their data because it is in the wrong shape or residing in disparate sources. But getting it in a useful form can be a complicated and time-consuming process, often requiring specialised skills.
To address this concern, Tableau Software recently introduced new offerings that combine new and existing analytical capabilities into packages appropriate for everyone across an organisation, regardless of skill level. With a new way to buy and scale Tableau’s analytics platform, organisations now can cost- effectively achieve a critical objective: enabling the data-driven enterprise by putting the data in the hands of everyone.
The company also launched Tableau Prep, a brand new data preparation product designed to empower people to quickly and confidently combine, shape, and clean their data, further reducing the time from data to insight. The application features a direct and visual experience for data preparation, giving customers a deeper understanding of their data, smart features to automate complex tasks, and integration with the Tableau analytical workflow for faster speed to insight.
“Businesses need to promote a data-driven culture,” said TC Gan, Senior Director for Customer Consulting at Tableau Asia Pacific. “This can be started by showing people the impact data has made to something which resonates with them. People will recognise the benefits and ask what data can do for them, and the more this happens the further the culture will spread. For this to be truly effective, it’s also important for there to be support and endorsement from senior management.”
In an email interview with Networks Asia, Gan stresses that the key is to use a modern analytics tool that is flexible enough to connect to any of these data sources without locking users into one platform.
“However, we are not using data to its full potential yet. For this to happen we need to transform in two ways: cultivate a change in mindsets and leverage the right tools and platforms.”
Excerpts of the interview follows:
We’ve had BI/BA tools for a while. They’re nothing new and neither is Big Data, so why are we still saying that businesses need more agility to find their own answers than traditional BI can offer, and IT needs to get away from being the bottleneck and move towards enabling the business? What are we still doing wrong? Are we going to have to spend more on IT? Are we being held back by legacy infrastructure and where does “modern” BI fit into IT?
TC: Traditionally, the primary objective of Business Intelligence (BI) tools was to create a top-down, single source of truth for businesses to centrally track KPIs and performance metrics. With traditional BI, these tools are in the hands of a select few – the IT or BI specialist teams. When a business function needs some analysis done, they have to specify their requirements, pass them to the IT or BI department and wait for the results in hope that these come through in the expected form and within a timeframe where the data is still relevant.
However, with the volume of data being produced now exploding, this approach sacrifices business agility. There is a significant lag between the questions being asked and the answers being provided. This type of delay results in lacklustre adoption and low overall business impact. The extra volume of data turns the IT or BI department into a bottleneck, meaning they can no longer feed business users timely information.
In order to keep up with business needs, IT professionals need to embrace the self-service BI model and enable the broader use and impact of analytics throughout their organisations.
The way to do this is not by investing more in IT, but in a modern self-service BI platform. This model has the greatest business impact because it allows data to be explored by the people who know the subject matter best, meaning users can transform their data into better insights and, ultimately, better business decisions.
We must remember, however, that the IT department will still play a vital role, as they will become the enabler – making sure that users are provided with clean, secure, governed data which is safe and useable.
Are we deriving the right value from the data we have? We are generating tons from data, whether from M2M communications, logs, customer and supplier interactions etc, and IT departments have spent time and resources building out their data infrastructure, capturing and curating this data. But are we making the right use of it all? Are we effectively bridging the gap between structured and unstructured and getting information from data to the right people when they need it?
TC: You do not have to look far to find examples of how great value is being delivered through data.
Take Grab, our largest Tableau Online customer in Asia Pacific. The ride-sharing company has used Tableau to centralise millions of rows of customer data and help the business make data-driven decisions. It is now used by data analysts, marketing teams, employees in product, research and development, as well as communications to provide a better overall user experience.
Data can also be used to drive social impact. For example, the United Nations (UN) and Tableau recently inked an unprecedented agreement that provides access to Tableau for the 44,000 United Nations staff working globally. Using Tableau, the UN will be able to use data to help expand its knowledge of fundamental issues, such as social and political instability, natural disasters and climate change, to improve decision-making and enhance its ability to predict crises.
In another example, last year the Tableau Foundation awarded its first grant in Singapore to the National Volunteer & Philanthropy Centre(NVPC). NVPC’s Research and Giving.sg teams are now using Tableau to connect and visualise governed data quickly, combine multiple views of data, gain actionable insights and share secured information with key NVPC staff for data-driven decision making on the go.
However, we are not using data to its full potential yet. For this to happen we need to transform in two ways: cultivate a change in mindsets and leverage the right tools and platforms.
Typically, businesses spend a great deal of time collecting, cleaning, moving, curating and storing data from various sources, only to lock it up. Users who want access to it are made to justify why and what they intend to do with it. This sort of mindset stifles innovation, because access might only be granted weeks later, by which time the questions they wanted to answer may not be relevant and they will have missed an opportunity.
Even with the right mindset, most businesses do not yet have access to a modern BI platform, and therefore cannot ask questions and solve business-critical problems in an agile, self-service manner. Instead they are relying on the IT or BI team for insights. Counterintuitively, there are also many organisations who actually do have access to a modern BI tool, yet are not using it properly – not querying the data themselves and relying on IT to provide insights for them, just like traditional BI.
Instead of this approach, Tableau believes it should be the people who know the subject matter who question the data. The IT or BI departments do not know what you know, so things can get lost in translation. This results in back and forth communication, which in turn leads to delays and potential missed opportunities.
Therefore, businesses need to promote a data-driven culture. This can be started by showing people the impact data has made to something which resonates with them. People will recognise the benefits and ask what data can do for them, and the more this happens the further the culture will spread. For this to be truly effective, it’s also important for there to be support and endorsement from senior management.
Are we tiering and storing data correctly and efficiently? Are we looking at data creation in the wrong way? How important has data generated from M2M communication or machine generated data become and how should we be dealing with it? In this Cloud era, with multiple endpoints, where does the onus of data management and security ultimately lie? As governments push for greater regulatory compliance, should data management best be handled by and third party or service provider?
TC: When it comes to Big Data, data strategy becomes very important. Although people always talk about the 3 Vs (Volume, Velocity and Variety) of Big Data, a proper data strategy really depends on time sensitivity, level of granularity of the data needed for your analytics, as well as achieving a balance between performance and cost.
The key is to use a modern analytics tool that is flexible enough to connect to any of these data sources without locking users into one platform.
In terms of governance, Tableau this year identified crowd-sourced governance as one of the top 10 BI trends. We are increasingly seeing more collaboration between IT and users, instead of IT defining the governance policy of the data. Under this model, depending on how the data is being used, the governance will continually be fine-tuned in very agile ways.
What is the future of data and IT? With the rise of deep learning and predictive IT, where is this “intelligence” taking us?
TC: Whilst we talk about reducing reliance on IT by scaling data across the business and putting it in the hands of everyone, IT will still play a key role.
For modern BI to work effectively, the IT department must act as an enabler, empowering business users by providing an agile analytics platform, coupled with clean, certified and governed data. IT must lead from the front, creating a community of practice and helping to drive a culture of analytics. IT professionals who embrace the opportunities that self-service analytics platforms provide will be a catalyst and deliver far greater value.
The next stage of modern BI is the age of smart analytics, powered by machine learning (ML) and artificial intelligence (AI). This type of analytics will reduce the mundane and manual work involved in working with data, but while this means computers will take much of the burden, it does not mean people will be replaced. This is because the questions and problems we are solving are complex. They require deeper thinking to make sense of the data, along with creativity and liberal arts skills to communicate insights effectively. Humans will be part of the equation for the foreseeable future – smart analytics will simply augment human intelligence, freeing people up to focus on high value analysis and decision making.