Here are some of the most critical trends to bear in mind looking ahead to a new year, and even beyond.
Don’t fear AI: How machine learning will enhance the analyst
Popular culture is fueling a dystopian view of what artificial intelligence can do. But while research and technology continue to improve, machine learning is rapidly becoming a valuable supplement for the analyst, providing assistance and driving efficiency. By automating simple, yet, labour intensive tasks like basic math, analysts gain time to think strategically about the business implications of their analysis and plan for next steps. Secondly, it helps the analyst stay in the flow of their data. Without stopping to crunch numbers, analysts can ask the next questions to drill deeper. Machine learning’s potential to aid an analyst is undeniable, but it’s critical to recognize that it should be embraced when there are clearly defined outcomes. While there might be concern over being replaced, machine learning will supercharge analysts and make them more precise and impactful to the business.
The promise of Natural Language Processing (NLP)
Gartner predicts that by 2020, 50 percent of analytical queries will be generated via search, natural language processing (NLP), or voice. NLP will empower people to ask more nuanced questions of data and receive relevant answers that lead to better insights and decisions. Simultaneously, developers and engineers will make greater strides in exploring how people use NLP by examining how people ask questions – from instant gratification to exploration. The biggest analytic gains will come from tackling this ambiguity and understanding the diverse workflows that NLP can augment. The opportunity will arise not from placing NLP in every situation, but making it available in the right workflows so it becomes second nature to those using it.
The future of Data Governance is crowd sourced
It’s an understatement to say self-service analytics has disrupted business intelligence, and the same
disruption is happening with governance. As self-service analytics expands, a funnel of valuable perspectives and information inspires new and innovative ways to implement governance. Governance is as much about using the wisdom of the crowd to get the right data to the right person as it is locking down data from the wrong person. BI and analytics strategies will embrace the modern governance model in
2018: IT departments and data engineers will curate and prepare trusted data sources, and with self- service going mainstream, end users will be free to explore trusted, secure data.
The debate for Multi-Cloud rages on
According to Gartner, “a multi-cloud strategy will become the common strategy for 70 percent of enterprises by 2019.” As enterprises grow increasingly wary about being tied to a single legacy solution, evaluating and implementing a multi-cloud environment can determine who provides the best performance and support for each situation. However, while flexibility is a plus, this approach increases overhead cost by splitting workloads across providers and forcing internal developers to learn multiple platforms. With multi-cloud adoption on the rise, organizations must assess their strategy and measure adoption, internal usage, workload demands and implementation costs for each platform.
Rise of the Chief Data Officer
Data and analytics are becoming core to every organization. But in some cases, a gap forms between a CIO and the business while battling security and governance versus speed to insight. With that, the C- Suite is becoming more accountable for creating a culture of analytics. For many, the answer is appointing a Chief Data Officer (CDO) or Chief Analytics Officer (CAO) to lead business process change, overcome cultural barriers, and communicate the value of analytics at all levels. The role of the CDO/CAO is outcome- focused and they ensure there are proactive C-level conversations happening about how to develop an analytics strategy from the get-go.
The Location of Things will drive IoT innovation
As a subcategory of IoT, the “location of things,” covers devices that sense and communicate their geographic position. Capturing this data allows users to consider the added context of a device’s location when assessing activity and usage patterns. This technology can be used to track assets, people and even interact with mobile devices like smartwatches or badges to provide more personalized experiences. As it relates to data analysis, location-based figures can be viewed as an input versus an output of results. If the data is available, analysts can incorporate this information to better understand what is happening, where it is happening, and what they should expect to happen.
Vulnerability leads to a rise in data insurance
According to a 2017 study by IBM and the Ponemon Institute, the average cost of a data breach incurred by Indian companies reached INR 110 million this year. For many companies, data is a critical business asset. As we have seen with recent and prominent data breaches, a threat to a company’s data can be crippling, causing irreparable brand damage. Data as a commodity means its value will only increase, and ultimately, drive new questions and conversations around how this raw material will propel companies to greater heights and advantages. And like any product, what good is it if it can be pilfered without consequence? Look for companies to wisely invest in cybersecurity insurance to make sure this asset is protected.
Increased prominence of the Data Engineer role
Data engineers will continue to be an integral part of an organization’s movement to use data to make better decisions about their business. As of November 2017, there are over 1,700 open positions in India with “data engineer” in the title on LinkedIn, indicating the growing and continued demand for this specialty. Data engineers are responsible for extracting data from the foundational systems of the business in a way that can be used and leveraged to make insights and decisions. As the rate of data and storage capacity increases, someone with deep technical knowledge of the systems, architecture, and the ability to understand what the business wants and needs becomes more crucial.
The human impact of liberal arts in the analytics industry
With technology platforms becoming easier to use, the focus on tech specialties decreases. Everyone can play with data without needing the deep technical skills once required. This is where people with broader skills, including the liberal arts, come into the fold. They can drive impact where industries and organizations have a data worker shortage. An increased focus and prioritization of data analytics will also put these data stewards in the position of helping their companies gain a competitive advantage. And, as analytics evolves to capture both art and science, the focus will shift from simply delivering the data to crafting data-driven stories that influence decisions.
Universities double-down on data science and analytics programmes
At the 2017 Big Data & Analytics Summit, Nasscom identified six areas of specialization in the big data analytics domain. Business analysts, solution architects, data integrators, data architects, data analysts and data scientists are expected to be key to the IT sector’s growth. With companies embracing a data- driven approach to decision-making across all functions, organizations are in dire need of professionals with data science and analytics skills. How are top universities responding? Leading institutes like IIM Bengaluru, IIM Calcutta, IIT Kharagpur and IMT Ghaziabad have developed robust programmes in analytics.
(Views expressed in this article are of Anand Ekambaram, Country Head, Tableau India)