Business leaders are aware by now that big data presents transformational business opportunities. Organisations that have already embarked on big data projects may have discovered that, like most transformational technologies, big data also presents challenges. The key to overcoming these challenges and delivering a data project with strong results is to be aware of the potential pain points before the project starts, then taking appropriate steps to avoid or at least mitigate them.

There are many types of big data projects and each has its own challenges and opportunities. For example, an IOT sensor-based project measuring gas flow and preventive maintenance analysis on an offshore platform has different challenges than a marketing analytics solution looking at next best offer or churn analysis for a financial services company.

Insight discovery requires different skills, processes and technology than operational applications, for example. Many large enterprises are likely to have a mix of these scenarios: true data science use cases trying to drive new insights; and production operations dealing with the current external world.

Embarking on a big data journey therefore requires a number of considerations and strategies:

  1. Understand what problem or opportunity you are trying to solve

    The starting point for any data project should be a well-defined and structured strategy, just like any other good opportunity. To do this, an organisation typically needs to assess what data assets it has, what is happening in and across industry, what insight or improved decision it would like to derive from its data, and what the size of the prize is.

    It’s very easy to expend a significant amount of data science effort analysing the wrong problem or answering the wrong question.

    Once the problem or opportunity is understood, a target state needs to be defined, which looks at people, process, technology, and data. Data governance always plays an important role, given the legal and regulatory rules around data sharing and management. The target state needs to be accompanied by a well-defined implementation road map and business case.

  2. Be agile and experiment

    Cloud technology and agile development methods mean organisations can easily experiment and fail fast. It’s important to spend time investing in spikes, research projects, and proofs of concept to test new ideas, and try out new technologies and techniques. The rate of change of big data and analytics technologies means that there are many opportunities to do things quicker, easier, smarter, and at a lower cost. It can also be overwhelming given the rate of innovation from the likes of Amazon Web Services and the Apache Open Source community.

    To ensure maximum results for your investment, you need to cap the effort, make sure the right people are working on the initiative, and be clear up-front about the objectives, outcomes, and what success looks like. It’s important to quickly stop initiatives that are unlikely to give a positive outcome.

  3. Cloud first

    Big data and analytics projects are closely tied to cloud platforms, whether hybrid or fully public. Big data analytics projects require the capability to process huge volumes of data on scalable and agile infrastructure. Trying to do this using traditional infrastructure approaches is not economically sensible or agile enough.

    Using cloud platforms lets businesses leverage the billions of dollars that big public cloud providers are investing rather than devote their own resources. Trying to match the capabilities being commoditised by AWS and others is too slow, and it can be difficult to find the right skills in-house.

  4. Partnering for success

    Today, it’s rare for any company to have all the answers, all the expertise, or all the data. Big data and analytics projects present huge opportunities as well as challenges.

    The skills required to derive actionable insight from data typically require a range of capabilities including data analysis and domain expertise, data engineering skills to wrangle the data into a form to support analysis, and data science expertise to develop the algorithms to create the insight. All these stages require different functional and technical skills with an overarching layer of governance, security, and control. There is huge demand for these skills.

Many organisations are looking to augment their operational data with interesting and unique third-party data, which improves the overall value of the analysis or even creates completely new and innovative business opportunities. This turns the sharing economy into the data sharing economy.