Operationalizing Data & Machine Learning
Everyone will tell you, Artificial Intelligence and Machine Learning are the next Big Thing. Data scientists job satisfaction and salaries are both at an all time high, and companies routinely quote data science as their biggest strategic priority and area of investment. So, where is my infallible and intelligent personal assistant yet? Successful applications that make our everyday life better don’t seem to quite match this hype and effort, in both abundance and quality.
Certainly, AI is hard: more advanced algorithms, better modelling of the data, better engineering approaches take time, effort, knowledge. But the reality of industrial AI is that success mostly relies on functions that have nothing to do with sophisticated algorithms, and all to do with product operationalisation. Operationalisation ranges from defining what problem to solve (market understanding, customer workflows, product requirements) to verifying that what was built actually solves it (quality assurance, product analytics, customer success) to ensuring that the result is a viable product (business intelligence, marketing and operations).
The requirements of success are classic but, in the case of AI, more complex to implement and requiring a higher level of data literacy and operational maturity from the implementing organizations. Let’s explore together some of the specifics of data operationalization, through the lens of how various organizations have driven the development of their Machine Learning platforms, and attempt to extract some general principles (lean, shift left, culture shift) and practices (devops and dataops, measure everything) on how to successfully operationalize data & ML.