Keeping Up With Data #80

Source: https://medium.com/news-uk-technology/the-0-1-done-strategy-for-data-science-3c1737de14b3

Human middleware — a term brought to my attention by Lauren Balik’s recent LI post — is about new roles needed to make things work better. She argues that modern data stack is creating a lot of these roles, which are handling complexities not managed by the individual tools. And that it obviously has a meaningful financial impact on the buyers of these tools. Not taking a strategic approach to data often leads to complicated data infrastructures, requiring many new tools to manage the complexities created, as well as new roles, all while the business impact of data operations stalls. There are obvious benefits of many of the modern data tools. But as always, even when it comes to data, we should be value-driven, not technology-driven.

Outcomes, not outputs — a theme of this weeks list.

  • Do you really have a data strategy? With the growing importance of data, many companies are developing their data strategies. Sometimes, it’s a well thought out document, sometimes a vague vision, sometime a collection of projects. The article asks a key question: “How can we be sure to have a robust data strategy?” The proposed answer lies in looking into five traits of a robust strategy. A solid strategy is making clear and visible decisions (mainly about what business problems to be addressed with data), shows the way to achieve the set goals, proposes the sequence of actions (with timeline), and ultimately — articulates what are the benefits (not only costs and resources) of the execution of the strategy. (Luca Condosta (PhD) @ TDS)
  • Winning With Data: Say No To Insights, Yes To Out-of-sights! Actionable insights — I remember that mantra from a job I held ten years ago. Avinash shares that in his experience, what analysts present as insights are often things in sight. Things everyone can see (like sales are declining). The true insights, or out-of-sights, should be novel, actionable, credible, and relative (so that there’s no doubt as magnitude or urgency). Where to look for out-of-sights? One needs purpose — typically driven by key business questions. The focus needs to be on outcomes, not outputs. It doesn’t matter how many dashboards, analyses, and models are produced. Data and analytics should be judged by their impact. (Occam’s Razor by Avinash Kaushik)
  • The “0 / 1 / Done” Strategy for Data Science: A cryptical heading stands for “0-day Handovers, 1-day Prototyping and to declare projects as Done when Completely Done” and it’s about recommendations to achieve operational excellence in applied data science delivery. What stands out to me is the practice of only declaring data science projects done when they are completely done. Too often (still!) I see data scientists getting some data, putting few lines of codes together, training a first model, writing an extra script to score new data with the model, and calling it a finished project. That’s just asking for trouble. Again, it’s not about the number of projects finished, it’s about their impact on business. (Marios Perrakis @ News UK Technology)

Remember: outcomes, not outputs.

In case you missed the last week’s issue of Keeping up with data

Thanks for reading!

Please feel free to share your thoughts or reading tips in the comments.

Follow me on Medium, LinkedIn and Twitter.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store