Skip to main content

A business case that improves data quality

·2 mins

Many data engineering teams spend a lot of their time struggling to deal with upstream data. That includes:

  • Implementing pipelines to “clean” data
  • Responding to unexpected schema changes
  • Trying to find the owner of the data to give them the business context they need

This is high effort, low value work.

Worse, it affects your ability to deliver higher value work.

If you want this to change you need to:

  1. Make this effort (and cost) visible
  2. Propose how to reduce that cost

You can do this by collecting data on how often these things affect your work, what are the most common root causes, and how much time it costs your team.

Once you know the time, you can convert that to dollars (e.g. hours * avg hourly wage of your team).

And once you have dollars you can make a business case, suggest some investment which can be made that will reduce those costs.

That business case could be to get the upstream team to do a bit of work (such as adopting data contracts) to save a greater amount of work for your team (such as responding to unexpected schema changes).

That’s a business case any CTO/CFO/CEO would want to approve.

A business case that improves data quality.


P.S. this was inspired by one of the many great discussions from my workshop in Utrecht yesterday. If you’re interested in attending a workshop or would like to have discussions just like this with me directly please feel free to reply to this email. I respond to every email :)

Daily data contracts tips

Get tips like this in your inbox, every day!

Give me a minute or two a day and I’ll show you how to transform your organisations data with data contracts.

    (Don’t worry—I hate spam, too, and I’ll NEVER share your email address with anyone!)

    Andrew Jones
    Author
    Andrew Jones
    I build data platforms that reduce risk and drive revenue. Guaranteed, with data contracts.