• Beyond Data
  • Posts
  • BD #5 - How to Fix Underperforming Data Teams

BD #5 - How to Fix Underperforming Data Teams

The 3 immediate changes I make when helping a new team

Prompt: Robot buried in waste, cartoon (made with Midjourney)

Prompt: Robot buried in waste, cartoon (made with Midjourney)

Introducing free stuff! Take a look at the new referral program - as a way of thanks you can now earn rewards for referring the newsletter to other people. Refer enough people and you'll receive everything from exclusive guides and content to branded T-shirts and private community access.

We're really keen to share these insights with as many people and businesses as possible. Check the link at the bottom of this post for more details 👇.

The pain of underperformance

One of the most painful things to deal with when working either in or with a data team is underperformance. Often, it takes a lot of discussions and planning to get a data team off the ground. It's a serious investment. But data and analytics is hard.

Sometimes it's the team's lack of engagement or mismatch of skills.

More often it's due to the environment they're operating in.

  • Lack of engagement from senior stakeholders

  • Limited domain knowledge or support

  • Complex data landscapes with no central ownership or governance

  • Appetite for complicated and complex solutions across the organisation

  • Ad hoc and bespoke delivery processes

Even if you're met with every point on this list there are three things I always prioritise in the early engagements that get quick results. The benefits of these stack - leading to a snowballing effect on successful deliveries.

Start with deeply understanding the stakeholders

Nothing in data and analytics is more important than deep domain and context knowledge.

Nothing.

Genius data professionals will spend a lot of time and effort coming up with brilliant solutions to the most headache-inducing problems that experienced stakeholders know to ignore or shortcut. Learning these traps and the shortcuts around them takes years of experience.

For some teams, this might already exist within the technical team - this is a very lucky position to be in. For many others, you'll have to outsource this. Even if you're both the data professional and the domain expert answer these questions:

Who is going to be using this solution? How?

If that person isn't you - reframe all of your development and focus on their input. Find out what they want from the solution and how it will help them. You'll start to notice many requirements come to you that either wasn't described properly or try to solve something that's easily done elsewhere, using something they're not familiar with.

Now we do want to avoid getting stuck in "this is how we've always done it" arguments that throw up barriers to progress, which can be a danger in and of itself when dealing with experts. Just ensure you are regularly reaching out and listening to those that know the data and how it's used the best.

Copy simple approaches to get early feedback

Once we understand the stakeholder really well you then want to build the simplest possible solution that meets their requirements.

Many of us, being techies, get excited by the technology and want to deploy state-of-the-art solutions to deliver the best possible results. The challenge with this is they're often quite difficult to deliver and come with a host of challenges to maintain and support.

A hidden danger of this approach also leads to end-users and new team members having a much harder time understanding what's been done and how it works. This can either slow down onboarding new users/team members or even go so far as to erode trust.

If you start with the goal of delivering the smallest, simplest solution possible you eradicate the risk of unnecessary complexity. You may need to adapt and change this solution over time but you'll have strong benchmarks and safety nets to fall back on when other approaches fail later on in your development.

This allows you to deliver smaller things, more often which leads to more feedback, which leads to better solutions.

Build pipelines and components that are easily deployed

It's also common to find a team that does all of the above but just gets weighed down with inefficiencies. This might be wrangling data access from multiple external functions, fighting against a messy data architecture, or having to manually hand-hold their end-users every time they need to use the solution.

In this case, you need to start chipping away at the low-value stuff.

Take any part of the painful delivery process and begin to build templates and repeatable processes that reduce that pain and can be scaled across the team to every solution. This might be as simple as writing guidelines and checklists that ensure this can be done consistently by any team member but the ultimate goal is building automated tools that handle this for you.

Again, start small with this or you can run afoul of premature optimisation or get lost in designing solution implementation tools as opposed to actual solutions for your end users.

Once you've been doing this for a while you'll have a suite of tools, code, documents, and helper services that accelerate your implementation and allow you to bake in quality and scalability.

Final thoughts

It can be difficult to know where to start when a data team isn't delivering to their potential. I've found the above three focus areas to be the best start. Each of them builds towards better all round practices and continual improvement of the delivery machine.

Good luck,

Adam

When you're ready, there are a few ways I can help you or your organisation:

  • Sponsor this newsletter: to see sponsorship options here to discuss reaching 2000+ data and AI leaders that subscribe to this newsletter.

  • Consultancy: for help with your data and analytics initiatives, get in touch via email to [email protected]

  • Free, daily insights: on LinkedIn here or Twitter here

Join the conversation

or to participate.