Ep. 248: Connie Siu - The Importance of a Data Driven Culture

< Intro >

– Welcome to another
insightful episode of Count Me In.

Today, we're delving into the topic
of building a data-driven culture

with our esteemed guest, Connie Siu,

President of CDC Synectics Incorporated,

and an accomplished author.

Join us, as Connie shares her expertise

on essential elements of data-driven
culture within an organization,

and the significant impact it has
on today's business environment.

Stay tuned, as we explore
key challenges faced

during the transition,
and gain valuable insights

on assessing the effectiveness
of a data-driven culture.

This episode promises
to offer valuable insights,

into the power of data-driven

decision-making in shaping
organizational cultures

and driving business success.

Let's get started.

< Music >

Well, Connie, we want
to thank you so much

for coming back on
the Count Me In podcast.

And, today, we're going to be
talking about data-driven culture

and what that means.

And, so, maybe, we can start off,
you can elaborate what constitutes

having a data-driven culture
within an organization,

and why is it essential,
especially, in business today?

– That's a great start, Adam.

Data-driven culture is the
consistent values and beliefs

in distilling insights from data
to drive informed decision making,

and that's happening across
the whole organization.

And I would offer three characteristics

that you can look for in, an organization,

where there's a data-driven culture.

The first one is you will
see individuals and teams

actively asking themselves questions like,

"What information we can draw on
to support and guide decisions."

You will see consistent efforts devoted

to pull relevant data to analyze an issue.

And you will see open and frank
dialogues on understanding

the root-cause of problem
by looking closely at KPIs.

In terms of why it is essential
for businesses today,

there are four factors, two
external and two internal,

that are important to bear in mind.

The first external factor is
competitive marketplace.

Companies need focused strategies

to target the right markets, to
differentiate themselves to compete,

and they need the market intelligence

to develop focused strategies.

The second external factor
is digital transformation.

The ability to adopt the right technologies

to drive business outcomes is critical.

Successful digital transformation

involves using technology
to capture relevant data

and analyze the results.

To automate processes, for instance,

companies need to know what
data is important and what's not.

The internal factors:

The first one is operational efficiency.

Businesses need to be efficient today,

and we are aware that costs
are going up, labor, materials.

And with the current inflation,

companies need to have a
good handle on the numbers.

The second internal factor is the
need to treat data as a strategic asset.

Every business has tons of data.

Imagine if you can mine
the data for intelligence,

they will uncover lots of opportunities
to make all kinds of improvements,

such as targeting
high-margin niche markets.

So these four factors require

an appreciation of making smart
choices from data analytics.

It is more important, than ever,
to build a data-driven culture.

– Yes, I think those are some great
factors to take into consideration,

especially, if you recognize
that your organization

doesn't have that data-driven culture.

Maybe we can talk about
some key challenges

that organizations face when
they're trying to transition to that.

Because it's not something
that happens overnight,

something that you can turn a switch

and say, "Hey, we're a data-driven culture."

It's something that
builds over time, I'm sure.

– Yes, there are two key
challenges I'd like to share.

The first one is the lack
of technical capabilities.

And when I say technical capabilities,

they include the skills to identify what
data, or KPIs, are relevant to look at.

They include skills to
analyze the numbers.

For instance, how do you know you
have achieved efficiency improvement?

What would you look at to
monitor process performance?

Do you want to look at the
results on a weekly basis

or it makes better sense to compare
month over month changes.

And there are many data
points you can look at,

but not all of them are relevant.

Once you have the data, you
need the tools to capture,

compile, and analyze them.

And many companies are still using
legacy systems that are not integrated.

So it is a tedious and often very
frustrating exercise to extract the data.

And to overcome that lack of technical
capabilities, start with training.

Training the fundamental skills

on asking good questions to identify
what data do we need to look at.

Training on the skills to analyze an issue.

And I would suggest train everyone

from the executives to people
working on the front line.

We don't need to train everyone
to be a data scientist,

but we do need them to have the
basic skills to ask good questions.

To understand what they need to look
at, and become good problem solvers.

And in terms of the legacy systems,

there's only so much you
can do patching them.

Eventually you need to
invest in modern technologies,

and there are so many
options out there today,

and there's no need, and
I want to emphasize this,

it's not necessary to invest in
the most comprehensive ERP.

The key is to find the right applications
that meet your business needs.

Now, the second challenge I'd like
to talk about is the lack of buy-in.

When you don't have the support
of the senior management team

and the middle managers, it is
very difficult to make that shift.

Now, middle managers are accountable

for the team's performance.

So that fear of poor results is natural

because they reflect on
their leadership skills,

and no one wants to look bad.

When middle managers shy
away from results reporting,

they tend to do the minimal,
just what is needed.

Essentially they create an alignment

where there's little incentive for
the team to embrace analytics.

Now, when we look at the
senior management team,

when there's no buy-in
from them, on analytics,

you tend to see an authoritative
management style.

Top-down decisions will become
directives for the teams to execute.

And in this situation, the efforts made
on analytics are not valued at all.

To overcome the lack of support,

start with understanding
what the dynamics is today

and find your champion.

That champion could be a
team leader for a small group,

a middle manager, or an executive.

Someone who is receptive to analytics,

open to discussing results, and
also willing to devote the time

and effort to data analytics.

And once you have that champion,

pick a problem to tackle and develop
a game plan, and that game plan

has got to be practical, for folks
who will be doing the work.

Include, in your game plan;

1. How you're going to capture the data.

2. What tool you're going to use.

3. Who is going to do the analysis.

4. What forum you're going to bring
folks together to discuss the results.

5. Who is going to make
decision on what action to take,

and implement the improvements.

And, then, go through the
cycle of monitoring the results

and refine your changes.

So those are the key points on
overcoming the lack of buy-in.

– Yes, that's a big one, is making
sure you have that proponent,

that person, who can help lead
the change in the organization.

Because unless that's
coming from the top-down,

it's very difficult to drive
that change in the culture.

– Yes, definitely, and one
thing I forgot to mention is

share your success stories with
as many groups as you can.

Because the more you can broadcast
how analytics will help improving

business outcomes,
you will build momentum

and excitement around analytics.

– Now, one thing I
wanted to circle back to,

you were mentioning legacy systems,

and how it's hard to connect things
and there's a lot of manual data.

Maybe we can talk a little bit about

how companies should
strategically invest that money.

Especially if you're a medium
to small-sized business,

it's not always easy to
implement new systems,

you might not have the capital.

But you want to strategically invest
that money so that you can have

the right systems in place,

to foster that data culture
we've been talking about.

– Yes, there are three areas
I would offer for consideration.

The first one is to build the
capabilities within the organization.

So that goes back to training
employees on the skills that they need.

To ask good questions to
identify what data they need.

Train them on how to analyze results,

with the skills they will take ownership

on the data capture and analysis.

The second area to
invest in is technology.

The key is to find
the right technologies.

Some companies will
spend thousands of dollars

and potentially millions to
invest in state-of-the-art ERP.

But, yet, they might be using 10%
or even 5% of the functionalities.

So any way you look at it, they're
not going to get the ROI on that.

And there are lots of smaller
applications out there,

cloud solutions, for instance,
today, that are very affordable.

And for smaller businesses, they
might want to focus on those

and hone in on what are
the biggest functions

that you need from that application,
and that's the best way to go.

And you want to make sure,
also, the tool is easy to use.

Those big ERPs, generally,
are clunky to use.

So the smaller and simpler the
tool is, you get better user adoption.

Because when users
use a tool haphazardly,

you end up with incorrect
and inaccurate data.

The third area to invest in,
it's got to be time and effort.

It takes time to do the work, capture data,

compile it, analyze, discuss, take
action, make improvements, et cetera.

So it's not something that you want
the staff to do it for one month,

put it aside for a few months
and come back to it.

It doesn't work that way.

To build that culture,
you got to be consistent

and put in the time,
regularly, to build that habit.

So when you invest your time
and effort in these three areas,

technical capabilities, technology,
and time and effort to build a habit.

You will build a confidence
for your teams,

hopefully, across the organization,
to make a shift to a data-driven culture.

– Yes, no, that's great advice.

But when you think about all the data

that we have in organizations,
it can be very difficult.

And all that data is not
necessarily quality data.

The old adage "Garbage in, garbage out".

How can organizations
ensure that they possess

a complete set of accurate data?

And some of that time that you
were talking about putting in,

does that include cleaning up the data?

– That's an excellent question, Adam.

Quality data is a challenge
for many companies,

and it's nice to have
accurate and complete data.

But, in reality, most companies
still have a lot of work to do.

Of course, you can clean and
correct your historical data,

in your systems, but it is
usually a painful exercise

and often the game might
not worth the efforts.

So if you, indeed, need to make
decisions based on historical data,

I would suggest a couple of things.

The first one is to understand
where your data deficiencies are,

and incorporate assumptions
in your analysis.

Develop the worst-case
and the best-case scenario,

so you have the bookends.

And when you apply your business
savviness to your numbers,

you make better decisions.

For example, when Covid hit, 2020,

we know that in the
second half of the year,

the shipping costs went sky high.

So if you include the cost for
those six to eight months, in 2020,

when you want to deduce the average
margin cost for your portfolio,

you know the numbers will be out.

But you know the reasons, and
you can explain the anomalies.

The second option is you can exclude
those data points from your analysis.

Now, the second part to make decisions

from historical data, as you mentioned.

If you have the time and manpower
to do the data cleanup, you can do it.

But I would suggest to be very selective

on how much you want to do because
you don't want to get into a spiral.

Now, that's historical data.

Going forward, though, you have
more control on the data quality,

and there are two parts
to that, to build good data.

The first part is to
capture meaningful data.

The second part is have good data input.

Let's look at the first part first.

Capture meaningful data;
so that is training your staff

to have the skills to ask quick questions,

so they know what data
they need to go after.

And, essentially, when
they're good at that,

they will become filters for
capturing meaningful data.

Now, the second part
is good data capture.

What is most critical here is to
have the tool that is easy to use.

Think about a worker working in
the site, on a construction site.

They have limited amount
of time to enter data,

and you got to make it easy for them.

Use drop-down lists, for instance,
minimize the guesswork.

And if they're working out in the rain,

you're asking them to enter
20 data fields on a screen,

that's not going to happen.

So you want to ask for the
minimum amount of input,

and that goes back to
ask for what is relevant.

Forget about what's not relevant
because it doesn't make sense

for them to do all that work.

And you asked about the
building trust in data, too,

and I would like to address that part.

On how do you get people
build trust in the data

and therefore the output
that you generate from it?

One best approach I suggest
is to look at the results

and do reasonableness tests.

For example, you can
use a subset of the data

and use that to verify the margins
for select skills of your portfolio,

and share the analytics with
as many people as possible.

Because the more pair of
eyes you get on the results,

you get better feedback, and
you can tweak your analysis.

The idea is not to go for perfection

because you don't
want analysis paralysis.

– Definitely, you don't want that,

and I think it's so easy when there's
so much data to get lost in the details.

And you can't talk about big data, you
can't talk about massive sets of data,

without talking about generative
AI tools like ChatGPT.

The ones that everybody's talking about.

But in a lot of these tools, the ERP
systems that you're mentioning,

a lot of them are incorporating

those types of generative AI to help
you with the analysis of the data.

So we've talked about how important
it is to have good data in your system.

Now, how can these tools help be a tool?

Obviously, they're not the end all, be all,

because with all AI you need
HI, Human Intelligence,

to make sure that they work together.

But how can these tools
help with reliable insights,

especially, with the power
of AI that's out there?

– ChatGPT has really created
a big rave out there with AI.

And with ChatGPT and AI-driven
insights, data quality is very important.

And back in March, earlier this year,

OpenAI did share that the
fourth generation of GPT,

on average, makes up
stuff 20% of the time.

And you heard about
ChatGPT hallucination,

generating outputs based
on wrong information.

And I'd also like to mention
a couple of articles

that Microsoft had to take down what
they claimed were unsupervised

AI-generated articles
on the travel website.

One of the articles was recommendations

for travelers visiting Ottawa,
in Canada, our capital city,

and they suggested that you
got to visit the food bank

with an empty stomach.

And the second article
was a recommendation

for visitors going to Montreal, in Canada,

and one of the suggestions was
you got to try mouth-watering

dishes such as McDonald's hamburger.

So you got to be careful about how

you're looking at the
AI-generated, outputs.

Do your fact checking and judgment
as well, see if it makes sense.

Because if you just use what is presented,

you could make poor decisions and
potentially exposing the company

to legal and non-compliance risks.

Now, you talk about using
AI to generate content

and incorporating part of
that into in-house tools.

Using AI based on internal data set

is probably somewhat, quote-unquote,

could be more reliable
when you have quality data.

But the same thing is
you need to make sure

that what you fit in is reasonable.

Also, check the performance of your
AI model, and there are metrics

out there that you can look at now,

looking at the accuracy,
precision, and F1 score, et cetera.

So you need to be careful of
how you are using that model.

And if you look at a lot
of articles out there,

now, they're talking about
companies are diving into AI

but, yet, not all of them are deploying
them in a big scale, at this point.

Because of the concerns
about the accuracy

and how data could be misused,

and also generating output that
could be misguiding decisions.

– Yes, that's a really good point.

Things to always keep in mind
when using any generative AI.

Now, what if there's a listener

listening to this conversation, right now,

and they're like, "Connie, I've done
all the points that you've made.

All the points you've made I've
implemented in my organization."

Now, how can they
assess the effectiveness

of this new data-driven culture that
they've created in their organization?

– There are three things
they could look for

to assess the effectiveness
of their data-driven culture.

The first thing is
enthusiasm around analytics.

Are people asking good questions to
verify observations, that's one thing.

When people just share
data or share information,

they ask for justification,
and verification for those.

Are they asking good
questions to pinpoint problems?

Are they getting clarity on work ideas?

When your boss tell you that,
"Oh, we got to be efficient."

And right away, if you hear someone
ask, "What do you mean by efficiency?

Can you be more specific about it?"

Because once you hone
in on those specifics,

it will help you to identify,
"Ah, you're talking about speed

of the process or errors
that we're making

that will help you to identify data to
capture, and therefore, there are KPIs

that you need to hone in
for doing your analysis.

The second thing you'll
look for is transparency.

When you have a solid
data-driven culture,

people are very receptive
to what the data present.

They're very objective and impartial

when it comes to interpreting the results,

and they're ready to
share the information.

No reservation about it, good and bad.

Let's just look at it and be
open about discussing

what that means and how
do we need to respond.

The third thing you can look
for is that trust in each other.

When people are very comfortable
in sharing results openly,

and they're very forthcoming,
focusing on issues

rather than personal attacks
or pinpointing blames.

People when they're
not afraid to speak up,

you can see that you
have a data-driven culture,

that people are very forthcoming and,
in fact, collaborating well together.

Now, you've also asked about
the second part of that question,

the culture, whether, fostering
better decision-making.

I would put the onus on the champion.

We talked about the champion before.

As the champion, he needs to reinforce

that discipline is in place to turn data
into actions and improvements.

He also needs to pay attention
to whether folks are committing

to measurements, and analysis,

and he will need to observe
how they make decisions.

The champion also needs to monitor
if the capabilities are in place.

You got to give people the tools
to do the work, track the impact.

Be able to have the time
allocated to discuss the results,

take action, and then
continue to monitor it,

and I would make a comment on, this.

Random improvements
are often short-lived,

but evidence-based
improvements are sustainable

because they indeed tackle a problem
that is important to the business.

– That's really important, and as
you have this data-driven culture,

and you'll be able to
see things more quickly.

How important is it to swiftly
act on these new insights

that you're gaining more quickly,

as you're seeing the data
and seeing the big picture

but in a better way than you were before?

– It is super important because doing
the analytics is just part of the work.

Turning that analytics into action
and follow-through is very important.

And I'd like to share
a story on Alan Mulally

who is the former CEO
of Ford Motor Company.

When he joined Ford in
2006 and became the CEO,

Ford had lost $17 billion in
the previous fiscal year.

And over the course of eight years,

what he had done was he turned
the weekly executive team meeting

into a collaboration exercise.

Executives will come to the
meeting with the numbers,

with the issues, table it openly,

and ask for advice and insight ideas
on what they can do about them.

That's a big contrast to his predecessor.

What it used to happen is the expectation

was, "You don't come into
this executive team meeting

without a solution to your problem."

So what happened then is there's
no incentive to share issues

and that's really forcing, in a way,
guiding people to work in silos.

When Alan had his first
executive team meeting,

after he began to CEO, he was shocked

when he looked at the dashboards
that folks brought to the meeting.

There were hardly any red lights.

If you think about the dashboards;

the green light, red lights, there
were hardly any red lights.

And the first question he
posed to the team was,

"Folks, we know the
company is losing money.

How can we only have a few
red lights on this dashboard?"

So you can see that he really
turned the company around

when he exercised the regiment
of come and bring the results,

whatever it is, green, red,
yellow, bring them all in.

You just need to identify
what the issues are.

If you have some ideas on what
you're going to do with them,

let's share them, openly, with the team.

Others will have ideas, or experience,

or people with a skill set that
will be able to offer some help.

So he really changed that whole culture,

driving the data-driven culture home
by actively promoting that every week.

So big kudo to him, when he
retired from Ford in 2014,

Ford was a money-making machine.

It had a profit of $7 billion when he retired.

So that speaks volume to his leadership

and how he changed that
whole culture around.

So it also illustrates that, yes,
you do the analytics is one part.

But getting the folks together
to talk about the results openly,

no hidden agenda, "Let's be open and
honest about it, what's happening?"

And let's solve and
tackle the issues together.

– That's so important,

and some people call those
the fierce conversations.

Those conversations that
may make you uncomfortable,

but they're super important
to having open and honest,

and making your organization successful.

– Yes, definitely, because you can
only do so much doing analytics.

And, yes, you can have
a center of excellence,

building intelligence in
this particular group.

But if you don't have that arena
for people to talk about it openly

and share the information,
it's not going to help a whole lot.

– Well, this has been a
wonderful conversation.

Thank you so much for
sharing your insights

and the importance of having
that data-driven culture, Connie.

Thank you so much for
coming back on the podcast.

– Happy to be here, and
thanks for having me.

< Outro >

– This has been Count Me In,

IMA's podcast, providing you with the
latest perspectives of thought leaders,

from the accounting
and finance profession.

If you like what you heard,
and you'd like to be counted in

for more relevant accounting
and finance education,

visit IMA's website at www.imanet.org.

Creators and Guests

Adam Larson
Producer
Adam Larson
Producer and co-host of the Count Me In podcast
Connie Siu
Guest
Connie Siu
President of CDC Synectics Inc. | Keynote speaker | Author
©Copyright 2019-2024 Institute of Management Accountants. All rights reserved.