Blog Post

Data Knowledge Pioneers Episode 2: Taking on Fragmented and Tribal Knowledge

by
Nick Freund
-
January 31, 2024

I am pleased to bring you the next episode of the Data Knowledge Pioneers podcast, where I speak with data leaders and practitioners about the problems they face in creating, curating, and disseminating knowledge about their data.

On today's episode, I speak with Michelle Ballen-Griffin, Head of Data Analytics at Future, and Scott Breitenother, the founder of Brooklyn Data Co.

We explore the problems organizations face with fragmented tribal knowledge about their data, and some of the solutions they have employed to break down those barriers. This is a fascinating conversation with data practitioners who have experience creating a data-driven culture in organizations both large and small. 

Listen on Apple Podcasts, Spotify, or YouTube.

Transcript

[00:00:00] Nick Freund: Hey everyone. Welcome back to Data Knowledge Pioneers, presented by Workstream.io, and we're again exploring how organizations create shared consciousness around their data. I'm Nick Freund and we're speaking with leaders and practitioners around the acute problems they experience in creating and disseminating knowledge about your data. And so, specifically today we're talking about the issue of fragmented and tribal knowledge, and how you can capture that knowledge, institutionalize it, and ultimately enable your team. And really excited to introduce two awesome data leaders who always make me think differently about these types of topics.

[00:00:50] So first off, we have Michelle Ballen, who's the head of Data Analytics at Future, which provides one-on-one digital training with fitness coaches. And Scott Breitenother, who's the founder of Brooklyn Data Company, which is a very large and fast-growing data consultancy. Which, among other things, gives out these awesome t-shirts. And so if you ever see Scott, you definitely should ask him for one of these. You probably can get one if you email him at – I think it's just Scott at Brooklyn Data Company. So anyways, Michelle, Scott, thanks so much for joining me today. 

[00:01:31] Michelle Ballen-Griffin: Thanks for having us. Yes, that is my go-to gym shirt. Scott, I keep meaning to send you a gym selfie when I'm wearing the Brooklyn Data Co shirt.

[00:01:40] Scott Breitenother: I appreciate it. We do t-shirts and data very well. Those are the two things we do. And everything else, it's okay. But t-shirts and data we do well. I'm glad you both like it. 

[00:01:50] Nick Freund: They're very comfortable. And I will plus one that I work out in this shirt quite a bit. Non Sequiturs aside, I wanted to start by talking about the problem of tribal and fragmented knowledge about your data. So, what is this? And how will we define the problem? Michelle, do you have thoughts on how you would define the problem? I have my opinion, but as a practitioner, I would be really interested to hear how you define it. 

[00:02:24] Michelle Ballen-Griffin: Yeah. I mean, there's cross-functional work streams in different pockets of every organization. They have different initiatives that they're testing out. They're learning all the time. And how do you make sure that the knowledge that they're gaining via the initiatives they are testing, the hypotheses that they're validating, actually get disseminated and shared throughout the organization? Because something that the team might learn on a marketing initiative, maybe it was like a campaign creative – this really resonated with clients from an acquisition perspective – might be relevant to a member experience team who is doing ongoing sales, and keeping clients engaged and retained. 

And so how do we make sure that the learnings that are happening over there, are being shared throughout the organization? Everyone has that shared understanding of how the business is going, what we're learning every day about our users, so that we can make more informed decisions. And really have just more collaborative discussions around what we should be doing next to improve the business.

[00:03:20] Scott Breitenother: I totally agree with what Michelle is saying. And I think this is always a people, process, and technology challenge. You've gotta create the culture of sharing this knowledge. But you also have to create the structures to it. The intranets, documenting the code, and just build a culture of knowledge sharing, which is very challenging. But it's one of those things with culture, if you see a good culture, you can kind of recognize it. But it's hard to think about all the millions of components that went into building that knowledge sharing culture.

[00:03:57] Nick Freund: You know, Michelle, you had talked about that example of the marketing team has seen something and is building some knowledge. Maybe that's relevant to the customer experience team. Have you experienced this problem mostly with one team sharing knowledge to another? Or do you experience it a lot with the data team sharing knowledge or capturing knowledge from the business? Or do you see both? 

[00:04:22] Michelle Ballen-Griffin: I feel like I've definitely seen frequently, teams giving their best effort to share with other teams. Like the product team: “We learned this thing. Let's put out release notes or send around an update.” I feel like, personally, and this is definitely true for my role at smaller organization startups, the data team is super well equipped to play this role of making sure that this knowledge is disseminated and shared across the organization. And it's something that I kind of consider part of my team's mandate: We are business partners to every different operation. We are very close to them. We know what we're doing. We've been part of the strategy discussions around what data do we wanna create to answer, what questions do we have, right? And so I think it makes a lot of sense for our team, who's already part of all of these disparate conversations, to come together, pull all those learnings together, and make sure that the team across the board is being made aware of new learnings. And also, I think it's our responsibility to facilitate the conversations to get people talking about these learnings. And then it kind of spurs and inspires new ideas and new questions and just gets this cycle going. So, I like to think of it as my team's, one of our responsibilities to create that shared knowledge and shared understanding. 

[00:05:44] Nick Freund: And do you scope that around data specifically, or do you see that as almost a broader mandate, given how your team works across Future or any other company you've worked at?

[00:05:57] Michelle Ballen-Griffin: I guess a broader mandate. I mean, we're working with the business stakeholders every day to learn new things and test out new hypotheses, validate those hypotheses. Then basically, how can we recap all those learnings and then make sure that everyone has access to them, and has an opportunity to ask follow-up questions, and have a follow up opportunity to have more in-depth conversations around, “What does this mean? Are we interpreting it correctly? How can we use it moving forward?” 

[00:06:29] Nick Freund: Scott, any responses to that? 

[00:06:32] Scott Breitenother: Yeah, I totally subscribe to the Michelle school of thought on this. I would say that the data team's mandate is not just to produce the data, produce a data warehouse, or produce reports. I kind of think the data team is accountable for how data is used across the entire organization, and creating a culture where the entire organization is making data-driven decisions. Which is kind of weird because that's a dotted line type of responsibility. 

How can a centralized data team be accountable for outputs and data-driven decisions that they're not in the room for? It's by being that hub of knowledge and disseminating the learnings. Being good listeners: “CX team, I hear you're thinking about this decision. This other thing that the marketing team did the other week might be really relevant.” 

And so it’s a multi-pronged kind of approach. You gotta push out reports and analysis. You’ve gotta present. The data team needs to be talking amongst itself and educating each other, and being ambassadors for the learnings, and pushing them out in the org. They've gotta be ambassadors for this culture of data and data-driven decisions, which is, again, hard to do. And you gotta build trust. Gotta build relationships with your stakeholders. But if you do it right, the data team essentially becomes this artery of insight and knowledge that's going out to all aspects of the organization.

[00:08:18] Michelle Ballen-Griffin: Absolutely. There's so much more to it than just documenting and pushing out the knowledge, like pushing out the write-ups or the recaps. It's involving the stakeholders early on in the process, right? Getting them to be part of the question generation: “What questions do we have?” Getting them excited about, and bought into, how we're going to capture the answer to this question. Or how we're gonna validate this hypothesis. And really getting them involved and making them feel part of the process, so that it feels that much more relevant to them. 

The way that we push it out, making it high-quality design, and something that people actually wanna consume. And then also, to Scott's point about being empathetic – really listening and creating a safe environment for people to have conversations and discussions around the data. That's all part of it. It's not just simply writing up the learnings and sharing them. It's creating the whole culture, really. 

[00:09:09] Nick Freund: The way I think about the role of the data and the analytics team, you become like the nerve center of the organization, right? Or of decision making. But I think just of the organization in general because of everything that both of you have been talking about. And with great trust comes great responsibility. Or great power comes great responsibility? Whatever they say in Spider-Man. But Scott, I'll set you up with this one. What do you think is the most important context that consumers of your data need to do their job? Or to make better decisions? Is there anything that you think is particularly important? 

[00:09:55] Scott Breitenother: I think just knowing about the universe of data that's available. And so I'm super into enablement lately. And I probably should have always been into it. At Brooklyn Data, we implement data strategies and data infrastructure for clients. And we do train the users, both the data team users, to continue to build and maintain the infrastructure, but the end stakeholders. But now we're doubling down on the stakeholder enablement. We'll probably do 5, 6, 7, 8 two-hour long training sessions with the stakeholders to make sure they really feel comfortable using the tools, and understanding what data is available, and how to self-serve. I think that's so important. Building that understanding of: “Here's the category of decisions that stakeholders should be able to take on your own. Here's the category that you might want to ask the advice of the data team, because it might be kind of uncharted territory. And then here's the category of decisions that probably should be more data team-driven analysis." And just building the knowledge and the relationships, so the stakeholders know when to raise their hand and ask for help, to phone a friend. 

One other plug I would say is that, I really recommend Michelle's blog post on adding annotations to analysis. I regularly send a link to that blog post out to people. It's really cool. More organizations should do this, which is essentially annotating key dates and milestones so that your data has context. Because without context, data is very hard to interpret, and easy to misinterpret. 

And then the challenge is, unless you document it, context walks out the door every single day. And when people leave the organization completely, you can lose context and never get it back. So I think annotations – I highly recommend Michelle's blog post.

[00:12:03] Michelle Ballen-Griffin: I mean, there's not much to the blog post. It's just: keep a list of dates and what happened on that day and make sure that that's [documented]. 

[00:12:10] Scott Breitenother: But sometimes it's the simple things that matter. 

[00:12:12] Michelle Ballen-Griffin: Totally. I think that's also part of creating knowledge, right? Something that I'm big on, and I think is very common now among data leaders is, how do I become a more proactive versus a reactive team? And so something for me, something that I did early on in my career, I would get the request: “Why is conversion rate down? Why repeat buyers?” 

[00:12:31] Scott Breitenother: It's probably for me. 

[00:12:32] Michelle Ballen-Griffin: Yeah, maybe. 

[00:12:34] Nick Freund: For everyone who's watching, Scott and Michelle used to work together way back. It was at Casper, where you two worked together? 

[00:12:41] Scott Breitenother: And we're forever friends now. 

[00:12:43] Michelle Ballen-Griffin: Forever friends. Getting that question and having to dig into the data, and you never found the answer. It was always, “We think it's a seasonal blip,” or something that's a known issue. There's a known bug on the engineering team. And so keeping a record of those logs now, it just helps if we do ever see a shift in a metric – and this is something that I train the whole organization on, especially my team – is when a metric shifts, we should know already why that was. 

Even if ideally we would probably be writing experiments so we could say, we launched this feature to 50% of people and we know that it was hurting conversion rates. So that's why we know conversion rate went down. But if we're not able to do that, at least having that record of, we sent a huge email blast that day. So it was significantly more traffic that was low quality, not high intent purchasers, and so it's all expected. Then that way we can avoid that reactive work. But it's also creating that knowledge of: “When I see this metric, I can quickly reference that change log and say, this makes sense.” And now we don't have to spend time doing the reverse engineering segmentation and stuff...

[00:13:45] Scott Breitenother: You can do it incrementally every day, too. Versus just do it as it happens.

[00:13:56] Michelle Ballen-Griffin: Yeah, exactly. And the whole organization contributes to it. And it's something that I've done at like my last five jobs. 

[00:14:04] Nick Freund: I think regardless of the exact methodology – and I will plus one that Michelle's blog post was great – I think that your point is about incrementally doing this, right? If it becomes a habit and you're building that knowledge on an ongoing basis, right? Like you then have it. 

As opposed to, wow, how do we create this knowledge from scratch? And when I talk to data leaders about, how do you build out knowledge, a lot of times that's what trips people up, is, how do we go from a state of zero to one here? It can be a lot of overhead if you haven't been investing in it the whole way. 

[00:14:48] Scott Breitenother: And I would just add, yes, it would've been better if you started a year ago. But the next best option is to start today. If you are a growing company, you're creating more data in the next six months than you did in the last two years. And so don’t cry or obsess about cleaning up the old data, or the missed opportunity to annotate the old data. Just move forward and focus on the new data. Because if you're growing, you're creating so much more data to the point that the old data is almost irrelevant. 

I remember when I started at Casper and we moved to a new setup, and I was like, “What about the data from the first 10,000 customers, you know?” Who cares? It's important, but just focus on the next 500,000. You know what I mean? 

[00:15:36] Michelle Ballen-Griffin: Yeah. 

[00:15:37] Nick Freund: Scott, I wanted to go back to one of the points you made about when you're at Brooklyn Data, when you're training stakeholders, and a lot of what you're trying to do is help them figure out: “What can I answer myself? What do I need to phone a friend for?”

Is that more idiosyncratic business to business? Or is there a broader framework that you have there that you rip and replace, and use regardless of who you're training? 

[00:16:04] Scott Breitenother: It’s two indices that decide which category it is. It’s your familiarity with the dataset. So, if as business stakeholders working with the data set, they're extremely familiar with the idiosyncrasies of it, they should be able to analyze that without phoning a friend. As soon as they start getting into new datasets that – as much as we try in the data world to make sure every single dataset is documented and joins together perfectly, it's just unrealistic. And so as a business stakeholder starts to join unfamiliar datasets or new datasets, or join multiple datasets, that's when it's probably worth raising a hand.

And the other index is the importance of decision. If we're deciding on something small, versus a board presentation. So it is a 2x matrix of familiarity or newness of the dataset, and importance of the decision. That's when you kind of decide when to raise your hand.

[00:17:15] Nick Freund: If nothing else, we know you run a consulting company because you just introduced a two-by-two matrix into the discussion. 

[00:17:20] Scott Breitenother: You know, if you can't solve a problem with a two-by-two matrix, it isn't worth solving.

[00:17:27] Nick Freund: So on one axis of the two-by-two here, you're saying technical complexity. And then one is business import or impact.

[00:17:37] Scott Breitenother: It's familiarity. If I'm a business stakeholder in marketing and I'm analyzing the marketing data source I always look at, I'm very familiar. If I'm analyzing shipping data, I'm unfamiliar. And so as you get to data you're less and less familiar with, that's when you should start raising your hand. And then it's business impact or importance. Which is not always the same thing. Again, a chart in a board presentation might have low business impact. But it might have high business importance. So it's important as things become either more important or the dataset is less familiar to you, or you're less familiar with the dataset, that's when you should raise your hand. Michelle, what do you think? And two-by-two matrix answers are the only accepted form.

[00:18:31] Michelle Ballen-Griffin: Granted, I've only been at my current organization for seven or eight months now, so things might change. But the way that I've been approaching it, and my team has been approaching it, is we're building out these self-service tools and this foundation, so that people can be self-service to an extent. I would say opportunity sizing. We wanna send this email to all people who have been deactivated for X months, I can quickly pull that list without being someone on the data team. 

In terms of doing deep analysis and trying to uncover new trends, I think the self-service tools make it possible for my team to move a lot faster. And I almost encourage more of that business partnership, where in the meeting together, the marketing analyst and the marketer will have a discussion, figure out: “What questions do we have?” And on the fly, use the self-service tools together with the data person driving and sharing their screen, almost like Figma. But with data answering those questions on the fly together. And that's why the self-service tools are so great, that we built out that foundation, and really encouraging them to lean more on the data. Cause it's just unrealistic for the marketing team to become experts in this data when they have all their tools that they need to become experts in and own. 

So, I really encourage more of the business partnership, the exploration, and the question generation, and the hypothesis generation happens together with the data person driving the data side. We don't do as much training on the self-service tools. Again, quick opportunity sizing: I need to pull a list of all these users who I need to contact, or roughly how many people have used this feature, just to get a quick gut check. But in terms of doing analysis to uncover opportunities, and then especially for analyzing the incrementality of different efforts, that would be owned by the data person. And still using those same self-service tools and building that, it's almost like we're enabling ourselves to move a lot faster, and be more effective. Maybe over time I'll realize that this doesn't work and we'll need to do more training and lean more on the operators 

[00:20:41] Scott Breitenother: When it gets bigger, I think.

[00:20:44] Michelle Ballen-Griffin: Yeah. 

[00:20:44] Scott Breitenother: As I've started to work with larger and larger organizations, it's become more and more apparent that the data team can't be everywhere all the time.

[00:20:53] Michelle Ballen-Griffin: Right. 

[00:20:55] Scott Breitenother: And I totally agree. I think when you're early and mid-stage size, having the data team as deeply plugged in makes sense. But then as you start to get a big company, you have the data team, you have the business stakeholders, and then you start to have a Business Analyst, which is this whole new role that is not in this world or that world, you know what I mean? 

[00:21:18] Michelle Ballen-Griffin: Yeah. And we do have evangelists on each team who are taking it upon themselves to learn the tools much deeper so that they can play that role as well. There was something else that you said, but now I'm losing my train of thought. I'll come back to it. 

[00:21:39] Nick Freund: We can come back to it if you think of it. 

[00:21:41] Scott Breitenother: We'll add it in post, you know? 

[00:21:43] Nick Freund: Yeah, exactly. We can always just say that, you know, we'll add it in post, and just makes it sound cool. 

[00:21:48] Scott Breitenother: It sounds cool.

[00:21:50] Nick Freund: Scott, you were talking about your perspective that the dynamics maybe changed a little bit as companies get bigger, right? I totally agree with that. Given your work, is there a threshold there where you think what Michelle has been talking about truly breaks down? Is it thousands of people? Is it less than that? Any perspective on enablement strategies and self-service strategies given company size?

[00:22:24] Scott Breitenother: Yeah, but I think it isn't a threshold company-wide. You might find that it'll happens at team-by-team basis, as each team gets larger and larger, and has their own dedicated analytical resources. And also the data team starts to specialize. They start to become the Data Platform team, instead of this Data and Analytics team. You've got this Data Platform team that their whole job is landing data in the data warehouse, cleaning it up in the data model. You might even find at a certain size, you don't even have a Data Platform team. You have the Data Integrations team. And then the Data Modeling team and at certain sizes, companies get larger and larger. People inevitably specialize. And there's just more and more hops between the person that's modeling the data and setting things up, and might have one aspect of context, and the business stakeholder that's further and further removed. 

And that's why, just training and enablement, data catalogs, discovery tools, sharing reports and insights in monthly or weekly newsletters to the company. It's funny – you know, my background in smaller companies – I would sit there and was like, this seems like the silliest thing in the world to send out a newsletter of insights. I look around to left, my right. Everybody that needs to know, knows. But it's the real deal, now that I work with much larger companies, sometimes you sit there and you're just like, if they only knew what they knew. You start to go into larger companies, not only is it an issue of getting data from this centralized core out into the far regions of the universe. They might actually have five data warehouses, and there might not even be just one centralized core that's having issues getting to the periphery of the business. There might be five decentralized, central nodes. It just gets more and more complex, where training, enablement, pushing out knowledge, becomes really required. 

[00:24:40] Nick Freund: I think a lot of this ultimately bakes down to just the problem of information asymmetries. And there's a lot of them, I think, around data, and all the ways that you're talking about, right? But getting to the touchy feely vision of shared consciousness around your data, right? Which is a standard we throw up sometimes when we talk at Workstream, it's about breaking down those asymmetries and those barriers to knowledge, and how do you get them to go across those organizational boundaries. 

Michelle, on that, when you think about – and this starts to get meta, so bear with me – when you think of all of the self-service tooling you've put out there, other pieces of data or data assets that you've put out there, enabled the organization on, are there times and places where you feel like you actually lack knowledge about how that's bringing value? Or data about the data, or have you not felt that yet because you're new or that's totally off? 

[00:26:05] Michelle Ballen-Griffin: Yeah, I think it's just small organizations that I have an opportunity to talk to people one-on-one, and I also can just by monitoring Slack, get a feel for how this is being used, and whether it's actually being absorbed and valued. So, we don't currently do any sort of qualitative surveys around how people are feeling about data or the data products. We might at some point. But I feel like I have a good sense just from the fact that it's a small organization, and I can kind of see everything that's going on.

[00:26:35] Nick Freund: What are you ultimately looking for there? Like what's used? What's valued? How do you break that down and try to understand the impact if you're being successful? 

[00:26:47] Michelle Ballen-Griffin: Certainly, when questions come up in Slack, are they being answered pretty quickly? Also, all of the effort that my team's doing, is it actually generating the output that the teams expect, or that their team is actually able to use? So, basically limiting the amount of wasted work that our team is doing. And that starts with, when a question comes, are we part of the initial strategy discussion, and thus we understand the context? And we can make sure that we've worked with the team to think about how we're gonna think, and figure out, what are the questions you wanna answer, and make sure that the team, all the work that we're doing, is ultimately useful for what the problem at hand is, right? 

Before I learned how to be better about process, and working with stakeholders and understanding what their challenges were, there's a lot of wasted work. And I feel like today, we have basically no wasted work. And everything that we're doing builds upon: when we get the next question, we can leverage this foundation that we built, and tweak it a little bit. Make it a little bit more flexible. And now we can open up a whole new area of opportunity for us to answer new questions. So, I think it's limiting time between asking a question and getting an answer, the amount of work that our team is doing that is actually useful, versus wasted effort, and obviously usage. But what good is usage of our tools if people are making the wrong decisions? So, I don't read too much into that. But currently, it's: engagement is high. 

[00:28:24] Scott Breitenother: I have a question for you, Michelle. I find that it's a tough decision how generalizable and flexible to make something. Sometimes I've been so jaded by not making something flexible enough that sometimes I er too much on the side of making things flexible now instead of going for quick and dirty, 

[00:28:48] Michelle Ballen-Griffin: I'm not the right person to ask, because I am all about: Spend the time upfront. Invest the time to think through how this should be structured upfront. 

[00:28:57] Scott Breitenother: Totally. 

[00:28:57] Michelle Ballen-Griffin: And you're gonna save time. I understand, quick and dirty people wanna answer questions quickly. But I just think, if you spend the two times more amount of time upfront, you're gonna accelerate everything in the future. And that's where we are today at my company. We did that. We invested that upfront work, and now we very rarely have to go in and make changes to our data models. Basically almost anything is possible. At least anything that we want is possible today. I am a big fan of move slowly to move faster in the end.

[00:29:27] Scott Breitenother: Me too. But sometimes I want to challenge myself – am I over-engineering?

[00:29:32] Michelle Ballen-Griffin: I know. 

[00:29:33] Scott Breitenother: Am I putting too much? You know what I mean? 

[00:29:34] Michelle Ballen-Griffin: Yeah, definitely there are people who would argue against it. But I see what happens is, it just creates a swirl. And it creates spaghetti code. And all this buildup of tech debt. Now you're having to break down the change management of, “Why is this different than that?” And I'm just a huge fan of being very thoughtful, deliberate upfront, and building systems at scale. 

[00:30:02] Scott Breitenother: I totally agree. I always try to keep myself in check, though. Am I making this too complex? But it's hard. You just never know the answer. But I think you and I have experienced the downside of not being thoughtful and putting flexibility. 

[00:30:18] Michelle Ballen-Griffin: Yeah. I'm process-obsessed at this point. But I think it's working.

[00:30:25] Nick Freund: That kind of kicks off one of the last places I wanted to dive into with the two of you, which is, you think of these problems of tribal knowledge or facilitating institutional knowledge about your data. Is this fundamentally a people problem? Do you think it's a process problem? Is it a technology problem? Is it some combination of the above? I'd just be curious what the two of you feel there. 

[00:30:58] Michelle Ballen-Griffin: Scott, take it away. 

[00:31:01] Scott Breitenother: Well, perfect. I wanna go first cause you're gonna knock it outta the park with the answer, and I don't wanna follow you. I think it's people, processes and technology. The solution that's right for you with a fifty-person business is not the same as for the same business a year later, when they're a hundred people or two-hundred people. It's constantly evolving. Airbnb is a really great example. They're spinning out phenomenal tools constantly on how to navigate data, how to understand SLAs, how to enable people across the organization. They're also ripping out the old shit every couple years and putting something new in. Because they're a completely different company every couple years. The technologies are enabling you to do even more than they could a couple years ago. 

And so, it's a journey. But I guarantee it does not happen by accident. You have to have a strategy. You have to be thoughtful how you put the data infrastructure. Get everybody working on a single repository of data and code. Document the big changes and insights in one place. Create tools to help people explore data. Train them. Create the Slack channels that anybody can ask their data question and get a quick response. If you look at any high-functioning, data-driven organization, there's not one of those specific things that's driving it. It's the combination of all those things. And that's built intentionally and over time. 

[00:32:40] Michelle Ballen-Griffin: I completely agree. The only thing I would add, just a thought that I've been really trying to reinforce with my team over the past few months, is something that actually, Emilie Schario, mentioned in one of her blog posts, is that, absolutely it is people and process just as much as technology, if not more. I really encourage my team, even if they are not managers directly managing someone, they are still leaders at the organization. It is their responsibility to lead this charge and create this data culture, and really evangelize and create: How are we going to share this knowledge? How are we going to use it? How are we gonna be better and maximize the value of our data over time?

[00:33:29] Nick Freund: Before we wrap up, is there anything else around fragmented knowledge, tribal data knowledge, anything in this area that you wanna talk about, or it's tickling your brain and you have to share it with everyone?

[00:33:45] Michelle Ballen-Griffin: I guess the only thing I've been thinking about is this concept of monthly business reviews. I think every organization that I've been part of, it's very focused on the metrics, and it's very focused on “Why did this happen? Why did metrics shift month to month?” And I don't know that it's very actionable, those learnings, and really translate into new decisions or new questions for the business. 

So, what we've been doing at Future is that's a small portion of the monthly business review. But the rest of it is all about, here's everything we learned this month. All the things we tried across all the different channels. It's focusing more on what we learned last month versus what performance was like last month. I'm just curious, and I don't know if this is an opportunity to put out a survey or maybe just a Slack post asking people what is the balance of their monthly business reviews related to just reporting on KPIs versus recapping everything that we learned that month.

[00:34:48] Nick Freund: Definitely interesting. I think if you're focusing on learnings, what have we learned, arguably that will push you forward much faster, than just talking about the metrics. Because the learnings tell you what to do next. The metrics kind of just tell you what happened, and maybe you have to get a level past that.

[00:35:10] Michelle Ballen-Griffin: It's fun. 

[00:35:11] Scott Breitenother: The only thing I would add in a completely different direction is, sometimes I forget that stakeholders need enablement and training on how to make charts. How to interpret charts. How to interpret data. And so often we're quick just going straight to how to drag and drop in Looker, when we're skimming this. Probably the most important thing is how to make a compelling chart. How to do an analysis. It very funny, Michelle and I were at the Marketing Analytics Data Science Conference last month, and the very last talk was a guy named Bill Shander, and he did like an hour and a half session on data visualization. I've been making charts forever. I thought it was spectacular. It was a really good structured hour and a half training or something like that on just how to interpret a chart. I actually think that would be beneficial for the vast majority of organizations. Too often we focus straight on the data. Let's focus on actually how to interpret data and how to build charts. I think, again, it seems silly, but I think too often – and I know I'm extremely guilty of it – I go straight to the numbers and skip that chapter of the enablement book entirely.

[00:36:32] Michelle Ballen-Griffin: I really like that.

[00:36:34] Nick Freund: Yeah, I was gonna plus one that one for sure. I find this, as a early stage founder, I spend relatively little time building charts. But I just don't spend that much time visualizing data anymore. I used to a ton, like 15 years ago. And so I get in there and do it, and I have lost my skillset, where it takes me 10 times longer to do anything. So I think it's a skill that is actually harder to do well than you would think. Especially if you're in it all the time. 

It goes back to Michelle's point about, how do you make the experience, if you're being thoughtful, this is as much about the experience as it is about the underlying data, right? 

Thanks again everyone for joining Data Knowledge Pioneers, and again, Scott, Michelle, thanks so much for coming on. Tto anyone who's listening, if you want to hear more, join us next time. Next time we're talking with Benn Stancil from Mode Analytics, and Danielle Mendheim, the director of Data Analytics at Dr. Squatch, and we're gonna be talking about the workflows between data and business teams and why and how they're often very broken. Thanks again, and have a great day. 

by
Nick Freund
-
January 31, 2024