Season 4 Episode 2
Welcome to Season 4 of the Law Firm Data Governance podcast. I’m CJ Anderson, founder of Iron Carrot, the law firm data governance specialist. I’m thrilled to have a new season of insights about working with information and data in law firms.
Data governance is the key to unlocking your law firm’s potential. But it’s not the only area of data activity that is important for your firm’s success. That’s why, in this fourth season, I’m pleased to share some information, questions, and top tips about the other areas you might want to consider.
Episode Transcript
CJ Anderson
Welcome to Season 4 of the Law Firm Data Governance podcast. I’m CJ Anderson, Founder of Iron Carrot, the law firm data governance specialists. I’m thrilled to have a new season of insights about working with information and data in law firms. Data governance is the key to unlocking your firm’s potential. But it’s not the only area of data activity that’s important for your firm’s success.
That’s why, in this fourth season, I’m delighted to share some of my recent data conversations. My guests this season are thought leaders in their own areas. Each has a unique perspective on the importance of data to law firms. Join us as we talk about capturing, finding, using and governing data in ways that can add meaningful value to the firm’s strategy, operational processes and everything in between.
Hello and welcome to another episode of the Law Firm Data Governance Podcast. My guest this week is James Markham, the Head of Business Partnering at Dentons. So welcome, James.
James Markham
Thanks for having me.
CJ Anderson
Let’s start by explaining to people who don’t know you a little bit about your current role in your career journey and how you work with data and kind of how you’ve got here.
James Markham
How did I get here? There’s an existential question. Yeah. So, I suppose I’d probably best be described as having a bit of a portfolio career over the last sort of 10-15 years with a mix of consultancy and sort of senior management roles, mainly within UK law firms. And the uniting thing being kind of driving profit, driving cash flow within those, as you say, currently a Dentons Head of Business Partner looking after innovation and legal tech, legal, project management, commercial finance and practice management. So quite a broad range and as you can imagine fair bit of data underlying all of those teams. And sort of getting ready to plan my next chapter in 2024. I think in terms of kind of data, I probably almost go back to the beginning. You know, I trained originally an accountant and as an auditor, so really, you know as a junior auditor in the weeds of whatever shed in Stoke that I happen to have as a client, you know, ticking numbers back from what’s in the accounts through to. Whatever underlying invoices sales invoice is purchasing invoice etcetera. And I think probably that point probably just like an increasing kind of awareness of how can technology help to either interrogate or just reduce that manual effort around that kind of ticking and bashing. I then moved into sort of an operational consultancy role sort of leading Six Sigma type basis in terms of discipline. And I think it’s fair to say I think the data probably came a bit sort of messier bit more fluid, bit more open to interpretation. So not so much, you know, what did this widget cost? Can I take it all the way back to the invoice of the widget costing X but kind of, you know, when did that event happen and when did we record that event happening? And are those two things the same? And if they’re not, why aren’t they? And you know, how do you measure quality? You know, those sorts of sort of more nebulous things. And I think that kind of that contrast has always struck with stuck with me has stayed with me throughout that kind of contrast between, you know, OK, at face value, the data is telling me. This thing you know, but underlying that, you know, there’s this kind of question of you know, how are we measuring it? You know how can I validate what appears to be that kind of face value in. So how can I test it quickly before I act on it and also kind of how can I make kind of messy decisions where I don’t have all the data you know when do I have enough data to act when do I need to kind of test. I think that’s a really you know interesting part of when we talk about data that kind of you know what is it for and how do we kind of manage that and some of the messiness and the fuzziness that comes with some of those areas.
CJ Anderson
So from your perspective, what’s the kind of data challenges or data opportunities that you’re kind of tripping over most within your current role?
James Markham
I think it’s probably the assumption that the report or the dashboard or the whatever that the data is, is right. Whatever right means quote marks. You know, without necessarily kind of the kicking the tires of that to understand what it is, you know I think in you know when you talk to people who maybe don’t work in professional services, you know you kind of say, oh, yeah, and these kind of guys and girls, these lawyers, they record every minute, every unit of every time, every minute they spend in a day. And they give you these really detailed narratives of what exactly they did in that six minute unit. You go kind of eyes light up and it’s like, well, think of what you could do. Think of the possibility for process improvement. And then you kind of get your before you go rushing in. You know, you need to understand. There’s quite a lot of gap that opens up between how long did a thing take, what was the thing and what got recorded in whatever you’re in tap time you can’t hide in whatever your time recording system is, and I think understanding some of those behavioural dynamics, you know, around kind of under and over recording. You know where the soft time codes are, where some of the you know some of the seven hours to partnership, admin or whatever the equivalent is in your law firm. You know, I think I’m saying some of those dynamics and working out how you can unpick those to then kind of make the decision. I think that to me is you know, not just in the cover up dentist but like a perennial issue across colour career today that kind of at face value got lots of data and it’s really really good. But when you start scratching the surface you realise maybe some of these foundations aren’t as solid as you would maybe hope or had assumed.
CJ Anderson
So it’s interesting to talk about the foundations when you’re thinking about process improvements or changes, how much of that focus is on the data that that can enable that or the data within that and how much is on the behavioural piece?
James Markham
Yeah, I mean, I’m probably going to cop out with that with a bit of and it depends, right?
CJ Anderson
Yeah
James Markham
I mean that was that was predictable. And I suppose then what does it depend on it? Suppose it depends on kind of the what are you trying to achieve and what have you got to work with. I think that you know to kind of go back to kind of reference around kind of Lean Six Sigma that kind of framework, the defined measure analyse, improved control framework of moving through a process improvement project. I think the bit that I see consistently gets forgotten or missed or we you know we stump we just kind of keep going out of excitement to go and change the world or whatever it is the thing we want to do is that measure bit and that kind of OK just assessing what is the current state, what’s the as is. Testing the robustness of some of those measurement systems and the data that where is that data being captured like right at the point of that you say that time entry to that point or pen is being put to it. But how robust is that that we can then say OK, that is a robust understanding of the as is to then say OK so then what leavers do you want to pull, what changes do you want to make to then make those changes come back to it and say yeah you know we move we move that the needle the profit went out the cash flow went up whatever it is. With some kind of robustness of what you’re doing, is valid as opposed to just sort of. Shooting hope kind of approach to these things and I think when you kind of drill into that kind of measurement system piece, I think that does start unpicking that you know are there bits within this process where we’re just not measuring anything at all, you know I’ve kind of had to do kind of manual samples of things where you’re looking for dates on invoices to kind of trace through some sense of what’s the standard we’re doing things. Do you maybe plug in gaps? That behavioural piece you know culturally you know, is there pressure on Fiona as to perhaps over record their time, heaven forbid or you know, is there a profit pressure? So maybe there’s a pressure to under record time you know? I’m just saying some of those. Bit around the edges of that process improvement piece I think starts informing them where the focus is and do you need to build systems? Do you need to? I don’t know. Do training or behavioural change or whatever. I think it comes out of really looking at what are you measuring in the issue or that it’s right.
CJ Anderson
Do you get a sense that data quality, and I don’t know if it words in your mouth, but is data quality kind of at the root of a lot of challenges or is it a contributor to a lot of challenges in a law firm space? Certainly, as they start to look for legal tech and different ways of doing things? Or is it just another factor in the in the mix?
James Markham
Things always there. It’s like a lingering smell in the background, isn’t it? You know, it’s always there as I mean and you make the smell is strong. Maybe the smell is weak depending on what it is you’re trying to achieve, but I think in whatever the thing you’re trying to do to you because I mentioned legal tech, if you’re putting a new bit of technology into, you know, I guess make up a process quicker or to take a step, whatever is you’re trying to do, I think you’ve always got to have a question of what is the data quality like and you might kind of go well I’ll kick the tires on it and it’s fine for what we need to do. It’s fine we don’t need to do anything or it might be we’ve got none, it might be we’ve got some, it might be that it’s built a little bit on the sound and we need to address that, but I think it needs to buy money it needs to be an explicit question, but rather than an assumption I think there’s a risk that you kind of assume that it’s all OK and you can kind of come a little bit unstuck when you’re getting further down that kind of process improvement type initiative where you know you’re not getting the benefits you’re expecting because actually you weren’t measuring what you thought you were measuring or whatever. So I think kind of knowing, you know, kind of that kind of as to be kind of move. It’s really good idea to know where you’re starting. To then have a sense of, you know, are you gonna get to where you’re actually heading from where you’re starting?
CJ Anderson
So I think that sense of kind of wanting to keep score and using the decent data to get you from the your eyes as to your to be or where you are to where you where you want to be. Is it important to kind of have a sense of data strategy or data governance or data management whatever you want to label it as a parallel stream to all of this stuff that’s supporting it or is it something that naturally comes out through other activities?
James Markham
I think that it is probably two answers to that depending on like. I guess when you ask me like if you ask me say 10 years ago, I think actually. So the data maturity was generally quite low. You know there’s maybe an assumption it was owned by IT or maybe by the finance systems team or the HR kind of tied to who’s the product owner for the system that’s collecting the data if you like. And so therefore in terms of any kind of improvement effort and to your point, that kind of keeping score, I think you kind of have to build in those data collection and the measurement piece as you go, I think to be honest, I think to be fair, I think that is now getting better. I think you do see. More or greater data maturity, data understanding of it. More sort of cross functional owners of end-to-end processes rather than kind of those kind of hand offs from one to another and no one really knows what the downstream does. No one really knows what the upstream does but they just. I’ve just been given something to work with, so I think there’s a, I think it is getting better, but I think so. Yeah. But as with anything in probably at any six, but certainly legal, but I’m familiar with, that’s not a universal statement, right, that there’s there are still those kind of old practices of data. That’s all it isn’t it or whatever the kind of the mental shortcut that the senior management team have. So I think getting better, but road to go, which is probably music to your ears, right?
CJ Anderson
Absolutely, absolutely. But it’s interesting to think like what do you think really are the barriers, I don’t know if the question is taking data more seriously or putting more of a say grown-ups senior persons focus on Data. Because there’s a lot of, I guess there’s that kind of AI large language model type noise stuff. And they said we want to do the cool things, but then there’s that, but we’re not really bothered about. How the data gets where the data gets, but what do you think the kind of challenge to taking data seriously.
James Markham
Yeah, I think it’s probably an underlying data literacy point and that maybe disconnect between the data as a thing and the, you know, whatever the shiny toy is that we’re playing with at the moment to your point, kind of large language models, you know, a few years ago, it would be, why can’t the Internet landing page just have a Google search bar like Google has? And it’s like, well, because you haven’t spent billions in R&D to sort your data out to be able to surface that. Right. So I think that those conversations, I think just still happen. You know and you know the shiny things got everyone’s attention. I think that sometimes what I see is you know where I think there’s an understanding that being data led in decision making is sort rather of a good thing and then you know conversely not being data net would be a bad thing. But in terms of what that actually means, I think that there is probably sort of an education piece to you know to your grown-ups in the room or the senior management team or whatever you know. Because I think sometimes when we say data led what we mean is I’ve already decided I want to go over there and now I’m gonna go look at the dashboards and the reports that will kind of that confirmation bias thing have proved to me that that’s the thing I need to go and do and go there or I’ve done this thing because I thought it was a great idea. And I’ll look for the data to support that rather than on a more scientific method basis. So I look for the data to tell me I got that wrong. I think that I think it’s a kind of a, yeah, data literacy education piece. Which isn’t as exciting and as sexy and glamorous as. Yeah, his ChatGPT in the box for you or his Google search bar over the top of your Internet, but I think that’s the. I think that the difference between teams that perform well with data and those that don’t I think are those that I have an understanding of that not necessarily coming through a detailed level but having an understanding of just needing to test and say well that report over there says we should go left. But that report over there says we should go right how are they saying two different things and can we unpick that and just a bit of challenge around I think what gets surfaced in sort of that kind of management reporting suite or increasingly dashboards.
CJ Anderson
The drivers for kind of making those changes and looking at data was and kind of doing more with data and putting that education piece in place are those kind of internal drivers to the sort of within or are they pressures coming from clients and outside or a bit of both?
James Markham
Yeah, I think that’s a good. I think that’s a really good question, I think, I think I’ve you over the last decade or so, I think I’ve always likewise was increasingly come to the conclusion that kind of that innovation, that change is probably most sort of powerful profound when it’s being client driven. And I think that’s true in the data space. If you look at you know kind of free LLMs and AI and all the rest of it. But you know you look at some of the requirements around E billing for example and the extensive use of narratives and the automated screening of those bills to say you. You know, you charge me for two cups of coffee and photocopying, and that’s not it. Within our letter engagement, go back and try again. I think actually if you look at that as a feedback loop, assuming everybody kind of plays ball with it rather than just gives outputs it and sees what comes out the other end, you know that actually starts driving really good data behaviours that sense of eye upstream law firm, they’re giving you downstream client really some quite detailed data and you’re validating it back to underlying letter engagement telling me I’ve got it wrong for whatever reason indeed, right, there’s a feedback loop that makes me go as law firm. Well, why did we do that? Why do we get that wrong? And can I put some validation further upstream. We’re sort of embarrassed myself giving the wrong thing to the client that you know, at the end of the matter with the bill. And so I think where that’s embedded, I think you kind of see those kind of really good practices of those feedback loops of correcting underlying data. And I think that is there’s probably consistency there with law firms that have a heavy billing presence to say with large banking practice groups or insurance practice groups, big users of that kind of billing set up. I think you see that kind of data maturity. Then it kind of sort of diffuses, isn’t it across other teams that you know aren’t necessarily but that kind of like well, why don’t we just get the bill right before we ship it to the client is a fairly universal, universally true thing. That would be a good thing to achieve, right? But to your point, I think where it’s being driven by clients, I think that’s where you start getting the change that cultural change, behavioural change to stick as opposed to maybe the more internal effort which can be a little bit one shot than we’re done, you know, so we’ll plaster over the data, make the change and we’ll move on, whereas where it’s been driven by the external forces, I think that requires it. Yeah, more considered approach.
CJ Anderson
I’m just going to jump back on something else you said and pick up on the kind of opportunities for data and opportunities for themes and doing things differently and kind of not doing the confirmation bias thing. So ,are you seeing opportunities to do data properly? Or whether they’re taken or not is a different question, but is there a clear kind of? A set of things that that are, you know, law firm opportunities in this space that that you see.
James Markham
Yeah. I mean, I think there’s. So, yeah, I think there is there is opportunity and there’ll probably quickly follow it up with, you know, kind of a question as to how well equipped are we to deal with it. And I think there’s a piece where. You know long established law firms, even not particularly long established law firms actually have a lot, a lot of data the whether that’s, you know, precedents and tools and templates that help them run their matters in a in a standardised way, whether that’s, you know, a wealth of time recording data that you can mine for well, you know on that sort of transaction it tends to take us that long and that sort of transaction it takes that sort of long. You know, plus whatever else you kind of got in terms of kind of qualitative notes within your case management system, I think the, the challenge is how do you surface that because all of that kind of rich data has been produced over many, many years with that and without necessarily any kind of if you like, Big data’s probably not quite the right word like little to medium data you know, but that kind of none of that was kind of produced or entered into any particular system with a view to the outcome of well how can we leverage that, how can we leverage our, you know, unique history in this particular practice area in this particular niche within our marketing materials for our clients for example. And So what you’ve got is just a combination of data in different systems that may or may not talk to each other. You’ve got data quality issues where people have kind of set matters up and they’ve just recorded it to your matter type other because they didn’t know that that was potentially going to be an important field and it might not have been an important field, you know, ten years ago. And I think there is a lot of hard work. And it’s not very exciting. I’ve just been in that up and not to the point where it’s pristine and perfect, but to the point where it’s usable and you can start drawing sort of meaningful insights out of it. And I think that’s to my mind, sort of the data elephant in the room is. There’s a lot of data there. Again, to your point, you know, can you just stick an LLM on the top of it? I mean, arguably that moves you along a little bit better than where we were before that, but it’s still not great, right? It’s still not as good as a properly structured, you know, set of databases that link into each other and give you a real meaningful insights about the markets you’re operating in. So I think there are opportunities there. But I do think that kind of the, I don’t know if it’s an elephant or an albatross. Is that need to clean it all up to get to a point where it actually becomes usable?
CJ Anderson
And if you’re starting to think about, you know, making a business case for doing something with data. Do you think that we need to clean it up to leverage the big tools is a strong enough case? Do you think firms would buy into that as a case or do you think you’ve got to? Glossy it up with and here’s the client outputs.
James Markham
Well, I think whenever you can tie it to a client output, I think that you that that is a different realm of interest like that starts being more relevant to you know, let’s kind of do a bit of housekeeping. I think it’s how sometimes the internal stuff can land. And I think until the kind of building that business is probably as in you know as well as I know I know you do when you kind of look at these kind of when you come in and help firms with the data governance bit, it’s kind of where are the quick wins. So where’s the little bit of data that if we just spent a little bit of time to clear it up or validate it or reference it back to a standard taxonomy of definitions or whatever, where’s that quick win that enables you to kind of say to you managing partner, your CEO, your finance Director, your CEO, whoever it is. Look, we cleaned up this little bit of mess over here and it generated this insight over there. Wouldn’t it be great if we could maybe start tackling some of these bigger, you know, problem areas and then these are the sorts of insights we try. I think there’s a need to show and demonstrate that because again, it probably goes back to your point earlier. Yeah, there I think there is a bit of a disconnect between shiny new tool over here and. The data that that drives that over there and I think it’s needed to kind of that the kind of storytelling thing. Telling that if we did this little bit here, look at what it drives in terms of value, we’ve to our sort of clients and therefore think you know the world is your oyster think how much you could achieve if you could clean up some of this other stuff that’s kind of just lying around on the back of the sofa.
CJ Anderson
Absolutely. I’m gonna ask you now for a kind of final thought. So, if you were, if you were gonna kind of summarise a piece of advice around data in law firms for someone who’s never worked with law firm data before, what final thought would you want to leave them with here?
James Markham
Yeah, it’s a good, it’s good question. I think I think probably just don’t sort of take, don’t take the data as read, you know don’t take it that that is the answer. I think just have a bit of curiosity around where did that data come from, you know what system, what person did a thing that created that data point. And I suppose a little bit of empathy and put yourself in their shoes as you know. Well, what pressure was that person under when they put that bit of data in that bit of system that came downhill and wobbled out onto this kind of dashboard at the end of it. And I think just a little bit of curiosity to kind of. Just pick at it and explore it and understand it and understand some of those kind of cultural behavioural dimensions that might be at play in terms of what ultimately you’re then trying to make a decision on your dashboard, on your report.
CJ Anderson
That’s really brilliant advice, James. Thank you so much for joining me on this podcast episode.
James Markham
No worries at all. Always a pleasure.
CJ Anderson
Thank you for joining me for this law firm data governance podcast episode, I really enjoyed chatting with James about data and data within data change and process improvement. If you liked this episode please share, like and review it so that more law firm leaders can learn about data governance and how to manage data in law firms effectively. Don’t forget to subscribe so that you don’t miss any of this season’s data conversations with law firm data thought leaders. Or head over to irconcarrot.com to get in touch with your questions and ideas for future episodes.

- Do you want learn more about the podcast?
- Are you curious about what’s coming up in future seasons?
- Do you want to listen to the latest episode?
Answers to these questions and more can be found on the podcast page.