Single Arrows_RGB

How We Did It: How to Use Assessment to Drive Development

The Need

A global client needed to combine assessment and development to drive deeper impact among their mid-level leaders and show the effects of development on their readiness.

The Solution

With DDI's Leader3 Ready® simulation-based assessment, this client was able to assess readiness against the competencies being targeted as part of its mid-level leadership development program.

The Result

The data has helped to elevate mid-level leaders coming out of development, and the client can see how leaders perform against benchmarks. Additionally, the client has good data to use for succession as they continue to build their leadership bench.

The addition of assessment has elevated this development program, giving participants good data and objective feedback coming out of the experience that they can take back to their manager and work together to continue to develop.

Patrick Connell, Consulting Manager - US Operations, DDI

In this How We Did It video, Patrick Connell, consulting manager, shares how DDI worked with a global client, teaching them how to use assessment to drive development.

The client already had an enterprise-wide assessment strategy in place for its selection processes that was aligned back to its enterprise-wide competency model. But they wondered if some of the assessments being used on the selection side could also be leveraged during development, especially for its mid-level leaders.

Learn how this client partnered with DDI to integrate assessment into its mid-level leadership development strategy and where in the development process they used assessment (not where you might think!).

Learn more about DDI's competency-based leadership assessments.

Transcript:

Beth Almes:                        

Hi, everyone and welcome back to our series on How We Did It. Today, Patrick Connell is going to talk about one of the key issues so many companies have in that they might be doing assessment or development, but they really want to bring those two things together to be even more successful and accelerate their results. Patrick, welcome. I'm so excited to hear about this story.

Patrick Connell:                 

Thanks, Beth. Yeah, it's an exciting one, because like you said it's always great when you can have sort of different ton of applications really talking to each other and we see that's where clients really see the most impact overall.                                         

This organization had done a lot of really good work putting together an enterprise-wide competency model and making sure that was ingrained in the organization. We also helped them develop an enterprise-wide assessment approach that was aligned back to that model, then specifically at their mid-level and senior-level parts of the organization. 

But there was also a separate development arm and there was a desire to say, "How can we, maybe, leverage some of these assessments that have been put in place on the selection side?" Is there an opportunity to leverage that on the development side? And absolutely, that is the case and that's the ultimate goal if that's possible.

There were a couple of programs where they saw the biggest need. One was that their development program for mid-level leaders. They really wanted an assessment that added a little more teeth to the experience, for lack of better words. 

So they had included some assessments on the front end of this development program, which was really a learning journey that takes place over the course of three to four months. And so, the participants come into that experience having some insight about their personality and about their behaviors through 360 data, but they wanted to put something at the end of the program to kind of assess overall readiness.

We were able to, through discussions, talk about, okay, what are the behaviors and the competencies that are really being reinforced in this program and then configuring that assessment to really target those competencies. It's a similar experience, it's at the end of the program, which is more of a, sort of a capstone assessment, if you will, and it's a way to assess readiness against the competencies that are being targeted as part of the development program. 

It's been a nice lift. It sort of elevates the program as well in terms of giving the participants something to really cap off the experience and some good data that they'll have coming out of that program, that once they have feedback, they can work with their manager on how to continue to take that forward from a development standpoint.

Beth Almes:                        

That's a really interesting approach to put it at the end of the program. I don't know that I've heard that that often. A lot of people think of it at the beginning, or they take an assessment when they get the new role, but it's not related to development. 

That's an awesome way to really pool together assessment and development, putting it right at the end. Tell me a little bit, you mentioned that it was a simulation-based assessment. I'm curious because there are so many assessments out there to choose from, whether it's personality assessments or shorter leadership tests that are all automated. What made them decide on a simulation-based approach?

Patrick Connell:                 

Well, one was, and this assessment is our Leader3 Ready® Executive Assessment. They just have had good feedback around the depth of the insight that comes from it on the selection side. So they appreciated the holistic approach of looking at both personality, motivation, experience, and behavior. That's sort of the beauty of a holistic assessment like L3. They wanted something that provided more holistic depth as well in terms of insight as part of the program. 

There was a lot of really good feedback around the insight that came from the upfront personality inventories in the 360 survey data as well. But they really wanted an objective data point as part of this program. And as we know, 360s are, they are extremely insightful, but they're more subjective in nature. 

So that was the original, I guess, north star for wanting to put something with a little bit more objective types of data in the simulation because we're looking at actual behavior really adds that to it. And so, it really fit that need really nicely.

Beth Almes:                        

So people felt more confident in the results because it was objective, not just what your colleague observed about you, but really they felt like they had a really fair shot.

Patrick Connell:                 

You got it. And we're able to show their performance against benchmarks, which is also something that this client really valued as well. So that, again, that was an added benefit of using more of a simulation-based assessment.

Beth Almes:                        

What were the reactions and results as you combined this assessment with the development program?

Patrick Connell:                 

It's still early days, so we've been kicking it off, but so far it's been very positive. The participants themselves have expressed, given positive feedback about having this as part of the program. Our clients have been really excited about, again, just the next level has sort of evolved the program in a way in giving them more insights and thinking about, "Okay, now how do we take this data and maybe use it in other places." 

Taking it, not just to evaluate individuals coming out of the learning journey, but this is a good data point for succession as well as they continue to build their leadership bench. That's where the conversations are starting to go now, is how can they continue to make the most of this data.

Beth Almes:                        

That's a fantastic story. I think it's one that is hard for so many companies to really do well. So it's been fantastic to hear about a client who was able to really master that integration of data to make a more powerful development process. Thank you for sharing that, Patrick.

Patrick Connell:                 

Sure. My pleasure, Beth.