OBM Business Hour #3: Training

Listen and subscribe on Anchor.fm, Spotify, and Google Podcasts.

In this episode we discuss:

  1. What is training?
  2. How to know when training is the right solution to your performance problem.
  3. When organizations typically request training to be completed from outside consultants.
  4. How to use training to ensure performance occurs.
  5. What to look for in a trainer? What qualities make a good trainer?
  6. Tips and pointers to increase training effectiveness.

We also provided the following material references to further your growth:

  1. Analyzing Performance Problems – By Robert Mager & Peter Pipe
  2. The New Mager Six-Pack – Robert Mager
  3. Instructional Design ApproachDr. Barbara Bucklin
  4. Yendri Diaz at Skillometry

 

Transcript of Episode

Kyle Ditzian: [00:00:18] Yeah, sure, I think that maybe one with a certain subset like training is perhaps the most common intervention that gamers are asked to do on any given basis. I think a lot of a lot of companies, a lot of managers think that training is the end all be all solution to every problem that’s ever existed. If my employees are not acting correctly, then surely it must be because that they simply don’t know what they’re supposed to be doing instead. Right. I think that we can all probably disagree on that one.

Kyle Ditzian: [00:00:54] And so, I mean, there’s a number of topics that I know we wanted to get to today. But maybe the first one is or and maybe the most important one in regards to training is when should we do training when this training actually the right solution. And maybe we can get some resources to help you guys make those decisions. But I think that for me when I am considering whether training is the right solution versus any other literally anything else, the first question that I always ask is this a is is it a can-do or won’t do the problem if I. Put a metaphorical gun to your head and asked you to do whatever it is that you’re supposed to be doing, if you could do it under those conditions, it is not training. It is probably something else problem and is worth doing a little more investigating.

Kyle Ditzian: [00:01:51] But that being said, I would I’d be interested to hear either Manny or Natalie’s thoughts on this as well as man.

Dr. Natalie Parks: [00:02:03] Yeah, I mean, I think I work in a couple of environments that they talk about the skill level, which I think is can’t do won’t do as well. And I definitely think that’s one of those. If it can’t be a problem, I think there are other times that are probably super obvious, you know, any time you have somebody new. So if they’re entering the organization, I always try to go with the general rule of thumb of yes, they probably have training in certain things and certain aspects, but we should go ahead and take them through training and at least doing it our way. That’s just kind of a good check as well as any time people are asked to do something new or different. So if that’s like a promotion, if it’s an added job, responsibility, that type of thing, and any time that there is a performance problem, I encourage leaders to always first treat it as if it’s a training issue. So if this is the first time that it came up, just go with the general assumption that somewhere along the line training wasn’t as adequate as we needed it to be. And so retrain and then and usually it’s because that allows me to talk with the leader about how do we know if somebody has been trained, how do we document? So like, has the person demonstrated the behavior fluently? Have they demonstrated the behavior under the contingencies of the actual job and that type of thing? And all of that seems to be wrapped up into that initial training.

Manuel “Manny” Rodriguez: [00:03:45] Yeah, I’ll jump in and I will. I would just be echoing a lot of what Natalie and Calgary said, so just maybe to build on that or add to it. One of the things that, you know, we see a lot in both OBM practice and in the OBM literature is the variety of what we call packaged solutions, you know, so it’s almost like everything. And the kitchen sink, if we could throw it at it and practice, that’s why you see a lot of. And so from a scientific standpoint, it’s hard to like to tease out, you know, what actually made the impact happen. But in practice, when you do what’s called the needs analysis. Right. And needs analysis to understand what the organization needs, what the performer needs more than just training comes up. So that’s why practitioners typically have these packaged solutions. They come up with job aides’ procedures, policies and training as part of that whole package as well, along with the end result of feedback, performance management, and data collection, and all that. So I think what’s important around training is to understand that it is a multi-billion dollar industry for a reason. And it’s because if you work in a job or, you know, somebody that works in a job and that person doesn’t get training, you know what happens? Nothing happens, right? They don’t know what they’re doing. They’re kind of lost in the skirt of things. So training becomes kind of the necessary evil of the world and work. Right. But from a performance improvement standpoint, training also becomes a necessary evil to make sure that at a very bare minimum, whatever the foundational skills or whatever the learning skills that are needed to make sure performance happens, training is part of that solution package. So I always find that in practice, there’s never a moment or a never an intervention or never something we’re implementing with regards to organizational change, where training is not part of the equation. So I think that that’s part of it.

Kyle Ditzian: [00:05:47] I think maybe also a caveat there is that word training can mean a lot of different things like training can be a really complex, specific thing designed to create very specific behaviors and in very certain circumstances. But it could also be something as simple as, you know, like a brief antecedent saying it could be a couple of bullet points in an email that constitute training in one thing or another. I mean, I would say that probably for any new intervention that we roll out, there’s always some form of training or clarifying expectations, at least that it comes along with those. Natalie, I’m actually pretty surprised by your approach there to try to treat things as a training problem first. I feel like I go the exact opposite, at least a good chunk of the time, and just try to avoid training, mostly because it is if you want to do training, done the right way. Right. A thorough training, then that’s often like time and resource-intensive. And, I tend to shy away from that. If there is something else that might be done that could address the problem, that I feel like also maybe, maybe, maybe what Manny mentioned is also worth bringing up here, but doing a bit of a needs analysis first.

Amanda Barnett: [00:07:12] I have a reference and I also and it looks like maybe you want to say something, so I’ll jump in there. So the Meger [Oh, it just blacked out. All right.] So Meger six pack book Analyzing Performance Problems. That is a really great book. And so it’s six different books. And this one really gets to analyzing performance problems and some of that needs assessment. That man is going to that many talked about earlier. Also, I kind of laughed for a second because my husband is making lunch and he is loud and I was muted and he was messing with me and I was trying to keep a straight face. It was very hard. I apologize. And back to Natalie.

Kyle Ditzian: [00:07:55] Tell your husband, to keep up the good work.

Dr. Natalie Parks: [00:07:57] Yeah, well, you know, and Kyle, I think I think you’re right in that, you know, training at least the right way of training is really labor and resource-intensive. I think it’s it’s probably from my history of working with a lot of leaders that for lack of a better term are kind of finger-pointers in that like there, you know, it’s that person’s problem. Like they’re lazy. They’re not willing to do it. They’re this, they’re that. And at least in my experience, is more often than not, I don’t have the exact data on it. When you go and ask the person to perform the skill, there are errors in it. And so it’s kind of one it buys me a little bit of time to do a further needs analysis and get some kind of strategies going almost immediately. But it also helps me refocus the leader on maybe it’s not that this person is so dumb or so lazy or so whatever, you know, characteristic that they want to put in there. Maybe they’re just not set up for success. And so it’s kind of my backdoor way of helping to get that frame of mind changed a little bit too, I don’t know, the right way. It’s just what I do.

Kyle Ditzian: [00:09:31] So I understand correctly that you try to get like a demonstration of the behavior itself from the performer.

Dr. Natalie Parks: [00:09:36] Yeah

Kyle Ditzian: [00:09:36] OK. Well, I would say that that’s not necessarily defaulting to training so much as probably honestly best practice probably we should all do in a best-case scenario is like, hey, can you just show me the problem would be nice if I could see it, because then maybe there’s something that you and I haven’t thought of beforehand.

Dr. Natalie Parks: [00:09:56] Yeah. Yeah, I guess that’s true. And maybe I’m wrapping a lot into the term training. Right, I’m like…[uh] but I guess my, my definition, at least, of not training is that if you can’t demonstrate the skill, then you aren’t trained.

Manuel “Manny” Rodriguez: [00:10:17] Yeah, so that goes back to the can-do versus one-two, right? So I think, you know, one thing I also thought that we should talk about in terms of training from an organizational standpoint is when does an organization decide to initiate training?

Manuel “Manny” Rodriguez: [00:10:35] Right. And so it’s you know, when it comes to like a performer level that can’t or won’t do makes a lot of sense. Right. So you have a performer and maybe you have a team of performers and you have to decide, is training the right solution? Can’t you respond to that? Makes total sense from a company standpoint.

Manuel “Manny” Rodriguez: [00:10:53] When does a company decide to spend thousands, tens of thousands, hundreds of thousands of dollars on training? And I find that there are kind of a and a few things that really always seem to trigger that.

Manuel “Manny” Rodriguez: [00:11:06] And I’m just curious what the panel would say. But I’ll give you like my top to, like, trigger points for when a company’s ready to shell out so much money for training. And the one is when they are looking to implement a huge change effort in regards to how people are going to do their day-to-day work. So, like, whether it’s a software or a new protocol or maybe even like a new product or service or whatever, it’s like something new or pretty new. A new enough training is going to follow. They’re going to say we’re going to spend a lot of time and money and effort during training. And then the second trigger that seems to come up quite a bit is whenever they find that the data is going in the wrong direction and performance and they want a kind of quick solution, they want something that they can implement within 30, 60, 90 days. And if you compare a training solution versus a coaching feedback solution, training seems to be the one that they go for because it’s quicker to implement, not necessarily that it has the most long-lasting impact, but it’s definitely quicker to implement. So those are usually like the two trigger points, a big change or some data going in the wrong direction. They want a quick solution. What do you guys think?

Dr. Natalie Parks: [00:12:26] I would agree with that. I think the only thing that I would add is also if there’s like something that just cost a whole lot of money, like a lawsuit, or maybe it wasn’t a generalized performance, but one person did something really, really bad and they want to make sure that nobody else does anything like that. That’s the only one that I would add.

Kyle Ditzian: [00:12:49] Almost like a punishment contingency. Sort of.

Dr. Natalie Parks: [00:12:52] Yeah, kind of.

Manuel “Manny” Rodriguez: [00:12:55] I would like to do something.

Dr. Natalie Parks: [00:12:59] [unknown] It’s the emotional responding.

Manuel “Manny” Rodriguez: [00:13:01] Yeah, it’s almost like do something dumb to nothing, do something. You know, a lot of companies, jump on the training bandwagon just so they can prove that they’re doing something. Look, look, we just spent five hundred hours on training. But let’s bring on the job, and that’s OK.

Kyle Ditzian: [00:13:18] Yeah, yeah, that stuff like that comes up all the time with, like, ethics violations. Like for instance, if you’ve got a boss that is behaving unethically, like, oh, we’ll just send them to sensitivity training. That’ll fix the problem. Right.

Manuel “Manny” Rodriguez: [00:13:33] Right. That actually is a great point, because then let’s talk about how do you make training really, really effective. Right. Like, how how do you get away from that scenario? Bad, bad training. Right. Because ultimately we do believe that training is a viable solution. But what behavior analysis and OBM teach us is that there are ways to make training really, really impactful so that it’s not a waste of time or effort or whatever. And so so in our preparation for today, we talked about maybe adding in a little bit of instructional design. Right. Because what we’ve learned from behavioral science is good instructional design and integrating behavioral skills training, we seem to exponentially get better results. So I thought maybe we talked about that. What do you think? Insertions ideas? Yeah, maybe. OK. All right. So I’m going to borrow seamlessly from a colleague of ours. It’s not in the room today, but her name is Dr. Barbara Bucklin. And Barbara wrote an article several times ago called Principles to Instructional Design, where she basically brings about tells but talks about seven different principles to instructional design. So I just kind of share the principles with the group and then you guys want to talk about it, then you know, let’s talk about it. But quick, quick seven, and I’ll put it in the chat room for those in the Zoom room here. Focus on the learner. Write measurable objectives.  Provide active, meaningful practice that the students learn what the student does. Four. Teach practice and test in context. I like that one. Number five chunk content inappropriate step sizes. Number six, provide immediate descriptive feedback. And number seven, design effective tests.

Manuel “Manny” Rodriguez: [00:15:36] And like I said, I’ll put that all in the chat room, along with a link to the original article that Barbara wrote. See not plagiarizing, just borrowing shamelessly.

Manuel “Manny” Rodriguez: [00:15:47] And so thank you. So what do you guys think about the seven tips like this? Any one of those, in particular, resonates with you that you think like? Absolutely. Got to nail what you guys think.

Amanda Barnett: [00:16:00] Absolutely, I’m going to jump in here and just talk about some examples of what I’m currently doing and, you know, maybe some problems I’ve kind of seen systematically that they’re trying to fix with training. So actually one of knows this particular client. But we were brought in to help with training and they were having issues with turnover. If we trained people better, there’s going to be less turnover and they’re going to be more likely to stay after we got involved.

Amanda Barnett: [00:16:28] It turns out there are a lot more things going on besides training. Yes, they needed to probably enhance training. So what I ended up doing is actually they have a matchmaking process for when you adopt a dog or a cat. And there is a lot of things that go into how do you decide if someone’s a good match or not? And a lot of these things are where they were having difficulty capturing. And then because they weren’t able to write it down, they were teaching it inconsistently and they didn’t really and have more than one trainer. So this kind of gets at another thing that probably adds to this is, you know, think about your trainers and what is their skill set that you want them to have and are they good at providing feedback, you know? So basically, we first looked at what is it that we want them to walk away with? And then we looked at what are the top three things that you really consistently do with every match-making process and what are the consistent questions? So then and then at that point, we looked at measurable objectives. So the objective was when you are doing a mock interview with somebody that is the new trainee, they will be able to demonstrate at least three times in a row with 80 percent or more consistency, hitting these certain targets that we already carved out. So that’s going to be the competencies of when somebody says, well, I’m going to be gone for 10 hours, 12 hours a day, I want to say that’s not a good match.

Amanda Barnett: [00:17:58] So are they going to, you know, maybe some additional questions like who’s going to take care of that or, you know, how?

Amanda Barnett: [00:18:07] Tell me about you being away for that long. I don’t think that’s good for any animal. They probably wouldn’t say that.

Amanda Barnett: [00:18:12] But so then we’re looking at at that point that is measurable and observable and then we’re able to make it meaningful. So then we’re able to iterate on it as well. So I’m currently in the process of the process of developing this covid kind of took the wind out of it because I’m not able to meet with these people right now and then kind of looking at going forward. So when we look at the feedback, I was providing the trainers actually feedback, because one of the problems I typically see that I saw these trainers doing is I’m going to rattle off a bunch of stuff and I’m going to talk super fast. And when they were training somebody, they didn’t say, here’s what I want you to look for. He said, watch what I do. And they were doing probably 50 behaviors and then they weren’t consistent themselves. And then when they were interviewing, you know, people that were getting matched, I often saw them not asking follow-up questions consistently either.

Amanda Barnett: [00:19:10] So I say, OK, cool, cool, cool, moving on. And I was like, whoa, stop for a second. Why are you moving out so fast? So these are things that often got missed. So then I worked with them on when even before you set the learning opportunity review what your learning objectives are, how can you test that knowledge? So for me, I ask questions related to it versus just tell them if they’re that far along in the process. So I have a lot of examples. That’s one of them. But I’ve worked in construction industries and other ones like that doing actually this type of model. We talked about BST. BST for those who aren’t behavior analysts, I know there’s some in here who are not. And that basically means like, I’m going to teach you, I’m going to show you and then we’re going to practice with feedback until you’re able to demonstrate that competency or that skill set to a certain level of fluency or have it, so to speak. So I hope Natalie, Kyle, or anybody else has to be very honest. Hopefully, I did that some justice for those non-behavior analysts. But basically, it’s taking a look at how are we training? And this is kind of a scientific method and making sure that we’re maximizing our training. So that way it’s built for fluency and for mastery.

Kyle Ditzian: [00:20:25] So I think that Amanda has an excellent example thereof like, you know, ways to try to account for some of the issues that you’re going to come up with and good design, best practices and stuff like that. One thing that I want to chat about and this list of seven principles are. That in principle, these are all really, really good, right? Doing them should result in very effective instruction. I want to talk about, one, how to do each of these different steps, because there are definitely some important things to consider in each of them, pros and cons. Well, maybe that’s not worth discussing too much, but common issues that you run across when trying to engage in best practices here, because we all know, you know, best practices for a ton of different things. But often what doesn’t come up in teaching it best practices is how to go about dealing with all the bullshit that life will throw at you to prevent you from doing this that way. And so I wanted to get into some of that. One of the first ones that I wanted to talk about was practice and testing. One of the big issues that I think I mean, let’s talk about, for instance, the OBM community faces all the time is coming up with good practice for brand new OBMers. Right. We have a bunch of people that want to learn how to do OBM, want to learn how to do performance improvement. But providing an opportunity for them to realistically practice those skills in a real-life scenario can be very difficult because very few businesses are going to be like, yeah, give me the new guy, I want to get advice from him, you know. So I wanted to talk about that one in the context of, you know, let’s say OBM, but to it in general, a broader context, like how do we come up with some solutions to that?

Kyle Ditzian: [00:22:32] Manny and Natalie, you guys have had to deal with this kind of stuff in a huge variety of organizations. Give me something juicy, give me some information.

Manuel “Manny” Rodriguez: [00:22:43] I’ll start off and say, you know, I think. So in the broader context, you know, how do you solve for the teaching, practice and test right in context. So when things that I, I really appreciate in the world of manufacturing and technical hard skills. Right. Is the apprenticeship model.

Manuel “Manny” Rodriguez: [00:23:04] So if you think about like when you’re first learning a skill, a trade skill, so welding, pipefitters, you know, what have you, it gets some classroom training that gets some education. Some of them even have certification programs or certificate programs. When they start up a job, no matter where they start in the world, they always go into an apprentice program. So they get paired up with a mentor, somebody that’s been there, done that, done multiple jobs in that given trade. And so they’re there’s prepared over a long period of time until they are granted moving up the chain of that outside of being an apprentice. And I think that’s an important model because this whole idea of the team practicing and testing in context I don’t think can happen all in one city. You want to be able to teach, practice and test in context. And sometimes the context requires time. So that’s why I really like that apprenticeship model. I like it so much that I’ll share with you my own. My own learning experience as an OBM professional was definitely as an apprentice. So my first couple of jobs outside of grad school, I was paired with consultants that have been there, done that for 20, 30 years. So the first maybe a couple of years of my career, I was doing whatever it is that those senior consultants said I should be doing, and it was across multiple projects, across multiple clients, across multiple issues. Right. And there was never one project that was the same as the other.

Manuel “Manny” Rodriguez: [00:24:46] So I was getting taught. I was definitely practicing my craft and I was being tested in context. So essentially I was an apprentice. And I think that that model should inspire all of us to really think about how to make number four happen and how to do it. Well, not from the standpoint of the organization or efficiency or whatever, but from the learner standpoint, from the individual who is the one that is being taught needs to practice and is being tested.

Manuel “Manny” Rodriguez: [00:25:15] So so that’s kind of my first response to that one is how to make that work is really looking at it from the learners perspective. And how do you effectively teach them, get them enough practice, and test in context over a period of time?

Kyle Ditzian: [00:25:33] So that so well, Natalie, you go ahead and then I’ll.

Dr. Natalie Parks: [00:25:37] Oh, OK.

Kyle Ditzian: [00:25:38] I’m going to change topics when we come back.

Dr. Natalie Parks: [00:25:42] You know I think that I like that The Apprentice model as well. And I think, you know, in other professions, you can even think of it about the kind of like the practicum experiences that you might have or, you know, even as BCBAs, our fieldwork experience is supposed to mimic that in some ways. I think the difficulty is that when you think about apprentice models, the whole kind of system is set up for that. Right. So no matter what company you’re with, no matter what specific trade that you go into, there’s an understanding that you’re you know, you’re going to have apprentices and they serve a kind of a very specific kind of position. There’s work for them to do. They’re still adding to the company and some sort of way. While they’re getting the training and the learning, and I think one of the things that other industries face that make it really difficult is figuring out like how do we support kind of that level of oversight and training without the resources that maybe we need to to be able to do that. And so thinking even about whether it’s spent a lot of time kind of in the ABA world, but also like in the educational world and as even in my training as a psychologist, most of the actual kind of like talk therapy training, which I don’t do talk therapies that I think that I do.

Dr. Natalie Parks: [00:27:25] But I had to get trained in how to do it in order to become a licensed psychologist. But almost all of that training actually took place within my educational program. So like I had classes, I remember my talk therapy class where my teacher told me to stop doing functional analyses and spitting out behavior plans for my clients, you know, so I think there’s a. And I think. If you think about kind of those training programs and you think about kind of trade schools and the apprentice type training program, that to me is kind of the system is built to create the training. And the problem comes in when we have organizations where that model is not built that way. And I think, interestingly enough, the kind of autism world is not built for that. How often do we get? Behavior technicians or arbiters who are working one on one with the clients and everyone is saying, no, we can’t have them overlap somebody else to get the training like they have to be trained before they ever set foot with a client. But there’s a problem with that because there’s no practice. There’s no feedback when you’re actually doing the work of the job.

Dr. Natalie Parks: [00:28:46] That’s I don’t know if I have a solution for that, but I think it’s I think that’s the problem that a lot of other industries are trying to solve.

Kyle Ditzian: [00:28:59] So I think that probably one of the most common solutions that I’ve seen to like that issue where we can’t provide. Good luck on the job practice opportunity is with, like, simulated on-the-job stuff. So, for instance, I mean, the way that I got a lot of my clinical training was with another behavior analyst for behavior analyst and training sitting across from me, being the subject of my discrete trials. Right. And giving me feedback on that. And that does help to some extent, I don’t know that there’s a perfect solution. Right. There should always be some oversight, especially in the transition to having someone newly there. Another kind of means of bridging that gap is just trying to, you know, make sure that that person has an open line of communication for all the problems that they are encountering. And, of course, that is. Difficult to create, at least create. Well, it’s easy to make, you know, a random box where people can put all the problems I’m currently dealing with. Let’s put it in the box. Great. Thank you. But if you can actually create an organization or a context where people feel comfortable sharing the issues that they are really facing and that can help bridge that gaps another any other input that maybe you, Amanda or Manny has on that and then I would like to talk about testing and objectives.

Amanda Barnett: [00:30:42] I love that, and that’s a great Segway to something I just did, I actually shared some snippets from the Six Pack series, so this is actually on objectives. So in the images, I just shared is an objective checklist. And then I also shared, I found it helpful, but when you’re carving out your learning objectives, try not to leave a lot of room for interpretation. So in one of the other ones, I share it. It’s just words you can use that allow for less interpretation. So instead of saying no, you would want to say Right.

Amanda Barnett: [00:31:17] So that with a lot more observable and so than the other things I share, it is just three characteristics to make sure that you’re learning objectives are really good. So, you know, look at the images. Hopefully, that’s not copyrighted. I didn’t violate any ethical laws here and I did a shout out to that. It’s the, you know, this Robert Mager, did I say it right, [hopefully I did] that it six-pack.

Kyle Ditzian: [00:31:42] Is it Mah or May?

Manuel “Manny” Rodriguez: [00:31:49] Mager. OK, and you didn’t plagiarize because we also included the reference and you can buy the book on Amazon has been highly encouraging our editor and checked. Nobody gets any proceeds from that. So we’re fine.

Kyle Ditzian: [00:32:03] I highly recommend the Mager Six Pack [Mager six pack] the six-pack of books written by this author.

Manuel “Manny” Rodriguez: [00:32:10] I think you’d be foolish. Not to also mention it is quite pricey. So you probably want to wait a couple of months.

Kyle Ditzian: [00:32:17] If you were going to get just one of them.

Manuel “Manny” Rodriguez: [00:32:21] That’s super weird.

Kyle Ditzian: [00:32:23] So I’m like way over here now.

Manuel “Manny” Rodriguez: [00:32:28] So while Kyle is lost in the blackness of the screen. So one of the things that I wanted to reflect on was coming from an organizational standpoint, I think a couple of people mentioned about funding an apprenticeship model is very difficult in a field, but also just in general, a lot of companies won’t invest in it.

Manuel “Manny” Rodriguez: [00:32:47] And from an OBM practice standpoint, a lot of times, well, if we are really, truly wanting to advocate an apprenticeship model, the best thing we can do is show them the cost of their current training efforts, how much they charge, how much do they invest in it now vs. an apprenticeship model, what it would cost and the return on investment. And more often than not, I would probably, I probably guess, it’s really a guess, I don’t have data to prove it, but I probably guess that more often than not, an apprenticeship model will have a bigger return on investment.

Manuel “Manny” Rodriguez: [00:33:19] It just takes a little longer to implement versus a pure training for classroom training on the job training, stuff like that. But for me, an apprenticeship model is part of a full training program for a company. It’s not a substitute for training because they’re still going to be trained that you want to do in a classroom versus what Web-based, computer-based training for things like compliance and stuff like that. That’s more annual training, stuff like that.

Kyle Ditzian: [00:33:50] So one problem that I have seen with apprenticeship models and like learn on the job type models is that oftentimes that there is no structure to them and that can make things feel really awkward for the new person who basically just landed in front of this experienced person and given no direction for what to do there. Any advice for setting up a solid apprenticeship model that has, you know, some drills?

Dr. Natalie Parks: [00:34:26] I think that it comes to a conversation we all had a little bit earlier, but in thinking about, you know, it’s kind of how do you standardize training in some respects in terms of, you know, maybe looking at the set of skills that need to be demonstrated and in what context that needs to be demonstrated. And that’s probably going to vary from company to company. But I think at least internally as an organization, you could look at each job position and say, here are the set of skills that need to be demonstrated at what fluency, you know, how often all of those things. And then you come up with maybe, you know, checklists or your objectives. What to what criteria do these need? These skills need to be trained and what context do they need to be demonstrated. And that at least guides you know, even if I am random supervisor number one who got random candidate number two, I at least now have a set of kind of instructions and checklists and procedures that I’m supposed to follow, you know, but that’s my thought on it, at least in terms of how you ensure that it becomes kind of a better. More consistent training model.

Kyle Ditzian: [00:35:54] So essentially you try to center around objectives and specifically decently written active objectives?

Dr. Natalie Parks: [00:36:00] Yeah, I think so. I think objectives and objectives are tied to the demonstration of specific skills.

Kyle Ditzian: [00:36:09] We should talk about objectives.

Amanda Barnett: [00:36:11] This is a perfect segue.

Kyle Ditzian: [00:36:15] Yeah. So we talk about writing good objectives all the time. And I think that having good objectives is extremely important to have a good training day objectives. You know, if you follow, you know, do right. You get your objectives down first and then write your training to accomplish those objectives, to get people to actually be able to do them by the end. So I wanted to talk about best practices that were actually writing good objectives, as you guys were privy to earlier, I have fairly strong feelings about.

Amanda Barnett: [00:36:51] Can I just frame up real quick in image twenty-seven that I shared. Everybody, please look at those images I shared. It’s going to really be very beneficial. And then, Kyle, the image I shared is just an objective checklist that talks about performance conditions and criteria. And then there are links related to that as well. I don’t want to take away from what you were going to talk about, but I just wanted.

Kyle Ditzian: [00:37:17] Well, I think that having a time trying to open this up. That’s just my own fault. So I think that the most important thing when, like, actually coming up with these objectives is making them active. Even a ton of behavior analysts’ objectives that I have borne witness to at one time or another are just one not specific and two not active.

Kyle Ditzian: [00:37:44] And those are, in my opinion, the two most important things. The objectives need to be. There should be some demonstration in those objectives where you prove to me that you can do the thing that I need you to do. And if the objectives aren’t worded in a way that forces people to actually prove it. Right. Like if your objective says something along the lines of learner will understand such and such. That’s bullshit. Give me a demonstration. How do I know that you actually understand the objective? Should have something that’s clearly written that like “Learner will be able to tell me these five different things. The learner will do this.” And if your objectives are not written actively like that and with some sort of specific, measurable criteria, then they should probably be updated because otherwise, you’re going to end up with people that do five different things, that all technically meet the terms of your objectives, you know, and you end up back at square one.

Dr. Natalie Parks: [00:38:46] I don’t have anything intellectual to add, but it reminds me of when you said that, it reminds me of all these people I talk to that I’m like, well, how do you know that? Do they know? And they’re like because I told them. And I’m like, did they repeat it back to you, did they demonstrate some sort of understanding? But I think that… [Oh my God, that baby is so cute.] Sorry, I got distracted by the baby, but I think that I get similarly frustrated about the objectivity or lack thereof of objectives. I mean, it’s in the world, right? Like how do we take all of the subjective measures out? And I think that goes into training, but it goes really into like anything that we’re going to use to measure what people are doing within an organization. Like we should be trying to take all of those subjective, unclear, non truly measurable things out.

Kyle Ditzian: [00:39:55] So you kind of talked about testing a bit. Oh, no Manny, do you have some input? OK, you kind of talked about testing and I wanted to tie objectives and testing together. Do you feel like it is an effective test to exist? Does it require the existence of good objectives?

Dr. Natalie Parks: [00:40:17] I would say yes. Otherwise, what are you testing and how do you know if they have it? So if I could ask you a whole lot of things. Right, like I could create a test, but I don’t necessarily know if that test is measuring your skills as it relates to what I want you to do. And so, yeah, I think it all kind of starts with the objectives. And then I think it also likes there are two types of testing that a lot of times we talk about in organizational behavior is like performance-based versus competency-based. And so but I think your objectives tying back to that. Right. So if you want somebody to know or understand, then you might be able to do a competency-based in terms of like what is your factual understanding or knowledge base of a topic area. But if you want somebody to be able to perform a skill or do something a certain way, then you probably need performance-based testing, which is looking at, you know, how well do you demonstrate that skill or how fluent are you at that behavior?

Amanda Barnett: [00:41:36] Oh, I’m not muted. OK, great. So and apparently it helps if I stare really closely at my computer, just, you know, that always helps. So I think this gets into one of my main things is who how do you assess who’s going to be the trainer? And then one of the things that I’ve been working on, I’m going to is another construction kind of work with where the trainer wasn’t necessarily trained on how to test somebody’s knowledge. And so he would just kind of turn into just awkward light right along with his right along with them. For months we’re talking months. A lot of wasted productivity would not really clear objectives of what he’s supposed to be showing these new hires. And he just basically said, OK, I guess you’re just going to see what I’m going to do. And I see that a lot with apprenticeship models going back to what Kyle was talking about.

Amanda Barnett: [00:42:26] So basically what I did is I went on the ride-along kind of like what I did with the earlier example, with the animal matching is I said, OK, what are the common things you come across? And then we started writing down not only what is in the S.O.P, what are the things that aren’t in the S&P that you have to decide and you honestly come across very often that you try to capture in these right along training or shadow training, so to speak. So it turns out, you know, we created all these objectives, and then we had a mock area where I actually had the trainer come and then show me what he would look at and then say, OK, now we traded places and he was going to have to train me and then I had to train him. But mostly it turned into he realized he wasn’t asking tough questions and testing knowledge. He was just talking at them and saying, oh, well, that’s what needs to get fixed and that’s what needs to get that sounds like. How do you know I was listening?

Amanda Barnett: [00:43:25] You know, what if I was thinking about dinner later, you know. No, you didn’t ask me, you know, so that so ended up being a fun training.

Amanda Barnett: [00:43:34] And, you know, really, it was I needed to write down prompts for him and what questions to ask as a trainer of like, OK, before I do this, walk around this property, what should I be looking for? So that was like another good one. And then you can test where that person is. And, you know, can they actually say what the procedure is or whatever it might be? The other thing that with this individual we were working on. So he had a hard time with questions, but he had a hard time with feedback, too. So, you know, if I missed something, I actually said, I want feedback. I want to know if I missed something. So I would intentionally, like, do something great. But then I would also intentionally miss something to give him an opportunity to provide me feedback, you know, and we made it fun. But it was one of those things that he got more comfortable with the more he practiced. So I was honestly using the best method. I had my competencies as a trainer. Here are the things that I want you to demonstrate. Went over it with them. I demonstrated that myself and then I had him repeat it. And then we provided feedback to each other. That also got sidebar by covid. So, you know, I was actually hoping to turn these two things I talked about into research that I would have been super cool, I anticipated was actually going to decrease the amount of time for training by 50 percent, actually, potentially for both clients.

Amanda Barnett: [00:44:49] But I digress. So maybe next time, I guess, you know, we’ll see. I also am not sure if you can see the images, so I’m typing out what some of those things are for objectives as well. But I guess a question back to you guys is what do you guys look for in a trainer and what are those competencies?

Manuel “Manny” Rodriguez: [00:45:09] So I’ll start, I guess, you know, in terms of what I look for in a trainer. So definitely the trainer has to have experience and what their training and I think a trainer that is solely training by a book or by some type of literature won’t be able to add kind of the color commentary that really amplifies a good training, learning experience. Right. So whether it’s examples or stories or Kiowas mentoring, you know, like what are the mishaps? What are the things that failed and why, you know, a trainer with a good amount of experience in what they’re training and can do all of that? So I look for experience, that’s for sure. It’s kind of a broad statement. But if a trainer has the experience, I think that’s probably the first thing. And then the second thing I’d offer is good, strong facilitation skills. So if a trainer. We probably all have experience and educator of some sort or another very monotone, very dry, not very enthusiastic, very a lot more direct instruction than anything else does. Those kinds of facilitators, trainers just pour the crap out of me. So I just for me, number two is I want somebody not to be entertaining, per say, but definitely energetic, somebody that can hold themselves in front of a room and really get people to learn and enjoy what they’re learning. So those are my two.

Dr. Natalie Parks: [00:46:47] I think I would agree with those wholeheartedly. The only thing that I would add and I keep thinking about this and probably because I have a slight bias towards feedback right now, but I look for feedback and kind of not necessarily exactly how well but how willing people are to receive and give feedback, even in thinking about training and using the best model. The most effective component of that is the feedback. And so I think what I’m looking at somebody who is going to be doing training in addition to having knowledge about the subject area and experiencing it and being entertaining enough or passionate enough about what they’re doing, it’s also they’re both willingness and ability to kind of give and receive feedback.

Kyle Ditzian: [00:47:47] Let’s imagine a scenario where you have plenty of competent workers, but none of them have that engaging personality or or or are good at delivering feedback. Where do you draw the line? Do you try to get someone who is plenty experienced but is maybe boring and then try to try to fix that part? Or do you try to bring someone else up to speed who is perhaps more interesting to put in front of people?

Kyle Ditzian: [00:48:22] Or is this an “it depends” kind of question?

Manuel “Manny” Rodriguez: [00:48:24] Look at Natalie’s face and she’s like “I don’t know”.

Manuel “Manny” Rodriguez: [00:48:38] I’ve seen where both can work, so you have to know your audience, so if your audience reacts better to someone who’s more engaging. Versus someone that’s more intellectual than you put in the more engaging person. But if you go so a great example of that as a professional academic conference that many of us have all attended, if you put out a highly engaging person that’s not exactly scientifically sound, they get eaten alive because the audience wants that scientifically sound speaker. Right. But if you put that same engaging speaker at a TED talk convention, they’re going to be more accepted, right? Even doesn’t mean the engaging person doesn’t know their stuff. It just means they present differ. So I think you got to know your audience. I think both types of presenters have value. It just depends on the audience.

Amanda Barnett: [00:49:48] I have some tips and tricks I want to add, I’ll type them up on the side so I actually learn these from Yendri Diaz and she has a great organization called Skillometry. I highly recommend her trainings if you come across them. And one of the things she talks about is, you know, with an audience, you actually kind of lose them after about eight to 10 minutes. So make sure you’re breaking things up and you’re doing something active. So that’s going to be an activity, some type of question, some type of worksheet. I personally like to follow that model as well. And there’s a lot of its not actually OBM research. Oh, my God. But it’s in academia. They talk about what are the best ways to stay engaging and keep your audience so they attend to you. So I just want to add and then I just wanted a sidebar, does anybody have any questions? Because this is an amazing panel of people. I’m not I’m not including that. I’m not including myself. But you know, we have nine minutes left and I want to make sure that we have enough time for questions for people that might be coming across training issues or things you want to problem-solve.

Amanda Barnett: [00:50:56] And if you do yourself if you can or just type up in the Zoom group chat.

Dr. Natalie Parks: [00:51:05] One more thing in thinking about going back to your question, Kyle, about the two types of people, and who would I put as the trainer? I think to me and his. Statement about consider the audience, but I also think that there’s something to be said about when people are kind of being that this is going to sound really non-behavior analytic I have to put that progress, but they’re being genuine to themselves. And so I think if you can teach somebody when and how to kind of give feedback, meaning like it should happen quickly after the behavior, just before the next behavior, and it should be behavior-specific and all of those things. Then when you get a trainer who maybe isn’t as lively, but you have an audience that’s used to somebody who’s pretty lively and animated, you could still have a really good trainer if they have kind of the good training behaviors presence. And I think people actually respond to that. I think audiences are learners respond to that. So they might say like, oh, well, this person isn’t as entertaining, but I still got a really good high-quality experience. And so I think maybe as I thought more about that, I think maybe it wouldn’t matter. Like I might pick the person who actually has more experience in the better training. I think the other thing that I can do is we can start shaping and reinforcing the trainer’s behavior to maybe a little bit more lively. So if they have a little bit of intonation and their voice will give better feedback and social reinforcement as opposed to when they’re super dry.

Kyle Ditzian: [00:53:06] So one thing that makes me think of is one of the common issues, in my personal opinion, with a lot of training research, especially outside of the realm of behavior analysis, is that the most common evaluative measure of training success is subjective. How did you like the training? Do you think that there is value in collecting that data? How important is that data compared to actual effectiveness in behavior change?

Manuel “Manny” Rodriguez: [00:53:39] You ask a behavior change question to a bunch of behavior analysts, guess what you’re going to get?

Kyle Ditzian: [00:53:44] That’s fair. I did kind of walk into that one.

Manuel “Manny” Rodriguez: [00:53:48] I’ll answer that. I think I think both both data sets are important.

Manuel “Manny” Rodriguez: [00:53:52] So I think if we think about behavior analysis one on one, we definitely focus on behavior change. We definitely focus on wanting to make sure from the discussion training there’s transfer of training to the environment. Right. In the context. Right. But we also talk about and are very heavily focused on social validity as well. And I don’t think we should lose sight of that. If you’re if you sit through training and you even if you learn something, you’ll never want to either take that training again or advocate to others to take it or to refer people to take it, then. And is it really a good training experience? Right. So I think I think content is important. I think structure is important. And I think the transfer of learning is important. But I also think it has to be [It has to. It has to.] you have to like the training, like it has to be socially valid in that respect. So that’s that’s my two cents. I’m fearful because if it’s boring, you don’t want to take it, you don’t want to send your friends to it.

Amanda Barnett: [00:55:12] You know, I have a story I’m just going to share. It’s not very long. But, you know, when I was younger, so I had ADD for those of you you don’t know.

Amanda Barnett: [00:55:22] And I’m highly distracted by things and I actually don’t test well, and my IQ, when they tested it when I was a kid, was like over 120 and not surprised the crap out of my teachers because they’re like, I don’t get it. She doesn’t pay attention in class. It was because a lot of the times they couldn’t get my attention and they didn’t realize that. And also, I mean, it could have been when I was learning fractions, I knew what it meant. But I also thought by half I could just draw a circle and color it in half and that. And so I turned in. Oh, my God. Look like hieroglyphics, you know? So I’m sure they’re like, there’s something wrong with this kid. But and then one, they couldn’t figure out what to do. They’re just like, well, if I can’t teach him how to teach them, because that’s going to go well, you know. So it’s like, you know. But when we talk about making sure you can get the attention of the audience, it’s so important to get your analysts know this. So he’s not going to tend to you forget about it. So you know what? I think the other thing we talked about, too, is, you know, how do we maintain this? So I guess my last question for the panel on this, if anybody else has a question and normally if I was facilitating this, I would posture like a good 30 seconds until somebody else spoke. I don’t have time for that. But how do you guys how would you measure to see if you had the long term impacts of a training that you hope for and making sure you’re saving up for maintenance in the future?

Kyle Ditzian: [00:56:51] Sure. Oh, no. Natalie, go ahead.

Dr. Natalie Parks: [00:56:53] No, you go ahead. I was just going to say I like how we all just paused. [laughing]

Kyle Ditzian: [00:56:58] Who’s going to answer this? So let’s get that down to to reframe your question. How do we make sure that training sticks, essentially. Yeah?

Kyle Ditzian: [00:57:07] [nods yes] Ok, I mean, I think that probably the most important thing at the initially important thing is going to be establishing some sort of competency and like actually testing that if we if we test and get feedback and, you know, like if the performer has to actually demonstrate competency, I think that that is very much the first step. The next step is having some sort of follow up. So either data collection on ongoing performance and feedback on that or even a check in some period of time after both of those. Anything that can add accountability to your training is going to dramatically help establish it in the long term.

Kyle Ditzian: [00:57:56] I wanted to I have to bow out, but I wanted to leave on a quote that I am stealing from the coffee mug that I had this morning. That happens to be very, very poignant. We’ll say. This is from Fred Keller. This is the student is always right. He’s not asleep, not unmotivated, not sick. And he can learn a great deal if we provide the right contingencies of reinforcement. And I think that that is Super Super Bowl. But thank you, Fred Keller, for your time. As long as you can establish those those contingencies and keep them going past the training, I think that that is probably the most important thing.

Kyle Ditzian: [00:58:37] Contingencies maintain behavior. Training can only get them started.

Kyle Ditzian: [00:58:46] And with that, I will leave. Thank you.

Dr. Natalie Parks: [00:58:49] Mike drop! I think you said it perfectly. I mean, I think that was perfectly said were also right at our time. So I might have been the perfect ending. Mike drop we’ll, just end it off.

Amanda Barnett: [00:59:05] I wish I could have just dropped a smoke bomb and just went poof. And then you just left.

Kyle Ditzian: [00:59:08] Oh, I should have.

Amanda Barnett: [00:59:13] There you go. Yeah, that’s a mike dropper.

Manuel “Manny” Rodriguez: [00:59:15] I think Kyle had the nice, good last scoop of this panel, said thank you Brad Keller and Kyle Ditzian.

Kyle Ditzian: [00:59:28] Thank you, everyone. I just happened to pick the right coffee mug this morning.

Amanda Barnett: [00:59:33] Yeah, well, thank you all for joining us, we are doing this every Friday. So we are going to pick a topic that’s different. And for those of you who can join, great. I think we were going to try for 12 to one.

Amanda Barnett: [00:59:48] Hopefully this is a good time and then record it for if you want to use, you can get to use it and you can just go Behavior Leader university, is that right?

Dr. Natalie Parks: [00:59:56] University dot behavior leader dot com. (university.behaviorleader.com)

Kyle Ditzian: [01:00:02] All the cool kids are doing it. I highly recommend.

Amanda Barnett: [01:00:04] Highly recommend.

Amanda Barnett: [01:00:07] All right, well, thank you guys, and hope you have a lovely weekend.

Kyle Ditzian: [01:00:11] Thanks, everyone, for tuning in. Bye bye. [music]

woman leader with other leaders behind her

OBM Business Hour #2: Leadership Development

Ever have a problem finding the right leader to fill a position? Learn how to develop a pipeline so your next leaders are ready and waiting.

People collaborating at office

Business Case for Diversity

In the current era of globalization, there is a growing need for diversity within businesses. There is a current call for organizations to look inward to determine how to become more diverse. However, while there is growing acceptance that diversity is necessary for...
Ball with many words and Emotional Intelligence in the center

OBM Business Hour #5: Emotional Intelligence

Listen and subscribe on Anchor.fm, Spotify, and Google Podcasts.  In this episode, we discuss Emotional Intelligence, a hot topic in organizations for the past several years. While this term has become more and more popular, it is largely flawed from a scientific...