AASHTO re:source Q & A Podcast

Mastering AASHTO R 18: Common Pitfalls

AASHTO resource Season 5 Episode 3

This episode reveals laboratories' most common pitfalls with implementing AASHTO R 18, such as calibration record-keeping, training and competency evaluation, and internal audits. Learn how to sidestep these issues by implementing robust quality management practices. Listen in to transform your laboratory practices and get tips to move beyond standard compliance.

Send us a text

Have questions, comments, or want to be a guest on an upcoming episode? Email podcast@aashtoresource.org.

Related information on this and other episodes can be found at aashtoresource.org.

Kim Swanson:

Welcome to AASHTO Resource Q&A. We're taking time to discuss construction materials, testing and inspection with people in the know. From exploring testing problems and solutions to laboratory best practices and quality management, we're covering topics important to you.

Brian Johnson:

Welcome to AASHTO Resource Q&A. I'm Brian Johnson.

Kim Swanson:

And I'm Kim Swanson, and we have another one. It's the first of this season, but it's another Common Findings episode. Which one are we talking about? What standard are we talking about today, brian?

Brian Johnson:

We're talking about everyone's favorite standard, AASHTO R18, the basis and foundation of AASHTO accreditation.

Kim Swanson:

It is, and so that means that everyone in the AASHTO accreditation program is accredited to AASHTO R18. So this is a very wide reaching episode, I hope.

Brian Johnson:

It should be, and I did want to mention, if you had not heard yet. This is episode three of season five, and this is available not only in audio form but also video, so you can see uh us staring at a screen or a camera blankly while the other one talks, which is really exciting, it's riveting it.

Kim Swanson:

It's a riveting video. So, yes, go check out our YouTube, our YouTube channel, and so you can actually see this wonderful dynamic that Brian and I have. We're matching today, totally unplanned. So, yeah, it's a great, a great time. Go check out our YouTube video if you can.

Brian Johnson:

Yeah, and that's not all. You can see what my basement office looks like in my house and you can see the cell that we hold Kim in.

Kim Swanson:

It does very much look like I am in a prison cell. It is just my living room or dining room. If I know what room I'm in, it's in my dining room, so it's just happens to be a very blank wall behind me. So very exciting, but it does look a lot like a prison cell, yeah.

Brian Johnson:

So that's what you can check out on our YouTube channel. Yes, and I'm sure everybody is just frantically going.

Kim Swanson:

Yeah, I'm sure, I am sure everyone just stopped listening to however they were and went right to YouTube. So exciting things.

Brian Johnson:

Yeah, all right, let's get into it.

Kim Swanson:

Yes.

Brian Johnson:

Because we don't want to waste too much time. We're going to talk about common R18 findings and we're going to do this a lot like we've done the other common findings episodes, where I ask Kim questions that she couldn't possibly know the answer to, and then she guesses and then we see where it goes from there. So there will be some questions here and you can kind of play along at home and see if you do as well as Kim does, or maybe better. I hope you do better than me.

Kim Swanson:

I hope everyone that is playing does better than I, because I don't know anything. So this is going to be very exciting for me.

Brian Johnson:

All right. So first we're going to ease you into this and I will not ask the first question, but I'm going to tell you about the most common categories of findings, and they are calibrations, standardizations, verifications, checks, maintenance records, all of that stuff which, for the purpose of being concise, I am just going to call it calibrations, because most of our customers just call them that. They're not all categorized as calibrations, but I don't want to have to say that long string of words every time I mention this.

Kim Swanson:

Okay, all right, it gets tedious. Duly noted, you're going to refer to them as calibrations, but it's everything that you just said, so got it.

Brian Johnson:

Yeah, any equipment records.

Kim Swanson:

Okay.

Brian Johnson:

So this was a section of data that I pulled from our system over the course of I think it was almost a tour it was like a two-year data set and there were 3,717 findings written on calibrations. Next it was training and competency evaluations, and there were 2,837 findings written on that. So that is, typically the laboratories had not presented records on either training of staff or competency evaluations of staff, and that could be a number of issues going on that we'll get into, but those two categories were far higher than any other categories and then below that. So I was somewhere around the 3000 range there. And then the next, the third one, goes all the way down to 371. So that shows the discrepancy. Yeah, wow, so super common issues and then not as common for organizational issues that could be common for organizational issues. That could be org chart wasn't updated, didn't have one Position, descriptions were not in conformance with RIT, a bunch of other things like that. Then we've got 306 findings on management reviews.

Kim Swanson:

Okay.

Brian Johnson:

I thought that would actually be a little higher. People really struggle with that one. And then 216 for internal audits.

Kim Swanson:

Okay.

Brian Johnson:

And I think that shows progress on the internal audit front that there were not as many as management reviews, because a lot of times those two are neck and neck with each other. So those are the most common categories for R18 findings in general.

Kim Swanson:

So, for some context, there are over 2,000 laboratories accredited for AASHTO R18. So those numbers that were like the 3,000 findings or nonconformities, is that spread out over the entire 2,000 laboratories that are accredited?

Brian Johnson:

Not exactly so the way it worked is I pulled data from two years of assessment reports from AASHTO, resource and CCRL, so what that means is the laboratories that were no-transcript may have some laboratories with no findings in a particular category and you may have some with 16 or more findings.

Brian Johnson:

Okay on that. So it's really it's it's really it's very hard to get a sense of exactly what to expect if you were coming into the program based on these numbers. I don't think that would, because it's really dependent on how the laboratory manages the quality program in their facility. So I said I would add more information about what type of calibration uh issues there were, and I do have more data on that. So, uh, and I I think this is a little telling is so there were 2 000 or sorry, 3 717 findings uh, 2406 were just not presented. So there was a record straight up not presented. So they were operating with something just completely unknown I wouldn't say out.

Brian Johnson:

So this is another misconception. People say, well, if they weren't calibrated they must have been out. Not necessarily, it may have been fine and they just didn't know if you know if it was or not. Uh, they didn't have a record to show that, which, uh, is not something you want to have as a policy, that we're just not going to do it and assume that it's in. Um. Obviously, if you're accredited you can't operate that way, um, but there are, there are lapses that happen occasionally, and there are times when people um, can't find it at the time, which isn't good either, yeah, but it's not as bad as not having it at all, yeah, and then another big drop. So we go from about 2,400 to 544 that were missing results. So that could be. Maybe they let's. I'll use a thermometer, because that's a typical one. Maybe they had a thermometer calibration record, but it didn't have the test points that were needed.

Kim Swanson:

Okay.

Brian Johnson:

So that would factor in on that count. Other ones intervals not correct, so they might have had a 12-month interval when there was supposed to be a six-month interval. And then some just details missing equipment, IDs of the calibration equipment, or missing the ID of the equipment itself or misidentifying it. We've run into this issue occasionally where people they will replace a piece of equipment and give it the same ID number as the one that isn't there anymore, which you can't really do.

Brian Johnson:

No, I don't think you can that no that seems like it would be very confusing it would be very confusing and it's hard to tell what you used. Yeah, practice if that's what you've done, but we've run into that more than a few times where people really they they push back on that concept that each one has to have a unique identifier, but that's kind of. The whole point is that it has a unique identifier so you can trace back to what went wrong, if something did go wrong, or which equipment was used.

Kim Swanson:

So, going back just a little bit, you mentioned the intervals not being correct. Being correct Now can laboratories have different intervals than what's stated in R18 if they have the record showing that they don't need to, like that they've proven that they can go longer, or something like that. I believe I recall something like that. Am I wrong? Is this similar? Is this relevant or no?

Brian Johnson:

It is. You're absolutely correct. It is possible to change the prescribed interval if you have data to support it, and this can go either way. So let's say I have there's an interval that says I have to do something every three months and I have years of data showing that this is just too frequent.

Brian Johnson:

Like I never see changes on this, I want to extend it to six months or a year, and if you have the data to support it, yeah, you can do that. And, conversely, if you've got something that you're doing, let's say you're calibrating it every year, but every time you find that it's out of spec, from overuse or misuse, you need to shorten that interval, and you don't need any special permission to do that. You've already proven that you needed to, because you're not able to show that it's it's still in spec because it's either drifted out or there's some some factor is playing into it that you have not resolved. Or maybe you just do a ton of testing and it just needs to be shorter so you can make those changes on your own.

Kim Swanson:

Now do you have to have policies and procedures in your QMS describing how and why you change your intervals for that, documenting that process.

Brian Johnson:

Or no. You don't necessarily have to have a procedure written for how you did that, but you will have to show the data and show how you did it. Um, there there is an AASHTO standard that kind of takes you through that decision making process and you can follow that, as I believe it's r61. See, now I'm going to have to check that and make sure that's accurate as well. Good news it was R61. Other items with calibration include the lack of uncertainty on calibration records for measurement standards. There were 145 labs that did not have uncertainty and 84 of them also did not have the 17025 accredited calibration, which is probably why they didn't have uncertainty. So those issues, people are still figuring out how to deal with that. So question for you yes, okay, this is your time to question for you.

Kim Swanson:

Yes, okay, are we this? Is your time to play. Okay, ready.

Brian Johnson:

What do laboratories typically struggle with when resolving calibration nonconformities?

Kim Swanson:

I would say with proof of or evidence of that was actually calibrated, like the corrective actions they took actually were taken. No, is that what you're asking?

Brian Johnson:

well, that's a big, that's a big part of it. I think a lot of times what happens is that they uh resolve part of it, or they don't really know what to do and they haven't looked back at all of the requirements. So so these things are often layered. So you've got your r18 requirements, you've got your requirements in the test method or whatever other standards it's used in, and they may not look at those two items together to see, oh, what test points do I need? But, as it is with so many issues that occur, a lot of times with the issues the root cause of the problem is not often identified, and a lot of it is communication Right and a lot of it is communication, right, so communication.

Brian Johnson:

I think if you're ever being asked about, like what is important or what could go, there's a lack of communication between the laboratory and the agency they hired to perform those calibrations, so, or vice versa, so that would be one of the things of that.

Kim Swanson:

While the provider that they chose, the laboratory chose may be, while the provider that the laboratory chose may be 17-025 accredited, they didn't ask for a 17-025 calibration. So they didn't get the information that they needed, even though they are accredited for it. The calibration agency is accredited for it. They didn't provide that level of service because it was not specified by the laboratory. Is that what you're kind of talking?

Brian Johnson:

about that happens. That is part of what I'm talking about.

Brian Johnson:

The other thing is okay. Well, the calibration agency should ask the laboratory, well, what points do you need, what are you using your equipment for? And then they can have that conversation about how to deliver the services that that laboratory needs. Another issue that we've found can happen is the calibration agency's technician might show up and perform the work and they might find some problems and not communicate those to the laboratory and then leave them with a bunch of records, Some saying that things are in specification meet specifications, some that don't. But if there's no communication about that, those can just go right into a folder or into a file even harder, a file on somebody's computer or in an inbox and never looked at until it's too late.

Kim Swanson:

Yeah, so it's just assuming that. Well, they didn't say anything, so it must be all within specification, which that's not always the case, so the laboratory has to actually verify on the record what was actually said.

Brian Johnson:

That is correct.

Kim Swanson:

They can't assume no news is good news in this instance.

Brian Johnson:

Definitely not, and it is a challenge. I mean I don't want to make it sound like it's a lack of effort. Sometimes it's just a lack of availability, because the people, the contact person at the laboratory, might just be running around all the time, yeah, and it's hard to find them. And the person's like I have to leave these with somebody or I have to send this to them after the fact. And I mean it's typically they'll send them the official one after the fact anyway, but they might not be able to easily have that conversation that they need to, but I don't know.

Brian Johnson:

There's a lot of things that can go wrong. That is a common issue. Okay, let's get into training and competency evaluation. So, all right, what do you think the main issue was? The, the. There was one that, with this, so we talked. There are a lot of findings related. This is number two uh, with a big gap between two and three on training and competency evaluations. What is the main reason that that this happened, or not the main reason, but what was the what was the main issue?

Kim Swanson:

I'm gonna take a take the lead from the calibration one and say records were not presented you nailed it.

Brian Johnson:

Yes, yes, that is correct. So score one for kim swanson today. Uh, so most of them that they were just not presented, and then a big drop off, and this guy actually kind of surprised me is that. So it was 2482 not presented, so 2,482 instances of a record not being presented and only 355 times that a test method or practice was missing from the record. So maybe they had a record for somebody, but it was missing one or two of the test methods that that technician performs normally, areas where a laboratory can struggle due to not necessarily a lack of communication in this one, but just a lack of keeping up with what is going on at their laboratory, so they may have picked up a new technician or they may have started doing new testing and forgot to document the training, and they may not have a great system in place for doing that.

Brian Johnson:

You know, one thing that we've been working on at AASHTO Resource is an onboarding process that is improved, and I think that would be putting together a checklist at your laboratory, and this is one of those things. That's outside of R18. It's outside of all of the standards to have something like this, but you really have to think about those non-standard things that are helping you operate your quality management system better and implement those. So put a checklist together. When I hire somebody, this is what I need to do, and one of those things should be making sure that they have training records, making sure they've been trained on everything they're going to do for you.

Kim Swanson:

And similar checklist of when you're adding a new test to the scope, right Like of making sure like are the people trained? Do we have like documenting it?

Brian Johnson:

So, not only with new hires, but with new additions to your scope of testing, that's right, and and you all, the ones that have done this really well typically will maintain a matrix on all of their staff and all the tests that they perform or all the activities that they carry out, and that way they can easily keep track of who needs what and if they have any gaps in what they are saying they offer and if they can actually deliver those services.

Kim Swanson:

Yeah, and I think your clarification on the activities they perform and not just the tests reminds me of, I think, one of your pet peeves Maybe not, maybe I'm just projecting, but that assuming that a technician knows how to calibrate or check equipment because they know how to run the test, and I think, isn't that something that I don't know?

Brian Johnson:

I feel like we've had a conversation.

Kim Swanson:

I feel like we've had a conversation about that before.

Brian Johnson:

We have. So that was the one of the main issues I was trying to get. Added to R18 is that there needed to be a documented training for any technician who's being asked to also calibrate or what you know. The other words I mentioned in the beginning standardized check uh equipment that they are using, because that is often overlooked and often assumed by management that they will know how to do. But it is a quite different process than performing the test with the equipment, and so there does need to be care taken to make sure that they know what they're doing before you just say here do this.

Kim Swanson:

Yeah, but that is currently not in R18. That's just kind of a best practice that you should do, that laboratory should do.

Brian Johnson:

That's correct. But it's so critical and it comes up so often that to me it should be in R18. So we will continue to work on that and see if we can get that added. But for those of you who understand how the AASHTO standards are developed, you'll know why that can't just be done. And for those of you who don't, it's because I am a staff member at AASHTO. I do not vote on what gets added to the standards. So it is the Departments of Transportation, materials, engineers or their designees that make the votes on the additions or removals or edits to the AASHTO standards. So if the collective wisdom is to not include that, then it does not get included, even if I am trying to get a change made.

Kim Swanson:

Yes, you can only do so much because it is consensus standards, so it is not just what Brian wants, that's right, it is the AASHTO standard, not the Brian standard, and same with the AASHTO accreditation program.

Brian Johnson:

We are not operating based on my whims. There is a big process and we've got tons of content on that, so I'm not going to get too into it. All right, let's move on. This is another issue that comes up on this. Do you think that the problem lies with the implementation of the requirement or that there is a possible problem with the requirement itself?

Kim Swanson:

oh, so me knowing nothing I'm knowing nothing me knowing nothing. This is great opinion. I'm gonna say again you said no right and wrong, but just from context clues I would have a feeling that it's probably more in the implementation than the requirements. But it could be a good half and half. It could be a good mix of both, like I don't think it's just one or the other most likely. But if I had to lean I would say it was implementation and lack of communication and lack of clarity on how to implement those correctly.

Brian Johnson:

Yeah, I think you're probably right on that one, and I know you say knowing nothing. You've been around, you've absorbed a lot of information that you might not want, but you have over the years, and I think a lot of it is implementation based on the gross amount of findings that there are on this topic. Say that because we're looking at these and the outliers are very high, but they are, um, they're, they're not, uh, just a case of like oh, unlock, you got unlucky there, uh, and you got some non-conformities. Uh, there's some problems with with it, how difficult it is for people to keep up with this stuff. So it does make me wonder if there are some things that can be done to make it a little easier to conform, uh. And not, when I say easier, I don't mean like, oh well, it doesn't matter, uh, we should just just punt on this concept and people can just kind of do what they want, uh, what I mean is maybe some of the specificity of the requirements is too stringent and maybe there are some unnecessary items in there in the details.

Brian Johnson:

Because ultimately, if I felt like people were still doing a great job and this is still coming up, I'd say, oh, it's definitely a problem with the requirement. I don't feel that way. There's definitely some improvements that need to be made with the quality of testing, but I think that we do need to look at the standards anytime we see a large number of nonconformances to see if there's some changes that could be made. And I think that there are some changes that could be made, and I think that there are. But it's going to take some risks and anytime you loosen anything up, you're taking on some risk, right yeah? So there'll have to be some reflection and some consideration of what risks are willing to be taken on and whether they're worthwhile. I'm not sure how much we'd want to loosen up, but I think that there's probably some changes that could be made.

Kim Swanson:

Yeah, and maybe it's not loosening up but streamlining it and making it more. You know like you're not loosening it but you're making that process easier and so maybe they don't have to look at all of the other data points. You know like maybe it's just a consolidation of things in one area instead of loosening up. Because I don't love that as a user, as the public, as using the benefits of the testing of just driving my car down the road and or across the bridge and not worrying about it collapsing. You know I don't love the idea when you say loosening up, but I do. I would agree that there's always likely opportunities to streamline the process and make it more clear and concise, but also knowing, like you said before, like are you asking, are the requirements making a difference? Does it matter that? It is like, is there data to support that? It doesn't matter that you check this X, y and Z or something like that.

Brian Johnson:

So yeah, and I think it's a mixed bag. There's some areas where I think that there are is very little evidence to support the need to do certain things, uh, and there is plenty of evidence to support the need to do others. So, uh, it is a mixed bag and we just have to keep thinking more about it and not just assuming that everything that is already the way it is is there because it needed to be. So that's all part of continual improvement. It is these processes, so we abide by that concept and we'll continue to work on that. Okay, for organizational issues, we've got 146 on the org chart, 146 findings on the org chart, with 120 findings on position descriptions okay, so with?

Kim Swanson:

So with the org charts. Why does it matter? Why does it matter that that's accurate and up to date, and why should the laboratories care about?

Brian Johnson:

that. There are a couple reasons why so. Number one they need to know what their organizational structure is at that laboratory, and it's good for everybody on their staff to know what that is. Number two it is a way for them to keep track of who is working there and who is under each person on the org chart, so who's supervising whom and how all of the different departments relate to each other. The third reason is because it also helps identify the requirements for certification. So if you've got a requirement for the technical director to have a PE license, then we would want to see if that is the case, is there a technical director identified on the org chart and, if so, does that technical director maintain a PE license?

Kim Swanson:

Because I was asking, because it was. I can see why, like part of my job sometimes is the paperwork and I'm like, if I don't get why it is I mean not this type of paperwork but if I don't understand the reason why it's important that this is up to date. That's moving to the bottom of my to-do list and I'll get to it when I get to it. So I think it's helpful for laboratories to know why it's important that that's up to date, because it's not just a hoop you have to jump through.

Brian Johnson:

It's actually useful information not only for accreditation but for in their day-to-day laboratory practices but for in their day-to-day laboratory practices it is, and I would say there are elements of it where they're just not good at the paperwork part. Right, like, sometimes people are terrible with the software that you have to use to make an org chart. You know as simple as drawing the lines connecting these different boxes of staff members. And how do I organize Our organization is really complicated. How do we make it look like this makes sense? One thing I don't love is that sometimes people will have one and then they'll make us a special one to make it easier what, what they think is easier, to show conformance, which sometimes that means that and this is not allowed. Uh, that they not. Not that we would know. This is the hard part.

Brian Johnson:

Sometimes we wouldn't know if you're doing this yeah uh, but what can happen is sometimes they will remove people from the org chart that aren't certified.

Brian Johnson:

So that we don't go asking about whether that person is certified or not, and you can't do that. So that is a you are basically hiding someone or not disclosing some information that you need to disclose. There are other issues that occur too, like they don't know who is in charge of the technician, so you might have a complicated situation where, well, it depends on the project. So for certain projects, this engineer is in charge of what they do you know the quality of work that they do and on this project it might be somebody else, and you've got to figure out how you want to structure that. Maybe you have a box of supervisors and then they all go down to the technicians. That would be fine. That's one thing I want to make sure people understand, too is that it's okay for us to ask you questions about okay, well, how would this work? Or can you explain this in more detail?

Brian Johnson:

In some cases, you may have to add some kind of note to your org chart, or maybe you just have to have a document that supports it to kind of include those little details. Or sometimes it's just have to have a document that supports it to kind of include those little details, or sometimes it's just going to be an explanation that you give us every time, and you might get tired of doing that. But if you do get tired of doing that, you can just document it. But maybe you do need just some questions.

Brian Johnson:

But where we've seen some real problems occur is where you've got a quality manager that's not really connected to the rest of the organization and that is going to lead you down a path that uh is not quality oriented Uh, and I would not like to see that like. Ultimately, the quality uh quality needs to be in line with with the operations and the performance at that laboratory. So you never want to have a situation where, oh, we just have this quality person that kind of checks in here and there, gives some feedback, and then they're not really involved, other than that they need to be able to enact change. Because if you are focused solely on productivity without quality, then you cannot maintain that quality mindset that you need in order to continually improve and to maintain your accreditation ultimately.

Kim Swanson:

Yeah, that definitely makes sense. Thank you for clarifying all that for me and our listeners not just me. I hope our listeners got something out of that, because my day-to-day is not going to change, but hopefully this prompts somebody in a laboratory to take meaningful action.

Brian Johnson:

Yeah, yeah, you'd hope so, but if you have any other questions about that, you can talk to any of your quality analysts, or me or Amy, or even Kim.

Kim Swanson:

Apparently, I will say. You mentioned how people don't like making the connect boxes and lines together, and you can change the orientation to be portrait instead of landscape in PowerPoint as well, so you can do it. I'm sure there are YouTube tutorials on that as well if you have questions about it, but I find that's the easiest way to make org charts for me and my needs is in PowerPoint. All right, easiest way to make work charts for me and my needs is in PowerPoint.

Brian Johnson:

All right. Next, we're going to get into management reviews, which is a common issue, but not as common as I expected. I think it's just because each lab only has one management review, right? So you're only going to have one or two findings on this one, whereas you could have a multitude of ones on calibrations or training or copy files. So anyway, uh, 218 just didn't present management reviews, which is not at all surprising to me, because people still struggle to figure out what the management review is. Uh, and in light of that now, kim, you have, you have experienced some management reviews. You've, you've participated in, um, uh, you have read them. You have, I have what a management review is.

Kim Swanson:

Well, I will say from all of the webinars I've been a part of, with Tracy Barnhart, our quality manager, talking about this, that I will say. One of the big takeaways for me is that a management review is an input into an internal audit. They both look at the big picture but it's different, as in, the, management review is an input to an internal audit. But I think you asked what the biggest misconceptions were about it. I will say the frequency that you have Frequency.

Brian Johnson:

I don't know, I don't feel strongly about that answer, but I'm gonna say frequency okay, that's part of it, and I wouldn't say the frequency necessarily, but the um, the when, when do we, when do we do it in relation to the internal audit?

Kim Swanson:

oh, okay, so that that's so.

Brian Johnson:

So that is part of it that people struggle with Like okay, we scheduled our internal audit and our management review on the same day. That's not going to work because those are two totally separate things.

Kim Swanson:

Well, as one is the input to the other, so you can't really have an internal audit before your management review, because that would be an incomplete internal audit, would it not?

Brian Johnson:

Maybe it's a sampling.

Kim Swanson:

I mean, you can talk about the last one right, oh, yeah. I guess that's true.

Brian Johnson:

So they're both inputs of each other in a way. So when we're having a management review we will talk about the internal audit results, the last internal audit results and how did we do on that and were there any? How do we do on our improvement opportunities, and you can talk about all sorts of stuff related to the last internal audit. And during your internal audit you're going to check to see if you conducted your management review last year and did you resolve non-conformities from that. So you do talk about them in relation to the other. So that that part is is kind of interesting. But with the management review, I think a lot of times people just confuse it with the internal audit because of the way and this is an area where I would say r18 is partially to blame, because the wording used to describe the management review makes it sound like it might be some kind of audit and it really isn't so, it's more.

Brian Johnson:

it's more of tracy I think aptly describes it as kind of the state of the union address for your laboratory. So, uh, you take all this stuff quality related issues and otherwise it could be productivity, it could be acquisitions, facilities, could be all kinds of different stuff and you present it to top management and it gives them an opportunity to ask you questions, to give you the go ahead on making some changes or spending some money or using some resources that you didn't know you had, and it just kind of closes the loop on the organization between your laboratory quality and operations and top management. Because we do. If you're wondering why that matters, if you don't do that, you're going to have some serious issues keeping up with everything.

Brian Johnson:

And we have seen that laboratories that don't do a good job with management reviews often can't keep up with their expectations because they're not allowed to spend money on anything. And it might not even be that they're not allowed to. It's that management was never informed. So get back to communication that we talked about at the very beginning of this. If top management that has to make the decision to allocate resources isn't informed that resources are needed, they're never going to come to you and say, hey, I got a great idea. Why don't you blow a bunch of money on this stuff I don't know anything about?

Kim Swanson:

Yeah, no, that won't happen.

Brian Johnson:

Instead of spending it where they do know where it's needed right. So it's really important that that communication loop is closed so that things can just function properly.

Kim Swanson:

Yeah, and I do know that we have and correct me if I'm wrong because my brain might be misconstruing things but we do have a policy for new laboratories gaining accreditation that we don't expect them to have a management review right away or an internal audit right away. Am I correct on that? I feel like we had an episode on that but can you explain that? Thank you.

Brian Johnson:

Can you explain that more? I can? Yeah, we allow six months kind of leeway for them to operate. So if they're a new laboratory and they're trying to get accredited, we don't expect them to have conducted a management review or an internal audit before they're even working, because they really haven't had anything to evaluate yet. So we give them some time to do that, so they can get accredited, and then six months later we would expect them to send us their completed internal audit and we'll have conducted a management review. Now they can do it anytime in that time period and they and but they have to at least show us that they know what it is and that they've they've had a run through at least once.

Kim Swanson:

Yeah, and that's a definitely an instance of don't perform them on the same day, don't wait till the last day. Maybe do the management review at three months and then at you know five. Maybe do the management review at three months and then, at five months, do the internal audit or something like that to make that work appropriately, because that will be a red flag if you said you did them on the same day.

Brian Johnson:

Yeah, for sure, and I think it's okay. You can do them however you want, because the real value is going to reveal itself later on. Anyway, you're going to see it in the subsequent ones the next year, the next two years, next three years and so on, and things should change over time. Like you should. Don't just make this a static activity. You know, think about how you're doing it, and the management review is a good time to think about how you're doing it and ask around. You know, is this working for us? Those kind of questions should really be asked during those management reviews. What else can we do? Does this make sense? Are we doing what we need to do? Does this follow our vision and mission? Those kind of questions are great if you've got them.

Kim Swanson:

What's the next thing on the list about? That was management reviews. Internal audits is the last one, right.

Brian Johnson:

Yeah, internal audits was the last one, and that's your basic expectations. Is that people haven't been doing them or they're missing some details? Now for R18, this is an area that I think needs some improvement is it says that the entire QMS needs to be audited. That wording is going to lead to some inconsistency on how that is carried out and how that is assessed carried out and how that is assessed. So I think that one thing that I would like to do with an R18 ballot upcoming is to actually list out need to include this, this, this, and I think our list that we talked about today is a good start for what needs to be included in that. And actually recently somebody reached out to me and said, hey, I was written up for this aspect of you know, something was missing in the internal audit and my immediate thought is I don't know that I would have looked for that as an assessor. So I know that there's some improvements we can make in consistency of the way we're looking at this. That's something that will be in an upcoming R18 ballot. It was in a past one that didn't make it through, but it wasn't related to that. Exactly One thing I did want to talk about with internal audits.

Brian Johnson:

Some people who feel that they don't have time to perform an internal audit will try to hire a consultant to do that. And they ask is that okay? And it's like, well, probably not, but it depends. You know, I'm not going to say completely no on that, because I'll use Tracy for an example. Let's say she retired and she said you know what, I'm going to be a consultant now.

Brian Johnson:

And we say, oh boy, it would be really great because she's such a great internal auditor, she knows our system in and out If we could hire her as a consultant to carry out those internal audits. You know, can we do that? I would say absolutely, that would be a great option. But if we hired an external auditor that we see once a year to perform an internal audit, I would say no, that's probably not appropriate Because that person is not going to have the knowledge required to carry out an effective internal audit. So yeah, it's there, like a lot of things. There's no black and white answer on that, it's just going to depend on the situation. But I would say if you're not sure, don't do it.

Kim Swanson:

Channeling my inner Tracy Barnhart of nobody knows your business better than you. So that's the value of internal audits. And to challenge you, brian, if you said you would hire you know Tracy as a consultant, I think that's only going to work for a couple of years because our processes will change. So, like that's not a long-term solution, that's a. That's a temporary bandaid on a situation where in five years she's not going to be as familiar as she is, you know, right after.

Brian Johnson:

So yeah, and she's not going to want to do it anymore.

Kim Swanson:

No, I would. I mean, I know that for sure.

Brian Johnson:

Yeah, but in the made up scenario of that, Right Of that, yeah, so yeah, like I said, it's not a static thing. You got to constantly look at it and think about quality improvements, Yep. Other topics that have come up not enrolled in proficiency samples not reporting test method. Reports in accordance with the standards. Sometimes people don't have the up-to-date standards. That happen. Actually, that one I thought would have been a lot higher number, but it was only 44 instances that's not bad at all, yeah not bad.

Brian Johnson:

I think the um subscription services has helped a lot in that regard. Is that more people are going to that so that if they need something they they can easily pull it up.

Kim Swanson:

And I will say AASHTO standards are only available digitally now, so there are no hard copies on that. Just an FYI, in case people didn't realize that.

Brian Johnson:

Right. Yeah, that is good information, because some people always they still will call it the books, right, and then this may surprise people. So, in case you're wondering how many times falsified records showed up, only 38. I would say only 38 should be zero, right? But it's not. You know, human nature is what it is and there were 38 instances of falsified records that were documented in those findings over the course of two years.

Kim Swanson:

And I will say that you said that's two years, but that's, you know, over 1,500 laboratories I'm going to assume under 2,000.

Brian Johnson:

Yeah, probably.

Kim Swanson:

So I mean percentage-wise that's not horrible. But yeah, obviously, in the continual improvement area we would want an integrity and trust and all of that. We would like that to be zero. But you're right, human nature. There will always be shady people doing shady things.

Brian Johnson:

There will, and I will say we only write false bite if it's a really no question about it. So there are other times where there might be some wiggly wording that was used by the assessors so that they were like did not appear or something which was kind of not conclusive about the falsification. So there were probably more cases of it than just 38. But that's another thing to watch out for. On those internal audits, Kind of pay attention Like hmm, why does this look like this? Why are these numbers so similar all the time? Kind of keep an eye on that. And, as Kim said, you know your business better than anybody. You know your personnel better than anybody.

Brian Johnson:

You are more than likely going to be able to figure out what's going on a lot better than an external auditor would.

Kim Swanson:

Yeah, that was Tracy. I mean, I was channeling my Tracy. I'm like I'm not taking credit. That is all on our quality manager, tracy Barnhart, but that does bring up that your internal audits prepare you for the external audits, for the assessment. So there shouldn't be any surprises really in the assessment because you are doing the internal audit. You should be aware of those issues ahead of time in trying to fix those or in the process of doing corrective actions for those based on your internal audit.

Brian Johnson:

That's right. Yeah, if you do a good internal audit, you should really not have any problem. And definitely don't be afraid of the external audit. They're not going to find anything that you haven't already found if you're doing a good job.

Kim Swanson:

The internal audits for laboratories don't go over the specific test procedures, right? They don't do a demonstration of that, or do they? I can't recall.

Brian Johnson:

They don't. But what they would do instead of that is look at the competency evaluations and the training records and if they look at their training program, they should be made aware if there is a need for additional training before the actual external assessment. But yeah, things can come up. Yeah, and I think a lot of times with the test demonstration is that people are running the wrong version of a standard. Maybe they have an old version of a standard that they've been trained on and just you know, yeah, sometimes just hard to break old habits. Or maybe they're using a state method or a Corps of Engineers standard or something else. So those kind of things can happen and those are relatively minor issues. But it is possible to have a relatively clean report if you're keeping up with everything. We've seen it time and time again.

Kim Swanson:

Anything else that we're going to talk about today, because this is turning out to be a very long episode, I think.

Brian Johnson:

Yeah, sorry about going so long on some of those things, but through the magic of editing I'm sure you will be able to cut out some of the waste.

Kim Swanson:

We'll see. We'll see how that turns out.

Brian Johnson:

Thanks for hanging in there. I'm sure it's still going to be long, even if you cut it, but thank you for listening and checking us out on YouTube if you did that, and hopefully we'll see you on the next one.

Kim Swanson:

Thanks for listening to AASHTO Resource Q&A. If you'd like to be a guest or just submit a question, send us an email at podcast at AASHTOresourceorg, or call Brian at 240-436-4820. For other and and content