AASHTO re:source Q & A Podcast

June PSP Insights

AASHTO resource Season 4 Episode 6

John, Ryan, and Joe are back again. This time to discuss the most recent Viscosity Graded Asphalt Cement and Performance Graded Asphalt Binder proficiency sample rounds and results. We also dive into why the samples are shipped twice a year and what happens to all of the optional data that is collected. 

Related information: 


Send us a text

Have questions, comments, or want to be a guest on an upcoming episode? Email podcast@aashtoresource.org.

Related information on this and other episodes can be found at aashtoresource.org.

AASHTO re:source Q&A Podcast Transcript

Season 4, Episode 06: June PSP Insights 

Recorded: June 22, 2023

Released: July 4, 2023 

Host: Kim Swanson, Communications Manager, AASHTO re:source 

Guests: John Malusky, Proficiency Sample Program Manager; Ryan LaQuay, Laboratory and Testing Manager; Joe Williams, Senior Quality Analyst, AASHTO re:source

Note: Please reference AASHTO re:source and AASHTO Accreditation Program policies and procedures online for official guidance on this, and other topics. 

Transcription is auto-generated. 

[Theme music fades in.] 

00:00:02 ANNOUNCER: Welcome to AASHTO resource Q & A. We're taking time to discuss construction materials testing and inspection with people in the know.  From exploring testing problems and solutions to laboratory best practices and quality management, we're covering topics important to you. 

00:00:20 KIM: Welcome to AASHTO Resource Q&A I'm Kim Swanson. We are without Brian Johnson today, but we are going to soldier on. And we're talking today about the BAC and PGB samples and the proficiency sample program, so with us today we have John Malusky, the Manager of the Proficiency Sample Program.

00:00:40 JOHN: Good morning, Kim.

00:00:42 KIM: We also have Ryan LaQuay, the Manager for Laboratory Testing...

00:00:49 RYAN: I'll take it. Laboratory Testing Manager.

00:00:51 KIM: No, I had it. I had it and then I messed it up. So, sorry about that, Ryan. I'm no better than Brian in that regards but thank you for being here. And we also have Joe Williams, Senior Quality Analyst with the AASHTO Accreditation Program.

00:01:09 JOE: I managed nothing.

00:01:12 KIM: It's OK you have a shorter title, so I was able to recall that. So again, my apologies, Ryan. So, let's dive into the insights. Let's start with BAC samples. What can you share? Anything different with the sample?

00:01:26 JOHN: So yeah, this sample actually kind of surprised us a little bit. I'm going to go ahead. I'm going to defer to Ryan right away. You can kind of talk about what we saw in the homogeneity and stability for this round. Let your roll.

00:01:37 RYAN: So, whenever we have a sample, we do in-house testing before you know, we shipped out to everybody. We actually caught one of our samples as being an outlier. So we went back and checked what we thought were, you know, other samples in that general area. Those ones passed through. So, I thought maybe there might be just one off. Turns out that wasn't the case. We end up having more. Have an issue than expected, but we did see it coming out and it was more impactful than we thought it. Was going to be.

00:02:03 KIM: What was the actual issue that you noticed?

00:02:06 JOHN: Give a little bit more background. We tried to order the liquid. Vault about 8 months in advance of when we're going. To use it, we order about. 20-25 drums somewhere in that range, based on what we need for participation, and then the proficiency sample program crew will package it into the court cans, pipe cans, gallon cans. You know basically. Is in in advance for any of the samples. That we need. Well, while they're going through those processes, Ryan and I create basically like an array table of where to pull the samples from to make sure that when we're checking these samples for homogeneity, we cover. Every single round or scope or scheme, however you want to call it, you know. So, we went ahead and we did that and the crew poured the material in December for basically all of 2023, with the exception of the PGB samples. Since we're using a modified product for that, those are considered separate, and we order that material in a different a lot.

00:03:02 JOHN: Yeah, this one was just kind of a little bit weird. We did our normal protocol and we saw that one anomaly Ryan ran a Q test to remove it. It failed the Q test, so we kicked it out and I just want to say is when we pulled our numbers, we went back to a similar time and date when more of those cans in that area of the pole would have been tested and we pulled another can and it passed. So, we thought it may have been something related to our testing we.

00:03:31 RYAN: Another two cans even. But yeah, so when I ran through the testing, all of our samples had. Gravity of 1.029 or 1.030, except for this one sample that had a gravity of one point. 03/4, which in the scheme of things doesn't seem significant, but it is statistically significant enough that it was an outlier for us.

00:03:50 KIM: What was the data that you were getting back from participants you see in that?

00:03:56 JOHN: Basically, we didn't see too much of an issue with anything heavily related to viscosity. We did not see any issue with. Yes, there was a slight variation between the averages of sample A and sample B or 271 and 272, but the metrics were still applicable and when we looked at the data analysis and analyzed. Ran our verification for those types of test properties. The sample checked out fine. We didn't see any issue. The data was still normally distributed, but when we looked at things like kinematic and absolute viscosity, that's where we saw the issue. There was a major difference in the averages for viscosity the kinematic. Average for 271 was around 450 and the same value for sample 272 and kinematic viscosity was 5/24. Roughly 75 centistokes difference between the two, and that was enough to show a variation between the two samples, the kind. Of weird part is that.

00:04:59 JOHN: When that the random or random selection of samples. We must have hit dead in the middle when that can was shown because it appears that all of 271 was the same and all of 272 except for like 20 cans. So yeah, 20 cans of 272 was identical to 271, so it was. It actually would probably worked out if. All of the cans from 270. Two would have. Have been I don't feel called the deviation would have been the. Other material or whatever happened with it. So, we're not quite sure how it worked out. But it was enough of. A difference where we saw a bimodal distribution for the kinematic absolute viscosity test and we had to suppress it.

00:05:44 KIM: Going over to the accreditation side. So, you suppress the ratings. What have you seen on that side of things, Joe?

00:05:50 JOE: Yeah, that's actually interesting because we did have a situation come up with that and it was interesting to hear John talk about it where? We had a laboratory that was actually previously suspended for T201 and when the laboratory is suspended, we then take a look at their ratings and the following round to see how they do that was suppressed. We are going to reinstate that laboratory based on just the 271 data. We're basically just going to plug in what they got, what the average was and the standard deviation for 271. But since uh, the 272 data as John mentioned is sort of wacky there. We're not paying any attention to that, that that was just a one-off case for us. Suppressed ratings don't have anything to do with previous ratings, so if a laboratory received low ratings or didn't submit data on the previous sample, suppressed rating won't impact them this time around.

00:06:51 JOE: In fact, for BAC in total, we only had one laboratory suspension for a test standard and it wasn't 201 or 202, so. There's not a huge impact there, but there was one interesting case where we did have to take a closer look at it for a lab.

00:07:06 KIM: That is interesting. So, for Ryan and John, what are your kind of lessons learned around this situation or what are you doing to kind of minimize the chances of this happening in the future?

00:07:16 JOHN: We're kind of at a loss as to what really happened, you know, will be different if we saw. The homogeneity issue throughout all of the samples that we tested, but it was 1 sample out of a specific portion of one of those 25 drums that we tested. And it wasn't like it was, you know, at the beginning of our homogeneity, polar at the very end, it was about, if I remember Ryan, roughly like 2/3 of the way through. Or in the middle. So, we had some. You know, material in advance that tested out great. Then this one sample and then the materials that were tested out there after we're all right in line. So we, like I said, thought it might have been on us. It was on our own. As for a testing error. That's why we Q tested it, found out it was a variant, threw the can out and sampled. I think Ryan, you. Said you sampled 2 more. And we received a satisfactory rating. So, we thought it just might have been a testing situation on our end.

00:08:12 JOHN: You know, we don't know if it was, it was an issue with something coming from the supplier. We obviously can't pinpoint it or if it was an issue with our sample handling when it got here. But it was just interesting that it was such an isolated incident and it appeared to only affect about 70 to 100 cans. So just something so small and weird. I mean, I think the proficiency sample crew poured. I'm going to probably say 2000 to 3000 cans. So for this one, one can to pop up and you know. The amount of other tests that we ran on it, we're surprised that the BAC results came out like this, but at least we tried to do our best to address it, so there was no effect on the participants and they didn't see any effect with their accreditation.

00:08:54 KIM: How many labs are accredited for viscosity graded asphalt cement tests? So we can kind. Of get an idea of the the scope of the sample.

00:09:03 JOE: I don't know. I don't know, Kim.

00:09:06 KIM: Thought we just looked that up, did we not? 

00:09:08 JOE: We looked at accredited. Labs enrolled in the program accredited for any of the tests, and that was 247, we said OK 246. But for kinematic and absolute.

00:09:12 KIM: Any of the tests?

00:09:15 RYAN: 246 yeah.

00:09:21 KIM: I didn't mean to put you on the spot. I thought we. Had discussed that so I.

00:09:23 JOHN: Ohh you totally put him on the spot.

00:09:25 KIM: Did I. Thought it was something we already discussed.

00:09:27 RYAN: It's like Brian's here.

00:09:29 JOHN: Harnessing your inner Brian, let's ask questions that no one's even reasonably prepared to answer, or some number that's so specific.

00:09:38 JOE: So, for either designation AASHTO or ASTM, so that's T21T202-D2170 or D2171. There are 84 laboratories accredited. In our program for one of those test standards.

00:09:54 KIM: You said there was 260 something.

00:09:56 JOE: 46246 laboratories accredited that would require participation in in BAC.

00:10:04 KIM: It's really not that big of a group of people or laboratories that were impacted by this in the grand scheme of things.

00:10:11 JOE: Not necessarily not impacted because you know they they might still look at their data and say. You know what's why are these values so different? Even though there are two samples, they're usually pretty close, so there may be some impact as far as a a questioning of their data, but as far as an accreditation impact, no, there's no, there's nobody impacted.

00:10:34 JOHN: The other interesting thing, and I had a little. Bit of a talk. With Brian few weeks ago about this. One thing that was really different with the viscosity test was the standard deviation for both kinematic and absolute. Traditionally we see a smaller standard deviation, usually single digits, maybe, maybe double digits like low teens, but I think these were in the 20s and 30s, so the 1S value was at least 2 to 2 1/2 times. Larger than what? You see, and that was for both 271 and 272. It wasn't like it was isolated to just one of the sides in the in the pair. So that's you. Know one of. The weird situation that took place, I mean, maybe that was born from those, you know, the 10 or 12 or 15 or cans or however many they were where the samples. Were identical, so maybe that's where that came from. But it was just interesting to see how just that little bit made such a difference with the variation in the in the two materials and the.

00:11:31 KIM: Is there any other interesting or anomaly kind of things that are happening with the samples or the analysis of the samples?

00:11:40 JOHN: I think for BAC, for viscosity graded for this round, that was pretty much it. The rest of the rounds seemed to be very normal, even though it's apparent that we had material that was slightly vary of slight variation between the two. Don't know how relevant this is, but I spent. A week at the. National Transportation product and evaluation program. Reading through ash toe, I sat in on a session about binder suppliers and there was a discussion about how the materials produced and manufactured and it it gave me a little bit more insight on the attempt of the process control that those big facilities have to navigate and manage. So it was good to hear that. But it gave me a little. Bit more insight about how. We need to be diligent when we order material and when we process material and you know, do our best to try to ensure that we're sending those homogenized materials out the door. So it it was. It was definitely interesting it.

00:12:36 JOHN: There are a lot more hurdles than one would think when it comes to coordinating the processing and preparation of materials like that on the grand scheme of things for a refinery or terminal. So definitely interesting, but I could see how some of these situations that we encounter can occur just because of the mass and. Scale of the. Facilities and what they're doing there, so it's. But like I said, we're going to do our best to try to minimize any kind of issue like this. Make sure that we're blending appropriately and mixing and stirring and heating. And trying. To be as diligent as we can. Hopefully this is one of the things doesn't happen for years from now and unless we decide to design to do something like that, but we'll design it so it's effective.

00:13:19 KIM: Yeah, not just trying to throw curveballs at laboratories for no reason.

00:13:23 JOHN: Yeah, we definitely don't do that, no.

00:13:25 RYAN: So, I just noticed this cause we look at. Celsius because also include the specific gravity numbers that we got in our in-house testing. John, you have to know what. Those numbers were.

00:13:33 JOHN: I did not.

00:13:34 RYAN: For 272. The grand average is 103 four. The outlier would have matched up all the rest of our results were lower, as is tradition with our in-house testing with.

00:13:45 JOHN: Our unknown bias? Yeah, for.

00:13:47 RYAN: The past seven years now we have been below average on our gravity testing. We don't know why we've tried. Every possible you know looking into it, checking our equipment, checking our methodology, checking our environmental controls. We can't figure out why we're below average, but we.

00:14:04 JOHN: Are saving just one person. It's Ryan, myself, and one of the other quality analysts who does asphalt testing. So, the three of us are consistently low. We even thought it might have been a spreadsheet issue. When we're doing our calcs. We have checked everything but different.

00:14:16 RYAN: Magnometer is different water. Different rooms for cooling, different ovens for heating everything.

00:14:22 JOHN: We have a bias to be below the grand average as it relates to the data, so it's interesting.

00:14:28 KIM: So, I have a question. The topic is adjacent but John, why is viscosity graded Asphalt cement labeled BAC in our things because the rest of the acronyms and letters kind of makes sense with what the name of the sample is. But what's up with BAC viscosity graded asphalt cement?

00:14:47 JOHN: So this is kind of just a, I guess, old habits die hard for the proficiency sample crew. The sample was originally called Bituminous Asphalt. Meant wasn't set as to be a specific grade, and then I believe once the viscosity test got in, we changed the name to viscosity. Great asphalt, but never change the acronym, so it's something that we could probably do extremely easily. And I think the crew even labels. However, you can viscosity graded the label, say viscosity graded, but the other stuff says bituminous and bituminous is not the preferred language anymore. It should be asphalt, so at some point we'll get around to it. That's probably low hanging fruit that we should really take care of, I think.

00:15:34 KIM: As I'm looking at like, I can make that change on the website, but you know half of that change right now but.

00:15:40 JOHN: I think I think you. Can take care of most. Of that, Kim, thanks.

00:15:43 KIM: I probably can. I'm just always going to like, why is this? Like, I think it's always been since I've been here. Viscosity graded asphalt cement, but yet the acronym was BAC and I was confused from the beginning and I just never asked why.

00:16:00 JOHN: So see there you. Go. You have the power.

00:16:03 KIM: Well, I mean, not all of the power. I have a lot of that power, but not all.

00:16:06 JOHN: Like 95% of the power.

00:16:08 KIM: Yes, I can't act on that power, though, without approval. I can't just go make changes.

00:16:14 JOE: Pretty sure you can if the if the. Names already been changed.

00:16:18 KIM: We'll see our listeners to this. We'll see, like, oh, go to the website, see if Kim made that. Is there anything else that I didn't ask that where Brian would have asked about the viscosity graded asphalt cement samples and then?

00:16:31 RYAN: Yeah, let's not go down that rabbit hole.

00:16:32 KIM: We'll move on to the performance graded asphalt binder samples or PGB samples, and the most recent ones were again 271 and 272. So what was going on there, anything new or interesting with the PGB samples this time John?

00:16:49 JOHN: So, this year was our first year where we kind of changed the material a little bit. We had been out of. Tradition. If you want to call it that. We were sending laboratories unmodified products. You know, typically 60 fours, PG60 fours or PG 70s for every spring sample and every fall was the polymer modified material. Well, we've decided to go away from that traditional route and we are going to just start sending. Different kinds of material for every round, so we won't specifically identify that the spring is unmodified and the fall is modified. We are going to just send various performance grades, you know each time whether they're a polymer, modified or PA modified or unmodified. So it's our first time doing that. Obviously, nothing really changed substantially other than just that. Protocol of us having to order a modified product in the in the late fall for pouring for the spring. But as it relates to the round, very very normal.

00:17:49 JOHN: All standard round averages were consistent. Standard deviations were consistent from previous rounds, so we didn't see anything that was any kind of weird anomaly. It was nice. Clean round analysis went extremely well. You know, we had our typical MSCR issue which we see basically every round that we. Do this, we suppressed. Line item number 18 in the round, which was the percent difference of non recoverable creep compliance or the difference in Jnr due to a bimodal distribution. I'm actually working through a little bit of a back-end analysis now to determine what the issue was. The consistent issue has been variation between the DSR manufacturers and software. So, we'll see what happens as I navigate through that sidebar analysis.

00:18:38 KIM: Joe, is there any interesting accreditation insights?

00:18:42 JOE: This is one of our lighter sample types as far as accreditation action goes. Really, for PGB and BAC we have to look at them in conjunction because so many of those tests overlap between the two samples, so we only had 14 total suspensions out of those 246 labs. Is that are accredited in that sample, so only about 6% of labs and really there's nothing interesting with this one. Again, it's one of our lighter samples as far as accreditation goes and this one seemed pretty normal.

00:19:11 KIM: Thank you for that. Ryan, is there any insights? You're always good with some random and interesting facts.

00:19:18 RYAN: This one was a breath. This one was a breath of fresh air after the BAC issue. So I I'm taking this one as a win.

00:19:24 KIM: I don't blame you there. You did mention that these both are sample rounds that we send. Out twice a year. And that is one of the reasons why the sample round numbers are so high. They're, you know, in the 270s because that means we sent out 271 and 272 samples to participants. But John, why do we send out the PGB and BAC samples twice a year when the rest of them are? Just once a year.

00:19:48 JOHN: So when we started this program, you know, roughly 25 years ago might even be longer than that, the rationale? Behind it was. It was actually an industry push. The industry decided that it was of most importance to test these samples twice a year rather than once a year like every other round. Just kind of a recommendation and you know we don't seem to have any issues with it. We've never, I don't believe we've ever been asked to only send them once a year. We get comments when it comes to this sample round, specifically PGB and BAC is the great of the material. We receive a lot of comments about, you know, this material is way too modified or. Unmodified or what are you doing to us with this temperature? You know this this isn't anything I test in my state or my. So it ends up being like I said, a very common comment that we get from the laboratories and we need to try to do our best to make sure that we're sending a variation in material, you know, not to the materials, have a lot of variability, but we want to send laboratories. PG 76 is PG50, eights, you know, going into the MSR grades, SH, VZ.

00:20:51 JOHN: We want to. We want to make sure that our laboratories are proficient across the board when the materials come into the laboratory, just want to make it a point that these samples need to be treated just like anything. Else any other sample that comes into your laboratory tested like it's one of those samples. It shouldn't be. It shouldn't be quote UN quote special because it's a proficiency sample. I think that there are times where and I think Joe and Ryan can even comment when you know we're in a laboratory doing assessments. If there's additional pressure put on the staff to try to test it in a. In a, quote UN quote better or a different way? That anxiety causes tension with the technicians and may actually make and promote errors. You know, when it comes to this, you're we're talking about continual improvement here. You know, the accreditation program has the rules in place so that if there is an issue, you've got ample time to resolve it and many ways that accreditation can be and reinstated. So, you know, we understand the. Importance of it, but I think.

00:21:49 JOHN: There's a moment where participants need to take a step back and recognize that it's an overall process, and this is one of the ways to ensure that the quality is continuing in your lab. So have your technician this just test it like it's anything else. Don't put any extra pressure on them. Let them roll and see what happens, and then make the adjustments if you need to, more than likely. The statistics when you basically the way our analysis works, one out of every 20 times your lab tests, it's going to be a low rating just because of the. Nature of stats. So you know, we don't need to get. That out of shape. We don't need to get worked out, just test.

00:22:21 JOHN: It run it, relax. And see what happens. Not a big deal.

00:22:25 KIM: That's a great point. Thank you, John and. Again, I think it's important for laboratories and especially accredited laboratories to know that the proficiency sample program is just one of the methods to evaluate competency. You know in your laboratory, so it is a continual improvement kind of thing and I think it was in our last discussion on soils that it was like if you just submit data, you're probably OK. Like, just don't not submit data for it. So good advice John of just like treat it like any other sample and it it may be different results than you're expecting because you're not familiar with that. But everyone's getting a similar sample, so the results should be similar.

00:23:05 RYAN: I'll add 2 caveats to the treat it as any other sample. One read the instructions first.

00:23:11 RYAN: There might be something there where it's just something you do, but it's what we want you to do. Second cause I've heard I've gotten calls on this and have. People ask about. This these are not state methods. Don't run your state method. You will get bad scores.

00:23:25 KIM: Those are good suggestions. Thank you for those reminders.

00:23:27 KIM: Joe, do you have any tips?

00:23:28 JOE: One of my insights, especially recently, not with PGB or BAC, but with soil is the big one. Keep in mind that we change the sample program. Sometimes we add tests to them if. Whether our DOT's want to see the data, sometimes we use we get requests from other agencies like ASTM for our data to be used for precision and bias statements. Also, for the AASHTO test standards, so we add test standards and we always notify our accredited laboratories that, hey. You know this standard you're accredited for is being included in this sample, so definitely pay attention to that as well. Sometimes there will be additional tests in there and as Ryan said, read the instructions. Make sure you're looking at ohh. They added this tests. Are we accredited for that? OK, we need to submit data for that now. That's not specifically to. BGB or BAC soil is the big one that we've added a few things to over the past couple of years, but it's just.

00:24:30 KIM: Something to keep in mind so I know I've said to people and customers numerous times that you only need to submit data for tests that you're credited for. For proficiency samples, is there harm on for the laboratory or for the program in general? If laboratories submit data for all of the tests, even if they're ones that they're not accredited for?

00:24:51 JOHN: Yeah, it's entirely up to the laboratory. I mean, the whole program, you know the emphasis is essentially a good faith effort that you're going to put your best foot forward to conduct the testing appropriately in accordance with the standards. So, whether your laboratory is accredited or not accredited for a specific test. Method if you want to try and test, go ahead. There's no reason you shouldn't. You should never feel obligated to only test what you're credit for, but nor should you feel obligated to test everything. It's up to you and you know it's up to the participants. We have some laboratories. Who? You know, maybe. On the cusp of looking at a new project. And they may need another test. Put in, you know. Hey, let's go check our competency. You know, it might be some test method. They not necessarily haven't ever done it, but maybe they haven't done it for five years or 10 years because of a there. You know, there were no contracts out there that required a test method. Well, now all of a sudden there's a bid coming up and you can check to see how you're proficient. Your technician is go ahead and run it.

00:25:50 JOHN: It's not a big deal, I mean. The way that our analysis. Works and the invalid and outlier routine if the data is really bad, it's not going to be including the analysis, so it won't matter if anybody would take a shot, go for it. And there's this actually brings up a really good comment too that we receive all the time. And the common is that are the laboratories who are not accredited. Are they causing the bias and then making my result poor? And that's probably our second or third most substantial comment that we get. And I said I think the answer to that is no. Like I said one, the analysis will help to weed out any of those issues. A majority of these other laboratories who are not asked to accredited are accredited in some other capacity, whether it be for something like, you know, the Army Corps or through another North American accreditation body, even a lot of our other international participants have brought are accredited by a major accreditor and they still need to perform the test and.

00:26:49 JOHN: Ends with AASHTO or ASTM, and they have to receive satisfactory ratings. So, I don't believe there's any nefarious action going on there where people are intentionally trying to skew the data. At least I hope not.

00:27:03 KIM: Well, and you described some safeguards to prevent in the analysis process to, you know, safeguard against any of that skewing. So, I think that's good insights.

00:27:12 JOHN: Yeah, we try to check for collusion with every single line item for every single round. We will look for. Or clustering of data and if a cluster appears to be. A lot of. Laboratories from the same area of the state or country or company. We kind of really hone in on that and take a deeper look and investigate prior to making any kind of decision. And then if we see something that appears to be a little. Suspect will notify the accreditation program to take additional steps for in the investigation.

00:27:46 KIM: Oh, that's good to know too. Joe, on the accreditation side, John was just kind of talking about like, yeah, if you're can run the test, then submit data for it, even if you're just thinking about it or just seeing how you do, will that impact the accreditation of a laboratory who has maybe submitted data for a test that they are not accredited? Currently or they weren't accredited for in the last two years, but they submitted data anyway, but now they want to get accredited. Does the accreditation program look at that data or? Is it a clean slate? From it like what's that process like?

00:28:22 JOE: It is a clean slate. The only thing that we look at if they're seeking accreditation for. New method or new methods is that they're enrolled in the appropriate proficiency sample program. If they've been submitting data and it's some low ratings, but they've never been accredited for that test before, we will not deny their accreditation. Based on that. It's not until a laboratory is accredited for a test that we start including their. Data in our accreditation review of their ratings. Which again is every two is every two years basically a good example of that is hydrometer. A lot of labs are previously familiar with T88, and D422-D422 is a withdrawn standard now and has been replaced with D7928. Both of those are included. In the soil proficiency sample program, some labs are just starting to pick up on D7928. And are submitting data even though they're not accredited and we don't take a look at that when they seek accreditation for that new standard.

00:29:28 KIM: That might be helpful too, for labs to know that's not going to hurt their future chances of getting accredited correct, but circle back to BC and PGB samples. Are there any upcoming changes to the samples or the tests included in those sample rounds John, that our listeners can keep an eye out for?

00:29:46 JOHN: I don't believe so, Kim. I think the. Only thing that. We have been considering this for a while and that's to eliminate some of those properties on the multiple stress creeper recovery test, it's. Been an interesting decade, decade and a half of testing. You know, having data and analyzing data for those materials. You know, we're trying to get to the bottom of why we see a bimodal distribution or even a trimodal distribution. We keep investigating the TSR manufacturers and checking things like software and I just looked at it. This morning and a little bit of preparation for this, you know, we had roughly 240. Labs submit data. And out of. Those roughly 240 I found over 20 different versions of software, so it's really, really hard to pinpoint where any bias would have come from when you have that many versions of software in only you know roughly 250 labs. So you're seeing a 10% variation in software use alone.

00:30:44 JOHN: Just when it comes to those participants, so a little bit challenging to try to to pinpoint the exact cause of the bimodal distribution. But I think we're at a point now and if I remember correctly, Ryan, you. And either support or reject my claim on this. But we talked to the our oversight group and we had discussed removing this from the scheme for the following years. One other thing that I think we might look at is for the bending beam radiometer line items 22 through 27. In this report, we had been asked to evaluate. The trials of beams as well, so the accreditation program only looks at the average values of the beams, but we were suggested to collect the data for beam one and beam two in the pair. So, this is this would be two beams per side SO2 beams per sample 171, and two beams for sample 172 just to collect that data and see if there is any major variation. But I think now that after having this in for five or six years, we are not seeing any suspect or any potential variation between the two individual tests in in the side of the pair.

00:31:48 JOHN: So, I think we can probably cut out trial one and trial two and just allow the laboratories to still do that the testing in accordance with the standard to test 2 beams and report it. But then just simply report the average and it will take four line items away from this analysis.

00:32:03 KIM: I haven't looked at the data sheets for many samples in a long time, So what other kind of information are you asking and what is that used for?

00:32:11 JOHN: All kinds of stuff.

00:32:12 JOHN: Oh my gosh.

00:32:14 RYAN: Pull up a datasheet and let's run down that. So for for PGP we're asking for. Or for the rolling thin film oven looking for elevation above sea level, the barometric pressure time is testing the air flow rate exiting the air jet testing time in the oven for MCR, looking for the DSR manufacturer, DSR model, and the MSCR software version for the PAV looking for pav manufacturer PAV, model and whether or not the residue was degassed. And for bending beam rounder, we're looking for what type of mold. So that's we're done for that.

00:32:44 JOHN: At least just an array of things we ask for in equipment manufacturer, different types of compaction effort, oven temperature, thermometer used, elevation, barometric pressure and all. Of that data. Is basically stored right now, but it's stored for the purpose of when. We attend an ASTM. Committee meeting or the AASHTOs sub committee or committee meeting on materials and payments. If there is an entity in that group who comes up to us and says, hey. We need some data on this information or can you add this to a sample around to collect some data you know, go ahead and throw it in there and then send us the analyzed data and we'll look at evaluation of a specific method.

00:33:26 KIM: And that's optional, right? There's no requirement that laboratories submit information for those optional things, correct?

00:33:34 RYAN: No, I'm actually just going to ask about that whenever, John, you're talking about MC. How people actually responded with their information.

00:33:41 JOHN: Leave almost 200 laboratories submitted the software information, so quite a bit of laboratories are there are forthcoming with that, but it that's what we need to you know we need that additional information to help us provide information to the States and ASTM to make revisions to the standards and pull in those precision. Estimates to make the standards more repeatable and more reproducible between facilities.

00:34:07 KIM: Even though it is optional to submit that data, it is really helpful for the industry at large and to improve the test methods when laboratories submit all of the information that they. Can anything else that we missed or any other questions that I should have asked and didn't know to ask?

00:34:26 JOHN: I don't think. So, Kim, I think this one went. Went really well. We had had two, two rounds, you know, PGB obviously was was a normal round and we saw some anomalies with BAC, but I believe that we did our best to, you know, adjust and and course correct on our end and make sure that there was no impact on any of the laboratories accreditation. You know, one thing that I I think I'd like to. To do here while we. Have a few minutes left is just. Just kind of add a little reminder to the participants who are listening about what rounds you've got out right now and what's closing. So right now we've got the Marshall mix design around 79 and 80 that's out and it's going to close on July 20th. We have our three vein designs, California Union. Compaction Colorado compaction and Texas compact. Section. All three of those runs also close on July 20th, and then we've got our last mixture design sample, the gyratory around 55 and 56. And that closes on July 27th to kind of a a little monthly or month out reminder for all the participants that we've got the some of those closing out.

00:35:32 KIM: Yeah. And envisioning a very. Busy July for you, John.

00:35:36 JOHN: Yeah, Ryan and I will be extremely busy for the. Last two days before we go on vacation so.

00:35:45 KIM: I like it. Well, Ryan, John and Joe, thank you for joining us for another PSP Insights episode.

00:35:52 ALL: Thanks. It was a good one. 

[Theme music fades in.]   

00:36:01 ANNOUNCER: Thanks for listening to AASHTO re: source Q & A. If you'd like to be a guest or just submit a question, send us an email at podcast@aashtoresource.org or call Brian at 240-436-4820. For other news and related content, check out AASHTO re:source's social media accounts or go to aashtoresource.org.