X-ray inspection isn’t just for PCBs anymore. In this week’s Fireside Chat with the Xperts, Creative Electron’s own Carlos Valenzuela lights our fire with Autonomous X-Ray Inspection of Medical Devices. Carlos shares Creative Electron’s wealth of experience and insights into examining a broad range medical device products.
From software assisted automation to fully automated inspection, this discussion explores the range of options to meet the manufacturing and quality assurance demands of medical device manufacturers. Access additional Fireside Chats here.
Dr. Bill Cardoso: So it’s 10:00. Good morning. Welcome to another edition of Fireside Chat with the Xperts. Our host today is Carlos Valenzuela. My colleague, Carlos Valenzuela is our VP of Engineering. He’s been with us for six years now, Carlos? Developing-
Carlos Valenzuela: No, eight.
Dr. Bill Cardoso: Eight years.
Carlos Valenzuela: Yeah.
Dr. Bill Cardoso: Time goes by too fast. So he’s been with us for eight years and developing a whole series of technologies, from actuators and how our machines move, all the way to the development of deep learning and machine learning algorithms for the inspection of our samples. So the conversation today is going to be on the Autonomous X-Ray Inspection of Medical Devices, which is an area that Carlos and his team have been covering for the best several years now. And if you have any questions for future ask, you can either unmute yourself and ask the question, or you can type the question on your chat and I’ll go ahead and ask Carlos throughout the presentation. All right. So on that note, Carlos, it’s all yours.
Carlos Valenzuela: Cool. All right, well, thanks everybody. Thanks Bill, for the intro. And thanks for taking the time. Now that a lot of us are working from home it’s a good idea to educate ourselves of new technologies and what’s out there. So I’ll get started. Like Bill said, if you guys have a question just feel free to just post it on the chat and I’ll try to answer it as briefly as we can. If not, we can take it off the chat and start a new conversation.
Carlos Valenzuela: Do I have control? It’s not clicking. Okay. All right. So I do have a couple of slides here on… Oh, it jumped.
Dr. Bill Cardoso: Yeah, hold on one sec.
Carlos Valenzuela: Yeah, you can have control.
Dr. Bill Cardoso: Mm-hmm (affirmative). Sorry about that. Go ahead.
Carlos Valenzuela: Are you in the first one?
Dr. Bill Cardoso: Yeah, go ahead. If you need to change slides because I’m bringing people in to the meeting.
Carlos Valenzuela: Okay, yeah I’ll just let you know.
Dr. Bill Cardoso: Okay.
Carlos Valenzuela: Again, go back. You skipped it.
Dr. Bill Cardoso: So sorry about that.
Carlos Valenzuela: All right. I have a couple of slides on the history of the company. Oh, again.
Dr. Bill Cardoso: Go ahead.
Carlos Valenzuela: All right. Yeah, you can go to the next slide. All right, yeah.
Dr. Bill Cardoso: So you want the history of the company?
Carlos Valenzuela: Yeah.
Dr. Bill Cardoso: Okay. Go ahead.
Carlos Valenzuela: All right. So, we started in 2008 that we were doing research for the government and a lot of it was based on the X-ray systems. And about a couple of years later, in 2010, that’s when I came in and we started focusing on X-ray machines for consumer electronics and customers, not just government contracts. We are technically the largest US manufacturer. A lot of people are still manufacturing in China or overseas. We acquired one of our biggest competitors in 2016, and that grew our legacy, our older machines, and our service baseline. So we have about 1,000 machines worldwide.
Carlos Valenzuela: Another big one, in about 2018, early 2019, we became a Cognex and Fanuc PSI and ASI. Cognex supports us a lot with machine vision tools, barcode scanning, and things like that. And Fanuc has been an awesome partner as well with robotics. So cobots and just regular robots to move things around. We have locations worldwide. San Marcos is our main headquarters. Poland takes care of Europe. Tijuana, Mexico, we have sales and service. It takes care of Mexico and South America. Chicago and Norfolk, they’re just sales on the East coast just to have a different time zones. Cool. Next one.
Carlos Valenzuela: All right. So these are some of our partners. They’re some of the biggest in the industry. Like I said, Fanuc, Cognex, Hamamatsu and Varex are more on the X-ray side, Allen-Bradley for PLCs for automation. These are brands that people know and are familiar with. So they’re going to feel more comfortable working with it, from surveys and from a support standpoint. Good.
Carlos Valenzuela: This is our product line. And then for this presentation, we can think about this as maybe more of the shell or the baseline where our system’s going to go in. That’s basically the footprint, essentially, of what a machine is going to look like. If you’re doing a very small implant, you don’t need a Fusion machine that’s giant. You might want to stay in a Cube or Prime deck. We have the flexibility to use whatever the customers needs. Good.
Carlos Valenzuela: All right. So here’s a quick example of how… I’m starting more on the technical side, how an X-ray image is produced. Some of the people on the call are not very familiar with X-ray technology. We have two big components, which is the source and the sensor. As the sample, in this case, a medical device gets closer to the source, we are producing magnification. If we get farther, we are reducing it and increasing our field of view. If you want to see more components on one device at one shot, then you have to get closer to the sensor. We want to zoom in and magnify in a very critical point, then you get closer to the source. And our manipulation will take care of all that stuff. A robot can move it closer, a conveyor can move it away. All these little things happen. Good.
Carlos Valenzuela: So in that, like I said, there’s two key components. And I talked about the source and detector. So the cabinet was the previous slide where you saw or our product line, which is the Cube all the way to the Fusion, which is our bigger cabinet. These are fully leaded cabinets. They’re safe for operators. They have all the tools that you need to produce a machine. And the last point would be automation and software. What does your application require to become automated and autonomous. Do you require machine vision tools? Do you require AI to move things around? Do you need a conveyor? Robots? A lot of medical devices are not there yet. So they use what we call software assisted applications, where software tells the operator, okay, I think this is bad, but you have the final call. And that becomes easier for validation and things like that. Cool. Next one.
Carlos Valenzuela: All right, here, to give you some examples of custom machines that we built. This first case, we have a machine with two conveyors creating a 90 degree turn, a lot easier for radiation exposure because it’s not a direct exit, but mostly because of ergonomics. So if you see where the keyboard and the monitor stand, it’s right next to the person that’s loading the machine. So they’re just sitting there, they’re analyzing the image, and they’re just loading samples. Very simple. The sample goes in, it’s a serial number scan, and it goes inside the machine and then comes out the other machine. I have a little video of this later, but there’s a reject mechanism inside that bad product stays inside the machine and into a locked mechanism, a locked bin. So only authorized personnel can remove bad and failed product. Cool. Next one.
Carlos Valenzuela: Here’s more on the robotics side. So here we have an extra machine with a robot inside. This is more for a higher throughput. But a key thing with a robot is because we have full control of it, we can create not only a manipulator, but we can create a sorting mechanism, where maybe your product doesn’t only have a pass/fail. Maybe there’s three or four different categories. So if you see on the right side, there’s a robot moving around and it’s picking up this device that it’s X-raying, and then sorting it depending on the status of that current sample. It can be bad, good, marginal, too big, too small, missing one piece. Because to reduce scrap, you might be only missing one device that you can push your product back up a few steps on your production line and add that device that is missing. And you’re basically reducing the amount of scrap. It’s not just trash. Good.
Carlos Valenzuela: And then this is kind of showcasing our expertise with Fanuc. We’re pretty happy with that. Then the next one is something more inline. All of these three machines, they all have conveyors. The one on the left side, it’s our inline machine and it talks to other conveyors in a production line. So this machine can be one step of a bigger process. It can come from a optical inspection, and then x-ray, and then it can go on to final assembly or whatever it is. The one in the middle, it’s a conveyor. So it’s meant for bigger products, or at least that machine. It has a larger field of view, so you can fit up to a 17 by 17 inch product, x-ray it. Technically you could fit it even bigger, for the field of view 17 by 17. You can do a lot of processing and then it comes out. Very easy to use. You can see that all of them have the keyboard tray, they’re all touch screen, super simple. And then the one on the right same concept, it’s just on a smaller footprint.
Carlos Valenzuela: All right, so why do we need to x-ray these devices, and I’ll try to summarize all these points in just a few sentences. So, this is things not changed through the whole crisis and the pandemic and all that stuff. But every year our life expectancy is higher, right? People live longer, but they don’t live longer just because they’re healthier. They live longer because they depend on all these devices. People with diabetes, AIDS, and all these diseases that used to be almost like a life sentence are not anymore. So they depend on smaller glucose monitoring systems, on all these devices to live longer, essentially. So there’s a high demand for these devices. And the complexity of these devices, because technology is so much better now. They’re smaller, they’re more efficient. If you see a pacemaker from 20 years ago, it’s like almost you have to wear a backpack. And now they’re tiny. Hearing AIDS are smaller and smaller. So you need new technologies to X-ray these components.
Carlos Valenzuela: And then the other side is that you rely on people making these harsh decisions, and they might make the right decisions for the first 20 minutes, but then they might get tired and just start passing everything, and things like that. So that’s where the automation and the autonomy comes in. We create an automated system to make it as efficient as we can, to make it faster, and autonomy is that we give the system the tools required to make the right decision, or guide operators into making the right decisions. If they rely on putting a sample on anX-ray table and getting the perfect angle, that’s a little too much. So what if you put a robot, and the robot gives you the perfect angle, and the person, the only thing he has to do is press a green or red button. So, there’s a gray area where we try to help the customers into making the right decisions. Cool.
Carlos Valenzuela: So here’s some examples of how diverse medical devices can be. In the middle, we have a pacemaker that’s complicated. It has a battery inside. So the batteries have to be X-rayed. On the bottom, you have a hearing aid, a glucose monitoring system. And on the right, we have something more old school, a medical kit, even though it’s still very complex, it still has a lot of parts. It’s very important, but it’s kind of the opposite of the other ones. Once super small and tiny though, and it’s huge, but it still has some complexity to it. And I have some real examples of these.
Carlos Valenzuela: All right, now onto some actual X-ray images. So we wanted to start with something that everybody is kind of familiar with, which is an EpiPen, essentially, auto-injector. So you can see here, very basic, you guys probably seen it before. You can go to the next one. So that’s what an x-ray of it looks like. It can be simple, but there’s a lot of places where things can go wrong. Things that you can’t inspect, optically. We’re kind of creating this third dimension of quality inspection where you can see through it after it’s been finally assembled. Go to the next one.
Carlos Valenzuela: All right. So here, you can see just an example. You can see an example of what a production line of an inspection could be. It can be a conveyor, and you have on the left side, the red device is the X-ray source. And on the right side, the black little square is the detector. So we are penetrating the device and we’re creating an image. This happens almost live, one part every second or so. And that can be pretty fast. And you can see a little piston there that stops the product, and then goes on to the next one, and basically just keep moving along. This is just a demo of how a production line can be created with our technology. Next one.
Carlos Valenzuela: All right. So here we have three key areas for errors or for defects. We have a spring on the left side. We have the medicine, or what’s going to be put in there. It doesn’t have to be an EpiPen, it can be whatever it is. And on the right side another spring. So if we go next image. So here you’ve got three examples of how things can go wrong. Bent needle, missing dosage, and a bent coil. So I have some more examples of springs, but springs are very important in our medical devices. Because they provide the motion to your device. And that motion has to be repeatable, and you’ve got to count on it. If it’s bent, we’re talking about life and death of kind of things. Cool. Next one.
Carlos Valenzuela: Here’s another example. This one doesn’t have the empty vial, but it has some big bubbles inside, see some bubbles. And on the left side, that’s our algorithm counting how many coils that spring has. A coil and a spring gives you the torque, the repeatable action. So if one has 13 and the next one has 12, it might not work the same way. So for an EpiPen, I think it’s more like a gross action, but for some other devices, you rely on the action to be repeatable. Good, Bill.
Carlos Valenzuela: So here’s a video, just a reject mechanism, so if we find a bent needle. There’s an inspection area on the right side. We have a little piston and if it’s bad, it pushes it to a locked bin under the machine, somewhere where they can’t grab it. And then on the other side of the machine, there’s a key where you open and you grab a bad product. Hopefully you don’t have to use that that often. Cool. Next one.
Carlos Valenzuela: All right. So talking about bubbles, this is just an example of an implantable device. This goes inside your body, and inside there’s medicine, whether it’s HIV medicine, or insulin, or even birth control. Whatever it is, it’s inside. And if you go to the next one, though, here’s a blown up image of it. And if you blow it up more, you get those huge bubbles. And once you see it, if you go back one, they’re easy to see because your brain got trained for it, but they’re complicated to find. And they are complicated to locate even using a computerized system like algorithms. There’s not a lot of data there as far as, on the X-ray image, we’re dealing with black and white colors. When you have a bubble, you have sort of little lighter colors, but not very much so. So for bubble detection, we’ve got to do things that are more complicated, and I’ll go over an example. You can go next.
Carlos Valenzuela: Here’s another example of more bubbles. One more. So to tackle the whole bubble issue is… This has been something that we’ve been asked a lot about, but it’s hard to show examples when it’s an actual natural product. There’s NDAs and all this other stuff. So we had this case study a few years ago where we analyzed the contents of inkjet cartridges. And if you go the next image, next slide. So here we have an example. You want to put it upside down, if you have any bubbles, you want it to locate to the top or centralize, essentially. You can see the detector on the image on the right, and the source closer to the bottom. Good.
Carlos Valenzuela: All right. So here we have two images. This is what an x-ray of an inkjet cartridge looks like. You see the one on the right side and the one on the left side, there’s something wrong with one of them. And I guess if you go to the next slide, we’ll find out. There’s a huge bubble there and you can see, it’s kind of complex. There’s a lot of internal components to it. So finding a bubble, it’s not just finding it. It’s training the system to understand what a bubble looks like. If you look at the image on the left, there’s a lot of differences in colors. There’s some dark ones, there’s some lighter ones, there’s circles in the middle, there’s all these complex things going on that the system has to understand almost what to look for. So this is where we implement a deep learning approach where we load data, take a lot of images, show what a bubble looks like, and the system will start to understand. So if you go to the next one.
Carlos Valenzuela: So here’s a quick example of, on the left side, we got our data set. Of course, the data set has to be a lot bigger than that, but basically just showing the machine, showing our system what are defects, and tagging them. This is bad, this is good, and things like that. If you see on the left side, there’s good ones, there’s bad ones. There’s a couple in the middle that are good. There’s some that have a bubble in the top. There’s some that have a bubble in the middle. And our computer analyzes them, and then you see on the right side we get three different results. We get one good one and two bad ones. Good ones doesn’t have a bubble. And there’s one that has a bubble on the top. And there’s one that has a bubble on the bottom. And the benefit of this is that we’re training it on just locating bubbles. This can be internal defects, foreign material inside. It can be anything you want to look for.
Carlos Valenzuela: Because we hear the case like, “Oh, I can just weigh it. If it matches what it’s supposed to be, then it’s good.” But yeah, that’s an easy, simpler solution, but what if a bubble on the top is fine. What if it’s on the bottom, it breaks your printer. Maybe it can create clogs or things like that. Maybe one huge bubble is fine, but multiple little ones is the worst thing that can happen. So there’s a lot of information you can get from a system like this.
Dr. Bill Cardoso: Hey, Carlos. We have a question from one of our participants. So Jeff is asking, “What is the process to understand if this is going to work for my product?”
Carlos Valenzuela: So it depends on what the product is, but most of the time, if we’re dealing with complex applications, we do what we call a study, where we do as much as both parties will feel comfortable moving forward. So instead of designing a whole machine in a week, we say, okay, let’s tackle on a couple of things. And then let’s analyze a couple of images. And if they like the results, then it’s easier to gauge what route to take. Is it just traditional machine vision? Is it AI? Is it a robot? Is it…? All these things, and there’s no clear answer, but it’s mostly becoming engaged on the product and understanding the product itself.
Dr. Bill Cardoso: Cool. Thank you.
Carlos Valenzuela: There we go. All right. Here’s a video of the system processing images. See, some of them, I remove the indicators. So you can see the indicators, you can remove them. Sometimes, you’ll find a defect, but then you remove the indicator. You’re like, oh, I don’t think that’s a defect. So that’s when you have to go back and feed more information to the algorithm to understand that that’s not a defect. Or sometimes it’s proven to not be a defect anymore. You’re like, okay, we’re creating too many false positives. So, they get processed very quickly, less than a second an image. A lot of our processes are the fastest part of the machine. The slowest part is the loading. Are we relying on a person to physically grab it and put it on the conveyor? Are we waiting for something else? Are we waiting for a machine to finish what they’re doing? So, we’re pretty efficient on how we use time. Good.
Carlos Valenzuela: All right, here’s another case study. This is a simpler one. So this is something that goes inside an infusion pump. It’s some sort of device, and it has some tubing. So what was happening is that the operators were coiling this tube, and then just packaging it. It’ll go through and get sterilized and then put in a box and just sit in the warehouse until the customer wanted it. And few days later, months, or even years, the doctor would get it, open it, bring the tube out, and it’ll be bent. It’ll have some kinks, and it’ll prevent fluid from going through it. It could be IV, it could be blood, it could be whatever it is. So they had this huge lot of compromised material. So there’s no way that you can… You open the seal, now the sterilization is gone, right? So you have to re-sterilize it, so it wasn’t worth it. So they had this warehouse of hundreds of thousands of devices. So if you go to the next one, Bill.
Carlos Valenzuela: There you go. So now you can see at the bottom, there’s a kink there. So that bend right there was bent a little too aggressively. Maybe if you bend it like that and you open the next day, it’s fine. But if it’s sitting and it’s hot and it’s whatever it is, this environment was creating these bends to become kinks and then not be able to be used.
Carlos Valenzuela: All right. So next one, another simple one, it’s a packaging of a catheter. So catheter is a very simple device. There’s not a lot to it as far as X-ray goes. They go inside this metallic bag and that bag goes inside a box. So they had some RMAs and had some issues with them and they’re trying to figure out what it was. So if you go to the next image, Bill. That’s what an X-ray of a catheter is. My two year old son can detect what the issue with that is. So if you go to the next one, you can see that there’s two huge bends. And the problem was a machine, few steps back. So they were like, “Oh, I know exactly what machine that is.” And they went and they tweaked it and it’s fine. But our machines are creating this closed loop system where you can create more data and feed it back into becoming a more efficient production line. You don’t have to X-ray things because you have problems. You have to X-ray things just to know more about your product and find problem sources or where problems can occur. Cool. Next one.
Carlos Valenzuela: All right. So another big thing on medical devices is springs. So go next one. We saw a few examples on EpiPen, here you can see a couple of images of springs and maybe you see the annotations on the numbers who are counting the coils that each spring has. If it’s not… We see 10, but it counts zero, so there’s 11 coils. If for some reason there was 9 or 12, then that would fail. Next one.
Carlos Valenzuela: And then again, here, you can see just another picture of an EpiPen. So you can see the springs, but examples of really, really bad springs. The ones on the left, they’re falling apart. And maybe they still work. Maybe they were wound opposite, and they just look that bad, but you can’t take the chances of having that on your product. Next one. All right. I think this is the last example, but this is a pretty interesting one. So when we started initially, we saw this picture of this medical kit, and this medical kit can be… It doesn’t have to be a medical kit. I think it’s just showing our capabilities, our technology to analyze the finished product.
Carlos Valenzuela: So go ahead. So that’s what the kit looks like. Next one, that’s what it’s, in open. And of course we X-rayed it sealed and sterilized and everything. One more. So that’s an X-ray image of it. We can see everything. You can see vials, you can see syringes, you can see the scalpel, everything. Then again, to explore the capabilities of an AI system, we had to create again, a data set. You can go next one. So here’s an example of a bunch of different possibilities, and this became our data set, but also became our testing, our control group, essentially. If you see around, you can see all sorts of different configurations. You can see the scalpel fell, and you’ve got the vial on top of something else, all of these. So if you only care about your product being complete and fully sealed, this is the right tool. But we also know location, we know orientation. So if your kit or your product requires a complete kit, but also to be placed on the same spot, then this tool is right. So if you go next one.
Carlos Valenzuela: So here, you can see every single component. Component zero is the big syringe. One is the three vials. So our algorithm will be, you need one zero, three ones, three fives. And from there, you can easily find out if you have a missing component. And of course you can also find location and things like that. Next one. So see how things move around? And the last one is interesting, where we have few devices. We had the scalpel that fell off, We have a vial, a syringe on top of the other one. We have a very complex thing going on, and it found every single component. So that’s the beauty of using AI.
Carlos Valenzuela: A traditional machine vision tools, like pattern or comparison, wouldn’t be able to get these sort of results. Because when we use AI, and I don’t want to get too much into AI, we are teaching what our product is, but also what it’s not. So it’s understanding its environment. It’s understanding a lot of things. Machine vision will just rely on, what am I looking for? If it’s on top of another one, it changed what it looks like. AI understands more, like okay, if I’m on this side, it look like this. If I’m on that side, it look like that. So it’s a deeper understanding of the product. And I think that’s it. I don’t know if you… Ooh, four minutes too long.
Carlos Valenzuela: You’re muted, Bill.
Dr. Bill Cardoso: You want to tell us what you’re working on with Cognex coming up?
Carlos Valenzuela: Yeah. So, there’s a couple of people from Cognex that joined, they’ve been an awesome partner. So in a couple of weeks, I think the date’s still to be determined, there’s going to be another webinar hosted by Cognex and myself, showing a lot of the same examples, but going technically more into the details of how some of them happen and how we’ve been working together. Not only just on the X-ray side, but also the barcode scanning and things like that, helping us automate these machines.
Dr. Bill Cardoso: Right. Well thanks so much, Carlos, for your presentation. And thank you all for participating in this fire side chat with the expert. Please connect again next Wednesday at 10:00 AM. We’re going to be talking about machine learning and artificial intelligence. I’ll talk to you guys later, bye.