Product In Action Webinar
Available On Demand
Implementing digital PCR (dPCR) into your research can offer unparalleled sensitivity and absolute quantification of DNA and RNA in samples of interest, overcoming many of the limitations of other technologies. But as with any laboratory tool, the quality of dPCR can only be as high as the quality of the assays and instrumentation used to produce it. In this webinar, Frank Bizouarn, Market Development Manager from the Digital Biology Group at Bio-Rad, will describe the characteristics of ideal dPCR data, strategies for maintaining diligence and avoiding pitfalls, and explore the robustness of the QX ONE™ ddPCR System in supporting biodistribution studies for cell therapy products.
Key learning objectives
- Learn what ‘good dPCR data’ looks like
- Discover how Droplet Digital™ PCR (ddPCR™) overcomes challenges to produce superior data
- How ddPCR technology can support biodistribution studies
Speaker
Frank Bizouarn
Market Development Manager,
Bio-Rad Laboratories
MaryBeth DiDonna 00:08
Hello, everyone, welcome to land managers Product Spotlight webinar series. My name is Mary Beth de Donna and I'll be moderating this discussion, achieving quality and avoiding pitfalls and biopharmaceutical research. We like our webinars to be very interactive, so we encourage you to submit these questions to us at any point during this webinar. Our speaker will address your questions during the question and answer session following his presentation. To ask a question or leave a comment, simply type your query into the q&a box located on the right hand side of your screen. We'll try to address as many questions as possible during our time together. But if we happen to run out of time, I will forward any unanswered questions to our speaker, and he can respond to you directly if possible, please visit the handout section on the right hand side of your screen for supporting information for this presentation. I would like to remind you that this webinar recording will be available on demand shortly following this presentation. So please watch for an email from lab manager and how to access this free video once it's available. I would especially like to thank our sponsor BioRad for their support, which allows lead manager to offer these webinars free of charge to our readers. So with that, I'd like to introduce our presenter for this webinar, Frank was Oren joined BioRad in 2002. Sport quantitative PCR technology as a field application specialist in the southeast United States. In 2006, he moved to the role of international FAS for bio rads global gene expression division to promote best practices and qPCR around the world. In 2011, Frank began working on droplet digital PCR applications and joined the digital biology group. Currently, he is focused on supporting and promoting quantitation detection and sample discrimination applications that take advantage of the high resolution and sensitivity provided by the power of droplet partitioning. Right. Thanks for joining us today.
01:51
Hello, everyone. And thank you for joining us today for this webinar, where we're going to look at the importance of quality data and how to achieve it in the biopharmaceutical environment. For those of you familiar with digital PCR, this is going to be a little bit of a short review. But for those who are not, digital PCR is actually a relatively straightforward technique to quantify the amount of DNA or RNA present in your sample, the process is relatively simple, we take our sample and partition it into a whole bunch of sub reactions. In our case, we use droplets and the DNA or RNA present in that sample will be randomly distributed amongst all of those droplets or partitions. Now, some of the partitions will contain the target of interest, some of them will not. And we then go ahead and PCR those droplets and using a probe based or a dye based strategy are able to see which droplets contained our target of interest and which ones did not. So it's a simply yes, no, that amplification occur of my target of interest within these droplets. Now, the data that is generated typically looks like this. So we have some a graph where we see the fluorescence amplitude on one side and the partition on the other. And depending on the intensity, some will be high, some will be low, we set a threshold and simply say Okay, anything above the threshold is deemed a positive partition. Anything below is deemed a negative partition. And we simply count the number of molecules and adjust for distribution statistics. So basically Poisson distribution. So single color data typically looks like this multiplex data is usually plotted in two channels at a time. The example you see here, we have channel one, which is Fam channel two, which is hex being used as a reporter dye. And here you will see different what we call clusters or groups of partitions or droplets. So these at the bottom left hand side are the droplets that did not contain the target of interest for fam or for Hex, these had the target of interest for fam, these have the target of interest for Hex, and these are the ones that due to random distribution have both targets of interest in him. Now here again, we set a threshold for the Fam channel threshold for the hex channel, and are able to easily quantify the number of target molecules present in the sample initially, so very easy, very straightforward technology and it is extremely robust and precise. So as far as quantitative techniques for nucleic acids that are in a mixed population, digital PCR will probably provide provide the highest precision and accurate See currently available. It's also a method for absolute quantification in that we directly count the number of molecules. So there's no need for a standard curve, or a reference curve that is typically used in other techniques. It's a standalone quantification method. It's very easy to multiplex compared to other techniques. It's highly reproducible. And because it's a PCR process that is read at endpoint analysis, so after 40, or 45 cycles, it's tolerant to minor inhibition. So efficiency is no longer a factor. And finally, it's easy to analyze. So to give you an example of how this technology is robust and easy, we ran a workshop in April, I think it was April, where we had multiple users run a three Plex assay. So what we're seeing on the screen here, is a little bit more complicated than what we had before, but we're actually amplifying three different targets. One was an adenovirus target. Another one was an internal caucus. Finally, a Legionella. These are the droplets that had either or of these three different targets. And then we have the double populations and then the triplet populations. So this was an experiment that was optimized by three different researchers. And what's really interesting is that some of these were biopharma folks. And some of them had never, or I shouldn't say never, but have not lifted a pipette in years. So it was kind of interesting. But even though some of them were more skilled at lab techniques than others, if we look at the concentration counts between the different individuals, the results very, very little, in fact, less than 25%. So these are the same samples handed out the three operators. And with different pipette I guess, capabilities pipetting skill levels. And so we look at this, we go yeah, 25%. That's pretty big, but because they were amplifying and testing the same sample, if we use one as a reference and compare the other two to that reference, we see that the fractional abundance is exactly the same within statistical variability. I mean, this is pretty much almost identical. So this goes to show the reproducibility that is achievable. With digital PCR across different users have different skill level. And this is a little bit of a repeat of an experiment that was run back in 2017. Where, in collaboration with with a Metrology lab, six different samples were sent to 21 labs across the world. And what was really impressive of the results were that out of the 21 Labs 18, in essence, got identical results. Although all over the world for the same samples, three labs had different results. And it turns out that the labs were not analyzing the data properly, when we reminded them how to properly analyze data, two out of three fell back with the norm of having the same exact result. The one lab insisted on doing things their own way. And we're off by quite a bit. So. But nonetheless, this goes to show the reproducibility of this technology across users across different days and across different labs. So really impressive technology, very exciting. But we've also seen that as this technology has matured, that there are things that are slowly going wrong. One is that it's simple to run. And this perceived simplicity means that a lot of researchers are taking shortcuts and are not doing all the pre work that they should before running an important experiment. The other challenge that we're seeing is a reliance on others. In other words, not doing proper testing, proper validation, having somebody say, oh, yeah, this is good, just use it. And it not being as as good as it should be. And finally, a rush to get results. We often see different groups that are in a hurry. And obviously it's understandable the society we live in, makes that you know, the ability to spend weeks and months and years working on a project not not viable. It's got to get done quickly. But this rush is actually a detriment in that very often. Researchers don't do due diligence before running experiments. So we're seeing what is called the Law looks good enough mindset. And interestingly enough, this mindset set is slowly changing to Well, now I think we should get good data. But this is an example of average data. In this case, here, we have a single Plex assay. Here, we have a duplex assay. And right away, if you saw the previous image, these were nice and rectangular or orthogonal, this is not quite close to that. So one of the things that we're very focused on at BioRad is making sure that, you know, researchers realize that a little bit of optimization, a little bit of due diligence really makes a huge difference. And a classic example is the following. So here we have an assay on an individual Well, amplifying a given target. And hypothetically, we say, Okay, well, I see a group of negative partitions are at the basal level, and I see a whole bunch of positive partitions. Therefore, I can separate the two by setting a threshold. And that would be hypothetically good enough. But what's interesting is, if I run the same reaction, just another aliquot of it at a slightly different temperature, we can see that the same reaction mix gives us significant difference in amplitude, as well, which is the separation of my negatives, the positives, as well as the amount of partitions that are in between the positives and the negatives. And if we quantify the samples, we can see that numerically, the difference between having a poorly or poorly optimized assay, versus a highly optimized assay can be significant. In this case, it's over 30% of a difference. So when you're trying to quantify things accurately, obviously,
11:53
it's important to make sure that you stack the deck in your favor and make things as efficient, or at least as as efficient and easy to run as possible. Now, taking shortcuts is not something that is new. We've learned this from qPCR. If we look at a review from nature, where an analysis was done of 1700 papers, by the Mikey consortium, the bottom line was that more than 50% of these experiments were deemed reproducible or poor or absolute garbage. Which is not surprising, because, you know, we're just showing results, and we're not showing how we got the results. So there's a little bit of that shortcut. But you know, this obviously has ramifications. And in the case of digital PCR, what was seen as poor experimental design, of course, poor assay design, poor sample preparation, poor amplification and correct analysis, the list goes on and on and on. So not to dwell on qPCR. But what can we do in digital PCR? Well, this is what a really nice digital PCR profile looks like. As we saw before, we have a series of negative partitions, positive partitions. And ideally in between, we have an area, which is relatively clear of what we call rain. So these little partitions in between are actually droplets were due to either steric encumbrance or some inhibition or some random, nonspecific amplification within that droplet. The reaction didn't amplify as much as it should. And we didn't get as strong a signal as we wanted. This is actually almost negligible. But we try to minimize the amount of what we call rain between the positives and the negatives. So this would be an example of a good assay. This would be an example of a good duplex assay. So we still see a little bit of rain. But again, we see four different clusters roughly orthogonal or in a rectangular shape. And that is what a good assay looks like. Now, good assays are important because they allow you to see when things go wrong, excuse me. So if we had assays that were pouring quality, we wouldn't see things, for example, like this. So in this case, here, we had our positive cluster, separating really, really well. But all of a sudden, there's noise within that cluster. We're seeing things that we do not expect, in other words, little pop ups, or little gaps across the distribution. Well, if we have data that looks like that immediately if we should say, hey, something went wrong here. Let's rerun that. Well, in this particular case, oops, the cause of the error was content. emanating plastics from the pipette tips that went into the micro fluidics. So no big deal, but at least we know something went wrong, and we can correct for it. So this would be an example of a hardware related error. And you can also have assay related errors. So this is an example of what should have been a single Plex assay. But in essence, the assay was actually binding to different sub, or variants of the same gene. And as a result, we're seeing the quantification of both targets. In this case you have as you this is a reference, but you can see the first target the second target, and this is a doublet, which has first and second target together. So this is an example of an assay failure. So good quality assays allows you to see or allow you to see when things go wrong, which is really important. Now, there are a series of potential pitfalls that we've come across, that researchers are not paying attention to. And of course, one of them being assay design, improper assay design, also lack of optimization. Unfortunately, poor understanding of some of the mathematics behind digital PCR, it's relatively simple, but there are some some rules that need to be followed. Insufficient controls, instrumentation and reagents, the importance of good instrumentation and reagents, as well as data analysis. And some of the pitfalls are, unfortunately not controllable depending on the instrument you're using. So to go through these relatively quickly, we're going to start with assay design. So assay design is very similar to qPCR. But a little bit different in that amplification efficiency is not critical. And we typically aim for short amplicons. More so if we're doing things like either circulating cell free DNA analysis or other types of analysis where we want to hit specific targets that may be fragmented. The assays should be designed with digital PCR in mind in that the thermal profiles will probably be different than what is being used in qPCR. We've also noticed that the DNA samples, if they are long, should be digested. And this may seem surprising to some. But we've noticed that over the years that undigested DNA is more difficult to amplify. And we tend to see more of these partitions that are in between the positives and the negatives when we don't amplify sorry, when we don't digest the DNA. You know, looking back, it's probably a good idea as well to digest the qPCR samples probably give us a little bit better results along the way. But anyway, the RNA should use specific RT priming strategies. And of course, it's important just like in qPCR, to use well quenched probes. What's interesting is that many of the off the shelf assays are just rebranded qPCR assays that have not been tested and have not been qualified in digital PCR. Because the mindset is that okay, this is qPCR, just slightly different. No, it can be significantly different. Now that to say that the qPCR assays will work, a good percentage of them will work. So there's a high possibility of that. But some of them may not work or they may not work optimally. Optimization is a word that everybody dislikes to a certain extent, because it implies a lot of work a lot of effort. But it's important to do a little bit of optimization in digital PCR. Now, of course, these optimization of these tests should be before performed in vitro, not in silico, or other methods. The idea is to find the proper reaction conditions with regards to annealing temperature, extension time, a number of site cycles, as well as to test with different sample types. true positives, true negatives, and of course, no template controls that will tell us if we're amplifying things randomly. One of the easiest ways to generate or to test an assay and try to optimize it is simply to run a temperature gradient. So in this example, here, we ran an assay between 55 and 65 degrees. And what we see is that at the lower temperatures, we tend to get better separation of the positives and the negatives as well as a little bit less rain in between. So from this information here, we would say Okay, it looks like my best separation is between 55 and 59 degrees. He's, and I would then look at the quantification of the sample. And of course, this is the same sample just split across eight wells. And here we notice that at the coolest temperature, we're actually losing counts. So of course, looking at this information, we probably didn't decide that the optimal condition is between 55 and 57 degrees here. So these two reaction conditions here would probably be best. So as you can see, it's relatively easy to test and optimize an assay. And this can also be done during the RT steps. So this is an example of of testing different temperatures for the reverse transcription. And here, you can see that the optimal reaction conditions fall in this area here. And the difference can be significant. So from 69 copies per microliter to 177 copies per microliter of reverse transcribed RNA, so the results can be significant. One of the things we also recommend looking at is the amount of rain in the assay. And this can be done easily by simply playing with the thresholds. As we see here, we have the threshold all the way down at the bottom, we're getting the exact same result for the same sample run under two conditions. But if we look at the threshold set just below the positive cluster, we see that this sample here has more rain than that, by the difference between my value here 643, and 703. So we're looking at 60 versus about about 26 or 27. Here, so obviously, we have less
21:46
droplets, and this is per microliter. So we multiply these values by 20. So we have less droplets that are in between, so this is going to give me higher accuracy and higher precision. So optimization is relatively easy, and can yield good assays very quickly. Now, another point where we often see researchers with with a difficulty is distribution statistics. And very often everybody wants to push the decimal everybody wants to quantify five copies versus 10 copies versus 15. Copies. The reality is that these are very hard to do simply because of, of errors at low concentration. Now this error is at the low end is driven by what's called sub sampling error. In other words, taking a sample out of larger volume always carries an error with it. And the smaller the sample, or the smaller the amount of material that you're trying to pull out of a larger sample, the greater that error is. So at very low levels, the CV for just sub sampling error can be huge. So this is an important parameter to keep in mind, especially if you're quantifying at very, very low levels. Controls is another area where we see, you know, challenges. And very often we're presented with data and the question saying, Well, why am I getting these results? And very often, you know, we asked for things like, Well, do you have a positive control? And very often we get the answer no, the same thing with negative controls. And of course, the importance of a positive control is to set the benchmark for positive clusters. In other words, we want to know, what should the data look like if you have a real positive, the same thing with a negative. The no template controls are important as well, because they'll allow us to see if there's any ambient contamination, reagent contamination, or assay contaminations. And certain controls or spiking controls allow us to determine loss during sample processing. So we often skip over these, but these are very important. And if you're in biopharma, you're probably saying, you know, we do this all the time, and that's perfect. But we do see in the research environment folks who sometimes skip these steps. The next area where we're seeing variability is in instrumentation and reagents. So here we have an example of one of our super mixes. This is our QX one platform. This is an example of super mixes running out of steam. So in this example, here, on the left hand side, you see a two Plex assay fam and hex, and in the right hand panel, it's the exact same assay, but this time we've amplified or we've added three more primer and probe sets. So this is a five Plex assay and we're only looking at famine hex. And you can see that with a duplex assay the Supermix had enough power at the NTPs and polymerase and so forth, to allow good separation of the clusters. But as I start amplifying more and more targets, well, some of that polymerase is going to be monopolized by other targets and won't be available the same thing with DNTPs. And sometimes primers will cross react. So we tend to see a more rain in these assays, the quantification is going to be close, but the data is not going to be as nice. So this would be an example of the importance of a proper Supermix. In your experimental setup, depending on what you're doing. Instrumentation is important. And I want to share this because this is sort of like a user beware, as I mentioned earlier, everybody wants to see a nice clean gap between your negative partitions and your positive partitions. But sometimes the software will hide what is really going on. So in the case of this manufacturer, they have their software hide any positive partitions above the negatives. And if you take the threshold and manually bring it all the way to the top, all of a sudden you see partitions that were not visible before. And this actually affects the counts and gives effects your concentration call. So this is a little bit of, of data massaging it's a little bit too much. Now, granted, there's a lot of of data analysis that takes place in the software, there's gating, there's there's all sorts of, you know, behind the scenes, math, but this is obviously not beneficial to the end user. Another example of in essence, software that can mask things. So here's an example of data collected on one version of the software. This is the exact same data presented on the next version of the software, the newer version. And of course, the goal is to randomize these poor looking traces that, you know, sort of hide that poor data. And as I mentioned before, bad data is important because if something goes wrong, you want to be able to immediately look at it and say okay, something went wrong here, I don't care what it is just repeat the sample, and move on, as opposed to move forward with incorrect or poor data. So very important. We also see data analysis errors, basically letting the software systems based collect, and set thresholds and analysis of sometimes inaccurately, and as we saw in that 21 lab experiment where, at the end 20 out of 21 all generated the same data. If the thresholds and the data analysis is not done correctly, well, then we start seeing variability in the results. So that analysis is pretty simple. Just look at the well verify the distribution looks normal. In other words, in this case, four clusters in roughly the shape of a square, that the thresholds are properly set, in other words that they don't cross any of our clusters. Ideally, we want to analyze 10,000 partitions that are more simply because digital PCR is a combination of counting and distribution statistics. And to get good distribution statistics, the more events or partitions we have, the better and 10,000 and above is a good number to get good robust results. And of course, the software can calculate 95% confidence intervals. So it's important to carry this variability forward when reporting and doing additional analyses with the data. So with all of this, of course, may sound a little daunting, but these are actually pretty easy to do. And digital PCR has become very popular technique in 1000s and 1000s of labs around the world because of its accuracy and precision, and it's moved into biopharma as well. And so, you're probably wondering, okay, well, I can probably do my viral titer with digital PCR, what else can I do? Well, digital PCR of course can quantify your titer and tell you what your concentration is as a function of copies per microliter of of your reaction. It can also be used to determine residual host cell DNA. So depending on where or what type of cells you're growing your viruses in, there are limits that are put in place by the who and the FDA. With regards to how much residual DNA can be there. You can use digital PCR to accurately quantify this residual DNA And there are kits that allow you to almost directly put in your manufactured product straight into the QPS or the digital PCR reaction and quantify the sample directly without purification or other cleanups. Digital PCR can also be used to measure your residual DNA fragment size, using tricks such as linkage and proximity assays. And I'll show you an example of that in a little bit later. But this is another area where digital PCR can really help verify and make sure that things are are generated cleanly and properly. You can also use digital PCR for mycoplasma detection. So there are kits out there that allow the screening for many many mycoplasma species all in one reaction. And of course, this allows you to make sure that the product that you're generating is is as clean as possible. Oops, vector integrity. So looking at what we call link distribution, this allows us to determine whether in the virus
31:19
once we have this well, let me let me rephrase that. This allows us to look at whether the the insert in the virus is full length or fragmented. So the strategy in this case is to have one assay at one end of the insert another assay at the other end of the insert, some groups will actually add additional assays within the different areas of interest. And if your virus is intact, and the contents are in one long fragment, when partitioned into droplets, those will be all contained in the same droplet. If on the other hand, the molecule is incomplete, or there's fragmentation or degradation, these will appear in individual droplets, but not together. In doing so, this allows us to run what we call a linkage assay. And, or an integrity assay, where we look at, in this case, the partitions that are negative, the partitions that are positive for, let's say, my forward fragment, the ones that are positive for my second fragment, and then finally, those that contain both. And through statistical analysis, we can actually determine the concentration of intact reverses fragmented molecules. And this can be done across multiple di layers as well. So there's a little bit of complexity there. And then finally, digital PCR can be used in biodistribution studies, of course, so depending on if you're looking at RNA or DNA or expression levels, within the blood are different tissue sample, digital PCR provides the accuracy that will allow you to report accurate results to the regulatory agencies further downstream. An example of this is a recent example is a study from Dr. Naka Yama, and basically looking at quantification methods, using a single surrogate calibration curve across various biological samples, so published this year. So with all of this in mind, if you're looking at the technology that is the most precise and accurate, even within digital PCR, by far the most accurate platform available is the QX one. It has a series of features that really enhance the accuracy and precision of the results. workflow is relatively simple, you prepare your reactions, you load it into the cartridge, the instrument then generates the partitions or droplets cycles, reads the droplets and then generates results. So relatively straightforward and easy. So with this in mind, digital PCR is a wonderful technology. It's been around for quite some time, but has only taken off in the last decade or so. And it's fantastic when used correctly. We've seen a lot of new players in this space, and some instruments are actually good. Some of them are average, some of them are poor, the poor instruments will eventually fall out and disappear. The average instruments will probably stick around. But of course, it's a situation where they may not have the best practices or best strategies, so you have to keep that in mind. We're also seeing increased confusion as to what these best practices are. So we think it's important just to set a bar and remind users of how easy it is to make sure things are done properly. And you can then move forward at a quick pace. So again, a little attention to detail can have a tremendous impact on the accuracy and precision and quality of results. So you know, just bear that in mind. And of course, I'm preaching to the choir, in that there is enough complexity in the samples being analyzed without the assays and the hardware itself being a challenge. So with that, I'd like to thank you for your attention. Just bring out what we've brought forward in the last 13 years at BioRad. With regards to digital PCR, we have a whole array of different instrumentation for digital PCR, we have a whole array of assays and reagents, again, specifically designed for digital PCR. We have software that can be used in both academic and regulatory environments to help with audit, trailing, and so forth. And we've also gotten the experience with members of our team at both the local FAS level, the regional levels and global levels that can help you quickly achieve your goals. So with that, I'd like to thank you for your attention, and answer any questions you may have.
MaryBeth DiDonna 36:25
Okay, great. Thanks very much friend for a wonderful presentation. So at this point, we are ready to move into our q&a session with the audience. Again, for those of you may have joined us late, you can send me your questions by typing them into the q&a box located on the right hand side of your screen. Even if you don't have a question, we invite you to leave a comment, let us know how you enjoyed this presentation. If you found the information useful if you'd like to see similar content from lab manager in the future. And also if you would like Frank or the IRA team to reach out to you following this webinar, you can leave a comment in that q&a box and we will make sure that you they receive your contact information. I'd also like to remind you that you can look under the handouts tab on the right hand side of your screen for supporting information about this presentation. So Franklin, thanks again. Let's jump into this first question here. This one says what are the advantages of ddpcr?
37:13
Well, well, thank you. The advantages of digital droplet digital PCR versus other technologies by far are the accuracy, precision and reproducibility. I've already brought these to your attention earlier. But I just can't stress enough that there really isn't any quantitative strategy for nucleic acids that allows us to have these levels. As far as as, again, precision accuracy and reproducibility from lab to lab and user to user. This is a technology that many metrology labs are considering as a first order measuring tool because it's not dependent on anything else. It's standalone and it provides very accurate results. So definitely, it stands out. Okay, wonderful. Thank
MaryBeth DiDonna 38:07
you. We have another question here that asks, How do you set the threshold?
38:14
So we have that, that question asked quite often. And the threshold is simply set between your positive and negative clusters. So whether you have a line of positive events or a line of negative and a line of negative ends, or you have multiple clusters in a 2d plot, the the concept is relatively simple, you want to put that line where the different groups are separated from one another. Now, of course, the assay quality will have a very large impact as well as how much optimization you've done. In the ease of setting this threshold, and the ability to have different users under different conditions generate similar data, day in day out. But generally, in a nutshell, you simply set that line or series of lines where you can easily separate one group from another as cleanly as possible.
MaryBeth DiDonna 39:15
Okay, great, thanks. I think we have one more question which applications can I run using DD PCR? So
39:22
digital PCR allows you to run multiple types of applications. So what we've seen today are generally cool quantitative analyses where we quantify for example, either residual DNA or the virus itself or some sort of construct or the product of that infection downstream such as monitoring, but digital PCR can also be used. In the one example I showed which was for linkage and looking at proximity studies, whether these are viruses or different point mutations on a A DNA that can be present with various disease states, muscular dystrophy being one example of them. So these proximity or linkage assays are an example of that, as well as mutation detection. So we often think of mutant either being wild type, mutant or a mix of both. When you have a mixed population, the mutation can be at very, very low levels. For example, if you take a liquid biopsy out of out of blood or some other bodily fluid, and you may have 1000, or 10,000, normal copies, and a handful of negative copies or a positive copies, and detecting that those low levels within the higher abundance of normal material can be very, very challenging. And digital PCR allows you to detect that levels beyond what can be done in other techniques, such as even next gen sequencing or just standard qPCR. So there's quite a few applications. And the list is actually increasing every year. So what started off as Oh, this is going to be used for copy number variation, and maybe a couple of other applications about 20 years ago is now expanding into many, many different regions of analyses.
MaryBeth DiDonna 41:29
Okay, great. Thanks very much. I also did get a few private messages asking if this will be available for on demand viewing later, so people can watch it again? And the answer is, yes, it will. So we will alert you to that via email. We do have another comment here. I joined this webinar to learn more about ddpcr. Specifically, because San Diego County uses this assay for water quality tests, it was very helpful to learn about the precision and accuracy and get a better understanding of how DD PCR is performed. So that's why it's called one I don't know if you want to follow up on that at all. Well,
42:02
I actually, the San Diego was one of the early adopters, for digital PCR for water testing, if I recall, they were looking at the quality of or sorry, different bacterial pathogens in beach water, if I recall, and pyrococcus and ecoli, and a couple of other other pathogens. So they've been at the forefront of using this technology in in water testing. And of course, nowadays, we see a lot of digital PCR being used for wastewater testing for pathogens and other potential diseases. Of course, COVID comes to mind, but also smallpox and a whole series of other potential diseases are being monitored using digital PCR. So yes, they were at the forefront of this many, many years ago, so. Okay, great, wonderful.
MaryBeth DiDonna 43:04
Thanks so much. So that does bring us to the end of this webinar. And just a reminder that this webinar will be available on demand shortly following this presentation. Please watch your email for a message from land manager once this video is available. On behalf of land manager. I'd like to thank Frank was born for all the hard work he put into his presentation. And I'd like to thank all of you for taking time out of your busy schedules to join us. Once again, I would like to thank our sponsor BioRad for their support, which allows lab manager to offer these webinars free of charge to our readers. For more information on all of our upcoming or on demand webinars are to learn more about the latest tools and technologies for the laboratory, please visit our website at lab manager.com. We hope you can join us again. Thank you and have a great day.
43:45
Thank you