COVID-19 Study Update from Dr. John Sanders
Third round of antibody test validation yet again shows strong performance on avoiding false positives
Validation did yield false negatives, which suggests the Wake Forest study may have under-counted antibody positive results
Raleigh, N.C. — Senate Leader Phil Berger’s (R-Rockingham) office has periodically shared updates from Dr. John Sanders on the Wake Forest Baptist Health/Atrium study that has been underway since April. Though the information comes from Dr. Sanders, Sen. Berger’s office shares it through their media channels because reporters often inquire to Sen. Berger’s office about the study.
When the study was announced, the guiding principles listed in the public announcement were:
- Execute the most transparent study possible;
- Be forthright about data, goals, challenges, next steps, and missteps;
- Don’t let the perfect be the enemy of the good.
Below is another update, this time focused on validation results for the antibody tests.
Bottom Line: The latest validation from the National Cancer Institute yet again showed strong performance on avoiding false positives, but it did show a larger number of false negatives than previous validations. This means that the number of study participants with antibodies may be slightly higher than previously estimated.
Background on Validation
COVID-19 antibody tests are relatively new. To confirm whether the antibody tests actually work, they must undergo a process called “validation.” The process uses a confirmed sample of COVID-19 (or COVID-19 antibodies) and runs the sample through the test multiple times to see if the test performs accurately.
The validation process records two figures: specificity and sensitivity.
Sensitivity is a measure of how accurate a test is at confirming positive results among persons who have COVID-19 antibodies. A sensitivity of 85% means a test will correctly show positive results for 85 out of 100 persons with antibodies, and it will incorrectly show negative results (false negatives) for 15 of those 100 persons. A low test sensitivity would mean the antibody study is under-counting the number of people who in fact have COVID antibodies.
Specificity is a measure of how accurate the test is at confirming negative results. A specificity of 70% means a test will correctly show negative results for 70 of every 100 persons who do not have antibodies, but will it will incorrectly show positive results (false positives) for 30 of 100 every persons. A low test specificity would mean the antibody study is over-counting the number of people who in fact have COVID antibodies.
Validation of the Antibody Tests Used for This Study
Over the last three months, the antibody tests used for the Wake Forest Baptist Health/Atrium study have undergone two separate validations: one by Wake Forest Baptist Health and one by LabCorp.
Each of those validations showed a specificity of 100% (meaning zero false positives) and a sensitivity of 90% (meaning 10 of every 100 tests were false negatives).
These validation figures are excellent. Specificity is much more important for a research study like this than sensitivity — it’s better to err on the side of under-counting than over-counting.
Dr. Sanders received results this week from Syntron, the medical company that manufactures the antibody tests that have been used for the study thus far, on the third round of external validation testing. This third round of validation testing was conducted by the National Cancer Institute.
The validation recorded yet another 100% specificity (zero false positives) for IgG antibodies and 97.5% specificity for IgM antibodies. The sensitivity for IgM antibodies remained good at 93%. However, the validation showed a lower sensitivity for IgG than the previous two validations: 77%. If true, this means that most positive results are indeed positive, but the lower sensitivity means the study may have under-counted the number participants who are antibody positive.
Dr. Sanders said, “We are constantly evaluating the quality of data we are generating and we will continue to let people know the limitations of the study. We chose the best available test at the time with an emphasis on a high specificity to make sure we do not overestimate the number of people who have been infected. As we said when we started the study, we expect to continue to make changes as we learn more about the characteristics of the test and better tests become available.”
Sen. Berger said, “When the legislature first funded this study, I asked Dr. Sanders to make it the most transparent study possible. Everybody understands that this work is happening under severely challenging circumstances. That’s why transparency — being forthright about successes but also challenges — is so important to building trust. I’m confident that Dr. Sanders and his team will continue to build out this world-class research study, which has now expanded to other states with major investments from the CDC.”
Moving forward, the study will no longer use this version of the Syntron antibody tests, even though they passed two previous validations. LabCorp will also rerun some of the backup blood samples collected from participants as a standard, additional data check. Because the specificity of the antibody test is so high, the sample reruns may show slightly more antibody positives than previously recorded.