Challenges for NGS Diagnostics

NGS technology and utilization have shown rapid progress, including application for clinical diagnostics in areas such as noninvasive prenatal testing (NIPT) and oncology. However, significant challenges remain for the routine use of the technology in diagnostic testing labs. A paper published earlier this year in Applied & Translational Medicine entitled, “Barriers to Clinical Adoption of Next Generation Sequencing: Perspectives of a Policy Delphi Panel” authored by Donna A. Messner, PhD et al., classifies the policy barriers into three domains: IP, coverage and reimbursement, and FDA regulation. As one of the stakeholders shaping NGS diagnostics policy development and implementation, NGS instrument companies play an important role in this process through technology development, application investments and dialog with other stakeholders, such as payers, scientific organizations and regulators. In the article, IBO examines the two major barriers—payer reimbursement and proprietary databases—and how instrument vendors are addressing them.

The Applied & Translational Medicine article is based upon a modified Delphi policy study, an iterative process involving surveys and interviews of experts as well as discussion groups. Round 1 consisted of 48 experts, with the number decreasing in subsequent rounds. In Rounds 1 and 2, panelists evaluated a list of policy challenges. Rounds 3 and 4 focused on potential solutions.

Of the 19 challenges evaluated in Round 1, 10 were related to coverage and reimbursement. Private payers’ reimbursement of NGS diagnostic tests varies by payer and test type. Dr. Messner, senior vice president of the Center for Medical Technology Policy, told IBO that coverage of NGS-based NIPT testing is widespread. However, reimbursement for oncology tests, the largest potential market for NGS diagnostics, varies widely but is improving, according to Dr. Messner. “We’re seeing a transition now because we are starting to get more evidence. Groups like Foundation Medicine have been working hard to try to do some of the studies and generate some of the evidence that payers would like to see.” Foundation Medicine provides large-panel NGS-based tests for tumor profiling.

Foundation Medicine is among the NGS test providers that have received reimbursement for their tests from certain private payers. Private payers that have announced limited reimbursement for specific tests include UnitedHealthcare and Aetna. However, as Dr. Messner emphasized, reimbursement is still very limited to particular circumstances. As an example, she noted, “they are writing very specific limitations around it, so they’re saying ‘for extenuating circumstances, like tumor of unknown primary site or advanced stage non small cell lung cancer.’ There are certain conditions under which you’ve already exhausted other alternatives.“

More likely to reimburse for NGS tests are health system providers, such as Geisinger and Inter Mountain Health. “The health systems that are integrated and have their own coverage mechanism, their own payment mechanisms, they’re not waiting around,” noted Dr. Messner. However, as she explained, Medicare has provided little reimbursement, although local Medicare providers set their own policies. Such providers often rely on Medicare administrative contractor Palmetto GBA’s MolDx program, which provides test assessment and evaluation of clinical utility for molecular diagnostic tests, including NGS testing. In addition, Dr. Messner said that local Medicare providers consult the Blue Cross/Blue Shield TEC [Technology Evaluation Center].

As Illumina President and CEO Francis A. deSouza stated in the company’s second quarter conference call regarding the NGS oncology market, “A lot of the growth we’re seeing today is not driven by any reimbursement, as you know. So it’s either a self-pay market, or it’s the large cancer centers that are offering these genomics tests as part of their strategy of differentiating from other cancer centers.” He added, “We’re still in the stage of the market where we’re seeing a proliferation of panels. I think at last count there were over 1,000 panels out there. And so, as we look forward, the next step in the process is to start to see some consolidation in the number of panels, which we think will accelerate the reimbursement in oncology.”

The article, “Challenges of Coverage Policy Development for Next-Generation Tumor Sequencing Panels: Experts and Payers Weigh In,” by Julia R. Trosman, PhD, et al., published earlier this year in the Journal of the National Comprehensive Cancer Network, is based on interviews conducted in 2013 with 14 experts, including 7 health plan executives. The reasons given for lack of reimbursement include the inclusion of novel targets, leading to a classification of the whole panel as investigational; “misalignment” with single test/single result approach, including no precedent for evaluation of the bioinformatics component; proposed evidence methods do not fit payers’ evidentiary standards; and concerns about the adoption and delivery of NGS diagnostics.

QIAGEN, which launched its GeneReader sequencer and associated workflow for clinical sequencing last year, is addressing cost and reimbursement challenges by responding to making test costs more predictable. As Dr. Simone Günther, Senior Director Content Strategy & Medical Scientific Affairs, told IBO, “Our ‘price per insight model’ and ‘one vendor’ strategy support efficient cost management.” Under the price per insight model, QIAGEN is paid for each successfully completed test result. As a single vendor, QIAGEN can provide a complete NGS workflow.

Within the domain of coverage and reimbursement, participants in Rounds 3 and 4 of the Delphi panel addressed the specific issue of payers’ differing evidence standards for assessment of clinical validation. For this issue, the highest scored solutions were the use of multi-stakeholder consensus panels to set evidentiary standards and the development by expert panels of evidentiary standards for all payers. “Broadly speaking, payers have similar definitions of clinical utility, similar things that they are looking for,” Dr. Messner told IBO. “But when it comes down ultimately to specific decisions for specific testing products or testing protocols, they each want to reserve the right to use individual judgment in case-by-case situations.”

Although the greatest number of policy challenges in the Translational Medicine article were related to coverage and reimbursement, the highest scoring challenge in Round 2 was the ability of diagnostic companies to maintain proprietary variant databases. This refers to the databases that NGS LDT providers compile based on internal research and testing services.

The sharing of variant information is used to discover new variants and to corroborate existing variant information. As the FDA wrote last year in a paper discussing the use of variant databases to establish the clinical relevance of human genetic variants, “A variety of internal and external resources, such as other public databases, functional data, evidence-based rules, associated clinical characteristics, population frequency, or in silico assessments, may be used to provide corroborating evidence, especially when a new, previously unseen variant is encountered.“

Proprietary databases are endorsed by test providers. But, as Dr. Messner told IBO, “A lot of companies that are newer in this space know all about the more common mutations and can readily identify those. But for some of those more unusual variants, they don’t necessarily know to look for them. They don’t necessarily know how to interpret them.”

The use of public databases of clinical variants received endorsement last month. In June, the FDA released draft guidance addressing the use of public human genetic variant data to support IVD NGS clinical validity, stating that it believes public databases can be used to provide evidence for the clinical utility of NGS tests as part of a regulatory review. The draft guidelines describe a process under which databases could be voluntarily submitted to the FDA for official recognition.

Indeed, incentives might be needed to encourage sharing of private databases. The solutions to the potential problem of proprietary databases in the Delphi panel Rounds 3 and 4 supported by more than 50% of respondents was to make data sharing and the possibility of independent verification dependent on “approval/clearance, certification or accreditation.”

QIAGEN noted the challenge of clinical variant analysis. “In extreme cases, interpretation can take several months if each gene variant needs to be researched, evaluated and manually curated,“ explained Dr. Günther. “Another challenge is that new variants or combinations of variants with unknown clinical relevance are constantly emerging.” To address such challenges, QIAGEN’s workflow solution integrates its bioinformatics expertise. “These workflows include data analysis and interpretation capabilities that are unique and developed for the clinical space.” The company is also aiming to increase data sharing. “We are also working closely together with many stakeholders to intensify the data sharing to advance translational medicine and clinical diagnostics,” he explained. “One example is the Allele Frequency Community, a landmark initiative that is creating an extensive, high-quality and ethnically diverse collection of human genomes to address a key challenge in interpreting sequencing data for research and clinical applications.”

Both QIAGEN and Illumina provide bioinformatics products for creating databases, whether open or proprietary. In a February meeting discussing FDA oversight of clinical NGS testing, Mya Thomas, Illumina’s vice president of Regulatory Affairs, commented on proprietary databases stating, “There’s always going to be some proprietary databases. I think with what the FDA did with the Myriad approval [in 2014, the FDA granted its first premarket approval of an LDT to Myriad’s BRACAnalysis CDx test] was to basically make sure there was a process in place for that data being curated appropriately. . . . [T]hat’s part of how we have to handle the public databases. But there are occasionally going to be products that come through that rely on proprietary databases and it seems we’ve found a way to make sure that works.”

< | >