Category Archives: cost reduction

EHRs Support FDA Accelerated Approval

There are provisions for accelerated approval by the Federal Drug Administration (FDA) of new drugs for patients with serious illnesses. As an example, Genentech’s Avastin for metastatic breast cancer received accelerated approval on February 22, 2008. On June 29, 2011, the FDA’s Oncologic Drugs Advisory Committee recommended 6 to 0 that the approval be withdrawn. As of this posting, the FDA has not announced a decision.

The approval was based on standard research for this type of medicine and showed sufficient benefit that accelerated approval was granted. Subsequent similar studies raised questions about effectiveness and risks that led to the recommendation for withdrawal.

Standard research methods have been tested and generally provide the required information for FDA action. However, samples are small—the largest sample in this series of tests was fewer than 800. Most of the critical measures are in the low single digit percentages which mean that errors induced by small sample size can be significant.

Recent advances in electronic healthcare records (EHRs) are creating a new parallel path to provide much larger samples to assist in the assessment of traditional research. What ERHs lack in precision, they make up in breadth in both the research target group and control groups. This greater breadth allows for analysis of factors that may contribute to adverse events or produce higher rates of effectiveness among sub-groups in the target population.

There are, according to material presented at the hearings, 45,000 patients diagnosed with HER2-negative matasticised breast cancer every year. All of these patients have healthcares records although most of them are probably not yet in electronic format. Reasonable projections based on recent history and the continued efforts by the US government to encourage the use of electronic records suggest that EHR sample sizes for a population of this size could be well over 1,000 records in the immediate future and in the foreseeable future significantly greater.

Research based on EHRs is relatively inexpensive compared to traditional research because the cost of data collection has already been covered. There are additional advantages as outlined in EHRs Meet Pharma’s Need for Longitudinal Research. I am not suggesting replacing traditional research with EHR based research, but rather, doing both in parallel. A model of such a testing program might look like this:

During Drug Development: Use EHR based research to assist in defining the scope and nature of the potential market for a new drug as soon as there is high probability that a request for FDA accelerated approval will be made. Begin to build a control group that includes all potential variables including current and prior medications used for the target disease. Maximize the size of the control group to minimize sampling error there. Include relevant findings from this analysis as part of the request for accelerated approval and agree to continue the research after approval with the addition of patients for whom the product is prescribed.

At Launch: Begin tracking weekly for market research to gain insights about who is prescribing it and for which patients. Also track any adverse events to manage risk.

Three To Six Months Out: Reduce the frequency of reports to monthly or quarterly depending on market research needs and medical research requirements. Look for clusters of adverse results and clusters of effectiveness. Expand the control group with sub-groups to maintain sampling validity as smaller clusters are explored in depth. Explore options to reduce adverse effects including notices. Explore opportunities to focus marketing aimed at other clusters to improve overall performance in terms of satisfying FDA requirements and improving market penetration.

Map the EHR results to those from traditional research to build credibility for the smaller sample sizes of the traditional research or to explain any differences and their significance for FDA review and approval.

After FDA Approval: Continue the EHR research to manage risks associated with adverse events, continue to develop the market, and find new opportunities for new or improved products.

The rapid evolution and implementation of Web based, fully networked, database driven EHRs like Practice Fusion are creating a new set of tools to facilitate accelerated release and post-release research leading to final approval.


Short link:


Cloud Based EMRs: Better Post-FDA-Approval Research

A recently closed longitudinal study of a medication to boost “good” HDL cholesterol concluded: “… that the HDL-boosting drug niacin failed to cut the risk for heart attacks and strokes.” The study was designed to track patients for 4 to 6 years but was terminated 18 months early based on the results to that point. The cost: $52.7 million or $15,500 per person for the 3,400 study participants. Fully networked, cloud based, electronic medical records (EMR) appear to offer a better solution.

The topic of this post is research about medications that have received FDA approval and are being prescribed for general use. The subject of a research study could be a new medication being tracked for purposes of risk management among patients who were not fully represented in the limited sample used to obtain FDA approval. It could also be an established medication where adverse events are suggesting that more needs to be learned or where there is reason to believe it is not significantly effective to justify continued sale. In this case, it was a matter of both effectiveness and risk.

The key to a better solution is the evolution of fully networked, cloud based EHRs that create a database that is large enough to provide meaningful statistics about specific occurrences. As an example, Practice Fusion now hosts electronic records created by 90,000 medical providers in a single database that has more than 12 million patient records and is growing. The data is being collected as part of physicians’ normal practice—the electronic version of historically hand written notes.

There are operational advantages:

• Data collection is conducted to serve the day-to-day needs of the physician and their staff so they have established procedures and a vested interest in quality.
• The use of the data is totally independent of its collection so there is no bias in the data collection process or the data; neither the doctor nor the patients are even aware of how the data may be used: a totally blind process.
• Separation of data collection and use remove any presumption of undue influence by the sponsor of the study.
• Data is uploaded by physicians daily so it can be made available in near real-time for use at checkpoints in the study.
• If an area of particular interest is discovered, e.g., women over 60 who are more than 20 pounds overweight, additional participants with those characteristics can be identified and added to provide a larger, more reliable sample of that group.
• The study can provide information about risks and effectiveness, increased levels of HDL, and continued use, i.e., prescription renewal.
• In many cases, patient history related to their disease including prior medications is available and information will be available about patients who stop taking the medication or change to a different medication.
• The same database can be used to create a control group of patients that are not taking the medication and have essentially the same medical conditions and demographics as those who are; if a member of the control group begins taking the medication they can be moved to the study group and replaced in the control group—there is no need to deny patients the opportunity to take the medication just to protect the integrity of the data.

The largest benefit is that there is no marginal cost for data collection. The data is already being collected. There are, of course, charges for extraction of the specific data required for the study, for HIPAA compliant de-identification of the data, and for the use of the data. The study costs cited in the introduction, $15,500 per participant, include costs in addition to data collection, but data collection is a major part of that cost and could be dramatically reduced through the use of fully networked, cloud based, electronic health records.

Better data for the reasons noted above at lower cost translate to better healthcare at lower cost.

Short link:

Prediction: Pharmaceutical Litigation

Pharmaceutical litigation is where huge lawsuits are common. I predict that this will change and the change will reduce the damage to the bodies and lives of patients, reduce the cost of healthcare, and speed up the process of moving new medications from the laboratory to patients. What will cause that change?

Historically, the pharmaceutical manufacturers have been responsible for the collection, analysis and distribution of data related to benefits and side effects of new medications. Data collection has been an extra cost of doing business. Like all costs, there is pressure to minimize these costs. If, and when, publicly available data including anecdotal evidence begins to point at a serious problem, the plaintiff’s bar also begins to develop data about risks and damage.

The cost of data collection using today’s methods imposes economic limits on the amount of data gathered by manufacturers. Cost control argues for the smallest amount of data that will likely be required, but how much is that? The smaller the data sample the less likely it is that a risk will be identified early in the life of a new medication. If a risk is limited to just some patients, a small total sample makes it unlikely that a risk to a sub-set will be identified, e.g., males over 70. If the total sample is statistically small, a sub-set will be even smaller and less reliable as an indicator of both the need for action and as a guide to appropriate action.

The push by the federal government for healthcare providers to shift from paper records to electronic healthcare records (EHRs) has been seen as a potentially slow process. The common model assumes EHRs will be implemented first in hospitals and then spread to private practices because of two factors: first, the cost and complexity of the required computer systems, and second, the lack of the standards required to move meaningful data from system-to-system for consolidation and analysis.

The maturation of the Internet in terms of processing and security now means that data can be safely moved and stored. Cloud computing—remote data processing and storage—now allows service providers to grow new services rapidly without large up-front investments. The net result is a dramatic reduction in cost for the service provider and users, e.g. physicians. As an example, one company, Practice Fusion, provides a full function EHR to private practices totally free. Step one in the changes that are occurring is unexpected reductions in cost.

Hospitals generally have short term relationships with patients whereas private practices generally have longer term relationships. Private practices monitor more patients over longer periods of time than hospitals. Step two is a shift from hospitals first to private practices first for widespread implementation of EHRs which will provide better longitudinal data about the effectiveness and risks of new medications.

The original model for EHRs envisioned small libraries of patient data sitting on the hard drives in the offices of thousands of doctors. Solutions for the nightmares associated with moving and consolidating all of that data are still on the drawing boards. Rather than “stand-alone” systems, the Internet and cloud computing allow the use of a common set of databases and a single set of standards. As an example, Practice Fusion now has seven million patient records from 70,000 medical professionals in their database. That is nearly 10% of all of the doctors in the US and growing rapidly. The data from thousands of doctors is being monitored as it is received. The data is all in a common format and is available for analysis and reporting in near-real-time. Step three is the availability of large databases with one set of standards that dramatically reduce the time and cost required to convert data to useful information.

The data that was formerly collected by the manufacturers as an additional cost of doing business is now being collected as a routine part of patients’ visits to their healthcare providers. There is little or no extra cost to track a new medication. The data is being collected by nurses and doctors trained in diagnosis and documentation as a normal part of their medical practice. Step four is further lowering of costs; step five is better quality data.

The availability of more, better, and cheaper data at lower cost offers opportunities to reduce the risks associated with new medications and may allow regulatory agencies to approve new products with less research subject to close tracking and third party analysis of results. Lower cost of research and earlier to market could represent significant cost savings for new medications.

As a new medication enters the market feedback about its effectiveness and risks will begin to flow very shortly thereafter. Manufacturers who chose will be able to measure and document effectiveness and will also be able to identify risks. As the patient population using the medication grows, the sample being tracked will also grow. This means that that as risks become significantly large—however significance is determined—the data base will be growing to support both risk assessment and mitigation. Manufacturers, if they chose, will have new opportunities to minimize damage by providing additional information to the medical community and patients and to develop specific responses to specific problems. Timely action will reduce compensatory damages and responsible action will reduce punitive damages. Timely responsible action will reduce damage to the brand in the marketplace.

Manufacturers who chose not to participate in the analysis and application of this data will probably find that plaintiffs’ attorneys are using the data to find opportunities for litigation. The plaintiffs’ attorneys will probably argue that the reluctance of manufacturers to make effective use of the data justifies demands for increased punitive damages.

My prediction: more, better, faster, and cheaper data will speed up the process of moving new medications from the laboratory to patients, reduce the cost of healthcare, and reduce the damage to the bodies and lives of patients who have adverse reactions to new medications. Changes in the way that healthcare data is being collected, processed and stored will reduce both compensatory and punitive damages related to pharmaceutical litigation.

Short link:

EMRs as an Integral Part of Medical Research

The cost for collection and processing of data is a significant part of the budget for a typical medical research project. Use of data that is already being collected for other purposes provides opportunities to improve the quality of the available data, reduce the cost of obtaining it, and minimize the time required to get it to the analysts. Here’s where an EHR system can help.

As an example, a research project wants to track the use and effectiveness of a new medicine to manage a particular illness over a period of five years. Let’s call that illness Alpha. Today, research is pretty much limited to people already diagnosed with Alph unless the sample size is very large. With access to an EHR that has a large enough database, three types of patients can be tracked for the study.

The EHR can be used to find 1,000 patients who have the disease. They can be given the new medicine and tracked over the next five years using data from their EHR that is being collected as a routine part of visits to their doctor. Extra blood tests or other procedures may be required with a new medication. Reminders to the doctor can be included in the EHR and the results will then be tracked like any other data. The extra cost of obtaining and delivering the data will be relatively low.

A sub-project can be designed to get some of these patients to participate in additional research such as development of family histories of Alpha, or genetic testing. Recruiting patients for additional test through their doctors will be less costly than most of today’s means of obtaining this type of data.

The EHR can be used to find 1,000 patients who have Alpha, are demographically very similar to the first set of patients and are not part of the test of the medication. Data from their EHRs can be used to provide a baseline against which to assess changes among the patients who are taking the medicine. Again, the extra cost of obtaining and delivering the routine data will be relatively low.

The EHR can also be used to find newly diagnosed cases of Alpha over the course of the five years of the study. Newly diagnosed patients of the doctors with patients already in the study can be given the medicine and then tracked to see how effective it is if administered early. Newly diagnosed patients of doctors who are not in the study (and presumably are not aware of the medicine) can be  found and tracked to provide a dynamic baseline for early use of the medicine. There would be no extra cost to obtain the data; it is already in patient EHRs. The cost of a wider search of the database to find these cases could be noteworthy but significantly less than any other way to build a baseline of newly diagnosed patients.

There is one other piece to a complete solution and that is access to a large enough number of electronic health records to find a limited number of cases. There are a number of organizations including the VA, Kaiser, and vendors of hospital systems that have large databases. There are also physician office systems like Practice Fusion that are database driven and can quickly draw information from millions of patient records today over a much more diverse demographic (location, age, and socioeconomic status).

A research project based on EHRs will have data collected by nurses and doctors who are trained to collect health related data to assure quality. Data can be delivered to the research team in a matter of days; the interim and final research results will be available for use substantially faster than is possible with most of today’s data collection methods. When an EHR is an integral part of the research project the result is better data at lower cost faster.

Short link:

The Promise of EHRs for Pharmaceutical Companies

Today pharmaceutical companies conduct extensive research to obtain Food & Drug Administration (FDA) approval of a new medication. Typically this pre-approval research is structured in three phases: Phase I, a small (20-100) group of healthy volunteers to assess safety; Phase II, larger groups (20-300) to assess how well the drug works; and, Phase III, randomized controlled trials on large patient groups (300–3,000 or more ) to assess effectiveness vis-à-vis with the current ‘gold standard’ treatment.

Phase IV trials involve the safety surveillance of a drug after it receives permission to be sold. The safety surveillance is designed to detect any rare or long-term adverse effects over a much larger patient population and longer time period than is possible during the Phase I-III clinical trials.

There are opportunities to use data from Electronic Health Records (EHR) for Phases II and III, but the larger opportunity, and the one addressed here, is Phase IV, post approval surveillance.

Theoretically it should be possible to use the growing body of data in EHRs to track patients for whom a new medicine is prescribed. However, EHR adoption is still very limited; almost all reports show adoption rates less than 10%. More important, there are more than 200 vendors providing EHRs, all of which are built using limited standards for the information to be acquired, the coding for diagnosis and treatment, and for the exchange of information.

Historically, it has proven very difficult to exchange information among multiple systems within a single organization. The generally accepted solution is an enterprise resources program (ERP) which is complex, costly, and difficult to implement.

The collection and exchange of medical information is simpler in some ways and more difficult in others. The efforts of the Federal Government to develop and implement EHR standards will help but there is a great deal to do and the final definition and implementation of standards will take time—probably several years. Even with standards, there will be minor differences among systems developed independently by a large number of vendors that will limit the industry’s ability to exchange information and will raise questions about the quality of the data.

When a significant body of common data becomes available, pharmaceutical companies should be able to use that data to track a sample of patients using a new medication in “near real time,” perhaps within a week after a doctor sees a patient. This could provide four major benefits.

• First, regulatory agencies may give earlier approval subject to effective ongoing sampling and reporting of any adverse reactions—earlier to market.
• Second, any adverse reactions could be found early and evaluated to minimize damage to patients—minimize patient harm.
• Third, early warning could lead to examination of additional data from existing medical records of patients receiving the medication relatively quickly and at relatively low cost. This will further define the risk and whether or not there are patient, disease, or treatment conditions that create or reduce risk. It may be possible to eliminate a risk just by restricting use based on definable conditions—protection for both the patient and market value of the medication.
• Fourth, timely responses to an identified risk will make it difficult for plaintiff’s attorneys to successively demand punitive damages. Compensatory damages will be smaller and punitive damages will be smaller or non-existent—reduced legal costs.

There is enough information about current and near term EHR systems for pharmaceutical companies to develop strategies to use more and better data they will offer. The time to begin the development of those strategies is now.

Short link:

Hospitals: reform, stimulus & codes

This month, August 2010, McKinsey has published several articles about the impact of changing regulations and other market dynamics on healthcare costs, investments and benefits. Earlier I summarized and provided a link to one of their articles: Health Insurance: reform, stimulus & codes That post addressed insurance companies; this one provides the save service for hospitals.

New regulations that require US health care providers to use electronic health records (EHR) and adhere to strict data-coding standards will force hospitals to spend billions of dollars over the next decade to upgrade their IT systems. The spending requirements risk squeezing hospital capital budgets already under strain from steadily rising costs. With government incentives covering only a small portion of the total, providers will be forced to recover quickly their investment dollars from operating changes.

… Our research shows that automating and standardizing health care information can bring benefits that extend beyond meeting demands for compliance. A provider that creates a best-practice IT platform to house and share medical records, to manage hospital resources more transparently, and to define precise guidelines for medically authorized tests and procedures can generate significant operating efficiencies. Such a platform minimizes paperwork, reduces the number of unnecessary treatments, and lowers the risk of drug and medical error.

The productivity and resource savings often pay back the initial IT investment within two to four years while also producing better health outcomes for patients. We estimate that total savings across the US provider landscape could be on the order of $40 billion annually. Achieving such a positive return on investment (ROI), however, requires distinctive change-management skills among hospital leaders, better governance, and sustained engagement from key clinicians.

Estimates suggest that a wave of US legislation and regulatory changes will affect up to 80 percent of the existing hospital IT applications. Among the most far-reaching of these developments are provisions, laid down by the American Reinvestment and Recovery Act (ARRA), requiring health care providers to implement IT capabilities such as electronic health records and computerized-physician-order-entry (CPOE) systems. While some providers use electronic health records on a limited basis, the new regulations standardize what is expected from them and make their use mandatory.

An accelerated timetable means that US health care providers have until the end of 2015 to make the investments or face fines starting at $2,000 a bed in the first year and up to $35,000 a bed by 2019.1 In addition, both revisions to the Health Insurance Portability and Accountability Act (HIPAA) 5010 and the switch to ICD-10 require providers to apply strict new data-coding standards—no small task given the number of databases, hospital systems, and clinicians affected.

…US hospitals will need to spend approximately … $80,000 to $100,000 per bed, for the required project planning, software, hardware, implementation, and training.

Our research shows that optimizing the use of labor, reducing the number of adverse drug events and duplicate tests, and instituting revenue cycle management can help typical hospitals generate savings of some $25,000 to $44,000 per bed a year. … The realization of the benefits from health care IT investments will require a radically new approach to IT on the part of the CIOs of health care providers, as well as the business leaders and clinicians those CIOs serve. Health care providers will need to use new approaches to achieve an inclusive governance process with streamlined decision-making authority, a radically simplified IT architecture, and a megaproject-management capability.

Many of the changes that are facing healthcare require new systems, new delivery processes, and new management strategies. In short: new ways of doing business. The sheer volume of work to be done and the need for new thinking suggest a need to bring in new talent to assist those already in healthcare meet the wide ranging challenges of the next few years.

The complete report is at:

The short link to this post is:

EMR: Free? Really!

Free is a marketing term that typically evokes a mixed set of reactions ranging from an optimistic “You have my attention, tell me more,” to a cynical “There’s gotta be a catch,” to a pessimistic “There’s no such thing as a free lunch.” All three of these showed up when I heard about a free electronic medical records system offered by a company called Practice Fusion.

Their Web site referenced the book Free: The Future of a Radical Price, which includes an analysis of “How can healthcare software be free?” So I read the book. The basic theme is that costs of data storage, transmission, and processing are falling so fast—on the order of 50% per year–that the costs associated with “bits” of data (as contrast to “atoms” or physical stuff) are heading for zero. With atoms, revenue usually has to be associated with cost. With bits, revenue can be loosely related to costs or even independent of costs.

At the level of a private or small group medical practice the typical evolutionary path for medical records is from paper to site-specific computer to networked systems. Most of the software being sold today is site specific which means the doctor has to pay the up-front costs and networks will be added on. But, only some of the value accrues to the doctor and there is little or no broad agreement about what the networks will be or how they will be managed.

Practice Fusion sees the value in both the local data and data that is networked. Their basic premise is that by providing an EMR to a doctor the doctor’s data will be in a format consistent with the data of all of the other doctors using their system. With the appropriate consents and controls in place, the data can then be networked among subscribing doctors with full compatibility. Subscribing Doctor A can send a patient’s data to Subscribing Doctor B without translation, formatting or delay. Doctor A can send data to other doctors who do not subscribe to Practice Fusion with a similar level of ease or difficulty as using a site-specific system. Data can also be forwarded to billers and insurers.

With appropriate consents and controls in place, the data from multiple practices can be de-identified, consolidated, and shared with public health agencies and medical researchers to further increase its value at a very small increase in costs. Data can be sold at a higher price because it will be in a standard format and in larger quantities. A researcher, whether a not-for-profit institution or commercial company that needs 1,000 records will be able to go to one place and quickly get records of a known quality. Practice Fusion will recover its investment and cost from advertising (optional to users) and the sale of the data.

Practice Fusion has placed itself in the enviable position of having a cost structure that is getting less expensive and a revenue stream derived from data that is becoming more valuable over time as it gains longitudinal range.

Free presents the following hypothetical model:

Medical associations conducting research on specific conditions require longitudinal health records for a large set of patients. Depending on the focus of a study (think white, middle–aged, obese males suffering from asthma) each patient’s anonymized chart could fetch anywhere from $50 to $500. A physician typically sees about 250 patients, so Practice Fusion’s first 2,000 clients translate to 500,000 records. Each chart can be sold multiple times for any number of studies being conducted by various institutions. If each chart generates $500 over time, that revenue should be greater than if Practice Fusion sold the same 2,000 practices software for a one-time fee of $50,000.

[Practice Fusion is now reporting, “… 30,000 users across all 50 states and US territories.”]

Free is an option worth considering. Does that mean you should sign up? No.

The normal business process for selecting a system is to do a high level search and assessment to narrow the number of candidate systems for further study. The fact that free makes sense just means that a Practice Fusion system, or others like it, qualifies as a first round candidate.

The next step would typically be to prepare a cost/benefit study among the top few candidates. Because one of the systems is no-cost, the focus for the next step should be based largely on benefits.

Moving medical records from paper to a computer system provides opportunities to reduce office costs and improve both administrative and medical services to your patients. Benefits will include those directly related to the creation, storage, use, and networking of records plus those related to administration such as the non-medical part of patient records, appointments, billing interfaces, etc.

If you have already done your homework, now is the time to look at the benefits of a no-cost system. If you haven’t done your homework, check with other doctors and see what systems they recommend—both positive and negative. Get vendor documentation for other systems and acquaint yourself with the benefits those systems offer. Make a list of must have, like to have if cost is reasonable, and nice to have. Also, a list of things to avoid (negative benefits.) Now go look at a no-cost option and compare the benefits.

If a pair of shoes doesn’t fit, they aren’t worth taking home even if they are free. If a system doesn’t provide the benefits you need, don’t waste time considering it further. You do not want to change your practice to save money on a system. You want a system that will allow you to get the benefits at the lowest cost in terms of dollars with no negative impact on performance.

If a no-cost system provides the benefits you need at least as well as a for-cost system you have your answer. If two systems have comparable benefits, the cost/benefit analysis will always be better for a no-cost system than one where you buy it or pay a license fee.

If a no-cost system meets you minimum requirements and a for-cost system provides better benefits, you need to judge whether the better benefits justify the cost. They may. If a free pair of shoes fit but are not your style you will probably get more value out of a pair you like even if you have to pay for them. It is sort of the same thing with a computer system.

As a place to start, free is definitely worth considering. Be certain the vendor has a business model that makes sense. If it does, the next step is to get more information and be certain the system really meets your current and long term needs. But, that’s material for another blog post.

A footnote: On May 14, Chilmark Research, one of the healthcare blogs we follow posted a blog titled Where’s the Beef about another company that is offering a free service. That company claims it will be “generally available” in August. “Imagine our disappointment when we clicked on the [site] to find very few concrete details as to what the platform would offer …” Free is a good place to start but the real test is whether or not the system provides the services, protection, etc., you need. Thanks Chilmark for helping us make the point that it takes more than just free to make a system attractive.

Short link: