When California health officials recently announced a 10 percent annual drop in hospital central line infections, they did not mention that they had found flaws in the facilities' reporting of one of the most serious infections that patients can suffer while hospitalized.

In fact, the state's own fact-checking of records from one-fourth of hospitals statewide had uncovered a series of errors, including an overall 38 percent undercount of central line infections. In response, state officials asked hospitals to correct the data.

The mistakes highlight a major question looming in the current national rush to improve the quality of patient care: How can states accurately count infections that occur in thousands of U.S. hospitals amid a tangle of differing definitions, counting techniques and plain human error?

Never before has the accuracy of such numbers been so important. Consumers can now choose among their local hospitals using brand-new rating systems posted on the Internet that rely on the same database used to record infections in California.

The stakes are huge for hospitals, too. Under the 2010 health care reform law, a new federal program kicks off this October that ties Medicare payments to the quality of hospitals' patient care. The goal over 10 years: an estimated $50 billion savings in Medicare costs.

Judging hospitals fairly will hinge on the accuracy of the same data that California and its hospitals are grappling with today.

California is relatively new to reporting infections, trailing by years a number of other states that discovered similar errors and learned to fix them. The state's program is a work in progress, and its quality will improve with each annual report, say regulators and hospital officials alike.

"It's an ongoing learning process. We've always said that this is going to take time," said Jan Emerson-Shea, spokeswoman for the California Hospital Association.

The massive 2011 report that the state Department of Public Health released Aug. 9 contained data on five types of hospital-acquired infections, or HAIs, for virtually all of the nearly 400 general acute care hospitals statewide.

The report was the third produced by California, which, under a 2008 law, became the 28th state to require such reports.

To hone its work, California sought $600,000 in funds through the federal Centers for Disease Control and Prevention to validate 2011 reporting from 100 hospitals that volunteered for the project. The state sent one or two infection experts to each hospital in summer 2011 to review records and confer with the staff.

"The goal is not to go into a facility and say, `Gotcha,"' said Dr. Arjun Srinivasan, CDC associate director for health care associated prevention programs. "What we've seen is if you do this in a collaborative way, if you use it as a teaching tool, the reporting gets better."

State officials wrote in an email note last week, "This project was done to assess and assist hospitals with surveillance and reporting. Improving the quality of HAI data will also reduce HAIs."

Hospitals welcomed the validation project, and the state had too few slots for them, Emerson-Shea said.

"The fact is, we have hospitals volunteering to work with the state, opening themselves up, being willing to hear they're not doing it right and getting guidance on how to do it better," she said.

News of the validation study was first reported by California Watch in an Aug. 10 article.

One researcher who has worked extensively with state data said officials should have disclosed the study detailing the errors.

"If they had the data at the time they released the report, they should have had a section on the validation, say, in the technical report," said Dr. David Zingmond, an associate professor at the UCLA David Geffen School of Medicine who has done research on hospital quality and the epidemiology of health care.

A state spokesman said that the validation work was indeed made public, citing a link on a department web page under a section labeled "Information for Infection Prevention Programs."

The link leads to a slide show that state officials presented to hospital officials and others in 17 cities from May to July of this year. The 38 percent undercount of line infections can be calculated from raw numbers shown on slide 23.

"The 38 percent is a disappointing number, because they could have done better," Zingmond said after reviewing the slides.

Asked why the state reported a 10 percent decrease, Zingmond said, "I would say that there was a decrease, but quantifying it is harder due to the undercount."

He urged that the state repeat the validation work.

"You have to keep going back to see how well they're doing their reporting," he said.

Experts have estimated that thousands of patients may die of such central line infections annually at a cost of billions of dollars. The infections affect some of a hospitals' sickest patients, those who are fed or medicated through lines inserted close to the heart.

The state's 2011 validation study also turned up errors in the reporting of cases of Clostridium difficile (C. dif), vancomycin-resistant enterococci (VRE) and methicillin-resistant Staphylococcus Aureus (MRSA), although the central line undercount was the largest.


The CHCF Center for Health Reporting (www.centerforhealthreporting.org) is an independent news organization. It is based at the University of Southern California's Annenberg School for Communication & Journalism and funded by the nonprofit California HealthCare Foundation.