How CDC’s handling of infectious outbreak led to a Big Data epiphany

swimming in lake springfield prepares for process intelligenceIn 1998 I participated in a triathlon in Springfield. The swim portion was about a mile in Lake Springfield. It had rained heavily in the days leading to the race and the run-off into the lake had made the water very turbid. In the heat of the moment I did not give much thought to the swim until I got a call from the CDC a few weeks later.

I heard that some participants in the race had been diagnosed with a bacterial infection called Leptospirosis. The runoff into the lake had sewage from neighboring hog and cattle farms. Such waste is a carrier of bacteria called Leptospira. Humans can be infected by the contaminated water through ingestion or through contact via broken skin or the mucous membrane around the eyes.

CDC wanted me to submit a blood sample at the university health center to have it shipped to Atlanta. I did so and thankfully that’s the last i heard from them. I was somewhat lucky. 11% of the blood tests were positive for Leptospirosis. This 2001 journal article details the outbreak and the work done towards containing the risk.

“We investigated an outbreak of leptospirosis among athletes and community residents after a triathlon was held in Springfield, Illinois. A telephone survey was conducted to collect clinical information and data on possible risk factors, community surveillance was established, and animal specimens and lake water samples were collected to determine the source of the leptospiral contamination. A total of 834 of 876 triathletes were contacted; 98 (12%) reported being ill. Serum samples obtained from 474 athletes were tested; 52 of these samples (11%) tested positive for leptospirosis.”

The linked journal article details the process by which the outbreak was identified and traced. The symptoms were first observed among three participants in Wisconsin. The Wisconsin department of health was aware of Leptospirosis symptoms and passed on blood serum samples to the CDC. Upon confirmation of the infection, the CDC issued media advisories to alert the 876 triathlon participants and other recreational users of the lake. The contact information for all participants was obtained from the triathlon organizers and a telephone survey was conducted. This is quite remarkable since the athletes resided in 44 US states, Canada and Austria. After conducting the survey, a sub-set were asked to submit to blood tests to their local health care bodies.

All of the above is quite mindboggling to me. This was 1998. We were not all equally plugged into the grid. Some people worked really hard to get to all of us and make sure we were ok. For this I am grateful. Thank you!

Bullet dodged, I could not but help think at the time that it was not the individual but the system that worked so well. There was a gameplan that was triggered when the cases were first reported. The process did not require an exceptional act or singular heroics to succeed. That was the epiphany around process intelligence. The guiding vision is that it is not sophisticated algorithms nor the genius analyst but the process itself with embedded intelligence that drives value.

This subject has been top of mind recently around the subject of Big Data. Thankfully the hype is dissipating and wise minds are urging calibration around expectations. Here’s a relevant excerpt from an article on Big Data, Healthcare and the Human Lens.

Underlying much of the Big Data hype is an implicit, and dangerous belief that “feeding big data to algorithms will yield superior and actionable insight.” 

Of course, one need not have waded through raw sewage to arrive at the above conclusion but in my case it helped. Personally that singular incident has been a touchstone in the years since. In 1999 I framed the idea as a process model in a dissertation. However it was only 2009, with cloud technologies becoming affordable, that we launched process intelligence solutions targeting key business processes.

What is the intrinsic difference from the offerings from (say) SAS or IBM? Fundamentally, the emphasis is on the process rather than the analysis algorithms or the hardware. We standardize the business process and deploy it through a web based interface, embedding analytics at key decision points throughout. This provides guidance to the end-user in his/her day to day activities without requiring data analysis or number crunching. It’s speedy, effective and easier. The links below showcase process intelligence deployed in three different industries.

Underwriting process intelligence case study
Retailer integrated intelligence in campaign processes, grew revenues per contact
Warranty claims process flow with embedded analytics

Leave a Reply

Your email address will not be published. Required fields are marked *