For several years, the Partnership for Public Service and the IBM Center for The Business of Government have partnered to have Conversations on Big Data, a series of discussions designed to broaden the perspective about quantitative analytics and share lessons learned about what worked well and what did not. As I was reviewing the transcripts of some of our “Conversations” to prepare to write this blog, it occurred to me that although the data analytics techniques are often complex, some of the lessons learned that really makes an analytics program successful are rather straightforward.
As an example, let’s look at the Conversation Lara Shane and I had recently with Malcolm Bertoni, the Assistant Commissioner for Planning at the FDA. While analytics has always been used at the Agency in the context of drug trials, the Prescription Drug User Fee Act of 1992 (PDUFA) mandated a partnership between FDA and drug companies in order to fund the work required to improve the agency’s performance evaluation process. The results were successful and PDUFA has been renewed several times and performance management analytics has expanded over the last 14 years throughout the Agency. Here are some of the ‘tips’ Malcolm shared that are all things we learned in kindergarten.
You must share with others. While professional data analysts are necessary to build analytic logic models and algorithms, they are not intimately familiar with the data to which those algorithms are applied. The analysts have to turn to the people who use the data on a daily basis. Malcolm pointed out that “people do feel some ownership of their data … It is kind of human nature that they’re responsible and accountable for their program, and they don’t want people getting into their business who don’t understand it.” When asked how he helps folks get over that fear of sharing, Malcolm said that building good relationships helps, but that there also need to be a “top-down authority” figure to point to. The authority figure (or teacher) has the power to reward the subject matter expert for sharing and to punish those who do not share.
You must learn to work with others. An important component of most analytics programs is that the data involved is created or hosted, or both, in multiple business units across an organization. Malcolm and his team had to look at “processes or issues that cut across the organizational lines, so there wasn’t any one owner.” Malcolm and his team had to “bring together all the people from the different programs … and start measuring it and having the quarterly discussions about it.” These discussions were also used to disseminate “best practices and different techniques for recruiting and managing the process and the cycling that goes in and out.” Malcolm and his team had to lead the participants from across the FDA in the process of sharing ideas as well as data and working together to find the best solution.
Creativity is fun and important. Malcolm says “we want to be innovative. We want to look at the new technologies and opportunities and be creative and at the same time be grounded in what’s practical. And if we can kind of get the best of both worlds, I think there’s some really exciting opportunities in this field. Easy problems get solved. It’s the complex problems, particularly in this day and age, that tend to require … different thinking perspectives.” Designing and building successful quantitative analytics programs often demands “out of the box” thinking. There is no such thing as a “bad” idea and all options should be considered –none rejected without review.
Do not stop asking why. Most of us have had those conversations with children where every response generates another instance of the question “Why?” Children are always questioning because they have not yet learned to be embarrassed by not knowing the answer. Of course, this could by why children learn so much at an early age! This reluctance to admit ignorance must be conquered for an analytics program to succeed. The entire point of quantitative analytics is to find the answers to a specific question or sets of questions. People may think they know the answers, and sometimes the “gut feeling” does represent knowledge, but those gut feeling answers must be backed up by findings and results that tie back to metrics and numbers, if they are to be accepted by an organization. Another key area where it is critical to question is data quality. Malcolm compares his analytics team to a “quality assurance function that independently looks at data. But I think the most important thing to improve quality is to use the data … not just reporting it, but actually having conversations with the managers and asking questions, gee, I noticed there seems to be something jumping up here; why is that? A lot of times, they’ll know the answer, and other times, they’ll say, thanks for asking; we went and looked into it, and we found an anomaly, or there was a problem with the data. So engaging in that dialogue – and you know, that’s really the way to assure good quality data.”
Connect the Dots. Many of us will remember doing exercises when children where we had to “connect the dots.” Malcolm warns that it’s not enough to know the beginning questions and the end results of an analytics program. There are lots of “dots,” such as processes, integration points, etc., that have to be thoroughly understood or the integrity of the results may be questionable. He said “I’m looking for these intermediate impacts that still have a pretty direct line of causation.” Each one of these “intermediate” impacts must be identified, defined, and assessed, and then connected to the next to understand the overall picture.
Keep on Learning. In the end, Malcolm says that his mission at the FDA is “really about being a learning organization and a very intentional goal-seeking organization that’s trying to do the best for the public health with the resources and tools that you have.” The success of quantitative analytics at the FDA was the result of bringing a group of people with varying skills and knowledge to share resources, work together for a common goal by being creative, asking questions, discovering the dots and then connecting them in order to learn as much as the can and hopefully, enjoy the experience enough to want to tackle the next set of questions.
For the complete Conversations on Big Data blog series, please click here.