In December 2013, a little-known viral hemorrhagic fever was detected in the forest region of Guinea, triggering what would soon become a global outbreak. The Ebola Virus Disease (Ebola), a zoonotic disease, had jumped from animals to humans dozens of times since it was first detected in 1976. Most of what is known about the epidemiology of Ebola is derived from studies of large outbreaks in Kikwit, Democratic Republic of Congo and Gulu, Uganda. Yet unlike all prior outbreaks, this one would reach into densely populated capital cities where it would mushroom into “large and explosive outbreaks,”[1] and go on to become a large-scale epidemic.
Ebola cases first moved from rural Guinea into the capital city, Conakry, and subsequently appeared in the capital cities of Monrovia, Liberia, and Freetown, Sierra Leone. Although the vast majority of cases originated in urban areas, rural areas were hit hard as well. This was particularly the case in the tri-border nexus of Guinea, Liberia, and Sierra Leone, where people frequently cross borders because of trade and family connections. These countries were hit hardest due at least in part to weak health systems and physical infrastructure that remained undeveloped after years of war and civil conflict.[2] Nor were other countries immune, and the disease spread to Nigeria, Senegal, Mali and beyond.
The government of Guinea, where the first case had emerged deep in a forested region, officially announced its outbreak in March 2014. As the outbreak escalated, in August 2014 the World Health Organization (WHO) declared the Ebola outbreak a “Public Health Emergency of International Concern” (PHEIC).[3] Despite the declaration, the international response failed to keep pace with the disease’s spread, due to multiple factors including: weak health systems, limited human resources for health, the presence of unknown chains of transmision, community resistance, unsafe burials, porous borders and recent data on a mutation of the virus that caused increased infectivity and contributed to increased mortality in Liberia and Sierra Leone.[4] In September 2014, Joanne Liu, the International President of Médécins Sans Frontières (MSF), called the response “lethally inadequate,” and urged the massive deployment of military and civilian teams.[5] Two weeks later, the United Nations established its first-ever emergency health mission, the UN Mission for Ebola Emergency Response (UNMEER).[6]

The fatal consequences of the relative unpreparedness of both national and international actors to respond to an outbreak of this ferocity and scale were quickly revealed. In early fall 2014, the number of people infected with Ebola continued to climb, doubling approximately every three weeks. At the height of the crisis in Liberia in late September 2014, approximately 500 people were infected with Ebola in one week, with hundreds more exposed.[7] By the end of April 2016, the toll of the outbreak reached 30,057 Ebola cases and 10,990 deaths. 

The Perils of Forecasting the Spread of the Ebola Epidemic

As international responders scrambled to cope with the Ebola outbreak, researchers worked to model the spread of the disease in order to better understand its potential scale and impact. In a September 2014 forecast, U.S. Centers for Disease Control and Prevention (CDC) researchers estimated that Sierra Leone and Liberia would have approximately 550,000 Ebola cases--and a worst-case scenario of 1.4 million cases when corrected for underreporting of cases--by January 2015.[8] A separate WHO estimate published in a September 2014 New England Journal of Medicine article estimated a cumulative total of just over 20,000 confirmed and probable cases in Guinea, Liberia, and Sierra Leone by early November 2014.[9] The dire estimates were published around the time the UN Security Council and General Assembly discussed the Ebola outbreak, and ultimately triggered the rapid scaling of the Ebola outbreak response in the fall of 2014.[10]
What accounts for these diverging estimates? To understand the differences, it is crucial to understand the assumptions built into the models, which can lead to either small or large differences in model estimates.

The Impact of Modeling Assumptions on Outbreak Forecasts

In the early fall 2014, those modeling the outbreak did not know whether the disease would spread in a linear or exponential fashion, or when the peak would occur. It was clear at the time that reaching zero Ebola cases in West Africa would take months. In September 2014, cases were doubling in Guinea every 15.7 days, in Liberia every 23.6 days, and every 30.2 days in Sierra Leone.[11]
Both the CDC and WHO models assumed the number of Ebola cases would increase exponentially, but other variations in the assumptions informing these two models resulted in divergent estimates. First, an important difference between the CDC and WHO models related to underreporting. The CDC forecast included a correction factor for underreporting of Ebola cases, whereas the WHO model did not. Based on an analysis from late August 2014, the CDC estimated this factor to be approximately 2.5. In other words, the CDC estimated the true case count was 2.5 times greater than the reported case count.[12] 
A second difference between the CDC and WHO models was the timeframe used for the projection. At the time (August 2014), the CDC model was the only published model to extrapolate (with and without interventions to stop Ebola’s spread) beyond December 2014.  Other (non-public) estimates reached a projection similar to the upper CDC estimate of 1.4 million cases, assuming an exponential increase in cases, with the disease downturn occurring six months after the time the projection was released (estimated for February 2015) and an underreporting factor of 2.5.[13] In contrast, the WHO model forecasted only to late November/early December 2014.[14] If the WHO had extrapolated their estimate to approximately February 2015, the estimate of cases would have been about 400,000, without further scaling up of interventions. This is similar to the results of the CDC model.[15]

A third difference related to the impact of interventions to control and eventually stop the epidemic. The CDC model estimated the number of cases that might occur if no interventions were implemented as compared to cases that might occur with additional intervention, such as patient isolation in treatment units or safe burials. The WHO estimates assumed “no change in the control measures for the epidemic.”[16] The CDC model suggested that approximately 70% of Ebola patients must be effectively isolated in either Ebola treatment units or in the community, with safe burials when needed.[17] Behavior change practices also proved essential to the control of the epidemic.[18]

Limited Availability of Case Data for Forecasting

For epidemiologists and researchers in and outside the formal response, gaining access to case data to predict the trajectory of the disease and eventual Ebola caseload presented both data and technical challenges. The most accurate disease models require individual case-level data about the date of exposure, diagnosis, and outcome (i.e., recovery or death) in order to understand disease behavior and transmission. These data were not always accessible, however, and available data frequently contained inconsistencies. Available public data did not include detailed case data for a variety of reasons, including patient privacy concerns, a lack of identifiers required to differentiate between unique Ebola cases, and in some instances, a lack of data sharing agreements.[19] Moreover, the publicly available situation reports containing case data were aggregated, released in intervals, and retroactively corrected to reflect updated information about probable, suspected, or confirmed cases. Finally, reporting delays made the incidence of Ebola difficult to accurately calculate. Together these factors obstructed a clear picture of caseload data–a crucial input in models produced to predict the disease’s trajectory and eventual caseload.
Last but not least, a further aspect of the response that complicated the use of case data for disease forecasting was the publication of aggregated summaries in non-machine-readable PDF documents. As a result, caseload data had to be manually converted to Excel or .csv files to be of use to epidemiologists and others for research purposes. In one case, Caitlin Rivers, then a researcher at Virginia Tech, manually converted into Excel spreadsheets the data published in PDF files as the daily or weekly Situation Reports (Sit Reps). She posted these caseload data to GitHub, a website and tool that open-source software advocates and developers use to create, share, and re-use data and software code.[20] Until these data were publicly available, outside researchers were unable to verify or replicate existing forecasting models, nor were they easily able to create their own models. Such peer review from outside experts may have provided additional verification and analysis that may have helped to inform interventions designed to stop the spread of the disease.[21] Despite the perils of forecasting the Ebola epidemic, modeling was an important decision-making tool in planning and implementing interventions designed to stop the spread of Ebola.

About This Report

As the Ebola outbreak in West Africa appeared to be spiraling out of control in September 2014, with unexplained peaks and valleys in Ebola case counts and dramatically differing forecasts of the potential spread of Ebola, then-USAID Administrator Rajiv Shah charged a group within USAID with identifying why the data picture was so unclear. The pages that follow detail the many factors that complicated the collection, management, and analysis of paper-based and digital data used in the Ebola outbreak response. These factors clouded a clear picture of caseload data as the outbreak evolved, and continued to stymie efforts to retroactively resolve data discrepancies even after the crisis phase of the response. Indeed, discrepancies in major data sources (e.g., laboratory data; databases used to track patient case data at the facility, local, or national level; national situation reports; and contact tracing lists) are unlikely to ever be fully reconciled. 

To effectively contain the Ebola virus outbreak required multiple and coordinated interventions, including case management, disease surveillance, and contact tracing, as well as community engagement and social mobilization. The timely transfer of information about each of these interventions was critical to effective coordination and communication across sectors and among the range of actors involved in managing the response and ensuring recovery. How did digital technologies help to address this problem? At the height of the outbreak, USAID (and specifically the U.S. Global Development Lab’s Center for Digital Development) received requests from partners operating in West Africa seeking to effectively integrate digital technologies into their response efforts. Those requests, and the corresponding interest and need for guidance, prompted this work.[22]

To better understand the root causes of an unclear data picture, and the opportunity of digital technology to strengthen information flows and data-driven decision-making, the report addresses the following questions:

  • What contributed to the “fog of information”[23] that characterized much of the early stages of the Ebola outbreak response?
  • What can be learned from the use of data and digital technologies during the Ebola outbreak response? How and where were data and digital technologies effectively used in the outbreak response?
  • What should be done to strengthen the use of digital data and information flows in emergency contexts, to support long-term recovery, and to build resilience against future shocks like the recent Ebola outbreak in West Africa?

In answering these questions, the report highlights lessons and recommendations particularly for the humanitarian assistance and health-focused members of national and international organizations that respond to crises. These include:

  • responding organizations, such as UN agencies and nongovernmental organizations (NGOs)
  • donors, both bilateral and multilateral 
  • local, national, and regional government actors mounting an epidemic response in their countries

 Although the authors recognize that a broader swath of actors (such as citizens, frontline workers, and remote responders) played critical roles in the Ebola outbreak response, the report focuses its recommendations--including those based on the engagement of these other actor groups--on major government and international responders.

Linking to Broader Conversations

This report builds off of other reports in recent years that have tracked and advanced the practice and discourse about information management, technology, and communications in both global health and humanitarian assistance, including those that elevate the importance of communicating with affected communities in health crises and emergency response operations. As early as 2005, the World Disasters Report focused on information in disasters and called for a more people-centered disaster response.[24] The 2007 New Technologies in Emergencies and Conflicts report[25] and the follow-on 2012 report, Disaster Relief 2.0,[26] examined how technology was reshaping information flows in emergency responses, and the growing role of the “digital humanitarians” (also referred to as the volunteer and technical community) in emergencies.[27] These and other reports explicitly acknowledge the explosive growth and use of mobile phones and other digital technologies globally, both in development programming and natural disasters. In alignment with the objectives of the UN Sustainable Development Goals, this movement has sought to harness the potential of mobile technology to “end isolation, amplify the voices of the disadvantaged, and … connect even the poor to information and services that enable them to improve their livelihoods and quality of life.”[28]
In the global health context, a reform movement has emerged in support of strengthened health systems.[29] More recently this has included a stronger focus on the information systems that undergird well-functioning health systems.[30] Yet, as the Ebola outbreak demonstrated, much work remains to be done. The West African Health Organization’s (WAHO) Director-General Dr. Xavier Crespin noted that the Ebola outbreak “exposed the weaknesses of national health systems in general, and health information systems in particular,” and called for strengthened mechanisms to quickly and reliably share information about epidemic-prone diseases at national and regional levels.[31] As part of this effort, a movement has emerged to unlock the potential of mobile and electronic technologies to strengthen a variety of aspects of health systems and health information systems, from disease surveillance and response to issues related to health information and service delivery. It also encourages connecting citizens, health workers, and governments in real time. When used appropriately, this integration of digital technologies can lead to many benefits, including increased effectiveness of health services, and the expansion of health worker participation in community disease surveillance.[32]
The recommendations and lessons highlighted below from the findings in this report identify both opportunities for and challenges to restructuring information flows in emergencies and how, over the long term, digitized data and information flows could be more dynamic, less hierarchical, and support greater resilience in the face of future disease outbreaks, natural disasters, or other emergencies. These buttress calls for reform tied to the May 2016 World Humanitarian Summit that have highlighted the importance of local actors in disasters and emergency response[33] and champion a system that must become more flexible and adaptable.[34] The “Grand Bargain,” which emerged from the Summit, represents a step forward toward these reforms and includes a commitment from donors to provide more flexible funding to local organizations and to publish transparent data about humanitarian funding.[35] Digital technologies can support these aims.


The research adopted a mixed-method, qualitative approach, consisting primarily of semi-structured interviews, case study analysis, and a review of related literature and other lessons-learned reports and initiatives, with particular attention to those touching upon data, information flows, or digital systems.[36]
The authors conducted semi-structured interviews with more than 130 individuals between November 2014 and February 2016. Although not a representative sample of responders, interviewees included individuals from:

  • national-level actors (including those within national ministries of health and the coordination bodies in the three most-affected countries)
  • NGOs and international organizations (e.g., the International Federation of the Red Cross and Red Crescent, the West African Health Organization), the UN system (e.g., UNMEER, OCHA, WHO, UNICEF)
  • the digital humanitarians (e.g., Humanitarian OpenStreetMap, Digital Humanitarian Network)
  • private sector actors (e.g., GSMA, sQuid)
  • various USAID Bureaus involved in the response (Office of U.S. Foreign Disaster Assistance, Global Health and Africa Bureaus, the U.S. Global Development Lab, the Ebola Secretariat/Africa Ebola Unit) and other U.S. Government responders (Department of State, Department of Defense, and the U.S. Centers for Disease Control and Prevention)

Most of these interviews took place in person, in Washington, DC and New York; Accra, Ghana; Geneva, Switzerland; and during field visits to Conakry, Guinea, Freetown, Sierra Leone, and Monrovia, Liberia. Other interviews were conducted over the phone or by Skype. A list of those formally interviewed and their organizational affiliations is available in the Acknowledgements. Interview questions focused on documenting experiences, examples, and challenges in managing data and information flows during the response, and understanding projects and activities that involved data or digital technologies to support Ebola response efforts. Interviews were coded and analyzed thematically and are referenced throughout the report.[37]
In addition to the perspective and insights gathered through formal interviews, field research, and a literature review, the insights in this report were informed by the authors’ work at USAID. From September 2014 to August 2016, Larissa Fast was an American Association for the Advancement of Science (AAAS) Policy Fellow with the Center for Digital Development and Adele Waugaman is Senior Advisor for Digital Health in the USAID Bureau for Global Health. The authors drew upon observations and direct participation in select USAID planning meetings and reporting, as well as access to and participation in a variety of U.S. government interagency and outside events related to the Ebola response.


The research is limited in several respects. First, although the report captures a variety of aspects of the response over time, it is not comprehensive in its rendering of the various phases or aspects of the response. The response period covered in this research includes three key phases: (1) the initial declaration of the outbreak in March 2014 to the declaration of a PHEIC in August 2014; (2) the period of rapid transmission and dramatic increases in case numbers (summer to late fall 2014); and (3) the decline of cases beginning in 2015 and ending in 2016. Interviews for this report, which began in late 2014 and continued into early 2016, captured respondent’s perceptions at various points in time, and included their recollections of past events and occurrences. On the one hand, the duration of the research enabled an investigation of issues related to the transition from emergency response to longer term recovery and resilience. On the other, it has presented challenges in terms of the retrospective recollections of events, shifting contexts, and perspectives over time. 
Second, USAID aims to foster data-informed and data-driven decisions that are adaptive, transparent, responsible, responsive to, and inclusive of the needs of populations and decision-makers at all levels. This perspective has shaped the report and its findings. The research aimed to capture examples of programming that used data or information with the goal of enabling a more agile and flexible approach. It also sought to learn from examples of the use of digital technologies to try to shorten the timeframe between the collection, sharing, analysis, and subsequent use of data of all types; or to try ot facilitate a more inclusive ecosystem by lowering barriers to regular communications with multiple actors, which in this context includes citizens, frontline health workers, and other key stakeholders who were central to the Ebola outbreak and response.

Given this approach, the study documents the opportunities and challenges of use of data and digital technologies, rather than analyzing the underlying conditions that either supported or deterred the use of data or digital technologies in the first place. The report analyzes nine case studies as well as learnings and observations on this topic from more than 130 members of hte response community. It does not provide a comprehensive assessment of the degree to which data, information or digital technologies enabled a more effective response across the three affected countries. The study also does not capture all of the innovative examples of information flows or of the collection and use of data or digital technologies in the Ebola response [38]. The examples presented here capture persistent challenges and opportunities that surfaced repeatedly in interviews, and across the case studies this report examines. Wherever possible, the authors have verified information and case studies with interviewees. approachthe study does not capture all of the innovative examples of information flows or of the collection and use of data or digital technologies in the Ebola response. Instead, we present a series of illustrative examples as opposed to a representative or complete catalog of the many uses of data and digital technologies in the response.[38] The examples presented here capture persistent challenges and opportunities that surfaced repeatedly in interviews, and across the case studies this report examines. Wherever possible, the authors have verified information and case studies with interviewees.
Finally, the study is primarily interview and participant-observation based. It is not designed to capture a detailed and field-based investigation of digitized data and information flows. Although the study did include field research in West Africa (Freetown, Conakry, and Monrovia), it was not possible to adequately capture perspectives of those outside of the formal response efforts or the perceptions of communities affected by Ebola. This represents a significant gap in the research.

We use the term “formal response” to refer to that implemented by government, international, and NGO actors who organized and implemented the response. We use the term “informal response” to refer to that leveraged by local communities, the volunteer and technical community, and other actors.


[1] World Health Organization, “A fast-moving Ebola epidemic full of tragic surprises,” press release, Ebola at 6 months,
[2] Marc DuBois and others, The Ebola response in West Africa. See also World Health Organization, “Final Report of the Ebola Interim Assessment Panel,” prepared by panel of independent experts (July 2015), and David P. Fidler, “Ebola Report Misses Mark on International Health Regulations,” Chatham House, The Royal Institute of International Affairs (July 17 2015).
[3] WHO Statement, “Statement on the 1st Meeting of the IHR Emergency Committee on the 2014 Ebola Outbreak in West Africa,” World Health Organization, August 8, 2014,
[4] Diehl W. et al, , “Ebola Virus Glycoprotein with Increased Infectivity Dominated the 2013-2016 Epidemic,” Cell 164, no. 4 (2016): 1088-1098, doi:
[5] James Gallagher, “Ebola Response Lethally Inadequate, says MSF,” BBC News Website, September 2, 2014, accessed May 18, 2016, See also Doctors Without Borders/Médécins Sans Frontières (MSF), "Ebola: International Response Slow and Uneven,” MSF USA Press Release, December 02, 2014, accessed March 24, 2016,

[6] United Nations, “UN Mission for Ebola Emergency Response (UNMEER),” Global Ebola Response, (accessed May 18, 2016).
[7] According to WHO data from December 2014, the peaks occurred in Liberia (509 cases), Sierra Leone (748 cases), and Guinea (292 cases) at epidemiologic weeks 38 (September 14-20), 46 (November 9-15), and 41 (October 5-11), respectively. See CDC Morbidity and Mortality Weekly Report (MMWR), “Update: Ebola Virus Disease Epidemic — West Africa, December 2014,” Centers for Disease Control and Prevention, 2014, accessed May 18, 2016,, . For more on the complexities of data, see Table 2 and the discussion about case data 
[8] Meltzer et al. “Estimating the Future Number of Cases in the Ebola Epidemic.” CDC Morbidity and Mortality Weekly Report (MMWR) v. 63, n. 3 (September 26, 2014), Supplement: 3.  See also Meltzer et al. “Modeling in Real Time During the Ebola Response.” CDC Morbidity and Mortality Weekly Report (MMWR), v. 63, n. 3 (July 2016): S85-S89 and Thomas R. Frieden and Inger K. Damon. “Ebola in West Africa – CDC’s Role in Epidemic Detection, Control, and Prevention.” Emerging Infectious Diseases 21, n.11 (November 2015): 1897-1905.
[9]  WHO Ebola Response Team, “Ebola Virus Disease in West Africa: The First 9 Months of the Epidemic and Forward Projections,” New England Journal of Medicine 371, no.16 (October 2014): 1481-1495,
[10] See also Norimitsu Onishi, "Empty Ebola Clinics in Liberia Are Seen as Misstep in U.S. Relief Effort," New York Times, sec.1 (April 11, 2015); Makiko Kitamura and Elise Zoker, "U.S. Ebola Clinics in Liberia to Open with Few Patients," Bloomberg Business, December 18, 2014; Laurie Garrett, “Ebola’s Lessons: How the WHO Mishandled the Crisis,” Foreign Affairs Magazine, August 18, 2015.
[11] WHO Ebola Response Team, “Ebola Virus Disease in West Africa - The First 9 Months of the Epidemic and Forward Projections” New England Journal of Medicine 371, no.16 (2014): 1481-1495,
[12] Meltzer et al, “Estimating the Future Number of Cases in the Ebola Epidemic,” CDC Morbidity and Mortality Weekly Report (MMWR) 63, no. 3 (2014): 2, The authors calculated the underreporting factor by comparing a predicted number of beds in use as compared to actual beds in use at the end of August 2014, when Ebola cases were increasing exponentially. This provided a range of cases, from 550,000 to 1.4 million cases.
[13] Interview with USG official, June 2015.
[14] WHO Ebola Response Team, “Ebola Virus Disease in West Africa: The First 9 Months of the Epidemic and Forward Projections,” New England Journal of Medicine 371, no.16 (2014): 1481-1495,
[15] Email correspondence with CDC officials, August 2016.
[16] WHO Ebola Response Team, “Ebola Virus Disease in West Africa - The First 9 Months of the Epidemic and Forward Projections” New England Journal of Medicine 371, no.16 (2014): 1481-1495, Actual cases from November 5, 2014 were below the WHO estimates. See WHO Situation Report from November 5 (, accessed 8 September 2014).
[17] Meltzer et al, “Estimating the Future Number of Cases in the Ebola Epidemic,” CDC Morbidity and Mortality Weekly Report (MMWR) 63, no. 3 (2014): 2,
[18] PNAS and Fred Hutch studies, Mary Engel, “A Warning Heeded Yields Good News on Ebola,” Fred Hutch News Service (January 9, 2015),; Sabin Russell, “Tracking the Rise and Fall of Ebola in Sierra Leone,” Fred Hutch News Service (March 28, 2016),; Li-Qun Fang et al., “Transmission Dynamics of Ebola Virus Disease and Intervention Effectiveness in Sierra Leone,” PNAS 113, no.16 (2016): 4488-4493, doi: 10.1073/pnas.1518587113,
[19] Interviews with national and international responders, June 2015. The lack of data sharing agreements presented as an issue most prominently at the beginning of the emergency, and was not universal. For example, although not initially in place, the U.S. CDC quickly implemented data sharing agreements with each of the national governments and with the WHO (email correspondence with CDC officials, August 2016).
[20] Interview with Caitlin Rivers, June 2015.
[21] Thanks to Parviez Hosseini and Beth Linas, as well as Martin Meltzer, Leah Fischer, and Scott Santibanez for their comments on this section.
[22] This work builds off of two prior publications commissioned by the U.S. Global Development Lab. The first is an October 2014 assessment of the reach and capacity of information and communication technology (ICT) infrastructure in Liberia, including the use of software systems to support various aspects of the response. The assessment details how, as the Ebola outbreak progressed, mobile network capacity was strained by dropping revenues, rising operating costs, and skyrocketing demand by the international response. (NetHope, GBI, and USAID, “Information and Communications Technology Response to the Ebola Crisis: Desk Review and Recommendations for Private Sector Engagement,” (Washington, DC: USAID; NetHope/GBI, 2014), The second is a July 2015 report assessing the capacity of digital infrastructure in each of the three countries hardest hit by the Ebola crisis to leverage digital systems for the response. Gobee Group, “Regional, Real-Time Data Infrastructure for the Ebola Response: An Assessment of On-the-Ground Data Systems and Realistic Opportunities for Transformation in Guinea, Liberia, and Sierra Leone,” July 2015. See also a blog post of the same title by Eric King, “Fighting Ebola with Information,” USAID Impact Blog, May 28, 2015, 
[23] “Fog of information” is a variation of the term “fog of war,” first attributed to the Prussian military strategist Carl von Clausewitz and more recently popularized in the documentary film of that title that explored the difficulties of decision-making in the midst of conflict, when full situational awareness is often absent. We use this term, which several interviewees used, to describe the lack of timely, accurate, and accessible data, which clouded situational awareness, impeded effective decision-making, and stymied the response.
[24] International Federation of Red Cross and Red Crescent Societies, World Disasters Report: Focus on Information in Disasters, (Geneva: ATAR Roto Presse, 2005). Also available online at
[25] D. Coyle and P. Meier, New Technologies in Emergencies and Conflicts: The Role of Information and Social Networks (Washington, D.C. and London, UK: UN Foundation-Vodafone Foundation Partnership, 2009).

[26] Harvard Humanitarian Initiative, Disaster Relief 2.0: The Future of Information Sharing in Humanitarian Emergencies (Washington, DC, and Berkshire, UK: UN Foundation & Vodafone Foundation Technology Partnership, 2011), See also OCHA Policy and Studies Series, Humanitarianism in the Network Age including World Humanitarian Data and Trends 2012 (Geneva: United Nations, 2012), and OCHA Policy and Studies Series, World Humanitarian Data and Trends 2014, (Geneva: United Nations, 2014),
[27] Patrick Meier, Digital Humanitarians: How Big Data is Changing the Face of Humanitarian Response (Florida: CRC Press, 2015).
[28] NetHope and United Nations Foundation, SDG ICT Playbook: From Innovation to Impact (Geneva, 2015), 26.
[29] For more information, see the 2007 WHO report “Everybody’s Business: Strengthening Health Systems to Improve Health Outcomes,”
[30] For more information, see the June 2015 Wilton Park report, “(Re)Building health systems in West Africa: what role for ICTs and mobile technologies?”
[31] WAHO, “Health Systems that Can Talk to Each Other Respond Better in Emergencies,” May 20, 2015 accessed September 6, 2016, . See also Marc DuBois and others, eds., The Ebola Response in West Africa: Exposing the Politics and Culture of International Aid, HPG Working Paper (London, 2015). See also Simon Wright, Hanna, and Mailfert, A Wake-up call: Lessons from Ebola for the World’s Health Systems (London: Save the Children UK, 2015), vii., , accessed June 24, 2016,
[32] See United Nations, Protecting Humanity from Future Health Crises: Report of the High-level Panel on the Global Response to Health Crises, advanced unedited copy, 2016, 51, (accessed March 23, 2016); and WAHO, Annual Joint Meeting of NHIS and IDSR Managers with Technical and Financial Partners in the ECOWAS Region, General Report, 2015, 7,
[33] United Nations Office for the Coordination of Humanitarian Affairs (OCHA) and CDA Collaborative Learning Projects, Leaving No One Behind: Humanitarian Effectiveness in the Age of the Sustainable Development Goals (Geneva: United Nations, 2016), 7. The report highlights the need to “enable and empower national actors and institutions, not to substitute for them.”; International Federation of Red Cross and Red Crescent Societies, World Disasters Report: Focus on Local Actors, the Key to Humanitarian Effectiveness (Geneva, 2015).; and World Bank, World Development Report 2015: Mind, Society, and Behavior (Washington, DC: World Bank, 2015), doi:10.1596/978-1-4648-0342-0.
[34]  ALNAP, The State of the Humanitarian System, ALNAP Study (London: ALNAP/ODI, 2015), VII, 112,
[35] See World Humanitarian Summit, “The Grand Bargain - A Shared Commitment to Better Serve People in Need” (Istanbul, 2016),
[36] The following studies provide a more in-depth assessment of various aspects of the international Ebola response efforts. See World Health Organization, Final Report of the Ebola Interim Assessment Panel, prepared by a panel of independent experts (July 2015).; Inger Damon, MD, PhD. 2014 Ebola Outbreak Overview & Lessons Learned, from CDC, presented at the 2015 USPHS Symposium, May 20, 2015,; Marc DuBois and others, The Ebola response in West Africa; United Nations, “Protecting Humanity from Future Health Crises: Report of the High-level Panel on the Global Response to Health Crises,” advanced unedited copy, 2016,, accessed March 23, 2016); World Health Organization, Sixty-eighth World Health Assembly, Ebola Interim Assessment Panel: Report by the Secretariat, in pursuance of resolution EBSS3.R1, A68/25, 2015, (; World Health Organization, WHO Secretariat Response to the Report of the Ebola Interim Assessment Panel, WHO, August 2015, References to these and related literature appear throughout the report.
[37] U Unless necessary to name the individual(s) interviewed for context or other reasons, we refer to all interviewees by type or category in order to respect the confidential nature of the interviews. The interviews were coded and analyzed using the qualitative data MaxQDA software.
[38] For example, in this report we only briefly mention supply chain and laboratory data, and do not provide details about many of the innovations created during the response. These include Google’s development with MSF of a sanitizable tablet for Ebola treatment center medical records, or the USAID Ebola Grand Challenge for Development that resulted in the development of new innovations to improve point of care, such as the “Smart Band-Aid” and the Drip-Assist technology. Nor do we address data issues related to vaccine or clinical trials and research during or after the response. See Médécins Sans Frontières, “Ebola: MSF and Google develop ‘Ebola-proof’ tablets,” Stories from the Frontline, March 23, 2015, accessed May 18, 2016,; USAID, “United States Announces Results of Grand Challenge to Fight Ebola,” press release, December 12, 2014, accessed September 5, 2016,; Lance Ulanoff, “This smart ‘band-aid’ could help the world beat Ebola,” Mashable, March 14, 2015, accessed September 21, 2016,; USAID, “DripAssist: Shift Labs,” Ebola Grand Challenge, accessed September 21, 2016, See also Phil Sneiderman, “John Hopkins, DuPont Join Forces to Produce Improved Ebola Protection Suit,” Johns Hopkins Magazine, September 28, 2015, accessed September 1, 2016,