Skip to content
Blog

Lessons from designing a monitoring system for the Philippine Department of Health

We share our experience designing a monitoring system to support the Philippine Department of Health in rolling out Universal Health Care.

Philippine Department of Health officials and IDinsight team at a Theory of Change workshop. ©IDinsight

We share our experience designing a monitoring system to support the Philippine Department of Health in rolling out Universal Health Care.

Health systems across the Philippines have historically faced challenges due to fragmentation in health financing and service delivery models. In 2018, the Universal Health Care Act was introduced to address these long-standing challenges. Over the next five years, the Philippines will implement a set of reforms leading to universal access to basic healthcare for all 100+ million Filipinos. A key step will involve changing the way that health systems are organized, by requiring greater coordination and ownership at the provincial and city level, rather than at the municipal level, which is the current practice. The objective of this national reform is to streamline health care delivery and primary care accessibility, and ultimately improve health outcomes. This presents a significant opportunity to create a robust monitoring system that can help inform the successful roll-out of Universal Health Care in the Philippines. In this post, we share what we learned supporting the Philippine Department of Health (DoH) in this endeavour.

From June to December, IDinsight was contracted by the World Health Organization to design a monitoring system for tracking progress toward provincial and city health systems integration as part of the Universal Health Care reforms. Our mandate was to work closely with DoH to design (but not build) a system that would help monitor progress in all Universal Health Care Integration Sites (UHC-IS). Following our engagement, a software developer would be procured to build the monitoring system based on the technical specifications developed through our engagement.

At the beginning of the engagement, we realized that we needed to design a monitoring system1 that was flexible enough to account for potential changes in UHC-IS and future UHC delivery models; thus, we faced the challenge of designing instructive, yet open-ended technical specifications for the firm building the system. In this post, we’ll share our design process and our experience drawing on best practices in public sector software procurement, which we hope can be useful to other organizations and governments looking to build user-centric monitoring systems.

Addressing challenges in traditional public sector software procurement

Traditional public software procurement involves procuring a specific set of feature requirements with little room for iteration with future users. This approach can result in clunky, user-unfriendly software — and is particularly unsuited for software projects requiring a customized solution or ambiguous technical requirements. In the context of this project, we knew the UIS Monitoring System would need to be a tailored, user-centric system with built-in data quality checks and relatively sophisticated dashboard capabilities. Due to the uncertainties around the eventual scope of the monitoring system, we knew that DoH should procure a software solution where the selected developer could work closely with DoH to iterate on the future system’s design — by researching user needs, building and conducting A/B tests of the system, and updating features as appropriate.

As part of our design work, we applied best practices in public sector software procurement to develop the Terms of Reference (ToR) for the monitoring system. To shape our approach, we drew from the work of organizations like the United States Digital Service (USDS) and 18F, which have helped government agencies create and procure for more effective, user-centric digital services. We also drew on the TechFar Handbook, which places a particular emphasis on how governments can choose the right contractor.

Applying principles of agile software development

The following two principles of agile software development were most important to shaping our process designing the technical specifications for the ToR:

1. Ensuring active user involvement:

Our key objective at the beginning of the design process was to gain an in-depth understanding of how provincial and city health system leaders, the users, would report data into the monitoring system. We visited several provinces and cities and interviewed leading health officials to understand their usage of monitoring systems. We also gathered data on the existing technical capabilities within provincial and city health offices — in areas such as connectivity, device-use, data and reporting workflows, and computer literacy. This research laid the groundwork for the development of indicators for the monitoring system and our initial decisions for the system’s technical specifications.

While drafting the monitoring system indicators, we held numerous working sessions with DoH and other stakeholders to solicit feedback on indicator language and ensure they reflected shifting policy decisions. For example, since policies related to setting up integrated Health Care Provider Networks (HCPNs) were continually developing, we worked closely with PhilHealth Officers familiar with the policy changes to ensure that the monitoring system indicators reflected up-to-date policies2.

We also held a Data Flow Mapping Session3 with DoH to discuss and align on each stakeholder’s goals in using the monitoring system for performance management and decision-making. We encouraged our counterparts to take ownership over the design of the system and institutionalize decision-making processes to ensure that monitoring data would be used effectively in the future. As a result, our client team made concrete suggestions on the design of the monitoring system dashboards and began discussions on internal decision-making processes.

Finally, we helped identify a product owner (a Monitoring System Administrator) within DoH to manage the monitoring system on an ongoing basis, drawing on recommendations from organizations like the USDS and 18F. Identifying a product owner within a government agency to manage the software contracting and development process, as well as the ongoing usage of the system, will help ensure that it is built according to user needs and remains functional over time.

2. Defining software requirements based on user goals:

When developing the ToR, we focused on describing the main objectives of the monitoring system’s software and technical components, rather than their actual design requirements, in order to best position the software developer to build a responsive and user-centred system. In particular, instead of describing specific design requirements for the monitoring system interface, we created ‘user stories’ for each monitoring system user that described their key functions and goals for navigating the system’s interface (e.g. reviewing and approving reported data). For example, we included the following goals and functions for the DoH Monitoring System Administrator’s user story:

As the DoH Monitoring System Administrator, I want to:

  • Review and flag submitted data with explanations for flagging so that UIS Site Monitoring System Administrators can correct errors
  • View meta-data related to data collection and resolutions in tabular format so I can understand whether UIS Site Monitoring System Administrators follow reporting protocols,
  • Manage the users of the monitoring system so I can ensure that users have appropriate permissions and I can add and delete user accounts as needed

By presenting these user stories in the ToR, we were able to frame the software developer’s work in terms of enabling user goals, vis-à-vis simply building features. This approach lays the foundation for a user-centred solution by giving the software developer ample flexibility to iterate on the monitoring system’s design while empowering DoH to advocate for changes based on the articulated goals.

What’s worked for you?

By applying the above principles, we were able to equip DoH with the tools to procure a high-quality, adaptable monitoring system for the UHC-IS program. Our hope is that continual iteration and close collaboration with a software developer will help ensure that the system is adapted to DoH’s future monitoring needs and that it serves as an effective decision-making tool for supporting UHC delivery across the country.

We would love to hear your ideas on how to integrate best practices in agile software development when data collection and monitoring systems, as well as examples of strategies that have worked for you. Please reach out to our team, eric.dodge[@]idinsight.org, meg.battle[@]idinsight.org and vera.lummis[@]idinsight.org with your thoughts and suggestions.

  1. 1. Our monitoring system design process involved several stages, each designed to deepen our understanding of the priority areas for monitoring and the decision-making needs of different users of the monitoring system. We began by creating a Theory of Change to map out health system changes under UHC, and conducting site visits to several provinces and cities to understand local health systems’ plans for UHC delivery. We then developed indicators for the monitoring system, and worked closely with DoH to determine the system’s design and technical specifications. Finally, we produced a detailed Terms of Reference (ToR) for the firm that will be contracted to build the monitoring system software.
  2. 2. The following are examples of such indicators that reflected shifting policy nuances: “Out of [X] Primary Care Providers in the Health Care Provider Network (HCPN), how many Providers are not licensed by DoH nor accredited by PhilHealth?”, and “Out of [X] LGUs, how many LGUs are not served by at least one primary health care provider in the Health Care Provider Network (HCPN)?”
  3. 3. This process involves mapping out how monitoring system data will be sourced, processed, verified and shared via dashboards.