Integrative Monitoring, Differential Monitoring or Inferential, Instrument-based, and Coordinated Monitoring

The purpose of this blog post is to point out the intersections, differences and similarities of integrative, differential/inferential and coordinated monitoring as used in the monitoring of human service programs. Program monitoring has changed over the years in that not only has it grown in the types of monitoring done, such as process, compliance, outcome monitoring, etc.; but also in the functional aspects of monitoring as delineated with integrative, differential, and coordinated monitoring. Much has been written in the research literature about the types of monitoring but not as much regarding the functional aspects of monitoring probably because it is much newer and has grown with the various types of monitoring being used in different contexts.

Coordinated monitoring deals with monitoring across similar service types, for example, in early care and education, monitoring would be done using similar standards in Head Start, child care, preschool, etc. This is an effective and efficient approach which has been demonstrated through the creation and dissemination of Caring for Our Children Basics as a core set of standards for all these various settings. The US Dept of Health and Human Services has advocated this particular approach.

Differential monitoring focusing on the use of abbreviated or targeted inspections of programs that have a history of high regulatory compliance with specific rules or standards. It means spending more time and doing a more comprehensive review of those programs having difficulty complying with specific rules, these can be based upon risk assessment or predictive value of overall compliance. This is a very efficient approach which has been demonstrated to save time in monitoring reviews. Many states in the USA and provinces in Canada use this approach. The US Office of Head Start has experimented with the approach.

Instrument-based program monitoring utilizes instruments, tools, or checklists for recording all data when a review or inspection is completed. It is different from the case review or anecdotal type of record keeping. This approach started in the late 1970’s, early 1980’s when it was introduced by the Children’s Services Monitoring Transfer Consortium, a federally funded research project consisting of California, Michigan, West Virginia, Pennsylvania and New York City. Its development occurred parallel with the development of differential monitoring but with particular emphasis on the metrics or measurement domain when it came to tool development. The Child Development Program Evaluation Scale was a major tool developed from this initiative.

Integrative monitoring is a relatively new approach to monitoring in which the emphasis is on integrating regulatory compliance rules with quality programming standards. Note the emphasis is on the rules and standards and not on who gets applied to those rules and standards nor how they get applied. However, combining integrative monitoring with differential monitoring is an interesting research focus which could be a very effective and efficient approach in combining these two perspectives. In the past, licensing and quality programming have generally been in their own silos when it comes to program monitoring. Integrative monitoring removes them from these silos and suggests building a continuous metric that starts with the health and safety aspects of rules and adds in the quality pieces on top of the rules. Presently, quality initiatives, such as Quality Rating and Improvement Systems, Accreditation, and Professional Development systems are examples of standards that could be used to build upon health and safety licensing rules.

There appears to be interest in pursuing an integrative monitoring approach in several jurisdictions in the early care and education field but this interest extends beyond and has been suggested more broadly by a recent article published in the Journal of Regulatory Science by Freer & Fiene (2023). Regulatory compliance and quality programming: Constraints and opportunities for integration, Volume 11, Number 1, 1-10 (Journal of Regulatory Science). The interested reader may want to take a look at the article, it does provide a unique model for pursuing integrative monitoring. Also, one may be interested in Fiene’s eHandBook on Licensing Measurement and Monitoring Systems: Regulatory science applied to human services regulatory administration available at This eHandBook provides the basics of licensing measurement and program monitoring metrics.

Here is a graphic that has been used to describe a logic model for ECPQIM: Early Childhood Program Quality Improvement and Indicator Model/Differential Monitoring Logic Model and Algorithm (DMLMA) which overlays the monitoring approaches (Coordinated, Instrument-based, Differential/Inferential, and Integrative) with the logic model.

About Dr Fiene

Dr. Rick Fiene has spent his professional career in improving the quality of child care in various states, nationally, and internationally. He has done extensive research and publishing on the key components in improving child care quality through an early childhood program quality indicator model of training, technical assistance, quality rating & improvement systems, professional development, mentoring, licensing, risk assessment, differential program monitoring, and accreditation. Dr. Fiene is a retired professor of human development & psychology (Penn State University) where he was department head and director of the Capital Area Early Childhood Research and Training Institute.
This entry was posted in RIKInstitute. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s