Community Video for Nutrition Guide

Component 3: Diffusion

MIYCN Elements: MIYCN Refresher Trainings and Disseminations | Verifying Adoptions | Improving the Program Based on Data and Feedback

After you have produced the videos and completed the approval process, you can load them on to pico projectors (via USB or micro SD memory cards) for dissemination. The diffusion process that follows, described in this section, includes dissemination, data entry into COCO, verification of the adoption of the desired behavior changes, and feedback analysis to better align the project to local community needs and interests.

Photo of a group of women watching a video on a wall inside

Mediators play a critical role in disseminating videos to village groups (e.g., women’s SHGs), following up with home visits to support community members in negotiating behavior adoption, and monitoring project outcomes. Mediators are trained to use videos as a catalyst to engage groups in dialogue, pausing and rewinding the video to ensure that the audience fully grasps the concepts. After the screenings, mediators visit homes to verify and record data on adoptions. You can analyze this data to progressively improve the project in an iterative manner.

Figure 3 below details the steps in the diffusion process with bolded emphasis on the elements specific to the MIYCN approach detailed in this guide. The other steps are only briefly summarized below and are further explained in the DG SOP.

Figure 3: Diffusion

Click on the "Read section ..." buttons below and the corresponding steps in the process will appear.

Dissemination
 
  1. Plan the dissemination schedule.
  2. Hold dissemination preparation meetings as MIYCN refresher trainings.
  3. Conduct disseminations with groups in a participatory manner.
Monitoring Adoptions
 
  1. Record participant details, including their interests, questions, and willingness to adopt or promote a particular practice.
  2. Observe and validate adoptions and promotions with home visits.
  3. Verify whether all critical points of practices have been followed or any innovations have been made.
Reporting and Learning
 
  1. Enter monitoring data into COCO for analysis.
  2. Regularly cross verify a random sample of adoptions to ensure verification quality.
  3. Review participants’ feedback and adoption data to improve the video production and dissemination processes and identify content for new videos.

3.1 Preparing for Dissemination

Dissemination is the process of screening videos to small groups in an interactive and engaging manner to encourage viewers to adopt or promote the practices featured in the videos. Strive to assemble groups of between 10 and 20 individuals—large enough for peer-to-peer dynamics yet not too large for lively and personal discussions. Ideally, the mediator should be from the village where the videos are being disseminated, or living there, and should be trained to facilitate the screenings in a way that fully engages participants. Before initiating disseminations, you will need to secure the required equipment and confirm that it works. You should also develop a dissemination schedule.

Project Experience: Adoption vs. Promotion

IIn the standard Digital Green model for agriculture, adoption of agriculture-specific practices is typically relevant to all participants. With nutrition, however, many practices are associated with very specific target groups or timeframes. For example, information about exclusive breastfeeding may only be applicable to women with babies under six months of age, a subgroup who may constitute only a small percentage of an SHG’s participants. However, the information is likely to be—eventually—personally relevant to many other women in the group as well as to their sisters, mothers, and mothers-in-law. These individuals are critical influencers, with the power to enable a woman to practice a given recommended behavior—or impede her ability or willingness to do so. For this reason, it is just as important to reach the influencing audience and not only to encourage adoption of the featured practices but also the promotion of the practice among others.

Planning Dissemination Schedules

The local partner organization should schedule screenings weekly or biweekly, depending on community interest, the number of videos to be shown, and the availability of the dissemination teams. Additionally, it is important to set a regular dissemination schedule with each group to achieve a continuous flow of information. When scheduling screenings, consider frequency as well as locations (and related issues such as their seasonal accessibility). Planning should incorporate:

  • The number of screenings per month that each group has committed to attend.
  • Seasonal topics of interest for each group.
  • Content to promote.
  • Integration with community-based interventions by an extension or local partner.

Mediation Skills

Tips for Mediators

  • Pause the video at the correct times to engage the audience and ask questions to stimulate discussion.
  • Anticipate various types of questions (for the topic and content) to spur discussion and reflection.
  • Actively listen and respond to the group.
  • Appropriately use and maintain pico projector and other dissemination equipment.

The credibility of the dissemination is largely dependent on the skills of the mediator in facilitating the video screenings. Viewers may lose interest, for instance, if a mediator is unprepared, does not engage the participants to foster a discussion, is unable to follow up, doesn’t know where to procure physical commodities (e.g., iron–folic acid tablets), or does not escalate and resolve issues that arise. Therefore, it is very important that the mediator is well-trained in all aspects of dissemination, from handling pico projectors to posing and responding to questions to engaging audiences in a reflective dialogue to recording community feedback and observations.

Refer to the DG SOP and training manual for dissemination for more details about mediators’ role.

Dissemination Preparation Meetings

For each nutrition behavior (or video), the adoption verification job aids should reference the non-negotiable points for each behavior and promote observation to verify whether practices are adopted correctly, incorrectly, partially, or innovatively. The job aids (Appendix 5) should align with the key concepts in the training materials and the specifics of the POP used to create the storyboards and videos. Distribute job aids for upcoming videos to mediators during the regular dissemination preparation meetings. To avoid confusion and to keep from burdening mediators with too many papers and too much information, distribute only the job aids for upcoming videos at this time.

The dissemination preparation meeting also serves as a refresher training for the mediator on the video subject matter, supplementing the short MIYCN sensitization training conducted at the beginning of the project. Before dissemination, the mediator must be very familiar with the selected video and should fully understand its learning objectives. He or she should rehearse responses to questions that the group is likely to raise. During the dissemination preparation meeting, screen each video in the order it will be screened in the actual disseminations. During this meeting, also review the adoption verification job aid and the POP to ensure the accuracy of the technical information on the featured video topic.

Pilot Experience: Dissemination Preparation

Regular dissemination preparation meetings with mediators and subject matter experts were a key part of successful disseminations. These meetings became a platform to share experiences, and a forum for mediators to ask and answer questions from one another. Mediators found the feedback and suggestions gathered and discussed during these meetings particularly helpful in improving the quality of the disseminations themselves. Because of their perceived utility, the dissemination preparation meetings became the primary platform for cross-learning among mediators, resulting in a knowledgeable, confident, and empowered cadre of mediators.

The technical partner or advisor must be present at the dissemination preparation meetings to further detail the specific practices featured in the videos and to provide additional technical information or guidance to supplement the nutrition training the mediators received at the beginning of the project. Interacting with these technical experts is valuable to mediators in preparation for the disseminations. Any feedback received or questions posed during the pretest, as well as any questions asked in previous video screenings should be presented and discussed so that mediators know what to expect in their own disseminations and can accurately answer questions. Appropriate answers to these possible questions should be shared and reinforced. The importance of this preparation underscores the value of pretesting.

Project Experience: Gender Considerations

Gender dynamics within communities can have a major impact on the acceptability of community-led video. In the SPRING/DG project, having women on the video production team was essential to maintain the comfort level of the actors, who were mostly female. Likewise, facilitators reported that female participants felt uncomfortable having men present during disseminations, especially when video topics such as exclusive breastfeeding were being discussed. Keeping this in mind, the local partner and mediator discouraged male attendance at group meetings but encouraged men and women to discuss the video content together at home.

Objectives of Dissemination Preparation Meetings

  • Review video(s), adoption verification job aid(s), and POP.
  • Discuss technical content of new videos to clarify any questions.
  • Discuss feedback on pretesting as well as questions that arose at that time.
  • Review effective mediation techniques:
    • Ask open-ended questions.
    • Ensure there is enough light to stimulate discussion among group members.
    • Repeat the video if requested or if necessary to help audience members understand the content.
    • Do not proceed until participants answer the questions posed in the annotations at each pause.

Hold dissemination preparation meetings in a central location about once a month or once every two or three videos, depending on the dissemination schedule, mediators’ availability, and other circumstances. During these meetings, give mediators the next batch of videos to be uploaded onto their pico projectors. Be mindful that mediators will likely not be able to fully remember all of the technical information for more than two or three videos at a time.

Conducting Dissemination

Tips for Disseminations: Getting Ready

  • Confirm the dissemination time and location with group.
  • Review the video and POP before the screening.
  • Brainstorm potential questions and prepare answers.
  • Check equipment and accessories.
  • Pack dissemination equipment, accessories, and attendance/screening forms.
  • Arrive at least 30 minutes before the screening for set-up. Ensure the venue is dark; put up a screen or cloth if necessary.
  • Check the sound and projection before the meeting.
  • Prepare reporting forms.

When preparing for disseminations, remember that although the videos may be the main attraction, they are also tools to facilitate face-to-face discussion. Ensure that each mediator has access to functional and charged speakers, a pico projector, and a battery-operated lantern that can be used to illuminate the darkened room during the discussion portion of the disseminations. Mediators should review the POP and adoption verification job aid before starting the meeting to ensure comprehension of the topic to be discussed.

Disseminations can be conducted among existing village groups, including SHGs, mother-to-mother support groups, or other common interest groups comprised of between 10 and 20 individuals. Aim for each group to host two to four video disseminations each month, depending on the goals, budget, and resources of your project and the availability of target group members. The entire dissemination process, including the video screening and follow-up discussion, should be kept to one hour. During the dissemination, the mediator should:

  • Welcome group members and ensure that the environment is comfortable for the group.
  • Invite feedback on the last dissemination and behaviors that some may have adopted or promoted.
  • Record attendance and the names of individuals who self-report as having adopted or promoted a practice covered by a previous video. This information will be used to assess who should receive a follow-up home visit to verify the adoption or promotion. You can also consider doing this at the end of the screening.
  • Introduce the video’s subject and explain its applicability to the group.
  • Play the video on the pico projector, pausing at annotations and engaging participants in a conversation at these points.
  • After the screening ends, initiate discussion with the group on all the topics covered by the video.
  • During the course of the discussion, ask a group representative to summarize the main points of the video and to share her own experiences with the practice.
  • Record who expresses an interest in adopting the practice as well as any participant questions. Take note of any questions that cannot be answered and follow up with an answer during the next dissemination.
  • Thank participants, and confirm the time and location of the next dissemination.

Tips for Disseminations: Facilitating

Videos are a tool to facilitate conversation and should not be screened without the active engagement and interaction with a mediator. Disseminations should be participatory, interactive forums where the mediator actively promotes participant engagement and adjusts techniques to maintain group interest and participation. Make use of the annotations embedded in the video to facilitate discussion and reinforce critical points.

Quality of Disseminations

There is great value in periodically assessing the quality of disseminations to inform improvements in the content of videos, identify needs for refresher trainings, and provide feedback to mediators. The local partner should organize regular local team meetings to evaluate the quality of disseminations and how content and facilitation can be improved, and include other partner organizations who wish to participate. In addition, ensure opportunities to periodically sit in on disseminations to observe mediators and give constructive feedback on their sessions. Deliver all feedback in a way that will improve the quality of disseminations and help mediators improve their performance. Feedback can also be helpful in designing refresher training sessions during dissemination preparation meetings. Refer to the DG SOPs for more details on evaluating disseminations and for a form that can be used to periodically evaluate mediators.

Read the next section

3.2 Monitoring Adoptions

Performance and Effectiveness Indicators

Monitoring the adoptions and promotions of nutrition behaviors is integral to the SPRING/DG approach. The proposed performance monitoring and evaluation components are based on the simplified program impact pathway show in Figure 4.

Figure 4. Community Video Impact Pathway

Community video impact pathway - see PDF for more details

Based on this impact pathway, the following performance or process indicators should be tracked and reported on regularly throughout implementation:

Indicator Data Collection Method
Number of people trained in priority topics, by gender and cadre/level/role Routine monitoring
Number of disseminations conducted, by location and type of event/group Routine monitoring
Percentage of disseminations with positive feedback, by location, type of event/group, and mediator Periodic assessment
Number of people reached through disseminations, by location, type of event/group, by video topic, and gender of participant Routine monitoring
Number and percentage of participants who report an intention to adopt or promote a priority behavior, by location, type of event/group, by video topic, and gender of participant Routine monitoring
Number and percentage of participants who report promoting to others a priority behavior, by location, type of event/group, by video topic, and gender of participant Routine Monitoring
Percentage of participants who understand/know importance of a priority behavior, by location, type of event/group, by video topic, and gender of participant Evaluation
Number and percentage of participants who report practicing or adopting a priority behavior, by location, type of event/group, by video topic, and gender of participant Routine Monitoring & Evaluation

Note that outcome and impact-level indicators, which can be measured through an evaluation, will vary depending on the behaviors promoted. However, they may include any of those found in USAID’s Nutrition Strategy:

  • Prevalence of stunting among children under five
  • Prevalence of wasting among children under five
  • Prevalence of overweight among children under five
  • Prevalence of anemia among children 6-59 months
  • Prevalence of healthy weight among women of reproductive age
  • Prevalence of anemia among women of reproductive age
  • Prevalence of low birth weight
  • Prevalence of exclusive breastfeeding of infants 0-5 months
  • Prevalence of minimum acceptable diet of children 6-23 months
  • Women’s dietary diversity score
  • Prevalence of moderate and severe hunger
  • Number of HIV-positive, clinically malnourished clients who received therapeutic and supplementary feeding

The primary data collection methods for monitoring project results are described in the sections that follow.

Adoption Support and Verification Home Visits

Tools
Appendix 11: Adoption Verification Form Template

Using the data on self-reported adoptions and promotions collected during previous disseminations, work with each mediator to follow up with a home visit to those reporting adoption or promotion at least a week (but no more than a month) after the screening. The exact timing may vary depending on the dissemination schedule, video content, and your specific project. At minimum, the mediator should visit every self-reported adopter’s home to verify the adoption. Ideally, however, the mediator should visit the home of everyone in their group to negotiate with participants on how they could adopt or adapt the behavior in a way that works for them, or how they could promote it within their family or to their community if the practice is not relevant to them (e.g., promotion of exclusive breastfeeding for an individual who does not have a child under age 2). In this way, home visits do not function simply as a verification tool but additionally become a support mechanism to build self-efficacy and assist individuals in adopting the behavior given their resources and influence with others in their household and community.

Tips: Adoption Verifications

Visit the place of adoption during a time that the practice is most likely to occur.
Through observation, verify that the person is practicing the behavior correctly and completely according to the critical points listed on the adoption verification job aid.
For practices that cannot be easily observed, and for tracking promotion for those not in the target audience, interview the participant to assess whether the behavior is being practiced, or promoted, correctly and whether the individual has the correct knowledge about the behavior and the critical points.

 

Tracking Adoptions and Promotions

To measure adoptions and promotions of behaviors and the extent to which they are practiced correctly and as shown in the corresponding video, mediators should use the adoption verification job aid (Appendix 6) and the adoption verification form (Appendix 11) during home visits. If it is not possible to observe adoption (as is true for many nutrition- and health-related practices), mediators can use self-reported adoptions combined with knowledge recall as proxies for adoption of recommended behaviors. There are, however, important limitations to using this method in place of direct observation or 24 hour dietary recall; self-reported behavior cannot be verified and it is well documented that knowledge alone does not always lead to behavior change. Despite the limitations, this method to record and track adoptions is least cost prohibitive and ensures continued community ownership and participation throughout the process. As noted earlier, the adoption verification job aid reiterates key practices outlined in the POP and included in the videos and assists mediators in identifying whether practices are being executed correctly. The verification form then tracks the information on adoptions that correlate to the details in the job aid for each video. Whenever an individual is not in the target audience to adopt a given practice, promotion of that behavior to friends, neighbors, and family members is reported on in place of adoption, using knowledge recall as the primary verification method which carries the same limitations listed above.

The mediators’ steps for tracking adoptions and promotions are as follows:

  • Record the names of group members who express an interest in adopting or promoting the practices shown in the video and discussed at the dissemination.
  • Ask participants whether they have any questions or can identify any barriers to adoption or promotion.
  • Visit the households of those who expressed interest and self-reported having adopted or promoted the behavior to provide support as appropriate and feasible. (If possible, visit all households to follow up and provide support to encourage adoption.)
  • During the home visits, provide guidance on further steps and/or corrections and how the individual might adapt the behavior, if necessary.
  • Record adoptions and promotions on the verification form, marking whether critical points have been followed and whether the practice has been adopted or promoted.
  • Record issues that individuals faced when adopting or promoting a behavior as well as improvements, enhancements, or mistakes that they have made, to inform future videos.

During subsequent dissemination preparation meetings, after mediators have captured data on disseminations, adoptions and promotions during home visits, the local partner organization staff should collect the adoption verification forms and enter data into a centralized reporting database such as COCO, DG’s data management tool (Section 3.3).

Read the next section

3.3 Reporting and Learning

Data Management

As for any intervention, the success of the community-video approach relies on institutionalizing good data management and analytics processes to ensure that project goals and targets are met. Digital Green developed a management system specifically for this model that can be adapted to monitor and evaluate the approach, to integrate with organizations’ existing MIS systems, and even to capture other types of data that groups might be interested in tracking (e.g., non-video-related interventions).

Digital Green’s COCO is an open-source management information system that provides insight on, and analytics about, the reach of video content and related adoption and promotion rates. Presented in easy-to-understand visual dashboards, COCO helps program managers assess what is going well and what might need further attention. This data management model draws on data collected by mediators and captures interactions of individual community members with project partners as they appear in videos, attend video disseminations, express particular interest or ask questions, and adopt featured practices for themselves (or promote those behaviors) over time. You can also utilize other data management systems as you prefer, however, COCO is unique in that it has been developed specifically for the purposes of monitoring this approach.

Data on adoptions and promotions can be aggregated and disaggregated to the individual level, whereas one person’s viewing history and adoptions can be tracked over time. Similarly, you can examine which videos were most popular or sparked the highest rate of adoption to see the types of videos and practices best received by communities. Such data can help you make mid-course corrections; identifying content that yielded low adoption rates helps you better understand barriers to adoptions and promotions and how best to use this information to improve the videos. You can also use the questions recorded during disseminations to uncover what any additional issues might be and to understand how to improve adoption and promotion rates for specific behaviors. The data can also help you track mediator performance and the need for additional capacity building or supportive supervision by flagging any groups with repeatedly low adoption and promotion rates.

One critical feature of COCO is its ability to function on any Web-enabled device, including mobiles, tablets, and computers, without an Internet connection. The offline mode permits access to information and the inputting of data when an Internet connection is unavailable; information entered offline is synchronized with the online global database whenever connectivity is available and the data upload can occur. This offline mode is an important function, given the limited Internet connectivity and sporadic power outages in rural and remote areas where the community-video approach is typically implemented.

Cross Verifications and Quality Assurance

As your project expands, quality assurance becomes increasingly important. Regularly cross- verifying a random sample of adoptions is an effective mechanism for quality assurance. Conduct these cross-verifications one to three months following disseminations; depending on the practice being promoted (some practices may become irrelevant for some individuals after time has passed). The local partner should verify a random sample of approximately 10 percent of the total reported adoptions (the percentage that the implementing partner verifies will vary from project to project and can depend on the project scale and reporting requirements). Invite mediators to join the cross-verification process to strengthen ownership and help facilitate access to project communities and participants.

Digital Green’s quality assurance framework, further outlined in the DG SOPs, can help maintain quality of both the content and the process. A high-quality process ensures that the aspects of the approach will be institutionalized among local partners and communities, while high content quality ensures that the information is technically and scientifically correct and valuable to community members who use it.

For more information on cross-verification and other quality assurance protocols, please refer to the quality assurance framework and related details in the DG SOP.

Data Analysis and Use

Now that you have data and feedback, you can put it to use to develop new content, improve and expand future videos, and revise existing videos in an iterative cycle. Using community dissemination, adoption, and promotion data and feedback is critical to the approach’s feedback loop. This information should be tracked and analyzed regularly by program managers, as detailed below, to ensure that videos are understood, topics are relevant to the community, behaviors are being adopted, and to inform new videos as noted above. As discussed previously, the adoption verification forms provide the indicators that will be monitored. These are directly based on the POPs created at the design and content development stage in the process. Although the approach is primarily led and facilitated by local agents, the verification component permits an additional layer of monitoring and validation of adoptions across all partners.

Interactive, real-time reports (both data and graphical representation) available from COCO provide a macro view of the reach and effectiveness of any given set of videos, while also permitting you to retrieve the details of each video. The key statistics on the number of videos produced and screened and practices adopted or promoted provides a valuable snapshot of project progress in each region. This overview can be broken down to the individual, group, or village level or aggregated to state, district, and block levels.

The richness of the data allows both for aggregated and disaggregated analysis in various time-, geographic-, and partner-based dimensions. Using proximate metrics, such as the questions asked and interest expressed during the dissemination, the analytics provide insight into the quality of the disseminations and greater understanding of which practices may need additional reinforcement in a particular location at a particular time. Therefore, effective data management using COCO or another system provides an opportunity to review participants’ feedback and adoption data in order to improve the video production and dissemination process, enhance comprehension of nutrition topics, more effectively address barriers and enablers to adoption and promotion, and identify additional content that communities have requested.

Evaluating Impact

Given that the approach relies on monitoring and verification activities to be conducted by program staff themselves, there are limitations to the data collected. If feasible, you should plan to conduct an evaluation focused on at least the priority behaviors promoted through the project. Truly evaluating the impact of the intervention would require, at a minimum, baseline and endline data collection to show that behaviors have changed, at least within the intervention area. By conducting a midline evaluation, one could also assess initial adoption versus sustained adoption. By also collecting data in a control site one could better attribute change in behaviors and/or nutritional status to the intervention itself.