Secondary links

User links

Key points in communicating governance data

Key Principles

A number of considerations need to be kept in mind when disseminating governance assessments, governance indicators and their underlying data.
  • Stakeholder dialogue – A critically important dissemination medium for governance assessments are stakeholder dialogues. These nationwide public forums serve to raise public awareness about what governance involves, and to stimulate public debate about what standards of performance people should expect from government. They also can contribute to public debate about ongoing reform and help to identify priorities for a reform programme as part of a broader discussion on the governance assessment. Stakeholder dialogues further provide an opportunity to build partnerships amongst government, the private sector and civil society.
  • Reaching marginalized groups – The dissemination strategy should make a deliberate effort to reach marginalized groups in the country. Marginalized groups include those defined by socioeconomic factors but also by geography. This requires, for example, allocating resources for dissemination events in targeted geographical locations and the translation of the assessment into languages relevant to marginalized groups. 
  • Media as partners -- Local, national, regional and international journalists are important "dissemination partners" coming from television, print, radio and Internet media outlets. To help them report accurately on governance issues, it can be useful to offer journalists specialized training on governance concepts, indicators and data as well as other composite indicators related to governance. Journalists should be seen as important stakeholders in guiding public debate and drawing attention to critical governance deficits shown in the governance assessment and indicators.
  • Timeliness and timing -- Timeliness refers to the lapse of time between the end of a reference period (or a reference date) and dissemination of the data. It reflects many factors, including some related to institutional arrangements, such as the preparation of accompanying commentary and printing. It is important that indicators are disseminated within a reasonable amount of time to protect their relevance and usefulness. In addition, consider the timing of other reports on governance that are being prepared by the government, United Nations agencies and civil society, and of major political events such as elections, budget processes , donor consultations and national strategy preparations. The timing of these events and reports can bolster the relevance of the governance indicators.
  • Adapting to change -- A mechanism should be established to ensure that the means of dissemination are continually adapted to meet increasingly sophisticated user needs and to take advantage of technological innovations.
  • Formulating a media plan and the press kit – A media plan guides the introduction of key messages and data to the  public. One of its most important components is a press kit, prepared before the governance assessment is launched. A typical press kit includes a summary of the report, press releases targeting different audiences/stakeholder groups, FAQs, a fact sheet profiling data and findings, and/or a CD-DVD. The press kit also should highlight the results of innovative research and surveys, trends analysis, composite indices and their components, data discrepancies and disaggregated data reflecting discrepancies.
  • Websites and databases -- Some of the most useful outreach tools are interactive websites and databases. Not all people will have access to these, but they can reach thousands who would not otherwise obtain such information. Civil society organizations in particular can access this information on behalf of smaller community-based organizations and groups. 
  • Monitoring the effectiveness of outreach -- When governance assessments are expected to be undertaken regularly, it is important to monitor the effectiveness of each assessment so that outreach can be improved. In general, the governance teams involved in the assessment should establish impact targets early in its preparation, and an attempt should be made to measure the impact on every group targeted. Examples of indicators include: changes in the results of perceptions surveys over time; legislation proposed and/or adopted at any level; use of governance assessment findings in parliamentary and other public debates; shifts in resource allocations; emergence of new partners; and media coverage.
  • Translation to local languages -- The report should be accessible in all official languages of the country, which will in most countries require multiple translations. In addition to a report, the governance indicators and underlying data should be made available and accessible to all national stakeholders, without bias, including individual members of the public. This may require translation into additional local languages.

Media and Strategies

The table below provides examples of ways to present and disseminate governance assessments. It has been adapted from one developed in the 2nd edition of the IDEA State of Democracy Assessment Framework. The presentation and dissemination examples include developing presentations, producing papers, launching a website, investing in stakeholder participation, holding consultations, promoting analyses and usage of data, public launching of results and other activities.


Full report, hard copy
Full assessment
Publication in English and in-country language/s
Government officials, politicians, media, academics, donors, political activists, international organizations and some members of civil society
Full report, electronic copy
Full assessment plus linkages and data archive
Webpage of the assessment or links to the main website from the relevant participants
Elite Internet users, international interested parties and opinion formers
Executive summary/press release
Aggregated executive summaries (all sections), with various individual indicators highlighted as warranted
Press conference
Government officials, academics, media, politicians, political parties, international organizations, members of civil society and donors
Academic conference and conference documents
Full assessment as background paper, with presentations and papers from participants
Conference pack and section on the project website
Academics, policymakers, journalists and students
Extracts by section (specialist interest)
Executive summaries and specific sections
Sector- and interest-specific journals and in-house magazines; specialist s
Interest-specific, such as educators, health workers, media, local government officials
Extracts by section (popular, issues)
Derivative popular texts around current affairs
Popular magazines and newspapers
Literate, educated
Questionnaires, civic education summaries, classroom kits
Cartoon, non-textual or basic language, video or audio
CBOs, churches, NGOs, schools, community centres, libraries (i.e., gatekeepers)
General, including illiterate or poor
Interviews and features by radio and TV personnel
Verbal and visual summaries
Radio and TV
General, including illiterate and poor with access to radio and TV

Communication Challenges

Communication challenges related to the presentation of governance assessment statistics and indicators include:
  • An audience with a low statistical literacy level must be made aware of ways in which the indicators might be misinterpreted, shown the main results of the assessment as clearly as possible, and made aware of where complete results of the assessment can be found.  Users should have access to documentation on methodology, information regarding quality assurance practices and dissemination policies. The audience must be given the opportunity to see all e results of the assessment  so that they can draw their own conclusions with regard to bias and efficacy of the assessment approach. Where possible, the indicator and data should be made available via the Internet (see more on this below).
  • Too many numbers and statistical terms in the governance assessment can overwhelm even the most enthusiastic readers. In order to reach as many target audiences as possible, overly statistical terms should be avoided. Readers/users are unlikely to want to go through numerous references to "means" or "confidence intervals." Better options are terms such as "average" and "probability." 
  • Graphical presentation of data can significantly improve understanding and interpretation of governance indicators, if the presentation is created well. Because graphics are not always good, they must be interpreted with a critical eye and with the following questions in mind: Are the visual and numerical scales in alignment? Does the scale change? Is the whole story presented? Does the graphic present the indicators as clearly and simply as possible? If the answer to all of these questions is yes, then the graphic probably represents the data appropriately. Maps can be a particularly powerful information tool because they are easier to interpret than many statistical graphs and can contribute to transparency and greater access to data. In particular, maps vividly illustrate disparities; they can be combined with statistical graphs and exploratory texts.
  • The data must be representative; otherwise, it is incorrect to draw conclusions for sub-populations. Where possible, standard statistical errors should be reported and assumptions and caveats explained. The national governance assessment teams should make sure that indicators and data are accurate,through peer reviews and editing. Incorrect data, even a small error, can discredit the overall messages of a report and destroy opportunities for advocacy.
  • Personal privacy and protection of data is an important consideration in the presentation of governance indicators and  underlying data. One obvious protective measure is to make the release of micro-data illegal, even amongst government institutions. In that way, data collected by one institution, such as a National Statistical Office, cannot be arbitrarily seized and used by another institution, such as the military. The problem with using laws to protect data from unethical use by governments, however, is that governments can easily change those laws. Even so, having a coded policy of data confidentiality creates one more hurdle for potential data abusers and more clearly defines the ethical issues involved.
  • Discrepancies between statistics and indicators from national sources and those from international sources represent another issue of concern. For example, let us say that an international agency estimated that 51 percent of the population in a country was living below the poverty line. In contrast, according to a published government source, a national survey found the figure for the same indicator to be only 20 percent. Such discrepancies may result when organizations collect their own data at country level, make adjustments to basic data provided by countries or make their own estimates based on certain models. While the need for adjusted international data series is widely recognized, confusion can arise for users when the distinction between adjusted data and underlying data is not clear. Furthermore, discrepancies in published data that are significant and yet remain unexplained undermine the credibility of national statistics. Agreements on international standards for the definition and measurement of indicators, clear labeling and data source notes can help.