Wednesday, January 7, 2015

Improving implementation

How can the treaty bodies improve implementation of their decisions and recommendations?

There are several ways to look at this issue.

  • One is to compare the current reporting volume with the numbers that would be achieved at full performance. States should after all be reporting on how they are implementing the relevant human rights standards.  
  • A second approach is to look at how often a state report actually responds to and explains how the government has implemented each of the Committee's prior recommendations. 
  • A third approach is to evaluate the responses to the special follow up mechanisms that several of the Committees have put in place

This article takes a look at each of these approaches and offers some recommendations to the treaty bodies on procedures they might adopt to improve implementation in the future.

1. Comparing actual to theoretical numbers of state party reports  as an indirect measure of timeliness of reporting

Currently states are submitting approximately 100 reports per year. This represents approximately 35% of the numbers of reports that would be received if all states were submitting their reports on time.

Periodicity ratified periodicity per year
CCPR 168 4,5 37
CERD 177 4 44
CESC 162 5 32
CAT 165 4 39
CRC 194 5,5 35
CEDAW 188 4 47
CMW 47 5 9
CRPD 151 5 30
CED 44 6 7
Total 1296 280

The above periodicity time periods (e.g., 4,5 years between reports for CCPR, etc.) represent actual practice in cases where the treaty-prescribed practice has not been enforced for various reasons or where there is no treaty specified time period (CCPR has no specified time period; CERD has a 2 year reporting interval which is too short; CRC has moved to a 6 year interval due to its workload). So approximately 100 reports are received each year. There would be 280 reports if everyone was filing on time. The present performance therefore represents 36% of a full compliance rate of reporting. 

This figure is also consistent with the specific data for the reports reviewed by the treaty bodies in 2014. My research of those 130 reports indicated an on-time compliance rate of 22%. 

2.  Determining how many reports responded to and implemented each of the Concluding Observations from the prior reporting cycle

Each new periodic report should reply to all recommendations made by the treaty body from the prior report.  Of the 130 reports reviewed by the treaty bodies in 2014, 96 were periodic reports (the remainder were initial reports where no prior concluding recommendations were made). Of these 96 reports, by my count states replied to all of the prior recommendations in approximately 54 instances, representing 56% of the total. 

Treaty body reports responded percent
CESC 18 12 67%
CCPR 14 9 64%
CRC 17 8 47%
CERD 14 7 50%
CEDAW 20 13 65%
CAT 8 5 63%
CRPD 0 0
CED 0 0
CMW 5 0 0%
Total 96 54 56%

No reports are listed for CRPD or CED in the above table because neither treaty body has any periodic reports to review yet, only initial reports. SPT is not on this table because their procedure differs substantially from the others. For CAT, CCPR and CMW it is also not clear how best to evaluate reports submitted under the simplified procedure, the so-called LOIPR procedure -- if the state party fails to reply to all prior concluding recommendations but the list of issues prepared by the Committee does not ask them to do this, are they excused from further response? 

It should also be noted that the above statistics are my subjective evaluation of each government's responsiveness. In some cases it was difficult to determine whether the government had responded to all of the prior concluding recommendations or only some of them. There was no separately delineated list or table or labelling that identified each recommendation and the government's response to it.  In some cases, notwithstanding this ambiguity, I assumed the government had responded to all of the recommendations; in other cases the responses seemed so brief that I did not include it as a compliant response to all recommendations. 

The above figures also only reflect whether a response of some type could be found in the report. I have not attempted to measure the quality or completeness of each particular response. So this is a very distant measure of actual implementation. But it is some indication of how far we are from a full implementation picture. 

3. Follow up mechanism

Six treaty bodies now have a follow up mechanism of some type, which requires states to respond within one year on 3 or 4 identified recommendations. A special rapporteur is then appointed to review these responses and report to the Committee.  Four Committees have now been using this practice for several years, CCPR, CAT, CEDAW and CERD.  The information available on each Committee website varies in completeness. So it is somewhat difficult to track performance. 

I studied the CERD process this year. They post responses from each state party and send letters commenting on the completeness and relevance of the responses. No NGO submissions are posted or summarized. An annual report comments briefly on the responses received, but does not provide any evaluative comments on quality or completeness of the responses.  I noted Committee letters sent to state parties thanking them for a "timely" response even though the responses were up to 9 months late.

For purposes of this analysis I looked at all of the responses posted to date for sessions 66 (March 2005) to 83 (August 2013), since responses are now overdue for all of these sessions. 
  • total number of state reports in this time period -- 162
  • total number of states who responded in some form -- 85 (52%)
  • total number of these states who responded on time -- 47 (29%)
As indicated earlier, it is not possible from this data to analyse how many states actually implemented the recommendations presented, but anecdotally it would seem to be a very low number, perhaps between 5 and 15%.  The method of posting information and reporting on results also makes it difficult to evaluate the performance of the implementation.  But from the above data it is at least apparent that 77 of 162 governments failed to respond at all to the Committee's request to provide follow up information on the implementation of their recommendations. 


  1. Standing agenda item. Each Committee should have a standing agenda item for each session on the subject of implementation. The follow up reports should be a part of this item, but also general discussion, statistical summaries, and Committee decisions should be a part of the discussion. NGOs and NHRIs should be invited to participate.
  2. Statistical summaries. Statistical summaries should be prepared and kept up to date on implementation statistics. A discussion of these statistical summaries should be included in the general agenda item on implementation
  3. Table or index to responses in periodic reports. State parties should be encouraged to include a table or index in each periodic report, mapping each prior concluding recommendation to the location in the report where the government has provided a response on implementation. This change should be added to the latest reporting guidelines of the Committee. It should also be made an explicit request in every LOIPR. 
  4. Case decisions. Implementation of Views (individual case decisions) should also be more prominently reported on by each Committee. In each case where a state party has committed a violation should be tracked and reported on. Government responses and petitioner responses should be posted or summarized (subject to petitioner consent). The relevant time period should also be clarified -- most decisions say that the government must implement the findings in 90 to 180 days, but it is not clear how this time period is measured, especially in cases where there has been a substantial delay between the official date of the decision and the date it first becomes public. 
  5. Guidelines. A best practice set of guidelines for implementation of Committee recommendations and judgements should be produced by each Committee
  6. Training. More training opportunities or an "implementation school" should be offered to state parties
  7. Implementation teams. states should be encouraged to appoint their implementation teams before their report is heard by the Committee; members of the implementation team should be offered training before they start and should be encouraged to attend the Committee hearing where the recommendations they will be implementing will originate from
  8. Official government websites. Governments should be expected to maintain official websites where their treaty responsibilities are disclosed, including reporting cycles and treaty body appearances. Each recommendation should be tracked, with implementation plans and activities reported.  A model list of website content should be provided (in this regard please see the suggestions I have made for these government websites in a prior article on this blog). 
  9. Civil society. NGOs and NHRIs should be given greater access and visibility to the follow up processes. NGO submissions should be posted and reviewed. If a state party is late in submitting a response, the Committee should consider going forward with an analysis that includes the NGO report
  10. Analytical summaries. More evaluative comments should be included by the Committee. It should not be just that the state party has responded, but that the quality of the response is also satisfactory. Has the state party implemented the recommendation? Is additional information needed? An analytical summary of these comments should be available at each session, updated to the latest comments.  It is noted that the Human Rights Committee has initiated a practice of assigning a "grade" to responses; something like this should be adopted by the other Committees And it would be useful if the Human Rights Committee were to issue a statistical summary of these graded responses. 
  11. Database management. The follow up to concluding observation databases at each Committee website should be regularly updated and include any NGO submissions received. Analytical materials from each session should be posted to the database within a prescribed time period so that those who are looking for this information know when to check back. Alternatively there should be a prominent notice on the database itself that indicates when the latest update was made and an opportunity for interested persons to sign up to be notified of updates
  12. Press releases. Press releases should be issued from time to time, informing the news media of the follow up process, providing some details statistically or otherwise, and providing a link to the relevant database information
  13. Timeliness of state party responses. Please do not refer to responses filed up to 9 months late as "timely" in transmittal letters to the state party. This softens the deadlines to the point of losing their credibility. It is hard to measure timeliness of submissions when responses 6 to 9 months late are called timely 
  14. Consider different modalities. The treaty bodies should consider a variety of different modalities for reviewing implementation. Country visits and implementation hearings should be considered. One very interesting example is the country visit made by Mr. Morten Kjaerum of CERD at the invitation of the government of Ireland, in the implementation of the Committee's concluding recommendations in 2006. 
  15. Drafting. Treaty bodies should attempt to improve the readability of concluding observations. See my article earlier this week on this subject. 

No comments:

Post a Comment