The data
Measuring these types of results has only been possible relatively recently as more data, including historical data, has been posted publicly by the UN human rights office. Eight of ten treaty bodies now have some form of follow up to concluding recommendations. Six of those now post that information publicly on their websites. Four of those six now have enough of a data sampling to provide some useful conclusions.
Here is my analysis of the available information:
Response is on time
|
Response is late
|
Response is overdue
|
NGO reports
|
Compliance rate
|
Aggregate number of report reviews/time span
|
|
CCPR
|
(33) 32%
|
(45) 43%
|
(24) 25%
|
(58) 43%
|
12%
|
102 due/126 total/8 years
|
CEDAW
|
(26) 23%
|
(52) 46%
|
(35) 31%
|
(19) 17%
|
20%
|
113 due/160 total/5 years
|
CAT
|
(58) 36%
|
(58) 36%
|
(45) 28%
|
(31) 19%
|
Not available
|
161 due/180 total/12 years
|
CERD
|
(48) 28%
|
(40) 24%
|
(82) 48%
|
(2)** 1.2%
|
Not available
|
170 due/191 total/10 years
|
CED
|
(3) 50%
|
(2) 33%
|
(1) 17%
|
(2) 33%
|
33%
|
6 due/13 total/2 years*
|
CRPD
|
(2) 50%
|
0%
|
(2) 50%
|
(1) 25%
|
Not available
|
4 due/24 total/3 years*
|
Total
|
170/556 = 31%
|
197/556 = 35%
|
189/556 = 34%
|
110/556 = 20%
|
37/221
= 17%
|
556 total responses due
|
Footnotes:
1. *Both CED and CRPD have sample sizes that are too small for this data to be meaningful so far
2. *CRPD has also adopted the practice of identifying recommendations for follow up in this procedure only about 50% of the time.
3. CMW has adopted the procedure this year; no data is yet posted (no responses are yet due)
4. CESC has also adopted a pilot program this year, but it is too early to have any data
5. Three of the Committees provide an evaluation or grading system of the state party's response. I have used the "largely satisfactory" or "[A]" grade in these evaluations to calculate the compliance rates in the table.
6. This table only addresses follow up to Concluding Observations. When a treaty body issues Concluding Observations on a state report it identifies 2 to 4 recommendations for followup within 1-2 years (the particular deadline is specified in the report). This table does not cover recommendations (Views) in individual cases -- that data is being reported less systematically so far, and I have not determined how best to measure the response rates.
7. **the 2 NGO reports listed for CERD actually appeared to be from NHRIs. Apparently no NGO reports were received by CERD under its follow up procedure.
8. I have considered a state response late for these purposes if it was submitted 2 months or more after the recommended deadline. Most late responses are much more late than just 2 months, so the 2 month filter was my approach to clustering the "barely late" responses from the "really late" responses.
Viewed as a bar chart, here is a comparison of state performance by treaty body:
As you can see from this data on the two tables, CAT has the best on-time response (35%) and CCPR has the best overall response rate (32% + 43% = 75%). CERD has the worst overall response rate -- 48% of states fail to respond at all to the Committee's requests for implementation information.
As noted earlier, the figures from CED and CRPD are too small so far to make any generalisations from.
Also, it's important to note that there are very few NGOs apparently active in this follow up process so far -- only 20% of the total cases have an NGO report submitted, but most of these are in the CCPR process (and most of those NGO reports are from one of two NGOs, either the CCPR Centre or TRIAL -- kudos to both!). So NGO involvement generally is very low in this follow up procedure so far.
Is the glass half full or half empty?
On the bright side, 66% of states are responding at some point to the Committees' requests to submit information on implementation -- 31% are on time, 35% are 2 months or more late.
On the more disappointing side, these figures indicate that states fail to respond at all to the Committee requests in 34% of the cases and, where their response is actually evaluated, only 17% submit a satisfactory response.
Conclusion
So it seems we have a long way to go before we have acceptable rates of implementation in the human rights treaty system. States need to submit responses on a more timely basis and they need to implement treaty body recommendations more consistently. It is acknowledged that sometimes the states don't agree with the treaty body's recommendations and are not willing to implement them for this reason -- this is a natural aspect of a constructive dialogue process, but states who believe the treaty body recommendation is wrong are free to submit written comments to that effect -- something states rarely do at the present time.
My key takeaway points:
- states need to submit their responses in a more timely fashion; dealing with late responses is a time and resource waster in a system that can not afford these types of inefficiencies
- states need to implement treaty body recommendations more frequently
- data that makes it possible to measure these matters must remain open, timely, and transparent to those of us following the performance of the system
- NGOs should get more involved in submitting their own reports on state party compliance to treaty body recommendations; in reviewing treaty body comments during this research, it was clear to me that when a well-reasoned, relevant NGO report is submitted, it is given significant weight by the treaty body in the evaluation process