News | Sport | TV | Radio | Education | TV Licenses | Contact Us |
TRC Final ReportPage Number (Original) 160 Paragraph Numbers 8 to 21 Volume 1 Chapter 6 Part Appendix Subsection 2 Sources for the design8 The Commission drew on a variety of prior human rights data projects in order to design its database. These included the experience of the Haitian National Commission for Truth and Justice and the United Nations Commission for Truth in El Salvador - at the time, the only two truth commissions to have undertaken quantitative analysis of human rights violation data on the scale proposed by the South African Commission. Consultants from the Investigative Task Unit (a special unit established by the Minister of Safety and Security to investigate alleged hit squad activities in KwaZulu-Natal) and non-government organisations (NGOs) that had participated in the Human Rights Documentation Project also made suggestions on the information flow. 9 The instrument most extensively used by the Commission’s Database Development Group was developed by representatives of six human rights NGOs with experience in the design of human rights information systems.<Sup>12 Sup>Full evaluations of the Commission’s information flow were conducted in September 1996 and April 1997<Sup>13Sup>. In addition, numerous periodic office-specific or stage-specific evaluations were conducted. Theoretical basis for the information flow10 The Commission based its work on the assumption that objective<Sup>14 Sup>knowledge about the social world in general, and about human rights violations in particular, is possible. Some analysts, in particular academic anthropologists, have questioned this assumption. Their criticism is directed primarily at the decontextualised nature of human rights reporting in anecdotal presentations or legal casework, but it is equally - possibly even more - relevant to quantitative analysis. 11 In brief, analysts such as Richard A Wilson are concerned that “violence, like any other social process, is expressed and interpreted according to sets of metaphors about the nature of power, gender relations, and human bodies.”<Sup>15 Sup>Any report of political violence must place the violence within the relevant web of social networks and contingent cultural meanings. However, Wilson does not conclude that objectified or universalised human rights analysis is somehow fundamentally meaningless; only that, on its own, legalistic or quantitative analysis is inadequate. He thus calls for a blend of methods at different levels to explain human rights violations. 12 Wilson’s call provides an anthropological parallel to the Act’s legal requirements. The Act demands methodological pluralism. As argued above, it required that the Commission gather information and analyse it rigorously. Beyond rigour even, it requires an analysis of “systematic patterns” and of “context, motives and perspectives which led to such violations” (4(a), sections (i) and (ii)). The first level implies a quantitative treatment, and the second necessitates historical or ethnographic reflection. 13 In short, the Act echoes classical sociologist Max Weber’s definition of the sociological method, whereby “historical and social uniqueness results from specific combinations of general factors, which when isolated are quantifiable.”<Sup>16 Sup>Like the Commission, Weber is concerned that social analysis should be sufficient to draw general conclusions, but that it simultaneously preserve and reflect on individual case details. Weber recommends that analysts identify general factors in the universe of examples by applying ideal types - “controlled and unambiguous conceptions” - which illuminate particular phenomena of study. However, the general factors must be understood in terms of the particularities of individual cases. This definition of a set of ‘ideal types’ is then applied to a universe of narrative (or semi-structured) statements taken in interviews with deponents. 14 At the Commission, the data processing teams implemented these ‘ideal types’, using a controlled vocabulary and a coding frame. The teams coded deponents’ statements in standard forms before capturing the information on the database. 15 Weber was careful to note that this method is most useful as a comparative device. That is, the aggregation of examples of a particular ideal type with one set of characteristics provides a basis for evaluation of a second aggregation of examples of a similar but distinct ideal type with a different set of characteristics.<Sup>17 Sup>The comparison of patterns of violations - among regions, across time, between types of victims, and among groups of perpetrators - is the basis for the quantitative analysis presented in the report.<Sup>18 Sup> 12 Patrick Ball, Ricardo Cifuentes, Judith Dueck, Romilly Gregory, Daniel Salcedo, and Carlos Saldarriaga, ‘A Definition of Database Design Standards for Human Rights Agencies.’ American Association for the Advancement of Science/HURIDOCS. November 1994. 13 See Patrick Ball, ‘Evaluation of Commission’s Information Flow and Database, with Recommendations’, Memorandum to the Commission, 9 September 1996, 34 pp; see also Ball, ‘Statistical Analysis and Other Research Using the Commission Database: Notes for Analysts’, Memorandum to the Commission, 11 April 1997. 14 Or at least knowledge that is inter-subjectively reliable, that is, knowledge on which the involved actors can agree is held in common between them. This is a weaker assumption than objective knowledge but it has the same practical effect. 15 Richard A. Wilson, ‘Representing Human Rights Violations: Social Contexts and Subjectivities’, pp. 134-160 in Human Rights, Culture, and Context, Richard A. Wilson, Ed. London: Pluto P. 1997: p. 148. 16 Hans Gerth and C. Wright Mills, ‘Methods of Social Science’, in From Max Weber: Essays in Sociology. Oxford UP. 1946: p. 59. This paragraph and the following one follow Gerth and Mills’ description of Weber’s methodology. 17 See Gerth and Mills, pp. 59-60. 18 See Richard Claude and Thomas Jabine, ‘Exploring Human Rights Issues with Statistics’, pp 5-34; Robert Goldstein, ‘The Limitations of Using Quantitative Data in Studying Human Rights Abuses’, pp 35-61; George Lopez and Michael Stohl, ‘Problems of Concept Measurement in the Study of Human Rights’, pp 216-234 in Human Rights and Statistics: Getting the Record Straight, edited by Thomas Jabine and Richard Claude, University of Pennsylvania, 1992. See also the pioneering work of Judith Dueck and Aida Maria Noval, HURIDOCS Standard Formats Supporting Documents, Geneva: HURIDOCS, 1993 and Ball, 1996.Statistical limitations and sampling16 Section 4(b) of the Act required that the Commission accept statements from all South Africans who wished to make them. Hence, the Commission did not carry out a ‘survey’ of violations in the sense of drawing a probabilistic sample of victims.<Sup>19 Sup>Those who chose to come forward defined the universe of people from whom the Commission received information. 17 Human rights data are almost never taken from probabilistic samples. Instead, people decide for themselves if they will make statements. This ‘self-selection’ of the sample introduces a number of factors that must be taken into account when interpreting findings:<Sup>20 Sup> a people who live in areas very far from where the data are being collected have less chance of being in the sample than those closer to the offices in which statements are taken, because of transport difficulties, for example, or the relative inaccessibility of rural areas; b people who are energetic are more likely to give statements than those who are ill, injured, elderly, traumatised, or suffering profound depression; c deponents who died before the Commission began work cannot give statements; hence events that took place in the past are under-reported; d people with no access to the media (radio, newspapers or television) are less likely to approach the Commission; e people from constituencies that are hostile to the Commission are less likely to make statements. 18 Since the Commission’s sample was not a probabilistic sample, it was not possible to use the data to calculate how many violations, in total, took place in South Africa. Without knowing what proportion of all potential victims actually came to the Commission, the overall total cannot be estimated. What is known is that there were at least 21 000 gross violations of human rights. 19 However, the data gathered from the human rights violations statements do permit the kinds of analyses to which they are subjected in the various chapters of this report. It is important to note that the Commission’s data were based on corroborated findings. This means that, at a minimum, these violations (if not many more) definitely happened in these places at these times. Furthermore, none of the conclusions in the Commission report are based on quantitative data alone; in each case, the quantitative data is linked to the accounts of contemporary journalists, histories of the various regions, and analyses of reported situations by NGO human rights groups. 20 The quantitative results on which arguments in this report are made are not subtle. Only where there are great differences in relative rates, or very distinctive patterns that are stable across regions, does the report interpret the statistics as findings. 21 The Commission’s database represents an unequalled collection of data on a set of events that took place during a unique period of South Africa’s history. It may only have scratched the surface, but that surface has been scratched in unprecedented detail. 19 Statistical projection of findings and analysis from a sample to the society at large can only be made if a probabilistic sample is used - one that is drawn randomly from the population so that every member of the population has an equal or fixed chance of being included in the sample. 20 See, for example, Ignacio Cano, ‘Evaluating Human Rights Violations’, pp 221-233 in Evaluation for the 21st Century. Edited by Eleanor Chelinsky and William Standish. Beverly Hills, CA: Sage Press 1997. |