007 Selfcorrectiveness with respect to data

Two sorts of „selfcorrection“ are especially germane to our deliberations regarding the coherentist standard of acceptability: the one relating to the initial „data“ (in our technical sense) that it makes use of, and the other to its mechanisms of plausibility-evaluation. Let us begin with the former, and consider the important idea of a retrospective reappraisal of data-sources. 

For present purposes, the crucial structural feature of the coherence analysis-is that it (1) begins with „raw“ data, (2) refines these into revised and „cooked“ (or duly processed) data, and then (3) deploys plausibility considerations in applying the coherence analysis to these processed data in the endeavor to extract from them those theses which, relative to these data, are qualified for acceptance-as-true. 

Now it is clear that as this process works its way along one can discover that the basic resource of our initial policies in the setting-up of data was defective, in that certain sorts of „initially recognized truth- candidates“ (= data) turn out in retrospect — in a systematic and regular way — to be found wanting. For example, if our „data“ consist in the reports of various witnesses, we may well discover that the reports given by certain of these are wrong so uniformly and regularly, that we can simply eliminate these „witnesses“ as a source of usable data. On the other hand, the analysis may — indeed, if all goes well, ought — to provide for the retrospective resubstantiation of our initial acceptance of data sources. 

The relevant aspect of the structure of the coherence analysis may be portrayed from this standpoint as in Figure 1. This circular process clearly embodies an element of „SELFcorrection“ in applications of the coherence analysis, in making room for a revised and reformed view of the initial data that afford the very materials of the analysis, arriving at this result in the light of the workings of the analysis itself. There is a cyclic movement, a closing of the cycle that requires a suitable meshing — a meshing process that should eventually retrovalidate (retrospectively revalidate) the initial criteria of datahood with reference to the results to which they lead.