debatelab-admin commited on
Commit
536c1fc
·
1 Parent(s): faf2827
Files changed (1) hide show
  1. data/corpus/betz_rau.txt +69 -69
data/corpus/betz_rau.txt CHANGED
@@ -24,7 +24,7 @@ making and drives public deliberation and debate.
24
  Unfortunately, we are not very good at getting practical arguments right. Intuitive
25
  practical reasoning risks to suffer from various shortcomings and fallacies as
26
  39
27
28
  soon as a decision problem becomes a bit more complex – for example in terms of
29
  predictive uncertainties, the variety of outcomes to consider, the temporal structure
30
  of the decision problem, or the variety of values that bear on the decision (see
@@ -69,7 +69,7 @@ extension of – the programme of an argumentative turn, which was so far mainly
69
  perspectives of political science and empirical discourse analysis.
70
  2 For examples, see Singer (1988:157–9).
71
  40 G. Brun and G. Betz
72
73
  guided by the goal of making the given argumentation as clear as possible and by
74
  standards for evaluating arguments: premises can be right/true or wrong, arguments
75
  can be valid or invalid, strong or weak.
@@ -109,7 +109,7 @@ with reference to examples.
109
  3We freely draw on our earlier work, specifically Brun (2014), Brun and Hirsch Hadorn (2014),
110
  Betz (2013), and Betz (2010).
111
  3 Analysing Practical Argumentation 41
112
113
  2.1 Tasks of Argument Analysis
114
  Argument analysis, understood in a wide sense, involves two basic activities:
115
  reconstruction and evaluation of argumentation and debates.
@@ -152,7 +152,7 @@ place of “multiple”. See Snoeck Henkemans (2001) a survey on terminology and
152
  of complex argumentation.
153
  6We use “inference” as a technical term for completely explicit and well-ordered arguments.
154
  42 G. Brun and G. Betz
155
156
  requires taking decisions which need to be made with a perspective to the other
157
  reconstructive tasks. Another reason is that each subsequent step of reconstruction
158
  will identify additional structure, which may prompt us to revise or refine an
@@ -191,7 +191,7 @@ identify the structure of the argumentation
191
  Fig. 3.1 Interplay of reconstruction and evaluation in argument analysis (Adapted from Brun and
192
  Hirsch Hadorn 2014:209)
193
  3 Analysing Practical Argumentation 43
194
195
  Let us now turn from reconstruction to evaluation. A comprehensive evaluation
196
  of arguments and complex argumentation involves assessing a whole range of
197
  qualities. The following may be distinguished:
@@ -232,7 +232,7 @@ author’s argumentation to argument analysis which seeks to find the best argum
232
  that can be constructed following more or less closely the line of reasoning in
233
  some given argumentative text.
234
  44 G. Brun and G. Betz
235
236
  The exegetical aspect implies that reconstructions must answer to hermeneutic
237
  principles, especially accuracy (sometimes called “loyalty”7) and charity. “Accuracy”
238
  means that a reconstruction must be defensible with respect to the argumentative
@@ -273,7 +273,7 @@ position, resolve whether to accept a controversial claim, reach consensus on so
273
  the context of argument analysis see Reinmuth (2014).
274
  8 On various aspects of clarification see also Morscher (2009:1–58) and Hansson (2000).
275
  3 Analysing Practical Argumentation 45
276
277
  issue, shake an opponent’s convictions or explore the consequences of adopting a
278
  certain position. Argument analysis by itself does not directly realize such aims,
279
  neither does it necessarily lead to better arguments. However, it may prove effective
@@ -314,7 +314,7 @@ interesting and well paid job in Chicago. But the catch is that, if you wanted t
314
  job, you would have to take a plane [. . .]. Therefore there would be a very small but
315
  positive probability that you might be killed in a plane accident. [. . .]
316
  46 G. Brun and G. Betz
317
318
  [3.2] The maximin principle says that you must evaluate every policy available to you in
319
  terms of the worst possibility that can occur to you if you follow that particular policy. [. . .]
320
  [2.1] If you choose the New York job then the worst (and, indeed, the only) possible
@@ -357,7 +357,7 @@ whether some element of a text is part of an argument is functional. Being a
357
  premise or a conclusion is not a matter of the form or the content of a sentence,
358
  but a role a statement can play, just like being an answer. Identifying arguments in a
359
  3 Analysing Practical Argumentation 47
360
361
  text therefore presupposes at least a rough understanding of the structure of the text.
362
  A well-tested strategy is to start by sketching the main argument(s) in a passage in
363
  one’s own words and as succinctly as possible. For [Harsanyi] that could be
@@ -404,7 +404,7 @@ needs to be resolved, for example, different readings of scope (“Transportatio
404
  and industry contribute 20 % to the US greenhouse gas emissions.”). Thirdly,
405
  context-dependent, for example, indexical (“I”, “this”, “here”, “now”, . . .) and
406
  48 G. Brun and G. Betz
407
408
  anaphoric (“Harsanyi quotes Rawls before he criticizes him.”), expressions, must be
409
  replaced if there is a danger that their interpretation might not be clear in the
410
  resulting representation of the argument. In practice, the necessary reformulation
@@ -448,7 +448,7 @@ representation of inference
448
  experiment” and r to “we ought not to perform on the animal an experiment that would be
449
  considered outrageous if performed on one of us.”
450
  3 Analysing Practical Argumentation 49
451
452
  (e.g. “and” instead of “but”, “not acceptable” instead of “inacceptable”) and
453
  especially eliminating stylistic variations, for example, by replacing expressions
454
  which are synonymous in the context at hand by one and the same. In the examples
@@ -491,7 +491,7 @@ author and deal with “incomplete” arguments by revising the ascribed belief
491
  As Jacquette (1996) has pointed out, adding a premise is in some cases less charitable than
492
  strengthening a premise or weakening the conclusion.
493
  50 G. Brun and G. Betz
494
495
  plausible. Exercising judgement rather than applying a formal procedure is needed
496
  for assessing the alternative suggestions and deciding which one to select.
497
  Both, the notion of an enthymeme and the appeal to charity are linked to the
@@ -536,7 +536,7 @@ can convert the argument at hand into an equivalent deductive one with a
536
  13 Sentence S is logically stronger than sentence T (and T is logically weaker than S) just in case S
537
  implies T but not vice versa.
538
  3 Analysing Practical Argumentation 51
539
540
  weakened premise and investigate which additional premises are needed for such a
541
  conversion. For both strategies, argumentation schemes may be used as a
542
  heuristic tool.
@@ -578,7 +578,7 @@ but calls for investigation by, for example, perception, science or ethics. The
578
  exceptions are inconsistencies that can be detected by logical or semantical analysis
579
  which shows that the logical form or the meaning of a set of premises guarantees
580
  52 G. Brun and G. Betz
581
582
  that they cannot all be true.14 Inferences involving an inconsistent set of premises
583
  are negatively evaluated since they cannot perform the core functions of arguments;
584
  they provide no reason in favour of the conclusion. However, arguments with an
@@ -624,7 +624,7 @@ and only derivatively to arguments. An arguments can then be called “deductive
624
  because it is meant or taken to be evaluated by deductive standards, or because it performs well
625
  with respect to deductive standards. (Skyrms 2000:ch. II.4).
626
  3 Analysing Practical Argumentation 53
627
628
  deductively valid is more ambitious insofar as referring to just one case will not
629
  do. We rather need a general argument which shows that there cannot be a case in
630
  which the premises are true and the conclusion false.
@@ -669,7 +669,7 @@ validity: non-deductive strength is compatible with the conclusion being false e
669
  17 In this chapter, we use “validity” simpliciter as an abbreviation for “deductive validity”; in the
670
  literature it often also abbreviates “formal validity”.
671
  54 G. Brun and G. Betz
672
673
  if all the premises are true, it comes in degrees, and it is nonmonotonic; that is,
674
  adding premises can yield a stronger or weaker argument. An immediate consequence
675
  is that even if a strong non-deductive argument supports some conclusion,
@@ -714,7 +714,7 @@ The second type is exemplified in problems of dialectical irrelevance such as
714
  18 Lumer (2011) explains how argumentation schemes can be exploited for deductivist
715
  reconstructions.
716
  3 Analysing Practical Argumentation 55
717
718
  arguments which do not support the thesis they are presented as supporting
719
  (ignoratio elenchi) or arguments which attack a position the opponent does not
720
  in fact defend (“straw-man”).19 In this way, Harsanyi’s undercut seems to miss
@@ -754,7 +754,7 @@ follows what we quoted as [Harsanyi].
754
  for example those which include evaluative terms (“good”, “better”). For a more precise and
755
  sophisticated discussion (using a different terminology), see Morscher (2013).
756
  56 G. Brun and G. Betz
757
758
  explicit normative phrases in an argumentation relate to the same normative
759
  perspective.
760
  A second challenge for reconstructing practical arguments arises in connection
@@ -795,7 +795,7 @@ modality. The situation is much more complex if for practical arguments which in
795
  sentences; that is, sentences only part of which are in the scope of a deontic modality. See
796
  Morscher (2013) for an accessible discussion.
797
  3 Analysing Practical Argumentation 57
798
799
  4 Analysing Complex Argumentation
800
  4.1 Reconstructing Complex Argumentation
801
  as Argument Maps
@@ -836,7 +836,7 @@ Debater’s Handbook (Sather 1999:255–7); the items have only been shortened
836
  (as indicated) and re-labelled. The fact that many of the descriptive claims made
837
  are false (as of today) does not prevent the example from being instructive.
838
  58 G. Brun and G. Betz
839
840
  Pro Con
841
  [Pro1.1] The world faces an energy crisis. Oil
842
  will be exhausted within 50 years, and coal will
@@ -900,7 +900,7 @@ individual arguments is guaranteed qua charitable reconstruction. Rather than us
900
  inference schemes for the reconstruction, we suggest to add corresponding general premises that
901
  can be criticized. Pollock’s undercut-relation hence effectively reduces to the attack relation.
902
  3 Analysing Practical Argumentation 59
903
904
  • An argument supports another argument if the conclusion of the supporting
905
  argument is identical with (or at least entails) a premise of the supported
906
  argument.
@@ -938,7 +938,7 @@ arguments and theses
938
  (boxes) in the illustrative
939
  debate about nuclear power
940
  60 G. Brun and G. Betz
941
942
  relations between the arguments, and theses). The map is basically a hypothesis
943
  about the debate’s dialectical structure, which has to be probed through detailed
944
  reconstructions of the individual arguments. At the same time, this hypothesis
@@ -982,7 +982,7 @@ entails non-T!]
982
  These two reconstructions corroborate the dialectic relations as presumed in the
983
  preliminary argument map (cf. their conclusions).
984
  3 Analysing Practical Argumentation 61
985
986
  4.2 Argument Maps as Reasoning Tools
987
  Let us now suppose that all arguments have been reconstructed like [Pro1.1] and
988
  [Con2.1] above, and that the dialectic relations as visualized in Fig. 3.4 do really
@@ -1026,7 +1026,7 @@ undefended arguments must not be accepted. According to the approach championed
1026
  chapter, in contrast, any argument can be reasonably accepted, as long as the proponent is willing
1027
  to give up sufficiently many beliefs (and other arguments).
1028
  62 G. Brun and G. Betz
1029
1030
  • On the macro level, a complete (partial) position specifies for all (some) arguments
1031
  in the debate whether it is accepted or refuted. To accept an argument
1032
  means to consider all its premises as true. To refute an argument implies that at
@@ -1069,7 +1069,7 @@ and [Con2.1] is obviously not dialectically coherent; it directly violates one o
1069
  above constraints. A partial position according to which all premises of [Pro1.1]
1070
  and [Con2.1] are true is not dialectically coherent, either, because truth-values of
1071
  3 Analysing Practical Argumentation 63
1072
1073
  the remaining statements (i.e. conclusions) cannot be fixed without violating one of
1074
  the above constraints.
1075
  A micro or macro position which is not dialectically coherent violates basic
@@ -1103,7 +1103,7 @@ Fig. 3.5 Two macro positions, visualized against the background of the nuclear e
1103
  argument map. “Checked” arguments are accepted, “crossed” arguments are refuted, “flashes”
1104
  indicate local violations of rationality criteria (see also text)
1105
  64 G. Brun and G. Betz
1106
1107
  precise normative trade-offs involved when aggregating conflicting practical
1108
  arguments.26
1109
  Over and above coherence checking, argument maps can be valuable tools for
@@ -1149,7 +1149,7 @@ grandma, one should attempt to analyze the reasoning by means of the two princip
1149
  not lie to relatives” and “You must not lie to strangers”, which can then be balanced against each
1150
  other.
1151
  3 Analysing Practical Argumentation 65
1152
1153
  policy questions, further dissent concerning other arguments is then irrelevant
1154
  (regarding policy consensus formation).
1155
  Let us briefly return to our third criticism of pro/con lists: improper aggregation
@@ -1172,7 +1172,7 @@ illustrative argument map
1172
  Fig. 3.7 A simple, abstract
1173
  argument map
1174
  66 G. Brun and G. Betz
1175
1176
  coherent micro position on this map and to determine whether one should accept the
1177
  central thesis, one may execute the decision tree shown in Fig. 3.8.27
1178
  We have started this section with the issue of aggregating conflicting reasons.
@@ -1210,7 +1210,7 @@ we do not distinguish between denying a statement and suspending judgement.
1210
  28 This section is adapted from http://www.argunet.org/2013/05/13/mapping-the-climate-engineer
1211
  ing-controversy-a-case-of-argument-analysis-driven-policy-advice/ [last accessed 16.03.2015].
1212
  3 Analysing Practical Argumentation 67
1213
1214
  to offset the effects of anthropogenic GHG emissions. CE includes methods which
1215
  shield the earth from incoming solar radiation (solar radiation management) and
1216
  methods which take carbon out of the atmosphere (carbon dioxide removal).29
@@ -1253,7 +1253,7 @@ that CE should be researched into so as to have these methods ready for deployme
1253
  29 On the ethics of climate engineering and the benefits of argumentative analysis in this field
1254
  compare Elliott (2016).
1255
  68 G. Brun and G. Betz
1256
1257
  in time. They have then visualized the core position in the argument map and
1258
  calculated the logico-argumentative implications of the corresponding stance
1259
  (cf. Fig. 3.9). The enhanced map shows, accordingly, which arguments one is
@@ -1271,7 +1271,7 @@ Fig. 3.9 Illustrative core position (here: thumbs up) and its logico-argumentati
1271
  (here: thumbs down) in a detailed reconstruction of the moral controversy about so-called climate
1272
  engineering (Source: Betz and Cacean 2012:87)
1273
  3 Analysing Practical Argumentation 69
1274
1275
  interestingly, though, all the empirical chapters of the assessment report
1276
  (on physical and technical aspects, on sociological aspects, on governance aspects,
1277
  etc.) consistently refer to the argument map and make explicit to which arguments
@@ -1309,7 +1309,7 @@ argument.
1309
  30 Steele (2006) interprets the precautionary principle as a meta-principle for good decisionmaking
1310
  which articulates essentially these two requirements.
1311
  70 G. Brun and G. Betz
1312
1313
  This choice boils down to the following question: should we allow for decision
1314
  principles which individually do not satisfy standards of good decision-making? –
1315
  Yes, we think so. The following simplified example is a case in point:
@@ -1347,7 +1347,7 @@ which do not correspond to any of these schemes. And schemes might have to be
1347
  adapted in order to take the original text or plausibility etc. into account. That is,
1348
  schemes are rather prototypes that will frequently provide a first version of an
1349
  3 Analysing Practical Argumentation 71
1350
1351
  argument reconstruction, which will be further improved in the reconstruction
1352
  process.
1353
  It is characteristic for practical arguments under uncertainty that their descriptive
@@ -1388,7 +1388,7 @@ when bringing about S] ought to be the case that S.
1388
  (3) There is no alternative to X for agent A that [will/is likely to/might] bring
1389
  about S and is more suitable than X.
1390
  72 G. Brun and G. Betz
1391
1392
  (4) The certain, likely and possible side-effects of agent A doing X are collectively
1393
  negligible as compared to the [certain/likely/possible] realization
1394
  of S.
@@ -1424,7 +1424,7 @@ B in Betz (2016)).
1424
  [Principle of Prima Facie Rights Violation]
1425
  If
1426
  3 Analysing Practical Argumentation 73
1427
1428
  (1) Persons P possess the prima facie right to be in state R.
1429
  (2) Agent A doing X [certainly/likely/possibly] prevents persons P from being
1430
  in or achieving state R.
@@ -1463,7 +1463,7 @@ especially not of their worst possible consequences.
1463
  (4) There is no other available option whose worst possible consequence is
1464
  (weakly) preferable to the worst possible consequence of option oþ.
1465
  74 G. Brun and G. Betz
1466
1467
  then
1468
  (5) Option oþ ought to be carried out.
1469
  For various examples of worst case arguments compare Betz (2016:Sect. 3.1).
@@ -1499,7 +1499,7 @@ Resources Supporting Argument Analysis
1499
  Bowell, Tracy, and Gary Kemp. 2015. Critical Thinking. A Concise Guide. 4th ed.
1500
  London: Routledge.
1501
  3 Analysing Practical Argumentation 75
1502
1503
  Chapter 5 gives a very accessible yet reliable introduction to techniques of argument
1504
  reconstruction focusing on the analysis of individual arguments and complex
1505
  argumentation.
@@ -1548,7 +1548,7 @@ Hansson, S. O. (2016). Evaluating the uncertainties. In S. O. Hansson & G. Hirsc
1548
  The argumentative turn in policy analysis. Reasoning about uncertainty (pp. 79–104). Cham:
1549
  Springer. doi: 10.1007/978-3-319-30549-3_4.
1550
  76 G. Brun and G. Betz
1551
1552
  Hansson, S. O., & Hirsch Hadorn, G. (2016). Introducing the argumentative turn in policy analysis.
1553
  In S. O. Hansson & G. Hirsch Hadorn (Eds.), The argumentative turn in policy analysis.
1554
  Reasoning about uncertainty (pp. 11–35). Cham: Springer. doi:10.1007/978-3-319-30549-3_2.
@@ -1603,7 +1603,7 @@ Press.
1603
  Walton, D. N., Reed, C. A., & Macagno, F. (2008). Argumentation schemes. Cambridge: Cambridge
1604
  University Press.
1605
  3 Analysing Practical Argumentation 77
1606
1607
  Chapter 6
1608
  Accounting for Possibilities in Decision
1609
  Making
@@ -1677,7 +1677,7 @@ with no background in argumentation theory, it is certainly profitable to study
1677
  4 For an up-to-date decision-theoretic review of decision making under deep uncertainty see Etner
1678
  et al. (2012).
1679
  136 G. Betz
1680
1681
  In the remainder of this introductory section, I will briefly comment on the limits
1682
  of uncertainty quantification, the need for non-probabilistic decision methods and
1683
  the concept of possibility.
@@ -1723,7 +1723,7 @@ Shrader-Frechette (2016 p. 12) and Doorn (2016, beginning). Compare Gilboa et al
1723
  as Heal and Millner (2013) for a decision-theoretic defence.
1724
  9 See again Shrader-Frechette (2016).
1725
  6 Accounting for Possibilities in Decision Making 137
1726
1727
  probabilistic information, it would be irresponsible not to make use of it in
1728
  decision processes. In sum, this chapter construes reasoning about policy options
1729
  as a tricky balancing act: it must rely on no more and on no less than what one
@@ -1768,7 +1768,7 @@ defence of doing so.
1768
  12 For a state-of-the-art explication of the concept of real possibility, using branching-space-time
1769
  theory, see Mu¨ller (2012).
1770
  138 G. Betz
1771
1772
  to a given body of knowledge13: a hypothesis is epistemically possible (relative to
1773
  background knowledge K) if and only if it is consistent with K.14
1774
  The following example may serve to illustrate the distinction. An expert team is
@@ -1814,7 +1814,7 @@ background knowledge K) and that K is in a specific way “complete”, i.e. inc
1814
  can be known about S. Likewise, to assert that S does not represent a real possibility means that S
1815
  is no epistemic possibility (relative to background knowledge K) and that K is objectively correct.
1816
  6 Accounting for Possibilities in Decision Making 139
1817
1818
  beliefs? First of all, note that this is a general issue in policy assessment, no matter
1819
  whether we evaluate options in a possibilistic, probabilistic or deterministic mood.
1820
  My reading of the argumentative turn is that we don’t need general rules which
@@ -1854,7 +1854,7 @@ pollution and business as usual. These different states-of-the-world, which are
1854
  16 Brun and Betz (2016), this volume, which nicely complements this chapter, provides practical
1855
  guidance for analyzing and evaluating argumentation.
1856
  140 G. Betz
1857
1858
  predicted in (1) and (2), are then normatively evaluated in premiss (3). The
1859
  normative evaluation of outcomes is based on, or partially expresses an underlying
1860
  (frequently implicit) “value theory,” a so-called axiology. Premiss (4) states a
@@ -1899,7 +1899,7 @@ or risk. As will become clear in the course of this chapter, these principles in
1899
  substantial normative commitments and reflect different risk attitudes (such as
1900
  levels of risk aversion) one may adopt.
1901
  6 Accounting for Possibilities in Decision Making 141
1902
1903
  Sound decision making under certainty requires one to consider all alternative
1904
  options and all their consequences (to the extent that they are articulated and
1905
  foreseen). Likewise, sound decision making under deep uncertainty requires one
@@ -1945,7 +1945,7 @@ A0 respectively. They are weak and preliminary in the sense that more elaborate
1945
  will make them obsolete. Maybe we can construe them as heuristic reasoning which serves the
1946
  piecemeal construction of more complex and robust practical arguments.
1947
  142 G. Betz
1948
1949
  choice. That is certainly how its methods are frequently presented and applied.19
1950
  The argumentative turn is free from such hybris: Rational decision making
1951
  according to the argumentative turn consists primarily in rational deliberation, in
@@ -1982,7 +1982,7 @@ will expand less quickly than otherwise. The latter case is clearly preferable t
1982
  20 For a more detailed discussion of the implications of representation theorems see Briggs (2014:
1983
  especially Sect. 2.2) and the references therein.
1984
  6 Accounting for Possibilities in Decision Making 143
1985
1986
  first one. The local authority should err on the safe side and prohibit the
1987
  construction.
1988
  The environmentalists put forward a simple worst case argument, whose core can
@@ -2023,7 +2023,7 @@ recommendations, it rather does not justify any prescription at all.
2023
  21 Cf. Luce and Raiffa (1957:278), Resnik (1987:26).
2024
  22 E.g. Elliott (2010).
2025
  144 G. Betz
2026
2027
  Example (Local Authority) Charged by their colleagues, the opponents of the new
2028
  complex refine their original argument. They concede that if the local authority
2029
  fails to clean up the mine, the habitat may be destroyed, too. But they say: We
@@ -2065,7 +2065,7 @@ have truly catastrophic consequences, (ii) the potential gains that may result
2065
  24 Moreover, the general premiss (2) can be understood as an implementation of Hansson’s
2066
  symmetry tests (cf. Hansson 2016).
2067
  6 Accounting for Possibilities in Decision Making 145
2068
2069
  from taking a risky option are negligible compared to the catastrophic effects that
2070
  may ensue.25
2071
  These prerequisites can be made explicit as antecedent conditions in the decision
@@ -2108,7 +2108,7 @@ view of both their corresponding best and worst case. In order to do so, best an
2108
  general strategy to identify specific conditions under which the various decision principles may be
2109
  applied is also favored by Resnik (1987:40).
2110
  146 G. Betz
2111
2112
  worst cases have to be compounded for each option. Let’s refer to the joint
2113
  normative assessment of a pair of possible consequences (best and worst case) as
2114
  “beta-balance.”26 The relative weight which is given to the worst case in such a
@@ -2151,7 +2151,7 @@ in decision theory (Resnik 1987: 32, Luce and Raiffa 1957:282). Hansson (2001:10
2151
  investigates the formal properties of “extremal” preferences which only take best and worst
2152
  possible cases into account.
2153
  6 Accounting for Possibilities in Decision Making 147
2154
2155
  surely bring about consequences X, then option A is preferred to option B.27 Now, we
2156
  can also explain why the reasoning appears so plausible: Whatever the exact level of
2157
  risk aversion, the beta-balance of option A is greater than that of option B and hence
@@ -2195,7 +2195,7 @@ decision-theoretic analyzes by Schmidt et al. (2011) and Neubersch et al. (2014)
2195
  shows that decision-making which seeks to minimize the probability of some harm runs into
2196
  problems as soon as various harmful outcomes with different disvalue are distinguished.
2197
  148 G. Betz
2198
2199
  (2) The value of a possible consequence of erecting or not erecting the steel wall is
2200
  roughly proportional the corresponding likelihood that the historic building is
2201
  not fully destroyed.
@@ -2237,7 +2237,7 @@ order to test whether preliminarily identified options are really robust.29
2237
  29 Robust decision analysis "a la Lempert et al. is hence a systematic form of “hypothetical
2238
  retrospection” (see Hansson 2016, Sect. 6).
2239
  6 Accounting for Possibilities in Decision Making 149
2240
2241
  We will return to the epistemic challenge of deep uncertainty—namely the
2242
  problem of fully grasping the space of possibilities—in the second part of this
2243
  chapter. But deep uncertainty also poses a normative challenge for robust decision
@@ -2280,7 +2280,7 @@ will result in different arguments.30 For example, argument H:
2280
  adopts with regard to them can be understood as an operationalization of Hansson’s degrees of
2281
  unacceptability (cf. Hansson 2013:69–70).
2282
  150 G. Betz
2283
2284
  (1) A possible outcome is acceptable if and only if no person is killed and the
2285
  operation has a total cost of less than 1 million €. [Normative guardrails]
2286
  (2) There is no possible consequence of defusing the bomb according to which a
@@ -2320,7 +2320,7 @@ affected by a measure to provide consent (e.g. future generations). The simple
2320
  31 For a detailed discussion of risk imposition and the problems standard moral theories face in
2321
  coping with risks see Hansson (2003).
2322
  6 Accounting for Possibilities in Decision Making 151
2323
2324
  principle of risk imposition is hence too strict. It must be limited to cases where
2325
  those potentially affected are in a position to provide consent, or it must state
2326
  alternative necessary conditions for permissibility. Another problem is that the
@@ -2360,7 +2360,7 @@ can be analyzed. See also Hansson (2013:97–101).
2360
  of the possible from the impossible involves as influential a choice as the selection of a decision
2361
  principle.
2362
  152 G. Betz
2363
2364
  knowledge. (In particular, for any such statementHjxj!10 there exists a time trel such
2365
  that Hjxj!10 can be derived from the Newtonian model of the pendulum and the
2366
  possibility that the pendulum has been released at trel.) On the other hand, every
@@ -2404,7 +2404,7 @@ is the claim that the hypothesis is consistent with background knowledge. Howeve
2404
  35 For this very reason, it is a non-trivial assumption that a dynamic model of a complex system
2405
  (e.g. a climate model) is adequate for verifying possibilities about that system (cf. Betz 2015).
2406
  6 Accounting for Possibilities in Decision Making 153
2407
2408
  ii. There might be some conceptual possibilities which actually are inconsistent
2409
  with the background knowledge, although we have not been able to show this
2410
  (failure to falsify).
@@ -2445,7 +2445,7 @@ have more or less strong reasons to suspect that such possibilities exist, e.g.
2445
  we deal with a complex system which we have only poorly understood so far.36
2446
  36 See also the “epistemic defaults” discussed by Hansson (2016: Sect. 5).
2447
  154 G. Betz
2448
2449
  Class 2. By summing up the maximum contribution of all potential sources of sea
2450
  level rise, climate scientists are in a position to robustly refute the conceptual
2451
  possibility that global mean sea level will rise by 10 m until 2,100 with business
@@ -2490,7 +2490,7 @@ air conditioning system; the experts concede that they have not checked this yet
2490
  they can be robustly ruled out—which, according to the authors, is the case for the most extreme
2491
  ones (p. 24).
2492
  6 Accounting for Possibilities in Decision Making 155
2493
2494
  5 The Dynamics of Possibilistic Knowledge
2495
  Our possibilistic foreknowledge is highly fallible. That’s already true for the simple
2496
  notion of serious possibility in the sense of relative consistency with the background
@@ -2534,7 +2534,7 @@ had been verified by reference to other WW2 bombs recently found, whose trigger
2534
  was intact. But these bombs all dated from the last 2 years of the war. So the
2535
  argument from analogy does not really warrant anymore that the trigger of the
2536
  156 G. Betz
2537
2538
  bomb to-be-defused may be intact, too. For the time being, the possibility that the
2539
  trigger is intact has to count as a merely articulated one. The experts had also
2540
  considered whether the dust cloud of a potential detonation may damage the
@@ -2579,7 +2579,7 @@ hence don’t verify that specific scenario (given the correct assumption). The
2579
  possibility that no window breaks becomes a merely articulated possibility (unless,
2580
  e.g., an accordingly modified simulation re-affirms the original finding). Also, the
2581
  6 Accounting for Possibilities in Decision Making 157
2582
2583
  team originally excluded the possibility that the cultural heritage site will be
2584
  damaged. But the argument which rules out this scenario, too, relied on a false
2585
  premiss. Given the novel estimate of the bomb’s size, that possibility cannot be
@@ -2624,7 +2624,7 @@ items, the smaller the potential for surprise. If there’s reason to think that
2624
  understanding of a system will change and improve quickly, however, one should
2625
  also expect the overhaul of one’s possibilistic outlook.
2626
  158 G. Betz
2627
2628
  Of course, it’s impossible to predict what we will newly come to know in the
2629
  future.42 But it’s not impossible to estimate whether our knowledge will change,
2630
  and how much. So, in 1799 Humboldt had reason to expect that he would soon
@@ -2664,7 +2664,7 @@ progress (pp. 650–651).
2664
  43 See Rescher (1984, 2009) for a discussion of limits of science and their various (conceptual or
2665
  empirical) reasons.
2666
  6 Accounting for Possibilities in Decision Making 159
2667
2668
  (1) There is no available option whose worst non-falsified possible consequence is
2669
  preferable to the worst non-falsified possible consequence of not permitting the
2670
  construction.
@@ -2709,7 +2709,7 @@ results? It seems to me that the detonation plus small-scale evacuation is robus
2709
  vis-"a-vis our original minimum standards and relative to all such verified
2710
  possibilities.”
2711
  160 G. Betz
2712
2713
  So the team member explains that arguments H, I should be understood as
2714
  referring to non-falsified possibilities. In addition, she sets up a further argument
2715
  which only takes verified possibilities into account, argument L:
@@ -2752,7 +2752,7 @@ matter of that agent’s risk aversion. Likewise, an agent who seeks robust opti
2752
  44 Brun and Betz (2016: especially Sect. 4.2) explain how argument analysis, and especially
2753
  argument mapping techniques, help to balance conflicting normative reasons in general.
2754
  6 Accounting for Possibilities in Decision Making 161
2755
2756
  with respect to non-falsified possibilities is more risk averse than an agent who is
2757
  content with robustness with respect to verified possibilities. (b) The profile of
2758
  possibilistic predictions on which the decision is based. If, for example, there is a
@@ -2794,7 +2794,7 @@ Given a possibilistic outlook, a surprise has occurred just in case something
2794
  has happened which wasn’t considered possible (i.e. was not referred to in some
2795
  non-falsified possibility). Surprises may happen for different reasons. We may in
2796
  162 G. Betz
2797
2798
  particular distinguish two sorts of surprise, to which we already alluded above:
2799
  (a) surprises that result from unknown unknowns; (b) surprises that result from the
2800
  fallibility of and the occasional need to rectify one’s background knowledge.45
@@ -2833,7 +2833,7 @@ detail:
2833
  45 Basili and Zappia (2009) discuss the role of surprise in modern decision theory and its
2834
  anticipation in the works of George L. S. Shackle.
2835
  6 Accounting for Possibilities in Decision Making 163
2836
2837
  • If, considering all relevant aspects except their potential for surprise (i.e., the
2838
  extent to which an option is associated with unknown unknowns), the options A
2839
  and B are normatively equally good, and if A has a significantly greater potential
@@ -2874,7 +2874,7 @@ option B is normatively better than (should be preferred to) option A.
2874
  • If option A has a significantly smaller potential for (undesirable) surprise (i.e.,
2875
  the relevant background knowledge is provisional and more likely to be
2876
  164 G. Betz
2877
2878
  modified) than its alternatives and if carrying out option A doesn’t jeopardize a
2879
  more significant value (than surprise aversion), then option A should be
2880
  carried out.
@@ -2915,7 +2915,7 @@ recognizing the difference between conceptual possibilities that have been shown
2915
  to be consistent with background knowledge and ones that merely have not been
2916
  refuted. The conceptual framework also gives rise to a precise (possibilistic) notion
2917
  6 Accounting for Possibilities in Decision Making 165
2918
2919
  of surprise (e.g. unknown unknowns) and triggers an expansion of the arsenal of
2920
  standard argument patterns for reasoning under great uncertainty.
2921
  One major purpose of this chapter has been to refute the widely held prejudice
 
24
  Unfortunately, we are not very good at getting practical arguments right. Intuitive
25
  practical reasoning risks to suffer from various shortcomings and fallacies as
26
  39
27
+
28
  soon as a decision problem becomes a bit more complex – for example in terms of
29
  predictive uncertainties, the variety of outcomes to consider, the temporal structure
30
  of the decision problem, or the variety of values that bear on the decision (see
 
69
  perspectives of political science and empirical discourse analysis.
70
  2 For examples, see Singer (1988:157–9).
71
  40 G. Brun and G. Betz
72
+
73
  guided by the goal of making the given argumentation as clear as possible and by
74
  standards for evaluating arguments: premises can be right/true or wrong, arguments
75
  can be valid or invalid, strong or weak.
 
109
  3We freely draw on our earlier work, specifically Brun (2014), Brun and Hirsch Hadorn (2014),
110
  Betz (2013), and Betz (2010).
111
  3 Analysing Practical Argumentation 41
112
+
113
  2.1 Tasks of Argument Analysis
114
  Argument analysis, understood in a wide sense, involves two basic activities:
115
  reconstruction and evaluation of argumentation and debates.
 
152
  of complex argumentation.
153
  6We use “inference” as a technical term for completely explicit and well-ordered arguments.
154
  42 G. Brun and G. Betz
155
+
156
  requires taking decisions which need to be made with a perspective to the other
157
  reconstructive tasks. Another reason is that each subsequent step of reconstruction
158
  will identify additional structure, which may prompt us to revise or refine an
 
191
  Fig. 3.1 Interplay of reconstruction and evaluation in argument analysis (Adapted from Brun and
192
  Hirsch Hadorn 2014:209)
193
  3 Analysing Practical Argumentation 43
194
+
195
  Let us now turn from reconstruction to evaluation. A comprehensive evaluation
196
  of arguments and complex argumentation involves assessing a whole range of
197
  qualities. The following may be distinguished:
 
232
  that can be constructed following more or less closely the line of reasoning in
233
  some given argumentative text.
234
  44 G. Brun and G. Betz
235
+
236
  The exegetical aspect implies that reconstructions must answer to hermeneutic
237
  principles, especially accuracy (sometimes called “loyalty”7) and charity. “Accuracy”
238
  means that a reconstruction must be defensible with respect to the argumentative
 
273
  the context of argument analysis see Reinmuth (2014).
274
  8 On various aspects of clarification see also Morscher (2009:1–58) and Hansson (2000).
275
  3 Analysing Practical Argumentation 45
276
+
277
  issue, shake an opponent’s convictions or explore the consequences of adopting a
278
  certain position. Argument analysis by itself does not directly realize such aims,
279
  neither does it necessarily lead to better arguments. However, it may prove effective
 
314
  job, you would have to take a plane [. . .]. Therefore there would be a very small but
315
  positive probability that you might be killed in a plane accident. [. . .]
316
  46 G. Brun and G. Betz
317
+
318
  [3.2] The maximin principle says that you must evaluate every policy available to you in
319
  terms of the worst possibility that can occur to you if you follow that particular policy. [. . .]
320
  [2.1] If you choose the New York job then the worst (and, indeed, the only) possible
 
357
  premise or a conclusion is not a matter of the form or the content of a sentence,
358
  but a role a statement can play, just like being an answer. Identifying arguments in a
359
  3 Analysing Practical Argumentation 47
360
+
361
  text therefore presupposes at least a rough understanding of the structure of the text.
362
  A well-tested strategy is to start by sketching the main argument(s) in a passage in
363
  one’s own words and as succinctly as possible. For [Harsanyi] that could be
 
404
  and industry contribute 20 % to the US greenhouse gas emissions.”). Thirdly,
405
  context-dependent, for example, indexical (“I”, “this”, “here”, “now”, . . .) and
406
  48 G. Brun and G. Betz
407
+
408
  anaphoric (“Harsanyi quotes Rawls before he criticizes him.”), expressions, must be
409
  replaced if there is a danger that their interpretation might not be clear in the
410
  resulting representation of the argument. In practice, the necessary reformulation
 
448
  experiment” and r to “we ought not to perform on the animal an experiment that would be
449
  considered outrageous if performed on one of us.”
450
  3 Analysing Practical Argumentation 49
451
+
452
  (e.g. “and” instead of “but”, “not acceptable” instead of “inacceptable”) and
453
  especially eliminating stylistic variations, for example, by replacing expressions
454
  which are synonymous in the context at hand by one and the same. In the examples
 
491
  As Jacquette (1996) has pointed out, adding a premise is in some cases less charitable than
492
  strengthening a premise or weakening the conclusion.
493
  50 G. Brun and G. Betz
494
+
495
  plausible. Exercising judgement rather than applying a formal procedure is needed
496
  for assessing the alternative suggestions and deciding which one to select.
497
  Both, the notion of an enthymeme and the appeal to charity are linked to the
 
536
  13 Sentence S is logically stronger than sentence T (and T is logically weaker than S) just in case S
537
  implies T but not vice versa.
538
  3 Analysing Practical Argumentation 51
539
+
540
  weakened premise and investigate which additional premises are needed for such a
541
  conversion. For both strategies, argumentation schemes may be used as a
542
  heuristic tool.
 
578
  exceptions are inconsistencies that can be detected by logical or semantical analysis
579
  which shows that the logical form or the meaning of a set of premises guarantees
580
  52 G. Brun and G. Betz
581
+
582
  that they cannot all be true.14 Inferences involving an inconsistent set of premises
583
  are negatively evaluated since they cannot perform the core functions of arguments;
584
  they provide no reason in favour of the conclusion. However, arguments with an
 
624
  because it is meant or taken to be evaluated by deductive standards, or because it performs well
625
  with respect to deductive standards. (Skyrms 2000:ch. II.4).
626
  3 Analysing Practical Argumentation 53
627
+
628
  deductively valid is more ambitious insofar as referring to just one case will not
629
  do. We rather need a general argument which shows that there cannot be a case in
630
  which the premises are true and the conclusion false.
 
669
  17 In this chapter, we use “validity” simpliciter as an abbreviation for “deductive validity”; in the
670
  literature it often also abbreviates “formal validity”.
671
  54 G. Brun and G. Betz
672
+
673
  if all the premises are true, it comes in degrees, and it is nonmonotonic; that is,
674
  adding premises can yield a stronger or weaker argument. An immediate consequence
675
  is that even if a strong non-deductive argument supports some conclusion,
 
714
  18 Lumer (2011) explains how argumentation schemes can be exploited for deductivist
715
  reconstructions.
716
  3 Analysing Practical Argumentation 55
717
+
718
  arguments which do not support the thesis they are presented as supporting
719
  (ignoratio elenchi) or arguments which attack a position the opponent does not
720
  in fact defend (“straw-man”).19 In this way, Harsanyi’s undercut seems to miss
 
754
  for example those which include evaluative terms (“good”, “better”). For a more precise and
755
  sophisticated discussion (using a different terminology), see Morscher (2013).
756
  56 G. Brun and G. Betz
757
+
758
  explicit normative phrases in an argumentation relate to the same normative
759
  perspective.
760
  A second challenge for reconstructing practical arguments arises in connection
 
795
  sentences; that is, sentences only part of which are in the scope of a deontic modality. See
796
  Morscher (2013) for an accessible discussion.
797
  3 Analysing Practical Argumentation 57
798
+
799
  4 Analysing Complex Argumentation
800
  4.1 Reconstructing Complex Argumentation
801
  as Argument Maps
 
836
  (as indicated) and re-labelled. The fact that many of the descriptive claims made
837
  are false (as of today) does not prevent the example from being instructive.
838
  58 G. Brun and G. Betz
839
+
840
  Pro Con
841
  [Pro1.1] The world faces an energy crisis. Oil
842
  will be exhausted within 50 years, and coal will
 
900
  inference schemes for the reconstruction, we suggest to add corresponding general premises that
901
  can be criticized. Pollock’s undercut-relation hence effectively reduces to the attack relation.
902
  3 Analysing Practical Argumentation 59
903
+
904
  • An argument supports another argument if the conclusion of the supporting
905
  argument is identical with (or at least entails) a premise of the supported
906
  argument.
 
938
  (boxes) in the illustrative
939
  debate about nuclear power
940
  60 G. Brun and G. Betz
941
+
942
  relations between the arguments, and theses). The map is basically a hypothesis
943
  about the debate’s dialectical structure, which has to be probed through detailed
944
  reconstructions of the individual arguments. At the same time, this hypothesis
 
982
  These two reconstructions corroborate the dialectic relations as presumed in the
983
  preliminary argument map (cf. their conclusions).
984
  3 Analysing Practical Argumentation 61
985
+
986
  4.2 Argument Maps as Reasoning Tools
987
  Let us now suppose that all arguments have been reconstructed like [Pro1.1] and
988
  [Con2.1] above, and that the dialectic relations as visualized in Fig. 3.4 do really
 
1026
  chapter, in contrast, any argument can be reasonably accepted, as long as the proponent is willing
1027
  to give up sufficiently many beliefs (and other arguments).
1028
  62 G. Brun and G. Betz
1029
+
1030
  • On the macro level, a complete (partial) position specifies for all (some) arguments
1031
  in the debate whether it is accepted or refuted. To accept an argument
1032
  means to consider all its premises as true. To refute an argument implies that at
 
1069
  above constraints. A partial position according to which all premises of [Pro1.1]
1070
  and [Con2.1] are true is not dialectically coherent, either, because truth-values of
1071
  3 Analysing Practical Argumentation 63
1072
+
1073
  the remaining statements (i.e. conclusions) cannot be fixed without violating one of
1074
  the above constraints.
1075
  A micro or macro position which is not dialectically coherent violates basic
 
1103
  argument map. “Checked” arguments are accepted, “crossed” arguments are refuted, “flashes”
1104
  indicate local violations of rationality criteria (see also text)
1105
  64 G. Brun and G. Betz
1106
+
1107
  precise normative trade-offs involved when aggregating conflicting practical
1108
  arguments.26
1109
  Over and above coherence checking, argument maps can be valuable tools for
 
1149
  not lie to relatives” and “You must not lie to strangers”, which can then be balanced against each
1150
  other.
1151
  3 Analysing Practical Argumentation 65
1152
+
1153
  policy questions, further dissent concerning other arguments is then irrelevant
1154
  (regarding policy consensus formation).
1155
  Let us briefly return to our third criticism of pro/con lists: improper aggregation
 
1172
  Fig. 3.7 A simple, abstract
1173
  argument map
1174
  66 G. Brun and G. Betz
1175
+
1176
  coherent micro position on this map and to determine whether one should accept the
1177
  central thesis, one may execute the decision tree shown in Fig. 3.8.27
1178
  We have started this section with the issue of aggregating conflicting reasons.
 
1210
  28 This section is adapted from http://www.argunet.org/2013/05/13/mapping-the-climate-engineer
1211
  ing-controversy-a-case-of-argument-analysis-driven-policy-advice/ [last accessed 16.03.2015].
1212
  3 Analysing Practical Argumentation 67
1213
+
1214
  to offset the effects of anthropogenic GHG emissions. CE includes methods which
1215
  shield the earth from incoming solar radiation (solar radiation management) and
1216
  methods which take carbon out of the atmosphere (carbon dioxide removal).29
 
1253
  29 On the ethics of climate engineering and the benefits of argumentative analysis in this field
1254
  compare Elliott (2016).
1255
  68 G. Brun and G. Betz
1256
+
1257
  in time. They have then visualized the core position in the argument map and
1258
  calculated the logico-argumentative implications of the corresponding stance
1259
  (cf. Fig. 3.9). The enhanced map shows, accordingly, which arguments one is
 
1271
  (here: thumbs down) in a detailed reconstruction of the moral controversy about so-called climate
1272
  engineering (Source: Betz and Cacean 2012:87)
1273
  3 Analysing Practical Argumentation 69
1274
+
1275
  interestingly, though, all the empirical chapters of the assessment report
1276
  (on physical and technical aspects, on sociological aspects, on governance aspects,
1277
  etc.) consistently refer to the argument map and make explicit to which arguments
 
1309
  30 Steele (2006) interprets the precautionary principle as a meta-principle for good decisionmaking
1310
  which articulates essentially these two requirements.
1311
  70 G. Brun and G. Betz
1312
+
1313
  This choice boils down to the following question: should we allow for decision
1314
  principles which individually do not satisfy standards of good decision-making? –
1315
  Yes, we think so. The following simplified example is a case in point:
 
1347
  adapted in order to take the original text or plausibility etc. into account. That is,
1348
  schemes are rather prototypes that will frequently provide a first version of an
1349
  3 Analysing Practical Argumentation 71
1350
+
1351
  argument reconstruction, which will be further improved in the reconstruction
1352
  process.
1353
  It is characteristic for practical arguments under uncertainty that their descriptive
 
1388
  (3) There is no alternative to X for agent A that [will/is likely to/might] bring
1389
  about S and is more suitable than X.
1390
  72 G. Brun and G. Betz
1391
+
1392
  (4) The certain, likely and possible side-effects of agent A doing X are collectively
1393
  negligible as compared to the [certain/likely/possible] realization
1394
  of S.
 
1424
  [Principle of Prima Facie Rights Violation]
1425
  If
1426
  3 Analysing Practical Argumentation 73
1427
+
1428
  (1) Persons P possess the prima facie right to be in state R.
1429
  (2) Agent A doing X [certainly/likely/possibly] prevents persons P from being
1430
  in or achieving state R.
 
1463
  (4) There is no other available option whose worst possible consequence is
1464
  (weakly) preferable to the worst possible consequence of option oþ.
1465
  74 G. Brun and G. Betz
1466
+
1467
  then
1468
  (5) Option oþ ought to be carried out.
1469
  For various examples of worst case arguments compare Betz (2016:Sect. 3.1).
 
1499
  Bowell, Tracy, and Gary Kemp. 2015. Critical Thinking. A Concise Guide. 4th ed.
1500
  London: Routledge.
1501
  3 Analysing Practical Argumentation 75
1502
+
1503
  Chapter 5 gives a very accessible yet reliable introduction to techniques of argument
1504
  reconstruction focusing on the analysis of individual arguments and complex
1505
  argumentation.
 
1548
  The argumentative turn in policy analysis. Reasoning about uncertainty (pp. 79–104). Cham:
1549
  Springer. doi: 10.1007/978-3-319-30549-3_4.
1550
  76 G. Brun and G. Betz
1551
+
1552
  Hansson, S. O., & Hirsch Hadorn, G. (2016). Introducing the argumentative turn in policy analysis.
1553
  In S. O. Hansson & G. Hirsch Hadorn (Eds.), The argumentative turn in policy analysis.
1554
  Reasoning about uncertainty (pp. 11–35). Cham: Springer. doi:10.1007/978-3-319-30549-3_2.
 
1603
  Walton, D. N., Reed, C. A., & Macagno, F. (2008). Argumentation schemes. Cambridge: Cambridge
1604
  University Press.
1605
  3 Analysing Practical Argumentation 77
1606
+
1607
  Chapter 6
1608
  Accounting for Possibilities in Decision
1609
  Making
 
1677
  4 For an up-to-date decision-theoretic review of decision making under deep uncertainty see Etner
1678
  et al. (2012).
1679
  136 G. Betz
1680
+
1681
  In the remainder of this introductory section, I will briefly comment on the limits
1682
  of uncertainty quantification, the need for non-probabilistic decision methods and
1683
  the concept of possibility.
 
1723
  as Heal and Millner (2013) for a decision-theoretic defence.
1724
  9 See again Shrader-Frechette (2016).
1725
  6 Accounting for Possibilities in Decision Making 137
1726
+
1727
  probabilistic information, it would be irresponsible not to make use of it in
1728
  decision processes. In sum, this chapter construes reasoning about policy options
1729
  as a tricky balancing act: it must rely on no more and on no less than what one
 
1768
  12 For a state-of-the-art explication of the concept of real possibility, using branching-space-time
1769
  theory, see Mu¨ller (2012).
1770
  138 G. Betz
1771
+
1772
  to a given body of knowledge13: a hypothesis is epistemically possible (relative to
1773
  background knowledge K) if and only if it is consistent with K.14
1774
  The following example may serve to illustrate the distinction. An expert team is
 
1814
  can be known about S. Likewise, to assert that S does not represent a real possibility means that S
1815
  is no epistemic possibility (relative to background knowledge K) and that K is objectively correct.
1816
  6 Accounting for Possibilities in Decision Making 139
1817
+
1818
  beliefs? First of all, note that this is a general issue in policy assessment, no matter
1819
  whether we evaluate options in a possibilistic, probabilistic or deterministic mood.
1820
  My reading of the argumentative turn is that we don’t need general rules which
 
1854
  16 Brun and Betz (2016), this volume, which nicely complements this chapter, provides practical
1855
  guidance for analyzing and evaluating argumentation.
1856
  140 G. Betz
1857
+
1858
  predicted in (1) and (2), are then normatively evaluated in premiss (3). The
1859
  normative evaluation of outcomes is based on, or partially expresses an underlying
1860
  (frequently implicit) “value theory,” a so-called axiology. Premiss (4) states a
 
1899
  substantial normative commitments and reflect different risk attitudes (such as
1900
  levels of risk aversion) one may adopt.
1901
  6 Accounting for Possibilities in Decision Making 141
1902
+
1903
  Sound decision making under certainty requires one to consider all alternative
1904
  options and all their consequences (to the extent that they are articulated and
1905
  foreseen). Likewise, sound decision making under deep uncertainty requires one
 
1945
  will make them obsolete. Maybe we can construe them as heuristic reasoning which serves the
1946
  piecemeal construction of more complex and robust practical arguments.
1947
  142 G. Betz
1948
+
1949
  choice. That is certainly how its methods are frequently presented and applied.19
1950
  The argumentative turn is free from such hybris: Rational decision making
1951
  according to the argumentative turn consists primarily in rational deliberation, in
 
1982
  20 For a more detailed discussion of the implications of representation theorems see Briggs (2014:
1983
  especially Sect. 2.2) and the references therein.
1984
  6 Accounting for Possibilities in Decision Making 143
1985
+
1986
  first one. The local authority should err on the safe side and prohibit the
1987
  construction.
1988
  The environmentalists put forward a simple worst case argument, whose core can
 
2023
  21 Cf. Luce and Raiffa (1957:278), Resnik (1987:26).
2024
  22 E.g. Elliott (2010).
2025
  144 G. Betz
2026
+
2027
  Example (Local Authority) Charged by their colleagues, the opponents of the new
2028
  complex refine their original argument. They concede that if the local authority
2029
  fails to clean up the mine, the habitat may be destroyed, too. But they say: We
 
2065
  24 Moreover, the general premiss (2) can be understood as an implementation of Hansson’s
2066
  symmetry tests (cf. Hansson 2016).
2067
  6 Accounting for Possibilities in Decision Making 145
2068
+
2069
  from taking a risky option are negligible compared to the catastrophic effects that
2070
  may ensue.25
2071
  These prerequisites can be made explicit as antecedent conditions in the decision
 
2108
  general strategy to identify specific conditions under which the various decision principles may be
2109
  applied is also favored by Resnik (1987:40).
2110
  146 G. Betz
2111
+
2112
  worst cases have to be compounded for each option. Let’s refer to the joint
2113
  normative assessment of a pair of possible consequences (best and worst case) as
2114
  “beta-balance.”26 The relative weight which is given to the worst case in such a
 
2151
  investigates the formal properties of “extremal” preferences which only take best and worst
2152
  possible cases into account.
2153
  6 Accounting for Possibilities in Decision Making 147
2154
+
2155
  surely bring about consequences X, then option A is preferred to option B.27 Now, we
2156
  can also explain why the reasoning appears so plausible: Whatever the exact level of
2157
  risk aversion, the beta-balance of option A is greater than that of option B and hence
 
2195
  shows that decision-making which seeks to minimize the probability of some harm runs into
2196
  problems as soon as various harmful outcomes with different disvalue are distinguished.
2197
  148 G. Betz
2198
+
2199
  (2) The value of a possible consequence of erecting or not erecting the steel wall is
2200
  roughly proportional the corresponding likelihood that the historic building is
2201
  not fully destroyed.
 
2237
  29 Robust decision analysis "a la Lempert et al. is hence a systematic form of “hypothetical
2238
  retrospection” (see Hansson 2016, Sect. 6).
2239
  6 Accounting for Possibilities in Decision Making 149
2240
+
2241
  We will return to the epistemic challenge of deep uncertainty—namely the
2242
  problem of fully grasping the space of possibilities—in the second part of this
2243
  chapter. But deep uncertainty also poses a normative challenge for robust decision
 
2280
  adopts with regard to them can be understood as an operationalization of Hansson’s degrees of
2281
  unacceptability (cf. Hansson 2013:69–70).
2282
  150 G. Betz
2283
+
2284
  (1) A possible outcome is acceptable if and only if no person is killed and the
2285
  operation has a total cost of less than 1 million €. [Normative guardrails]
2286
  (2) There is no possible consequence of defusing the bomb according to which a
 
2320
  31 For a detailed discussion of risk imposition and the problems standard moral theories face in
2321
  coping with risks see Hansson (2003).
2322
  6 Accounting for Possibilities in Decision Making 151
2323
+
2324
  principle of risk imposition is hence too strict. It must be limited to cases where
2325
  those potentially affected are in a position to provide consent, or it must state
2326
  alternative necessary conditions for permissibility. Another problem is that the
 
2360
  of the possible from the impossible involves as influential a choice as the selection of a decision
2361
  principle.
2362
  152 G. Betz
2363
+
2364
  knowledge. (In particular, for any such statementHjxj!10 there exists a time trel such
2365
  that Hjxj!10 can be derived from the Newtonian model of the pendulum and the
2366
  possibility that the pendulum has been released at trel.) On the other hand, every
 
2404
  35 For this very reason, it is a non-trivial assumption that a dynamic model of a complex system
2405
  (e.g. a climate model) is adequate for verifying possibilities about that system (cf. Betz 2015).
2406
  6 Accounting for Possibilities in Decision Making 153
2407
+
2408
  ii. There might be some conceptual possibilities which actually are inconsistent
2409
  with the background knowledge, although we have not been able to show this
2410
  (failure to falsify).
 
2445
  we deal with a complex system which we have only poorly understood so far.36
2446
  36 See also the “epistemic defaults” discussed by Hansson (2016: Sect. 5).
2447
  154 G. Betz
2448
+
2449
  Class 2. By summing up the maximum contribution of all potential sources of sea
2450
  level rise, climate scientists are in a position to robustly refute the conceptual
2451
  possibility that global mean sea level will rise by 10 m until 2,100 with business
 
2490
  they can be robustly ruled out—which, according to the authors, is the case for the most extreme
2491
  ones (p. 24).
2492
  6 Accounting for Possibilities in Decision Making 155
2493
+
2494
  5 The Dynamics of Possibilistic Knowledge
2495
  Our possibilistic foreknowledge is highly fallible. That’s already true for the simple
2496
  notion of serious possibility in the sense of relative consistency with the background
 
2534
  was intact. But these bombs all dated from the last 2 years of the war. So the
2535
  argument from analogy does not really warrant anymore that the trigger of the
2536
  156 G. Betz
2537
+
2538
  bomb to-be-defused may be intact, too. For the time being, the possibility that the
2539
  trigger is intact has to count as a merely articulated one. The experts had also
2540
  considered whether the dust cloud of a potential detonation may damage the
 
2579
  possibility that no window breaks becomes a merely articulated possibility (unless,
2580
  e.g., an accordingly modified simulation re-affirms the original finding). Also, the
2581
  6 Accounting for Possibilities in Decision Making 157
2582
+
2583
  team originally excluded the possibility that the cultural heritage site will be
2584
  damaged. But the argument which rules out this scenario, too, relied on a false
2585
  premiss. Given the novel estimate of the bomb’s size, that possibility cannot be
 
2624
  understanding of a system will change and improve quickly, however, one should
2625
  also expect the overhaul of one’s possibilistic outlook.
2626
  158 G. Betz
2627
+
2628
  Of course, it’s impossible to predict what we will newly come to know in the
2629
  future.42 But it’s not impossible to estimate whether our knowledge will change,
2630
  and how much. So, in 1799 Humboldt had reason to expect that he would soon
 
2664
  43 See Rescher (1984, 2009) for a discussion of limits of science and their various (conceptual or
2665
  empirical) reasons.
2666
  6 Accounting for Possibilities in Decision Making 159
2667
+
2668
  (1) There is no available option whose worst non-falsified possible consequence is
2669
  preferable to the worst non-falsified possible consequence of not permitting the
2670
  construction.
 
2709
  vis-"a-vis our original minimum standards and relative to all such verified
2710
  possibilities.”
2711
  160 G. Betz
2712
+
2713
  So the team member explains that arguments H, I should be understood as
2714
  referring to non-falsified possibilities. In addition, she sets up a further argument
2715
  which only takes verified possibilities into account, argument L:
 
2752
  44 Brun and Betz (2016: especially Sect. 4.2) explain how argument analysis, and especially
2753
  argument mapping techniques, help to balance conflicting normative reasons in general.
2754
  6 Accounting for Possibilities in Decision Making 161
2755
+
2756
  with respect to non-falsified possibilities is more risk averse than an agent who is
2757
  content with robustness with respect to verified possibilities. (b) The profile of
2758
  possibilistic predictions on which the decision is based. If, for example, there is a
 
2794
  has happened which wasn’t considered possible (i.e. was not referred to in some
2795
  non-falsified possibility). Surprises may happen for different reasons. We may in
2796
  162 G. Betz
2797
+
2798
  particular distinguish two sorts of surprise, to which we already alluded above:
2799
  (a) surprises that result from unknown unknowns; (b) surprises that result from the
2800
  fallibility of and the occasional need to rectify one’s background knowledge.45
 
2833
  45 Basili and Zappia (2009) discuss the role of surprise in modern decision theory and its
2834
  anticipation in the works of George L. S. Shackle.
2835
  6 Accounting for Possibilities in Decision Making 163
2836
+
2837
  • If, considering all relevant aspects except their potential for surprise (i.e., the
2838
  extent to which an option is associated with unknown unknowns), the options A
2839
  and B are normatively equally good, and if A has a significantly greater potential
 
2874
  • If option A has a significantly smaller potential for (undesirable) surprise (i.e.,
2875
  the relevant background knowledge is provisional and more likely to be
2876
  164 G. Betz
2877
+
2878
  modified) than its alternatives and if carrying out option A doesn’t jeopardize a
2879
  more significant value (than surprise aversion), then option A should be
2880
  carried out.
 
2915
  to be consistent with background knowledge and ones that merely have not been
2916
  refuted. The conceptual framework also gives rise to a precise (possibilistic) notion
2917
  6 Accounting for Possibilities in Decision Making 165
2918
+
2919
  of surprise (e.g. unknown unknowns) and triggers an expansion of the arsenal of
2920
  standard argument patterns for reasoning under great uncertainty.
2921
  One major purpose of this chapter has been to refute the widely held prejudice