Course taster

Considerations for using experts

There are also a number of fallacies and heuristics (i.e. mental rules of thumb) (Eysenck & Keane, 2020) which we all have that bias us and may play a role when jurors hear expert testimony. There are many of these that may have an impact. Select from the following headings to learn more:

The tendency to think that all options are equal, even when faced with information that refutes this (Falk & Lann, 2008). This may cause jurors to believe that all expert testimony is trustworthy, regardless of the strength of the evidence provided.

We decide that an object/person belongs to a specific category because they appear typical or representative of that category – for example, we may think that someone wearing a suit and tie works in a business role simply due to their attire (Harper & Bartels, 2016). This may cause us to trust experts simply based on what they may be wearing and the authority they present.

This is the tendency to believe that the combination of two events is more likely than one event on its own (Dearden, 2018). For example, if given two statements: A) Hayley is a dog groomer, and B) Hayley is a dog groomer and owns a dog, and then asked which statement is more likely, people will often choose statement B, assuming that if Hayley is a dog groomer then she must own a dog herself. In actuality, statement A is more likely to be true as it only relies on one piece of information being correct, whereas statement B relies on both pieces of information being correct. However, experts may present multiple pieces of evidence together which could cause confusion with jurors.

The tendency to judge probabilities and likelihoods based on how easy/hard certain events are to retrieve from memory. For example, if you recently heard about a plane crash on the news, this may make you think that these events are more common than they are and potentially even make you more reluctant to fly – in actuality, this is the safest means of travel (Harper & Hogue, 2016). Experts may fail to present the base-rate information for certain events, and this causes this heuristic to occur.

A tendency to rely on our emotions when making decisions (Pachur et al., 2012), with more positive states being associated with greater risk-taking and negative states with lower risk-taking (Kahneman & Tversky, 1979, 1984). If an expert presents testimony in an emotionally evocative way, this may make it seem more plausible to jurors.

A tendency to prefer information that is concise and simple to understand, and we often overlook complex information. It has also been found that when too many options are present, there is choice overload, difficulty with decision-making and lower satisfaction in decisions (Iyengar & Lepper, 2000). Experts who present more concisely may be seen as more trustworthy.

Many people tend not to be good at understanding fractions, percentages and statistical information, and they prefer natural numbers (Hoffrage et al., 2015), which is why base-rate information is often ignored. Experts who present information in this way may be less effective.

The way we present information can change people's perceptions and thereby influence problem-solving and decision-making behaviour. For example, using emotionally evocative language when describing offenders, compared to more neutral terms, or providing information about whether an offender has undergone a re-entry programme, influences people to feel more negatively (in the case of the former) or more positively (in the case of the latter) towards offenders (see Harris & Socia, 2016; Snider & Reysen, 2014). The way an expert frames their evidence may then have an influence.

A tendency to prefer accepting the current state of something rather than changing our decisions (Nicolle et al., 2011). People also have a tendency to believe that things that are long-standing, old or popular must be good or correct, which further strengthens the status quo (Eidelman & Crandall, 2012). This may cause jurors to accept testimony that conforms to the status quo.

A tendency to seek evidence and information which confirms our own beliefs and hypotheses, rather than looking for information that contradicts it. This can be particularly problematic in police-interview scenarios, where an officer may assume a suspect's guilt and ignore information that goes against this belief (Hill et al., 2010). This may also cause jurors to accept testimony when it confirms their own views.

A tendency to focus on findings that are unusual or unexpected and use them to guide future decisions (Kulkarni & Simon, 1988). This could be problematic in statistics where interesting but flawed findings form the basis for future research or societal changes. Where experts present novel findings, this may seem more believable to jurors.

Misrepresenting someone's views by weakening or distorting them. If an expert undermines other people's views or opinions, it may make their own opinions seem more plausible.

A tendency to believe that a small first step will lead to a chain of events which will then create a largely undesirable outcome. For example, a common slippery slope argument is made related to drug legalisation, whereby legalising one minor drug may lead to a chain of events where more harmful drugs are legalised and cause an epidemic of drug addiction (Cummings, 2020). Experts who use slippery slope arguments may seem more believable to jurors.

Due to these limitations and issues with expert witnesses, there are several considerations that need to be made when deciding on whether to hire an expert witness (Davies & Beech, 2017):