Blake w. Killingsworth and Alexander J. Izbiky
Whitman College, 345 Boyer Avenue, Walla Walla, WA 99362, USA
ABSTRACT
The anchoring effect is the phenomenon by which prior exposure to a value affects future estimates by “anchoring” individuals’ estimates to the originally presented value. While a large body of research exists demonstrating the existence of the anchoring effect in the lab, only a small minority of procedures are operationalized in a way that reflects the everyday lives of participants. It is necessary to add to this growing body of ecologically valid anchoring research. This study considers how the magnitude of the anchoring effect may be impacted by variations in time delay and processing depth. To test the hypotheses, participants were taken through a modified version of the traditional anchoring procedure, in which they were presented with anchor values based on glacial melting rates and made subsequent numerical estimates. This research was carried out through a remote survey, which generated a sample size of 176 participants. A significant anchoring effect was detected. Results across conditions of delay and depth of processing did not achieve significance. The current study was successful in replicating the anchoring effect with material that was relevant to participants’ lives. Due to the lack of significance of depth of processing and delay, the present research cannot contribute to understanding the intricacies of why anchoring occurs. However, replications of the current study should guide future work that continues to probe the validity of such mechanisms – namely, the attitude change model.
AUTHOR SUMMARY
The anchoring effect is the phenomenon by which prior exposure to a value affects future estimates by “anchoring” individuals’ estimates to the originally presented value. In our study we wanted to explore how a time delay and variation in processing depth might impact the magnitude of the anchoring effect in the context of interpreting data on glacial melting rates. The experiment resulted in a significant anchoring effect, but no significant differences across conditions of delay or processing depth. We believe our study continues the work of past applied anchoring research, asking important questions regarding the influence of falsified data, and how the magnitude of anchoring might differ between groups with different established social beliefs.
INTRODUCTION
Anchoring is the tendency for decision makers to make numerical estimates that are biased towards an initial suggested value. In the seminal study on the anchoring effect, participants spun a wheel of fortune that was secretly rigged to either land on the number 10 or 65. Participants were then asked if the percentage of African countries in the United Nations was higher or lower than 10 (the low anchor group) or 65 (the high anchor group) and were subsequently asked for an estimate of the true value. Participants in the low anchor condition estimated significantly lower values than those in the high anchor condition. This study established the idea of anchoring as a heuristic – a subconscious mental shortcut that reduces complex tasks like predicting values into less effortful operations (Tversky & Kahneman, 1974). While heuristics can be a useful form of automatic processing, they can also result in cognitive biases, such as the tendency to rely too much on suggested values when making an estimate. The study also popularized the standard anchoring paradigm – the method of inducing the anchoring effect via sequential comparative (“is the true value higher or lower than the anchor?”) and absolute (“what is the true value?”) judgements (Englich, 2008).
Possible Mechanisms for Anchoring
The robustness of the anchoring effect has persisted despite a lack of definitive knowledge of its underlying mechanism. Among contemporary models, the selective accessibility model is widely accepted. The model proposes that when participants are asked to make a comparative estimate, they test the hypothesis that the anchor is equal to the true value, and search for relevant information from their lives that would confirm that hypothesis. Hypothesis-consistent knowledge is made more accessible, and is more likely to be used in making the absolute estimate (Mussweiler & Strack, 1999). While the selective accessibility model is widely supported, it assumes that anchoring requires a rather effortful process, which might not always be the case (Blankenship et al., 2008). The attitude change model proposes that anchoring can occur in a non-thoughtful manner in which the anchor is interpreted as a hint at what the true value may be, or occur in a thoughtful manner in which the selective accessibility model still applies. More research on the anchoring effect and potential moderators could provide more definitive evidence to support one of these particular mechanisms, or lead to the articulation of a new mechanism.
Application of Anchoring to Society
While early research showed that the anchoring effect was robust, one critique of these pioneering studies was that they were too experimentally confined and might not apply to ecologically valid scenarios — those scenarios which people would consider relevant to their lives. This could be because the anchoring effect was first proposed as a cognitive heuristic that occurs under conditions of extreme uncertainty (Tversky & Kahneman, 1974). As a result, many studies have asked participants for their estimates of obscure topics, such as the gestation period of an African elephant (Epley & Gilovich, 2001). A new wave of studies has considered the application of the anchoring effect to more socially relevant scenarios.
Anchoring in the Context of Expert Knowledge and Socially Divisive Topics
As the literature on applied anchoring develops, it has become clear that the anchoring effect is robust even when decision makers have expert knowledge or strong opinions on the topic. For example, in a study on criminal sentencing, legal professionals were anchored by sentencing demands that were determined at random by tossing a pair of dice (Englich et al., 2006). In another study, Lalot et al. (2019) informed Swiss university students of a proposed high-anchor or low-anchor number of immigrants to allow into Switzerland, and utilized the standard anchoring paradigm. One might suspect that more liberal participants would advocate for lenient immigration policies and thus propose a larger number of immigrants as the best policy, but the anchoring effect was consistent regardless of political leaning.
Delay as a Moderator
While research on the anchoring effect in criminal sentencing and immigration suggests a high level of real-world applicability, it is also important to consider the persistence of the anchoring effect as time passes. The standard anchoring paradigm requires an estimate to be made by the participant immediately following the presentation of the anchor. Outside of the laboratory individuals do not always face such immediate follow-up questions about data they have observed. While the literature on delay as a moderator of the anchoring effect is limited, there is some preliminary evidence suggesting it may be important. Blankenship et al. (2008) considered the interaction of the anchoring effect, cognitive load, and a one-week delay. The study found that the anchoring effect was larger for immediate judgements compared to judgements made one week later, suggesting a diminishing anchoring effect over time. Mussweiler (2001) produced contrasting results with a similar design, finding that the magnitude of the anchoring effect for participants in the delay condition was significant and undiminished relative to participants in the no-delay condition.
A potential influence of delay on the robustness of the anchoring effect may not be surprising, as the passage of time affects other cognitive processes such as memory, problem solving, and decision making (Cuthbert & Standage, 2018). Beyond delay, there are other factors that have similarly wide-reaching effects on cognition, and might be useful candidates for anchoring research. One such factor is levels of processing.
Levels of Processing
The levels of processing theory of memory was originally conceptualized by Craik and Lockheart (1972), who proposed that memory trace persistence is a function of the depth of analysis, or “processing,” of a stimulus. The theory also states that the effectiveness of rehearsal depends on whether the rehearsal facilitates deeper processing, rather than on the duration of the rehearsal (Craik & Lockheart, 1972).
Delay as a Moderator of Levels of Processing
As with all theories of memory, time delay has been an important concern in the levels of processing framework. Comparisons between immediate recall, one-week delays, and two-week delays demonstrate that deeper processing results in more durable memory traces (Burton et al., 1981). This research also shows an opposing interaction to that demonstrated by anchoring (Blankenship et al., 2008). While past research by Blankenship showed a degraded anchoring effect over time, the levels of processing effect strengthened over time; the advantage in recall for participants who deeply processed the study material increased between recall at one week and recall at two weeks.
Processing Levels
Craik and Lockheart’s levels of processing framework leaves considerable room for researchers to operationalize processing levels. Slamecka and Graf (1978) investigated processing levels through a word pair completion study, and found that participants who self-generated responses had significant increases in memory performance relative to those who did not. Studies involving self-generation with numeric stimuli have produced similar results. This suggests that the levels of processing framework is flexible enough to apply to various areas of research, and that self-generation may be a particularly effective way of increasing depth.
Past research on anchoring has alluded to some of the concepts involved in levels of processing, but without naming the theory. These studies have recognized the power of self-generated material to lead to a more robust anchoring effect, potentially as a result of stronger memory traces (Mussweiler, 2001). Current research aims to tackle these questions directly and use the framework of processing levels to explore the varied durability of anchors.
Current Research and Hypotheses
While laboratory research is effective at isolating and analyzing individual cognitive processes, people are often placed in complex real-world situations that engage many processes at once. As such, understanding how these processes interact is critical to having an externally valid understanding of human behavior. For example, an individual decision might be simultaneously affected by the anchoring effect, levels of processing, and delay. In the present study, participants estimated glacial melting rates after either calculating an average melting rate themselves (deep processing) or being presented with an average melting rate (shallow processing). Across conditions, those estimates were made either immediately after anchor presentation, or following a delay.
Under these conditions, we hypothesized that: (1) the estimates of participants presented with a low anchor would be lower than those of participants presented with a high anchor, demonstrating a traditional anchoring effect; (2) the deep processing (calculated) conditions would generate a stronger anchoring effect than the shallow processing (presented) conditions; (3) the immediate conditions would generate a stronger anchoring effect than the delay conditions; (4) there would be a significant interaction between depth of processing and delay condition, wherein a delay would produce a larger depth of processing effect for participants in the delay condition relative to participants in the immediate condition.
METHODS
Participants
The participants were 176 students and community members associated with Whitman College in Walla Walla, Washington, ranging in age from 18-82 (M = 29.66, SD = 15.98). Fifty-seven participants identified as male, 109 participants identified as female, six participants identified as non-binary, and four participants elected not to respond. One hundred forty-eight participants identified as White/Caucasian, eight participants identified as Asian, two identified as Hispanic/Latinx, 13 participants identified as mixed race, and five participants elected not to respond. One hundred twenty-two additional participants started the survey, but did not generate usable data points due to incomplete survey responses.
Materials and Design
Anchoring Data
The data used to create anchoring stimuli came from a study measuring the melting rate of alpine glaciers in Austria (Paul, 2002). The study included data on historic melting rates of six glaciers. From these six glaciers, a mean overall melting rate and standard deviation was calculated. Data for the high-anchor and low-anchor conditions were then created as follows. The high-anchor values for each glacier were three standard deviations greater than the true values (M + 3 SD). Similarly, the low-anchor values were three standard deviations lower than the true values (M - 3 SD). All values presented to participants, along with the original values, are compiled in Table 1. In shallow conditions, participants were presented with the data table, including the already calculated mean melting rate (bottom row of Table 1). In deep conditions, participants were presented with the same data table without the mean melting rate, along with instructions on how to calculate the mean melting rate themselves. Thus, data were presented in one of four tables to participants, depending on whether the high or low anchor values were present, and whether or not the average was included: shallow high, shallow low, deep high, or deep low.
Filled Delay
In order to observe if the robustness of the anchoring effect changes over time, a delay was incorporated into the design of the experiment. The delay took the form of a three minute, one second Saturday Night Live (SNL) YouTube video (Michaels, 2019) that was entirely irrelevant to the topic of numerical data and glacial melting rates. Irrelevant media was chosen to prevent participants from referencing previous knowledge of the topic when answering questions about the data.
Measures
Absolute Estimate
The mean absolute estimate (what participants suspected they would have estimated before viewing the data), calculated for all eight groups.
Anchor Magnitude
A measure of the strength of the anchoring effect, evaluated as the absolute value of the difference between the absolute estimate and the true value.
Climate Change Belief
In order to consider if the anchoring effect was moderated by participants’ belief in climate change, participants completed one final question in which they responded the statement “I believe the climate crisis is one of the most grave threats to society today” using a 7 point Likert-type scale ranging from strongly disagree to strongly agree.
Procedure
Participants voluntarily and remotely completed a Qualtrics survey sent out via the Whitman College students listserv. Upon electronically signing a consent form, participants were presented with a table containing six data points from a study on glacial melting rates. Participants in the shallow processing conditions were presented with the mean glacial melting rate at the bottom of the table, were asked to consider all melting rates and the average value, and were required to enter the average value to ensure that they had read and processed it. Participants in the deep processing conditions were presented with the same representation of data but without the stated mean. Instead, they were asked to calculate the average value using a cellphone or computer calculator and to enter the value they calculated.
Following the presentation/calculation of an anchor, participants were asked to make comparative and absolute judgments about that value. In the immediate condition, the participants immediately made these judgements, responding to the following questions: a) “Would you have estimated a value that is higher or lower than the average presented in the table/average you determined?” and b) “What average melting rate would you have estimated?”. After entering an estimate, participants were asked to enter the average value they calculated/were presented with earlier, as a manipulation check, ensuring the validity of the experimental manipulation.
Participants in the delay condition calculated/were presented with the mean, then were instructed to watch a three minute, one second SNL YouTube video as a filled delay. Following the video, participants answered questions a) and b), referred to above. They then underwent the same manipulation check that asked them to recall the average value. One participant in the delay condition was excluded due to the manipulation check.
Following completion of the anchoring procedure, participants were asked the climate change belief question, referred to above. Finally, participants were asked to report their age, gender, and race/ethnicity before being debriefed and released from the procedure. Participants were not compensated for their participation.
Proposed Analyses
Four statistical tests were used to analyze results once data was collected. A two-condition one-way ANOVA was run to consider the main effect of anchor condition on the absolute estimate value and a 2x2 factorial ANOVA was run to evaluate the effects of depth and delay on anchor magnitude. For the exploratory analyses, two independent samples t-tests were used to determine if there was a significant difference between low and high anchors for each climate change belief group. A final 2x2 factorial ANOVA was used to identify if the effect of a low or high anchor on anchor magnitude depended on one’s climate change belief classification.
RESULTS
Hypothesized Effects
This study was a 2 (anchor condition: low, high) x 2 (delay condition: immediate, delayed) x 2 (processing condition: shallow, deep) between groups design, yielding eight conditions. Mean absolute estimates were calculated individually for all eight conditions (see Table 2 and Figure 1). To assess the effects of delay and depth of processing, these estimates were converted into anchor magnitudes, equal to the absolute value of the difference between an estimate and the true value. Given that each anchor was equidistant from the true value, high and low conditions could be grouped together to calculate four anchor magnitude means for each combination of delay and processing conditions (see Table 3 and Figure 2).
Hypothesis 1: Anchoring Effect
Using the absolute estimate means, a two-condition one-way ANOVA was run to consider the main effect of anchor condition on the absolute estimate value. A significant effect was found, F(1, 174) = 259.363, p < .001, η2 = .598. The measure of effect size, η2, is the percent of variance in the dependent variable explained by the independent variable. This finding supports the first hypothesis that high (M = 29.18, SD = 10.81) and low (M = 9.10, SD = 4.34) anchor estimates would significantly differ, demonstrating a traditional anchoring effect.
Hypotheses 2 - 4: Depth, Delay, and the Interaction
Separately, anchor magnitude means were analyzed with a 2x2 factorial ANOVA to detect possible effects of depth or delay on anchor magnitude as well as a possible interaction between the two effects. The effect of processing depth on anchor magnitude was not significant, F(1, 172) = 1.787, p = .183, η2 = .010. Second, the effect of delay on anchor magnitude was not significant, F(1, 172) = 1.247, p = .266, η2 = .007. Finally, the interaction between depth and delay was not significant, F(1, 172) = 0.428, p = .514, η2 = .002, leading to a retention of the null hypothesis for all three.
Power Analyses
Given an effect size of η2 = .010 and ⍺ = .05, for the processing depth conditions G*Power (Faul et al., 2007) recommends a sample size of 1,302 participants to achieve power of .95. For the delay conditions, given an effect size of η2 = .007 and ⍺ = .05, a post hoc power analysis showed a sample size of 1,844 participants would have been needed to achieve power of .95.
Exploratory Analyses
Following the planned analyses, exploratory analyses were conducted to look for a possible influence of climate change belief on anchoring. Participants were separated into two groups depending on their Likert-scale response to the climate change belief question. Participants who agreed or strongly agreed with the “climate crisis” statement (n = 151) were labeled as having “high climate change belief” and participants who answered “somewhat agree” or any response less affirmative (n = 22) were labeled as having “low climate change belief.” This excludes three participants who did not provide climate change belief responses. Anchor magnitudes in high and low anchor conditions were then compared for each of the two groups using separate independent samples t-tests.
Among “high climate change belief” participants, a significant difference between low and high anchor conditions was found, t(149) = -2.108, p = .037. In particular, mean anchor magnitude was smaller when high climate change belief participants were presented with a low anchor (M = 9.99, SD = 3.66) compared to a belief-consistent high anchor (M = 12.33, SD = 8.82). On the contrary, no significant difference in anchor magnitude was found between low and high anchor conditions in the “low climate change belief” group, t(20) = 0.906, p = .376.
To further explore this asymmetric finding, a 2 (climate change belief: low, high) x 2 (anchor: low, high) factorial ANOVA was utilized. The interaction between anchor condition and belief group was not significant, F(1, 172) = 0.002, p = .968, η2 = .000. Thus, results do not indicate that the effect of a high or low anchor on anchor magnitude depends on one’s belief in climate change.
DISCUSSION
Support of Hypotheses
Out of the four hypotheses in the present research, one was supported by the results. There was a significant effect of anchor condition on the absolute estimate value. The main effect of processing depth and delay on anchor magnitude, as well as the interaction between depth and delay, was not significant.
Traditional Anchoring Effect
The results of the current study indicate that a significant anchoring effect was created by the manipulation, with high anchors producing significantly higher absolute estimates than low anchor estimates. This finding is not unexpected given how robust and well replicated the anchoring effect has been across a number of studies and applications.
One major deviation from the expected results of a traditional anchoring effect is that across all conditions, high anchors were stronger than low anchors, which differs from the expected symmetrical anchoring effect. One plausible explanation is the social pressure that participants may have perceived when making specific judgements about a socially charged issue like climate change. The skewed distribution of responses to the climate belief Likert scale measure supports this possibility. The participant pool overall had a strong belief in climate change, with 87% of participants falling into the “high climate change belief” group. This overrepresentation of climate change believers could have boosted absolute estimates for both high and low anchor conditions. This overall upward shift, especially within the “high climate change belief” group, would logically make the magnitude of the anchoring effect greater for high anchor participants as anchor estimates would move away from the true value, whereas estimates made by low anchor participants would move toward it.
Depth Effect
Although a significant effect of depth of processing on anchor magnitude was not found, the data provide an important starting point given that there is almost no other research about how processing depth might influence anchoring. Due to the unrealistic possibility for most researchers to obtain the sample size of 1,302 that was indicated as necessary by the post-hoc power analysis, one of the most important considerations for future research will be to create manipulations that generate larger effect sizes.
In the present study, the depth manipulation asked participants either to observe a mean overall melting rate that had been calculated for them, or calculate it themselves. This relatively simple manipulation created two distinctly different conditions, with one group making multiple calculations along the way ‒ an established method for operationalizing levels of processing (Benton et al., 1983). However, the method yielded a relatively small effect size of .010. Future research should consider using different depth of processing manipulations that may prove to be more effective. For example, greater depth could be achieved by asking participants to consider the more qualitative attributes of a stimulus, such as by identifying categories or placing stimuli in order among other similar stimuli (Treisman & Tuxworth, 1974).
Cognitive processing is especially relevant to the present research, since the subject matter, the melting of Austrian glaciers, is relatively esoteric. While many participants are likely informed about the existence of glacial melt due to global climate change, they may not feel equipped to make precise numerical estimates on the subject. Adding more weight to the topic and/or giving participants more background could have boosted statistical power. For example, participants could have been provided with a passage with more information about the glaciers, or visual presentations of melting glaciers. While additional manipulations may introduce other control concerns, generating a stronger manipulation of depth of processing is crucial to demonstrating a possible influence of this factor.
Delay Effect
There was no significant effect of time delay on anchor magnitude, but it is worth considering implications for future anchoring research that explores this variable. First, the recommended sample size of 1,844 produced by the post-hoc power analysis highlights that the measured effect size of .007 is small. If the means had been significantly different, the present study would be compatible with the results of Blankenship et al. (2008), which found that the anchoring effect is an immediately powerful cognitive heuristic, but loses strength over time.
It is worth noting that the insignificant effect in the present study occurred over a relatively short delay. While previous memory research has used delays as short as 20 seconds (Treisman & Tuxworth, 1974), the three minute, one second delay is novel in the context of anchoring. There is no identified threshold for when the diminishing effect of a time delay sets in, and this study begins the process of defining the time constraints on anchoring. Perhaps the difference in anchoring magnitude between immediate and delayed conditions is not significant for a three minute, one second delay, but would be significant for a 10 minute delay. Furthermore, certain topics might be more memorable for a group of participants. If so, the effect of delay may not be constant. Therefore, future anchoring studies should acknowledge the topic and context in which a diminishing effect occurs, and test more than one delay.
Interaction Between Processing Depth and Delay
A significant interaction between depth and delay was not found, so the hypothesis that a delay increases the effect of depth of processing is not supported. Given the novelty of the present study in exploring the interaction of depth and delay, future research should keep the directionality of these results and potential mechanisms in mind. Notably, the difference in anchor magnitude between shallow and deep processing conditions was (insignificantly) smaller in immediate conditions (deep processing resulted in an anchor magnitude increase of 0.70) than in delayed conditions (deep processing resulted in an anchor magnitude increase of 2.04). These results, if significant, would make sense within the framework of the attitude change model. The relatively small anchor magnitude reduction following the deeply-processed anchor would suggest that the hypothesis testing process produces a more durable anchoring effect than the “hint” anchoring process. This explanation is consistent with research showing that deeper processing results in more durable memory traces, both in the context of long delays (one-week in Burton et al., 1981) as well as short delays (20 seconds in Treisman & Tuxworth, 1974).
Future research into delay and depth manipulations should continue to recognize the attitude change model as a potential mechanism, but we must acknowledge that the interaction was not significant and other future mechanisms might explain why that is the case. Keeping in mind that the attitude change model originated from research that did not explicitly consider the interaction between depth and delay, it is recommended that replications of the present research are conducted so that a more inclusive and tailored mechanism can be adopted.
Exploratory Analysis of Climate Change Belief Measure
It is reasonable to assume that participants viewed glacial melting rates as linked with climate change. The survey established this by stating “The melting of the world’s glaciers has long been recognized as one of the clearest signs of global warming.” One might hypothesize that those with high climate change belief would be more anchored by a belief-consistent high anchor and those with low climate change belief would be more anchored by a low anchor. The results suggest that those with high climate change belief were indeed more susceptible to high anchors than low anchors, but those with low climate change belief exhibited no such difference. High climate change belief participants were significantly more influenced by a high anchor (M = 12.33, SD = 8.82) than by a low anchor (M = 9.99, SD = 3.66). In comparing the low and high climate change belief groups to each other, results of the 2x2 ANOVA indicate that there was no significant interaction between anchor condition and belief group. Thus, no definitive statements can be made regarding how anchors affect belief groups differently.
There are a few potential explanations for why there is not a comparable deflationary effect of the low anchor for the low climate change belief group. First, the sample was skewed towards high climate change belief participants, so that there was a relatively small low climate change belief group (n = 22) that limited the power of the analysis. Additionally, to run the analysis and differentiate the groups, the low climate change belief group was necessarily more heterogeneous; it included participants ranging from “strongly disagree” to “somewhat agree” on the climate change belief measure. Therefore, the result that the low climate change belief group was not more susceptible to one anchor over the other may not be surprising. It is possible that the anchoring effect could have been enhanced by high anchors for some and by low anchors for others in the group, ultimately obscuring any effect. A replication of the current study and other future related research should ideally seek a more diverse sample in terms of climate change belief. This would allow for more high-powered analysis between groups as well as more definitive claims about how anchoring might differentially impact the groups.
Finally, this exploratory analysis should be considered with respect to a similar applied anchoring study by Lalot et al. (2019) on immigration data. Both Lalot et al. (2019) and the present study found the more liberal group to exhibit a larger belief-consistent anchor magnitude (assuming climate crisis belief can be grouped with liberalism). Future research on more ecologically valid anchoring scenarios should consider how low and high anchors differentially affect cohorts with polarized views on social issues.
Moving Forward
Reflecting on the general findings of this study, it is important to place them in a real world context, and think broadly about how they may apply to human behavior outside the laboratory. Building on a massive base of past research, the anchoring effect continues to demonstrate its place as a particularly reliable and robust psychological construct.
Hopefully the present research may inform greater understanding about how data are consumed. It can be assumed that the power of numbers to convey a certain narrative will be harnessed for both positive and negative purposes, so directed research should explore the potential for anchoring as a tool to understand or propagate misinformation. The moderators of delay and belief in the relevant topic are therefore important to consider in future research.
The direction of anchoring research to more ecologically valid scenarios is an important and necessary trend, and anchoring in the context of socially divisive topics should continue to be explored with diverse and representative samples. To this end, anchoring research should be cognizant of the growing awareness in psychological science to engage populations outside of the “Western, Educated, Industrialized, Rich, and Democratic” participant archetype that is often oversampled, and falsely assumed to be generalizable (Henrich et al., 2010). The role of data and statistics in society and the everyday lives of people can only be expected to grow in years and decades to come. As long as this is true, the increased understanding of consistent human error in processing through heuristics like the anchoring effect will continue to be paramount.
REFERENCES
Benton, S. L., Glover, J. A., & Bruning, R. H. (1983). Levels of processing: Effect of number of
decisions on prose recall. Journal of Educational Psychology, 75(3), 382-390. Retrieved from http://ovidsp.ovid.com/ovidweb
Blankenship, K. L., Wegener, D. T., Petty, R. E., Detweiler-Bedell, B., & Macy, C. L. (2008). Elaboration and consequences of anchored estimates: An attitudinal perspective on numerical anchoring. Journal of Experimental Social Psychology, 44(6), 1465-1476. doi:10.1016/j.jesp.2008.07.005
Burton, J. K., Niles, J. A., & Wildman, T. M. (1981). Levels of processing effects on the immediate and delayed recall of prose. Journal of Reading Behavior, 13(2), 157-164. Retrieved from https://sagepub.com
Craik, F. I., & Lockhart, R. S. (1972). Levels of processing: A framework for memory research. Journal of Verbal Learning and Verbal Behavior, 11(6), 671-684.
Cuthbert, B., & Standage, D. (2018). On the short-lived nature of working memory: Drift and decay in a population-coding model. The Journal of Neuroscience, 38(48), 10241-10243. Retrieved from https://www.jneurosci.org/content/jneuro/38/48/10241.full.pdf
Englich, B., Mussweiler, T., & Strack, F. (2006). Playing dice with criminal sentences: The influence of irrelevant anchors on experts’ judicial decision making. Personality and Social Psychology Bulletin, 32(2), 188-200. doi: 10.1177/0146167205282152
Englich, B. (2008). When knowledge matters - differential effects of available knowledge in standard and basic anchoring tasks. European Journal of Social Psychology, 38(5), 896-904. doi: 10.1002/ejsp.479
Epley, N., Gilovich, T., 2001. Putting adjustment back into the anchoring and adjustment heuristic: differential processing of self-generated and experimenter-provided anchors. Psychological Science, 12, 391–396. doi: 10.1111/1467-9280.00372
Faul, F., Erdfelder, E., Lang, A.-G., & Buchner, A. (2007). G*Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behavior Research Methods, 39, 175-191.
Henrich, J., Heine, S. J., & Norenzayan, A. (2010). The weirdest people in the world? Behavioral and brain sciences, 33(2-3), 61-83. doi: 10.1017/S0140525X0999152X
Lalot, F., Quiamzade, A., & Falomir‐Pichastor, J. M. (2019). How many migrants are people willing to welcome into their country? The effect of numerical anchoring on migrants' acceptance. Journal of Applied Social Psychology, 49, 361-371. doi: 10.1111/jasp.12588
Michaels, L. [Saturday Night Live]. (Lorne Michaels). (2019, September 28). Weekend Update: David Ortiz on Getting Shot. Available from https://www.youtube.com/watch?v=QNtf6qRgxfU
Mussweiler, T., & Strack, F. (1999). Hypothesis-consistent testing and semantic priming in the anchoring paradigm: A selective accessibility model. Journal of Experimental Social Psychology, 35(2), 136-164. Retrieved from https://pdf.sciencedirectassets.com
Mussweiler, T. (2001). The durability of anchoring effects. European Journal of Social Psychology, 31(4), 431-442. doi: 10.1002/ejsp.52
Paul, F. (2002). Changes in glacier area in Tyrol, Austria, between 1969 and 1992 derived from Landsat 5 Thematic Mapper and Austrian Glacier Inventory data. International Journal of Remote Sensing, 23(4), 787-799. doi: 10.1080/01431160110070708
Slamecka, N. J., & Graf, P. (1978). The generation effect: Delineation of a phenomenon. Journal of Experimental Psychology: Human Learning and Memory, 4(6), 592-604. Retrieved from https://researchgate.net
Treisman, A., & Tuxworth, J. (1974). Immediate and delayed recall of sentences after perceptual processing at different levels. Journal of Verbal Learning and Verbal Behavior, 13(1), 38-44.
Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185(4157), 1124-1131. Retrieved from http://links.jstor.org