API Endpoint for journals.

GET /api/articles/29482/?format=api
HTTP 200 OK
Allow: GET
Content-Type: application/json
Vary: Accept

{
    "pk": 29482,
    "title": "Bootstrap Hell: Perceptual Racial Biases in a Predictive Processing Framework",
    "subtitle": null,
    "abstract": "Predictive processing, or predictive coding,1 is transforming\nour knowledge of perception (Knill & Richards, 1996; Rao\n& Ballard, 1999), the brain (Friston, 2018; Hohwy, 2013;\nKnill & Pouget, 2004), and embodied cognition (Allen &\nFriston, 2018; Clark, 2016; Gallagher & Allen, 2018; Seth,\n2015). Predictive processing is a hierarchical\nimplementation of empirical Bayes, wherein the cognitive\nsystem creates generative models of the world and tests its\nhypotheses against incoming data. It is hierarchical insofar\nas the predictions at one level are tested against incoming\nsignals from the lower level. The resulting prediction error,\nthe difference between the expectation and the incoming\ndata, is used to recalibrate the model in a process of\nprediction error minimization. Predictions may be mediated\nby pyramidal cells across the neocortex (Bastos et al., 2012;\nHawkins & Ahmad, 2016; Shipp et al., 2013). Andy Clark\nhas characterized predictive processing as creating a\n“bootstrap heaven” (2016, p. 19), enabling the brain to\ndevelop complex models of the world from limited data. This enables us to extract patterns from ambiguous signals\nand establish hypotheses about how the world works.\nThe training signals that we get from the world are,\nhowever, biased in all the same unsightly ways that our\nsocieties are biased: by race, gender, socioeconomic status,\nnationality, and sexual orientation. The problem is more\nthan a mere sampling bias. Our societies are replete with\nprejudice biases that shape the ways we think, act, and\nperceive. Indeed, a similar problem arises in machine\nlearning applications when they are inadvertently trained on\nsocially biased data (Avery, 2019; N. T. Lee, 2018). The\nbasic principle in operation here is “garbage in, garbage\nout”: a predictive system that is trained on socially biased\ndata will be systematically biased in those same ways.\nUnfortunately, we are unwittingly trained on this\nprejudiced data from our earliest years. As predictive\nsystems, we bootstrap upwards into more complex cognitive\nprocesses while being fed prejudiced data, spiraling us into\na “bootstrap hell.” This has repercussions for everything\nfrom higher-order cognitive processes down to basic\nperceptual processes. Perceptual racial biases include\nperceiving greater diversity and nuance in the faces of racial\ningroup faces (the cross-race effect; Malpass & Kravitz,\n1969), misperceiving actions of racial outgroup members as\nhostile (Pietraszewski et al., 2014), and empathetically\nperceiving emotions in racial ingroup (but not outgroup)\nfaces (Xu et al., 2009), among other phenomena. They are\nparticularly worrying due to their recalcitrance to conscious\ncontrol or implicit bias training. We may be able to veto a\nprejudiced thought (but see Kelly & Roedder, 2008), but we\ncannot simply modify our perceptual experience at will.\nRecalcitrant predictions such as this are “hyperpriors” and\nare unamenable to rapid, conscious adjustment.\nI begin with an overview of predictive processing. I\nexplain that the same principles that allow us to bootstrap\nour way into full cognition also allow for biases to develop.\nThese biases include perceptual racial biases, which are\nvisual and affective rather than cognitive. I explain how\nsampling biases in infancy and emotion perception\ncontribute to perceptual racial biases (although many other\nfactors certainly play a role). Finally, I hypothesize that\ntraditional implicit bias training may not be enough to\ndisentangle the web of hypotheses that contribute to\nperceptual racial bias.",
    "language": "eng",
    "license": {
        "name": "",
        "short_name": "",
        "text": null,
        "url": ""
    },
    "keywords": [
        {
            "word": "philosophy of cognitive science; predictive\ncoding; predictive processing; racial bias"
        }
    ],
    "section": "Biases",
    "is_remote": true,
    "remote_url": "https://escholarship.org/uc/item/67t7m6m6",
    "frozenauthors": [
        {
            "first_name": "Zachariah",
            "middle_name": "A.",
            "last_name": "Neemeh",
            "name_suffix": "",
            "institution": "University of Memphis",
            "department": ""
        }
    ],
    "date_submitted": null,
    "date_accepted": null,
    "date_published": "2020-01-01T18:00:00Z",
    "render_galley": null,
    "galleys": [
        {
            "label": "PDF",
            "type": "pdf",
            "path": "https://journalpub.escholarship.org/cognitivesciencesociety/article/29482/galley/19342/download/"
        }
    ]
}