Computer scientist Latanya Sweeney turns from her highly influential work on data privacy to investigate patterns in Google AdSense. Specifically, she “asks whether”:https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2208240 the ads that you see when you search for someone by name vary depending on whether you search for a characteristically black or characteristically white name.
bq. First names, previously identified by others as being assigned at birth to more black or white babies, are found predictive of race (88% black, 96% white), and those assigned primarily to black babies, such as DeShawn, Darnell and Jermaine, generated ads suggestive of an arrest in 81 to 86 percent of name searches on one website and 92 to 95 percent on the other, while those assigned at birth primarily to whites, such as Geoffrey, Jill and Emma, generated more neutral copy: the word “arrest” appeared in 23 to 29 percent of name searches on one site and 0 to 60 percent on the other. On the more ad trafficked website, a black-identifying name was 25% more likely to get an ad suggestive of an arrest record. A few names did not follow these patterns: Dustin, a name predominantly given to white babies, generated an ad suggestive of arrest 81 and 100 percent of the time.
It isn’t clear, however, whether this is the result of racially biased expectations on the part of the advertiser, or on the part of people who click on the ads.
bq. Google understands that an advertiser may not know which ad copy will work best, so an advertiser may give multiple templates for the same search string and the “Google algorithm” learns over time which ad text gets the most clicks from viewers of the ad. It does this by assigning weights (or probabilities) based on the click history of each ad copy. At first all possible ad copies are weighted the same, they are all equally likely to produce a click. Over time, as people tend to click one version of ad text over others, the weights change, so the ad text getting the most clicks eventually displays more frequently. … Did Instant Checkmate provide ad templates suggestive of arrest disproportionately to black-identifying names? Or, did Instant Checkmate provide roughly the same templates evenly across racially associated names but society clicked ads suggestive of arrest more often for black identifying names? Google uses cloud-caching strategies to deliver ads quickly, might these strategies bias ad delivery towards ad templates previously loaded in the cloud cache? Is there a combinatorial effect?