The Edmonton Police Service (EPS) shared a computer generated image of a suspect they created with DNA phenotyping, which it used for the first time in hopes of identifying a suspect from a 2019 sexual assault case. Using DNA evidence from the case, a company called Parabon NanoLabs created the image of a young Black man. The composite image did not factor in the suspect’s age, BMI, or environmental factors, such as facial hair, tattoos, and scars. The EPS then released this image to the public, both on its website and on social media platforms including its Twitter, claiming it to be “a last resort after all investigative avenues have been exhausted.”
The EPS’s decision to produce and share this image is extremely harmful, according to privacy experts, raising questions about the racial biases in DNA phenotyping for forensic investigations and the privacy violations of DNA databases that investigators are able to search through.
In response to the EPS’s tweet of the image, many privacy and criminal justice experts replied with indignation at the irresponsibility of the police department. Callie Schroeder, the Global Privacy Counsel at the Electronic Privacy Information Center, retweeted the tweet, questioning the usefulness of the image: “Even if it is a new piece of information, what are you going to do with this? Question every approximately 5’4″ black man you see? …that is not a suggestion, absolutely do not do that.”



“People should know that if they send their DNA to a consumer-facing company, their genetic information may fall into the hands of law enforcement to be used in criminal investigations against them or their genetic relatives. None of this data is covered by federal health privacy rules in the United States,” Lynch said. “While 23 and Me and Ancestry generally require warrants and limit the disclosure of their users’ data to law enforcement, other consumer genetic genealogy companies like GEDmatch and FamilyTree DNA provide near-wholesale law enforcement access to their databases.”