Researchers produce images of people’s faces from their genomes

The Economist:

CRAIG VENTER, a biologist and boss of Human Longevity, a San Diego-based company that is building the world’s largest genomic database, is something of a rebel. In the late 1990s he declared that the international, publicly funded project to sequence the human genome was going about it the wrong way, and he developed a cheaper and quicker method of his own. His latest ruffling of feathers comes from work that predicts what a person will look like from their genetic data.

Human Longevity has assembled 45,000 genomes, mostly from patients who have been in clinical trials, and data on their associated physical attributes. The company uses machine-learning tools to analyse these data and then make predictions about how genetic sequences are tied to physical features. These efforts have improved to the point where the company is able to generate photo-like pictures of people without ever clapping eyes on them.

In a paper this week in Proceedings of the National Academy of Sciences, Dr Venter and his colleagues describe the process, which they call “phenotype-based genomic identification”. The group took an ethnically diverse group of 1,061 people of different ages and sequenced their genomes. They also took high-resolution, three-dimensional images of their faces, and measured their eye and skin colour, age, height and weight. This information was used as a “training set” to develop an algorithm capable of working out what people would look like on the basis of their genes.