Figure 1: In this example, a neural network is trained to dream or ‘hallucinate’ the features of arches on a given depth rendering to impose novel geometric structure into the input, and then to perform a style transfer between a rendered texture image from a student’s 3D model upon the result of the dreaming process. Works by Hannah Daugherty, Mariana Moreira de Carvalho, Imman Suleiman.

Towards Hallucinating Machines – Designing with Computational Vision

The main aim of this paper is to demonstrate and…

The painting Portrait of Edmond Belamy, by the Paris based art collective Obvious14 is based on a Generative Adversarial Network(GAN) that was trained with a set of 15,000 portrait examples spanning from the 14th to 19th century.

A Question of Style – Style, Artificial Intelligence and Architecture

The complexity of the term Style consists in the unusual…

The Church of AI, Mariana Sanche, Leetee Wang, PennDesign 2018.

Machine Hallucinations

The main aim of this paper is to demonstrate and…

Results of 2D to 2D Style transfers based on plans: aspects of Estrangement and Defamiliarization profoundly speak in those results about a design ecology in a Posthuman era.

Imaginary Plans

The main aim of this paper is to demonstrate and…