Hidden Worlds uses Artificial Intelligence and Augmented Reality to examine gender through the lens of computer vision. These works use a computer trained on Greek and Roman statuary to generate its own which I interpret in my own way. I use another AI to write descriptive content for each work and generated music to create a multi-media interactive installation.
Can computers see gender? Without being trained in traditional binary notions of gender what can they produce? And how do we interpret the results? J. Rosenbaum presents a paper on their project Hidden Worlds, an exhibition of Artificial Intelligence Computer Generated artworks and mobile Augmented Reality technologies to see gender through the lens of computer vision. Rosenbaum’s last works used AI to interpret their creations, this time the computer creates the art and Rosenbaum interprets the output. A Generative Adversarial Network that has been trained in thousands of images of Greek and Roman statuary worked for weeks to create its own. Rosenbaum then explored the output to find the truth inside the computer generated work and reveal that to the viewer. Another Neural Network looked at the works and wrote poetry based on what it saw through an image classifier. This is incorporated into a soundscape inside the app. Viewers will see light boxes and watch them come to life inside the app as the computer generated work is transformed and reinterpreted by human eyes and hands. The language is alien, computer driven showing a collaborative effort between human and machine. This highly experimental work invites questions about computers creating art, about how machines see humans and gender and idealized beauty.
Hidden Worlds was at Testing Grounds from September 12 to 22 as part of Critical Mass for Melbourne Fringe Festival.