Fall 2024 • Gabriel Barcia-Colombo

Hypercinema

Week 08: Synthetic Media

As an example of synthetic media, I looked into the work of artist Björn Karmann. Specifically his "Paragraphica" camera. This is a camera that doesn't take photos through a lens, but rather uses geolocation data taken when pressing the shutter to generate an image with a text-to-image ai model.

The camera takes the shape of the star-nosed mole, which perceives the world through antennae instead of eyes.
The knobs allow users to alter some parameters of the photo to be generated. Other parameters come directly from the date/time/location where the "photo" is being taken.
Showing the location where the picture was taken, the prompt used, and the generated image.

Why do you consider this synthetic media: This piece works with synthetic media because the images are not captured, rather generated using a text-to-image ai model.

What can you find about how this was made: To generate the images, this work uses the Stable Diffusion API. This model was trained on images from the LAION-5B dataset, which in turn gets images by crawling websites and storing the image together with the alt-text value. That is how it manages to get descriptive text with each image, because the alt-text value is written for the visually impaired to understand what the image portrays.

What are the ethical ramifications of this specific example: I believe there are not any particular ethical issues from this particular piece, apart from the ethical issues from the Stable Diffusion model and the lack of consent from the owners of the training images. With regards to what it communicates, it is interesting to think that this piece will get the closest picture to the actual location if images of said location exist online. Because the internet is ~50% in English, it is not unrealistic to think that locations where English is not spoken (or where the internet is not very prevalent) will not be represented well in the images generated by this camera. In a way it is a window into the "eyes" of the web.

Week 03

This week we focused on making a short stop-motion animation in the format of a gif. Working with Audrey, we decided we wanted to make the animation with our bodies instead of puppets or props.

I remembered Morry Kolman's Traffic Cam Photo Booth which shows a realtime feed of nyc traffic cameras to snap pictures from your phone. We then decided to make our stop motion using those cameras instead of our own.

We switched to the Official NYC DOT website since the interface was more comfortable to record. Then, we chose two cameras in different parts of Brooklyn, that had the same perspective:

Screenshot of a website showing two photos side by side, each of a different city crosswalk. One reads '7 Ave @ Union St' and the other '3rd Ave @ Atlantic Ave'.
The two locations selected on the nyc dot website.
Photo of a city street, with a very low resolution and in black and white, except for some pixels with color.
We were also very intrigued by this camera. It seemed to be more pixelated and with a sort of color isolation filter on. We were not sure if this was a malfunction or the intended image the camera should be producing.

We planned out a simple animation that would connect these two places as if they were side by side and then went to each place to record.

Photo of a traffic camera on top of a traffic light.
The first camera, at the intersection of 3rd av and Atlantic av.
Low resolution photo of a crosswalk.
We took a selfie before getting to work.
Person sitting down with a laptop in a bus stop.
Audrey directing at our first location.
Photo of a traffic camera on top of a traffic light.
The second camera, at the intersection of 7th av and Union St.

Once we finished recording, we headed back to campus and edited the frames, aligning the two takes on top of the original interface from the nyc dot website, to make it look as if it was happening realtime:

Animated gif showing one person passing an umbrella to another person across two separate frames.

Week 02

Our 2nd week assignment consisted in making a physical installation that mimics a particular sound of our choosing. We started with a trip to the hardware store, testing the sounds of some objects.

We found this brushing sound, especially the clicking that occurs when the plastic strands get caught onto the skin and then let go, to be somewhat like a fireplace, with its crackling embers.

We then attempted to design and build a structure to reproduce this sound on its own.

Sketches on a piece of paper.
We made some sketches and tested out a mount with a heavy-duty metal wire.
Hands holding metal wire wrapped around a dish sponge.
A small dc with an eraser stuck to its axis, and a small cork touching the eraser on one side and a blue plastic dish sponge on the other..
Small makeshift structure with a wooden base, a plastic dish spong facing upwards, an a motor held right over it with a metal wire. A cork hangs from the motor over the sponge.
This first iteration of the machine made us realize that the DC motor didn't have enough torque to brush against the sponge.

For the second iteration, we changed the motor for another type of DC motor with a gearbox that increased its torque.

DC gearbox motor in the forefront of the image.
This 4.5V gearbox motor would have a more suitable torque for this application.
Small structure with a wooden base, a plastic dish spong facing upwards, an a motor held right over it with a metal wire.
We built the mount again with the metal wire.
Hand made sketch with scribbled measurements.
To take advantage of the torque, we needed to have a very good mount for the cork onto the motor. So we took some measurments with a calliper.
3D model.
Made a 3D model based on the measurements.
Small plastic object in a T shape.
3D print of the model in PLA plastic.
Small structure with a wooden base, a plastic dish spong facing upwards, an a motor held right over it with a metal wire. A small plastic t-shaped object mounts a cork to the motor spindle, and hangs over the sponge.
The second iteration of the machine.

This new iteration worked much better! However, due to the friction between the gears, the motor also made a much louder noise that obscured the sound we wanted to make.

Therefore we made a recording and used the 30 band equalizer to amplify the frequencies of the sponge and muffle the frequencies of the gearbox, in order to get closer to the sound we wanted.

Recording of the final machine. The audio was edited to amplify the desired frequencies.

Week 01

For the first week, our assignment consisted of recording a series of sounds that represented different concepts. Together with Jinnie Shim & Ray Wang, we went out on a scavenger hunt and recorded the following audios.

We also got started on our project for next week, which will be documented on the week 2 blog post.