Fall 2024 • Daniel Rozin

Physical Computing

Week 11

As we refine the idea more and more, we grow wary about time. We decided to focus initially on the listening device and push the recording device lower in the scale of importance.

Before designing the form of the listening device, we needed to experiment with the material, to better understand how we can fabricate our own inflatables.

Mylar sealed with a hair iron.
Using a large roll of mylar available at the shop, we first tested sealing it with a hair iron, which had pretty good results.
Mylar sealed with a clover mini iron.
We then tested sealing it with a clover mini iron from the shop, which is on its last legs. It also did a pretty good job and was easy to manouver.
Mylar bag.
Using the clover mini, we made the seals on 3 sides.
Mylar bag inside out.
And then turned it inside out so the edges were not so prominent. At this moment we realized only the shiny side of the mylar can be iron on to stick. The other side does not stick.
We then tested the pressure sensing by placing a barometer inside. It worked surprisingly well. With some smoothing, it'll be even better.
Mylar bag sealed with an impulse sealer.
Test using an impulse sealer. This was by far the fastest and most reliable method.
Mylar balloon with valve.
Lastly, we improvised a valve using a jack connector with a nut. It worked pretty well despite not having an o-ring. Now we have to find some better alternatives for this.
Also, when testing to seal a string on the mylar balloon to aid in pulling it inside out, we saw this very interesting motion when pullign the string.

Apart from these tests, we also found some good reference information regarding how to make inflatables easily:

Week 10

For the final, Niko and I paired up to continue exploring some concepts he began in his midterm: air as a storage medium for sound. This speculative idea explores how interacting with air can be translated to the medium of sound, like squeezing a container of air and having sound come out more quickly to simulate pressure.

For our proposal, we looked into designing both a device to capture the sound/air as well as an output device that could provide a listening experience through gestural interactions with air receptacles:

A sketch of an inflatable with speakers and pockets of air.
Loose sketch of what the listening device could look like. The idea is to make a large inflatable with pockets or air and speakers on each pocket. As people lie on it and press on the pockets, the pressure builds up inside them and is detected with a barometer, in turn modifying the sound output by the speaker on the pocket.
A sketch of bellows filling up a balloon.
A rough sketch of a concept for the recording device. More than recording/inhaling, this device must look like it is doing that, as it would really be seen only in a video accompanying the listening device. Kind of a making of video that would show where the sound and air comes from.

We had a series of inspirations stemming from the following references:

Week 06, 07 & 08

Midterm & P5 Serial Lab

These weeks were mainly focused on working on the midterm, for which I did the arduino to p5 serial communication lab.

Before continuing with my project, I met with Jeff to discuss the technical questions I outlined in my previous post. Although we couldn't arrive at some definitive answers, he did show me how to use an oscilloscope to measure the frequency of the wave going to the ultrasound array, something that I must always maintain at 40kHz.

An oscilloscope reading a wave with a 40kHz frequency.
I tested it out with Noah's speaker and measured 40.32kHz.

The day after, my components arrived. Sadly, I had mistaken the amount of transducers in the order, as I thought I would receive 100 but got only 1. This meant that the project would be considerably more expensive, so I decided to use Noah's speaker instead, and make a prototype of the whole system I envisioned.

The idea was to make a directional speaker that could track people in space and follow them so that only they can hear the sounds. This was part of a bigger goal of encouraging people to stand and remain close together, to hear the sounds with each other.

A NEMA 17 stepper motor.
I soldered my NEMA 17 stepper to a cable terminal for easier handling.
A 4 pin terminal.
This terminal which I found in the junk bin was perfect for the job as it had 0.1in spacing.
Once I had connected it to my H Bridge module, following the steps on the stepper lab, tested it out with an example code. I noticed after leaving it on for a while it would heat up a lot. Therefore, I decided to connect the STANDBY pin to a digital pin (through a current-limiting resistor) to pull the pin LOW when no movement is expected. This kept it nice and cool to the touch.

Once the motor was working, I started to work on the connection to p5.js, using the information on the lab.

After some tests, I got it to move with an HTML slider.
Once that was working, I added ml5.js bodypose model and got the motor to follow my nose.

The general gist of the code (which is posted below at the end of this documentation post) is that the arduino receives a byte between 0-200. From 0-99, it means it must move clockwise those amounts of steps. From 100-200, it is actually counterclockwise movement, moving the received value - 100 steps counterclockwise. This logic was built in order to send messages only using bytes and keep it speedy. When a message like this arrives, the arduino moves the motor and calculates how long it will take. During this time, it does not read any messages. Once it thinks its done, it send a message back saying it is ready for a new instruction.

Now, in order to keep everything small, I considered running everything off of a Raspberry pi.

A raspberry pi with a camera connected into a laptop through VNC
I borrowed a Raspberry Pi and Camera from the shop and set it up to be used with VNC from my laptop. However, when trying to run the sketch, I realized that the Pi Camera is not automatically recognized as a webcam. Therefore, to be used by p5.js, I needed to setup a server on the pi and broadcast the camera on that server. This was too much overhead for this project, also considering that the Pi takes a lot of current to power.
Small usb webcam in front of a laptop.
I decided to use this tiny usb webcam instead and run everything off of my own laptop for the time being.

With the main technical decisions, code and electronics out of the way, I proceeded with some fabrication.

A piece of laser cut acrylic.
I did an initial test with a scrap piece of acrylic. This shape was meant to acomodate the speaker and be folded on the top and bottom to mount onto a threaded rod.
A bent piece of laser cut acrylic.
I tried the bend and, although it broke, I also noticed that bending it caused the large hole to flare out.
A piece of laser cut acrylic.
I laser cut it again with the final acrylic and taking into account what I learned from the first prototype.
A bent piece of laser cut acrylic.
This one worked much better when bent, as not having the large hole made it more structurally sound.
Parametric speaker and webcam mounted on a piece of bent acrylic.
Now I tried mounting the speaker together with the webcam. Because I did not want to glue anything as this is just a prototype, I held the webcam in place with a large washer and a nut, that kept pressure on the back of the webcam pcb. I made sure to add two layers of non conducting tape to the back in order to avoid a short circuit with this setup.
3D model of slip ring mount.
With that part out of the way, I wanted to figure out how to use a slip ring that would allow the speaker to rotate continuously. I 3D modeled this piece that would screw into the threaded rod on the bottom. On the top, it would have heat-inserts in which the slip-ring can be screwed in. Then, the cables would come out the bottom of the 3D print and also out the top. Before printing, I removed the 3D threads and instead meade the bottom hole equal to the minor diameter of the rod, because the threads were too small for the printer.
3D printed part on top of a threaded rod.
The piece came out great! I screwed the rod into it and it stayed on tightly.
Slip ring being screwed onto something else.
I put int the heat inserts and screwed the slip ring to the top.
Female usb A cable soldered onto slip ring.
I decided to solder a female usb A cable to the slip ring on the side of the speaker, So I did not need to cut the original cable from the webcam and could later on reuse it.
Parametric speaker on a custom mount
I tested the mount and clamped the motor to the table.
Then I tested out the program again with the mount running. At this point I realized that if I kept pulling the STANDBY pin LOW when the motor was not in use, the inertia from the speaker would keep the motor moving past the step I aimed for. Therefore, I had to keep STANDBY HIGH at all times.

At this point, I needed to figure out how to get the mount to stand at eye level. By talking with Niko, we thought it would be a good idea to put it on top of a light stand, screwing it in using the tradition tripod screw. I found just the right tripod sized nut in the junk bin and built it up with a light stand from the ER.

Motor mounted onto a light stand
Niko found this piece of metal which fit the motor perfectly. I then added the tripod sized nut and screwed it onto the light stand. I also added washers and spring washers to fastent the acrylic more strongly to the rod and avoid having it loosen through vibration.
Motor mounted onto a light stand
All the cables that needed to come down to the arduino and the laptop were held out of the way using some metal wire that fell in a curve, allowing the speaker to move without tangling.
Covered up cables with knitted covers.
I used some knitted cable cover to cover up the cable management mess, also a suggestion from Niko who did the same with his project. Here you can see one of the things I overlooked: the coupler I ordered had a flex property, which made my speaker not stand straight, but rather to wobble quite a bit.
This was a short test of the whole system, using some music from my laptop. You can tell how the sound can only be heard when the camera is facing the speaker directly.

With this, the fabrication part was done. I now wanted to experiment more with the concept behind the piece and the interactions it would allow. However, time was running out :(

I built a sketch that would detect people. If it only saw one person, the motor would move away, avoiding them. If it saw more than one person, and they were close together, the motor would insted follow them. The idea being tested out was that you can only listen to the music when being together with somebody else.

Because I arrived at this late at night, there was nobody else to test it with and see how to iterate this interaction. Upon trying it in class, it became apparent that it was too much of a complex interaction, and the users would not really understand what was going on. Danny suggested to keep looking into it, that maybe the speaker following single people would be necessary at the beginning, and then this behaviour could shift as more people come into the scene.

Parametric speaker on a light stand.
This was a picture I took after presenting in class. At the end I was a bit dissapointed with the experience (or lack thereof) that the prototype created. It is evident that a lot of user testing is still needed to build an engaging and meaningful experience with this technical concept, which is something I would like to explore more in the future.
Show/hide Arduino code below

#include <Stepper.h>

//Motor
const int standbyPin = 7; //Set low when motor is idle to lower temperature. May affect holding torque
const int stepsPerRevolution = 200; //Defined by the motor
const int stepperRPM = 50;
const int stepsPerMs = stepsPerRevolution * stepperRPM / 60000;
bool busy = false; //Is the motor currently doing an instruction it received
unsigned long jobStart = 0; //When did the current job start (ms, from millis())
int jobLength = 0; //How many ms will the current job take

Stepper myStepper(stepsPerRevolution, 8, 9, 10, 11);

void setup() {
    // set the speed at 50 rpm:
    myStepper.setSpeed(stepperRPM);

    //Initialize the standby pin
    pinMode(standbyPin, INPUT);
    digitalWrite(standbyPin, HIGH);

    // initialize the serial port:
    Serial.begin(9600);
}

void loop() {
    //When a message from p5 is received
    if (Serial.available() > 0 && !busy) {
    // read the incoming byte:
    int inByte = Serial.read();

    //If it is a non-zero movement message
    if (inByte <= 200 && inByte != 0) {
        updatePosition(inByte); //Move motor to new position
    } else {
        informState(); //Inform that the motor is available
    }
    }

    checkCurrentState(); //Check if the motor is doing a job and update variable 'busy'
}

void updatePosition(int steps) { //Move motor
    if (steps <= 100) {
    //From 1 to 100 the steps are counterclockwise
    steps *= -1;
    } else {
    //From 101 to 200, the steps are actually 1-100 clockwise
    steps -= 100;
    }

    //Connect the motor
    //digitalWrite(standbyPin, HIGH);
    //Take the instructed steps
    myStepper.step(steps);
    busy = true;
    jobStart = millis();
    //Still need to figure out how to calculate this correctly
    //jobLength = ceil(abs(steps)/stepsPerMs);
    jobLength = 10;
}

void informState() {
    //Send the current state to p5
    Serial.write(busy);
}

void checkCurrentState() {
    if ( busy ) {
    //If estimated job duration time has passed
    if ( millis() - jobStart > jobLength) {
        //Set at not busy
        busy = false;

        //Disconnect motor
        //digitalWrite(standbyPin, LOW);

        informState();
    } else {
        busy = true;
    }
    }
}                     
                    
Show/hide p5.js code below

const serial = new p5.WebSerial();

let inData = 0;
let portButton;

//ml5 setup
let video;
let bodyPose;
let poses = [];
let centers = [];
// let connections;

let captureAvailable = false;

const globalConfidence = 0.1;

let song;

function preload() {
    // Load the bodyPose model
    bodyPose = ml5.bodyPose( {flipped: true});
    song = loadSound('assets/merry-go-round.mp3');
}


function setup() {
    createCanvas(640,480);
    // check to see if serial is available:
    if (!navigator.serial) {
    alert("WebSerial is not supported in this browser. Try Chrome or MS Edge.");
    noLoop();
    return;
    }
    startSerial();

    navigator.mediaDevices.enumerateDevices().then(function(devices) {
    devices.forEach(function(device) {
        if (device.label.includes('USB 2.0 PC Cam')) {
        const constraints = {
            video: {
                deviceId: device.deviceId
            },
            audio: false,
            flipped: true,
            };
            
            video = createCapture(constraints);
            video.elt.addEventListener('end', function(){
            console.log("Video stream ended, attempting to reconnect...");
            });
            video.size(640, 480);
            video.hide();
            bodyPose.detectStart(video, gotPoses);
            // Get the skeleton connection information
        //   connections = bodyPose.getSkeleton();
            captureAvailable = true;
        }
    });
    });
}

function draw() {
    background(255);
    if (!song.isPlaying()) {
        song.play();
    }
    // Draw the webcam video
    if(captureAvailable) {
        image(video, 0, 0, width, height);

        // Draw all centers;
        if(centers.length > 0) {
            centers.forEach(center => {
                noFill();
                stroke(0,255,0);
                line(center, 0, center, height);
            });
        }

        //Draw the distance between each person
        let orderedCenters = centers.slice();
        orderedCenters.sort((a, b) => a - b);
        for( let i = 0; i<orderedCenters.length; i ++) {
        if( i+1 < orderedCenters.length ){
            fill(0,255,0);
            const closestDistance = orderedCenters[i+1] - orderedCenters[i];
            if(closestDistance < proximity) {
                fill(0,0,255);
                stroke(0,0,255);
            }
            textAlign(CENTER);
            text(orderedCenters[i+1] - orderedCenters[i], orderedCenters[i] + (orderedCenters[i+1] - orderedCenters[i])/2, height/2 - 20);
            line(orderedCenters[i], height/2, orderedCenters[i+1], height/2);
        }
        }
    }else {
        textAlign(CENTER);
        text('External camera unavailable or loading :(', width/2, height/2);
    }
        
    //Draw audible area
    fill(220, 40);
    noStroke();
    rect( (width - audioRange)/2, 0, audioRange, height );
}

function gotPoses(results) {
    // Save the output to the poses variable
    poses = results;

    centers = [];
    //Run through poses and get the centers
    for (let i = 0; i < poses.length; i++) {
        let person = poses[i];
        let center = false;
        if(person.nose.confidence > globalConfidence){
            center = person.nose.x;
        }else if(person.right_eye.confidence > globalConfidence && person.left_eye.confidence > globalConfidence){
            center = person.right_eye.x - person.left_eye.x;
            const person_span = person.right_eye.x - person.left_eye.x;
            center = person.left_eye.x + person_span/2;
        }else if(person.right_shoulder.confidence > globalConfidence && person.left_shoulder.confidence > globalConfidence){
            center = person.right_shoulder.x - person.left_shoulder.x;
            const person_span = person.right_shoulder.x - person.left_shoulder.x;
            center = person.left_shoulder.x + person_span/2;
        }else {
            let minX = Infinity;
            let maxX = -Infinity;
            person.keypoints.forEach(keypoint => {
                if(keypoint.confidence > 0.1) {
                    minX = min(minX, keypoint.x);
                    maxX = max(maxX, keypoint.x);
                }
            });
            const person_span = maxX - minX;
            center = minX + person_span/2;
        }

        if(center !== false ){
            centers.push(center);
        }
    }


    if( centers.length > 0 ){
        // targetPos = map(width/2 - centers[0], -width/2, width/2, 3, -3, true); //distance from center of screen to center of person
        // if(Math.abs(Math.floor(targetPos)) == 1) {
        //     targetPos = 0;
        // }

        if(centers.length >= 2){
            const closestPair = findClosestPair(centers);
            if( Math.abs(closestPair[0] - closestPair[1]) < proximity ) {
                follow(closestPair[0]);
            }
        }else {
            avoid();
        }
    }
}

let audioRange = 140;

function avoid() {
    for (const person of centers) {
        if(person > (width - audioRange)/2 && person < (width + audioRange)/2) {
            targetPos = random(-20, 20);
            console.log('avoided');
            break;
        }
    }
}

let proximity = 70;

function follow( coord ){
    targetPos = map(width/2 - coord, -width/2, width/2, 3, -3, true); //distance from center of screen to center of person
    console.log('followed');
}

function findClosestPair(arr) {
    if (arr.length < 2) {
        return null; // Not enough numbers to compare
    }
    
    // Step 1: Sort the array in ascending order
    arr.sort((a, b) => a - b);
    
    // Step 2: Initialize variables to track the closest pair and the smallest difference
    let minDiff = Infinity;
    let closestPair = [];
    
    // Step 3: Iterate through the sorted array to find the smallest difference
    for (let i = 1; i < arr.length; i++) {
        let diff = Math.abs(arr[i] - arr[i - 1]);
    
        if (diff < minDiff) {
        minDiff = diff;
        closestPair = [arr[i - 1], arr[i]];
        }
    }
    
    return closestPair;
    }

function startSerial() {
    // if serial is available, add connect/disconnect listeners:
    navigator.serial.addEventListener("connect", portConnect);
    navigator.serial.addEventListener("disconnect", portDisconnect);
    // check for any ports that are available:
    serial.getPorts();
    // if there's no port chosen, choose one:
    serial.on("noport", makePortButton);
    // open whatever port is available:
    serial.on("portavailable", openPort);
    // handle serial errors:
    serial.on("requesterror", portError);
    // handle any incoming serial data:
    serial.on("data", serialEvent);
    serial.on("close", makePortButton);
}

/* SERIAL CALLBACK FUNCTIONS */

// if there's no port selected, 
// make a port select button appear:
function makePortButton() {
    // create and position a port chooser button:
    portButton = createButton("choose port");
    portButton.position(10, 10);
    // give the port button a mousepressed handler:
    portButton.mousePressed(choosePort);
}

// make the port selector window appear:
function choosePort() {
    if (portButton) portButton.show();
    serial.requestPort();
}

// open the selected port, and make the port 
// button invisible:
function openPort() {
    // wait for the serial.open promise to return,
    // then call the initiateSerial function
    serial.open().then(initiateSerial);

    // once the port opens, let the user know:
    function initiateSerial() {
        console.log("port open");
        serial.print(201);
    }
    // hide the port button once a port is chosen:
    if (portButton) portButton.hide();
}

// pop up an alert if there's a port error:
function portError(err) {
    alert("Serial port error: " + err);
}
// read any incoming data as a string
// (assumes a newline at the end of it):
let targetPos = 0;
let timeoutIndex = null;

function serialEvent() {
    inData = Number(serial.read());
    //console.log(inData);

    //Available for moving motor
    if(inData == 0 ) {
        clearTimeout(timeoutIndex);

        setTimeout( function(){
            let dataToSend = Math.floor(targetPos);
            if( dataToSend < 0 ){
                dataToSend = abs(dataToSend);
                dataToSend += 100;
            }
            if(dataToSend != 0 ){
                //console.log(dataToSend);
            }
    
            //console.log(dataToSend);
    
            serial.print(dataToSend);
            targetPos = 0;
        }, 100);
        
    }//else, it is busy, wait for the 0.
}

// try to connect if a new serial port 
// gets added (i.e. plugged in via USB):
function portConnect() {
    console.log("port connected");
    serial.getPorts();
}

// if a port is disconnected:
function portDisconnect() {
    serial.close();
    console.log("port disconnected");
}

function closePort() {
    serial.close();
}                 
                    

Week 05

Midterm

This week I continued researching about parametric speakers, trying to best understand their limitations and their possibilites. I started testing them out with the one Noah lent me:

Array of ultrasound transducers.
A parametric speaker that Noah lent me for a few days, it really helped to understand some of the basic aspects of these speakers.
Test of directionality with the speaker.

I also studied the technical aspects of this technology and drew a few important conclusions:

  • They seem to work best from a distance, so mounting them from an unreachable place is best. Ceiling mount might be the ideal location.
  • These speakers (with the circuits I have found) cannot transmit very high or very low frequencies (about 400Hz to 5KHz), which is very important to take into account when deciding the sounds to be played. Maybe they could be accompanied with a non-directional bass, so only the melody is projected directionally.
  • The ideal frequency to modulate the ultrasound at is 60KHz, however the only affordable transducers I could find are 40KHz ones. It seems that the main industrial manufacturers like Holosonics use 60KHz.
  • More transducers per array translate to more directionality (a more closed angle).
  • They are very reflective, it might be a good idea to use them in a sound-isolated space than can absorb the waves.
  • There are some worries about the safety of this technology. The general consensus is that ultrasound applications should not exceed 110-115dB SPL (sound pressure levels) to remain safe. This is orders or magnitude above what these hand built transducers can do, but it is good to keep in mind.

A lot of this information came from a translated version of the speaker's documentation, other japanese documentation, the first comment on an instructables video, and the Wikipedia article about generating sound from Ultrasound.

Circuit diagram.
This is the circuit diagram I will use to build my first prototype. It was taken from the video I cited last week, and is the simplest option I have found, which is a good starting point.

I prepared a BOM and ordered the required components from DigiKey:

Bill of materials.
Bill of materials for the prototype, minus the components I have access to at the floor.

Next steps will be building the prototype and meeting with some faculty to go over some of the technical questions I have.

Labs

This week's labs focused on controlling high loads using transistors and relays. This is something I have worked on before, but I can never seem to memorize what each leg of the mosfet does. So I decided to review it by making a simple analog circuit.

For my intro to fabrication, I wanted to make a flashlight that was dimmable, so I bought a premade flashlight and modified its circuit using a mosfet:

Flashlight pieces on a cutting mat.
I bought a cheap flashlight and tore it apart into its constituents.
Small flashlight pcb.
I analyzed the circuit to understand how it worked, since the flashlight had 5 different modes. It basically consisted of the LED and an IC that controlled those modes. Sadly I couldn't find a datasheet or any documentation, which made me think it was a custom proprietary chip.
Front view of a small circuit board.
I desoldered the button and the IC, as well as the secondary LED to leave just the flashlight LED and the battery holder.
I then built a simple circuit using a mosfet to control the dimming of the flashlight. The beauty of it is that the mosftet takes negligible current to control, so when the resistance of the potentiometer is very high, basically no current is being used, saving battery power.
Makeshift circuit with a transistor.
I then soldered the circuit trying to keep the smallest footprint possible.
Small electronic terminals.
I decided to use these terminals for the connection to the potentiometer, since I wanted it to be easy to disconnect if the need arises, so that this piece can fully come out of the flashlight.

Week 04

This week was mainly focused on finding a concept for the midterm, apart from the course readings. After looking into the midterm project description, I found the idea of doing a research project the most appealing, exploring technologies I have not worked with before for a future (possibly final project) piece.

For a long time, I have been interested in directional speakers: sound transducers that project sound in a narrow beam, so only those in the trajectory of said beam can hear it. Normally, these speakers, like the ones from Focusonics or SRAY are very expensive.

However, recently I have come across a low-cost way to build these speakers:

Short video on how to build a directional speaker that works with ultrasonic transducers.

I have worked with these transducers on their own somewhat before, making kind of like an ultrasonic radio, to send private messages over-the-air and sense them through haptics. In retrospect it was very ill-considered. It was an exploration on haptic commmunication, but the directionality of ultrasound did not really help: the two devices had to be pointed directly at each other. The mount design and material were not ideal either.

Two identical rectangular ceramic objects with protruding electronics, both on a white plinth over a black background.
Photo from 04/28/19. The two communicators were encased in a ceramic wearable. Users could send tactile presses by tapping the FSR. The other user would receive those presses real time and sense them haptically, with the small cell-shaped vibration motor.
A small white ceramic object mounted on the upper portion of a man's arm, with two protruding ultrasonic transducers.
Photo from 04/27/19. The way the arm-mounted wearable was conceptualized.
Arduino and other electronics on a breadboard.
Photo from 04/27/19. The electronics inside.

So I would really enjoy revisiting this directionality of ultrasound through a new lens, which is why I am looking to focus my midterm on building and researching directional "spotlight" speakers.

A longer term goal is to implement these speakers in an automated system, that can actually point and project audio at specific people around it. To do this, I would also like to explore Servo Pan/Tilt Brackets and stepper motor traveling nut linear actuators, as potential ways to control my system.

Some art pieces that explore/exploit these phenomenons:

Week 03

This week's work was focused on analog signals, specifically working with some audio and motors. Since I have experience working with motors, but very little with audio, I decided to focus more on that.

I was intrigued by the Mozzi sound synthesis library mentioned in the lab readings. At first I really couldn't believe something like that existed without the need of a shield or an extra breakout board. So I decided to try it out.

A breadboard with an arduino nano 33 iot, a potentiometer, a pnp transistor and an 8ohm speaker.
I did a basic circuit running a speaker with 5V through a PNP transistor. I placed a potentiometer in series with the speaker to do some basic volume control. The output of the arduino that controlled the transistor was pin A0, which is the pin that the Mozzi library uses for this type of arduino.

Once I had tested that out with a simple tone, I went ahead and uploaded an example from the library.

I tested out the example Vibrato sketch.

I then started moving the code around and making some changes, trying to understand how it all works. Since I don't have much experience with sound, some large concepts were not very clear to me. I spent a good time reading and fiddling with the Mozzi online learning guides.

Making some changes, I figured out how I could change values as the sketch looped, using the updateControls() function. This sketch just continually increase the pitch each iteration of the loop.

After understanding how to modify value in real time, I ventured to add a sensor for control:

I added an FSR that controlled the frequency of an oscillator, which also had a vibrato active.

At this point, I started spending more time on the code side, trying to understand how to layer multiple oscillators or what could be done to create more interesting outputs that have more depth. I think a deeper knowledge of how sound synthesis works would be a lot of help. In the end I decided to make one more interactive sketch:

Sketch with 4 different oscillators, each with vibrato. Both their frequencies and the depth of vibrato are controlled by potentiometers.
Show/hide code below

#define MOZZI_CONTROL_RATE 128
#include <Mozzi.h>
#include <Oscil.h>
#include <tables/sin2048_int8.h>

//Oscillator for first frequency
Oscil <2048, MOZZI_AUDIO_RATE> aSin(SIN2048_DATA);

//Oscillator for second frequency
Oscil <2048, MOZZI_AUDIO_RATE> aSin2(SIN2048_DATA);

//Oscillator for third frequency
Oscil <2048, MOZZI_AUDIO_RATE> aSin3(SIN2048_DATA);

//Oscillator for fourth frequency
Oscil <2048, MOZZI_AUDIO_RATE> aSin4(SIN2048_DATA);

//Oscillator for vibrato
Oscil <2048, MOZZI_CONTROL_RATE> kVib(SIN2048_DATA);

int freq = 900;
float vibrato = 0;
int combinedOutput = freq;

void setup() {
    startMozzi();
    aSin.setFreq(400);
    aSin2.setFreq(400);
    aSin3.setFreq(400);
    aSin4.setFreq(400);
    kVib.setFreq(6.5f);
}

void updateControl() {
    int trimmerL = mozziAnalogRead<10>(2);
    int trimmerR = mozziAnalogRead<10>(3);

    vibrato = map(constrain(trimmerR, 0 , 1024), 0, 1024, 2, 60) * kVib.next();
    freq = map(constrain(trimmerL, 0 , 1024), 0, 1024, 300, 10000);
    aSin.setFreq(freq + vibrato / 2);
    aSin2.setFreq(freq + 100 + vibrato);
    aSin3.setFreq(freq + 200 + vibrato);
    aSin4.setFreq(freq + 300 + vibrato);
}

AudioOutput updateAudio() {
    combinedOutput = aSin.next() + aSin2.next() + aSin3.next() + aSin4.next(); // Mix the audio signals
    return MonoOutput::from8Bit((combinedOutput / 4)); // Normalize output
}

void loop() {
    audioHook();
}                        
                    

Week 02

This week we delved into inputs and outputs as well as serial communication. During class, Prof. Rozin recommended that students with previous knowledge in pcomp work beyond the labs to continue their learning.

Therefore, I decided to develop a small project based on the USB capabilities of the nano to simulate a mouse and keyboard. I wanted to create a grid-like interface to control the mouse, setting the x & y of the mouse with sliders. This would allow me to practice using analog sensors and learn more of the mouse functions.

Two slider potentiometers on top of a table.
Una lent me these two sliders she got off of Amazon. She explained that they didn't really have a data sheet, and they had more than the usual pins.
I set the multimeter to measure resistance and through trial and error managed to identify the three main pins of the slider.
Three breadboards side by side with an arduino nano, one slider potentiometer set horizontally and one vertically.
I planned out the interface on my breadboards.
Three breadboards side by side with an arduino nano, one slider potentiometer set horizontally and one switch on the top right.
I wired out the connections for one slider along with a switch, to be able to turn off control of the mouse if it all went sideways.

At this point, I realized I had fundamentally misunderstood the Mouse.move() function. I thought that it would allow me to move the mouse to specific coordinates on the screen. Instead, it can just move the mouse relative to its previous position.

This makes a lot of sense given how computer mice work. However, for me it was a big problem.

With a lot of frustration, I figured out I needed to calm down. So I changed my small project to something that could calm me down: A slow breathing visualizer.

The idea was to make a visual representation of my breath, guiding me through slow breathing excercises and relaxation:

Sound sensor and arduino on a breadboard.
I connected a cheap sound sensor from years ago to use as the main sensor for my breath.

Because the sound sensor was such a cheap low quality component, the signal it gave was really noisey:

Apart from the noise, I also noticed that when it detected sound, it would not only increase the reading value, it would also decrease it as well, acting like a kind of sound wave.

In order to clean it up and process it, I remembered a running average filter function I once used, created by researcher Mads Hobye and documented in his doctoral thesis. I applied it to the sensor readings (along with a few other lines to remove the "wave" behavior from the values).

The function worked very well to clean up the noise. The downside is that it added some lag to the changes in value, but because it was for a breathing visualizer, this lag made everything look much smoother, which turned out to be a great advantage.

With the sensing out of the way, I needed to work on the actuation. I connected an old led matrix that would be used to visualize my breathing. To control it I used the MAX7219 Library.

The example code from the library used, showing that all connections had been done correctly.
A circuit on a breadboard consisting of an arduino nano, a sound sensor, an LED 64 pixel screen and a small push button.
Knowing it was working, I organized the circuit on the breadboard and added a pushbutton that allow me to reset the value that the sound sensor recognized as 'silence'.
A screenshot of a web browser on a website with an image of an LED matrix.
To design the animation on the LED matrix, I used the help of this website, to draw the lights and see what it would look like.

After finishing the code, my visualizer was done:

And this was my final code. A little bit messy but it worked well for a prototype:

Show/hide code below

/*
    September 15-16 2024
    Nasif Rincón

    Relaxing breath visualizer
    A simple device that senses breath as sound and
    visualizes the depth of the breath by filling out
    an LED matrix screen.
*/

#include <MAX7219.h>
//Library: https://github.com/michaelkubina/Arduino_MAX7219_LED_Matrix_Library

//LED Matrix Variables
#define DATAPIN     18
#define CLOCKPIN    19
#define CSPIN       11

MAX7219 Matrix(1, DATAPIN, CLOCKPIN, CSPIN);

//Breath sensor variables
int raw = 0;
float filtered = 0;
int baseline = 520;

void setup() {
    Serial.begin(9600);
    pinMode(A3, INPUT);
    pinMode(5, INPUT_PULLUP);
    Matrix.clearDisplay(1);

    Matrix.setLed(1, 3, 3, true);
    Matrix.setLed(1, 4, 3, true);
    Matrix.setLed(1, 3, 4, true);
    Matrix.setLed(1, 4, 4, true);
}

void loop() {

    if (!digitalRead(5) == HIGH) {
        baseline = analogRead(A3);
        return;
    }

    raw = abs( baseline - analogRead(A3) );
    //Filtering based on the work of Mads Hobye http://www.diva-portal.org/smash/get/diva2:1404339/FULLTEXT01.pdf#page=195
    filtered = filtered * 0.999 + raw * 0.001;

    //Animation
    Matrix.clearDisplay(1);

    if ( filtered > 3 ) {
      levelFive();
    } else if ( filtered > 2.5 ) {
      levelFour();
    } else if ( filtered > 2 ) {
      levelThree();
    } else if ( filtered > 1.5 ) {
      levelTwo();
    } else if ( filtered > 1 ) {
      levelOne();
    } else {
      levelZero();
    }

    //Intensity
    Matrix.setIntensity(1, constrain(filtered, 1, 15) * 2 );
}

void levelZero() {
    Matrix.setLed(1, 3, 3, true);
    Matrix.setLed(1, 4, 3, true);
    Matrix.setLed(1, 3, 4, true);
    Matrix.setLed(1, 4, 4, true);
}

void levelOne() {
    for ( int row = 2; row < 6; row++ ) {
        for ( int col = 2; col < 6; col++ ) {
            if ( (col == 2 && row == 2) || (col == 5 && row == 2) || (col == 5 && row == 5) || (col == 2 && row == 5) ) {
                continue;
            }
            Matrix.setLed(1, row, col, true);
        }
    }
}

void levelTwo() {
    for ( int row = 1; row < 7; row++ ) {
        for ( int col = 1; col < 7; col++ ) {
            if ( (col == 1 && row == 1) || (col == 2 && row == 1) || (col == 5 && row == 1) || (col == 6 && row == 1) ||
                (col == 1 && row == 6) || (col == 2 && row == 6) || (col == 5 && row == 6) || (col == 6 && row == 6) ||
                (col == 1 && row == 2) || (col == 6 && row == 2) ||
                (col == 1 && row == 5) || (col == 6 && row == 5) ) {
                continue;
            }
            Matrix.setLed(1, row, col, true);
        }
    }
}


void levelThree() {
    for ( int row = 0; row < 8; row++ ) {
        for ( int col = 0; col < 8; col++ ) {
            if ( (col == 0 && row == 0) || (col == 1 && row == 0) || (col == 2 && row == 0) || (col == 5 && row == 0) || (col == 6 && row == 0) || (col == 7 && row == 0) ||
                (col == 0 && row == 1) || (col == 1 && row == 1) || (col == 6 && row == 1) || (col == 7 && row == 1) ||
                (col == 0 && row == 2) || (col == 7 && row == 2) ||
                (col == 0 && row == 5) || (col == 7 && row == 5) ||
                (col == 0 && row == 6) || (col == 1 && row == 6) || (col == 6 && row == 6) || (col == 7 && row == 6) ||
                (col == 0 && row == 7) || (col == 1 && row == 7) || (col == 2 && row == 7) || (col == 5 && row == 7) || (col == 6 && row == 7) || (col == 7 && row == 7) ) {
                continue;
            }
            Matrix.setLed(1, row, col, true);
        }
    }
}

void levelFour() {
    for ( int row = 0; row < 8; row++ ) {
        for ( int col = 0; col < 8; col++ ) {
            if ( (col == 0 && row == 0) || (col == 1 && row == 0) || (col == 6 && row == 0) || (col == 7 && row == 0) ||
                (col == 0 && row == 1) || (col == 7 && row == 1) ||
                (col == 0 && row == 6) || (col == 7 && row == 6) ||
                (col == 0 && row == 7) || (col == 1 && row == 7) || (col == 6 && row == 7) || (col == 7 && row == 7) ) {
                continue;
            }
            Matrix.setLed(1, row, col, true);
        }
    }
}

void levelFive() {
    for ( int row = 0; row < 8; row++ ) {
        for ( int col = 0; col < 8; col++ ) {
            Matrix.setLed(1, row, col, true);
        }
    }
}
                    

Week 01

This first week was a refresher on the basic knowledge of electronics, through the development of some short practical labs and technical texts.

Variable power supply reads 6.8V, 0.02A. The leads are connected to a multimeter in voltage reading mode. It reads 9.0V.
I measured the variable supply before connecting it to the arduino. Was a bit surprised that the measurement on the multimeter was so different from the supply.
Arduino Nano IoT 33 on the edge of a breadboard. Connected with red and blue cables to a variable power supply that reads 6.8V.
With the measured voltage, I connected the arduino through the VIN pin. Built-in LED flashed with the "blink" sketch, indicating it was running.
Arduino Nano IoT 33 on the edge of a breadboard. Hands use a multimeter to measure two points on the breadboard, connected to vcc and ground. It reads 4.43V
The board's VCC was measured as 4.43V, a bit high considering it is supposed to be 3.3V. Since I had never worked with a variable supply before, they might be something I'm overlooking.

Apart from those labs, I also measured the output voltage from an LM7805 when supplied power from the variable supply. The reading was once again very strange, which corroborated my suspicions that I must be misunderstanding something about the variable supply. I decided to switch to a standard supply for the next lab.

I wired my blue LED as the provided diagram but decided to add a 4700μF capacitor in parallel with the LED. I thought this would cause it to both fade in and out, but it only faded out. I realized then that the fade in would be more complicated in these analog circuits, without PWM.

This fade out reminded me of Maywa Denki's machines, some of which have lights that power on instantly with the beats of his music, and then fade away. I decided then to make my switch music-related.

The idea was to make a simple switch that turns on the light to the rhythm of your feet tapping, an action usually done semi-subconsciously when you connect with the music.

A copper plate with a wire on top of it, connected with a small strip of copper tape.
The first layer of my soft switch sandwich.
The foam piece of a headphone set on top of the copper plate.
I used the foam of a headphone set as the dielectric for my soft switch.
Copper plate with a wire connected with copper tape. All on top of the foam piece.
Next I added another layer of copper.
Soft switch built with two copper plates and a headphone piece foam in between.
A little gaffer tape to hold it together.

After testing it out, I noticed it was very hard to create a connection when tapping, so I iterated:

Soft switch built with two copper plates and a headphone piece foam in between. It is opened up, showing a crumpled up piece of copper inside the middle hole of the foam piece.
To fix that issue and make it easier to close the circuit, I added a crumpled piece of copper inside.

Then the final switch was ready:

It could definitely benefit from a debounce algorithm on the switch press detection, but I was satisfied with the result considering it was an analog circuit :).