ALL YOUR BASE ARE BELONG TO WASS

Category

  • But if it wasn't for the Bronx...

    For our Applications class, we have to make an invitation for an unknown classmate to go on an adventure. This adventure can be a place, an activity, a person, whatever. Knowing that a significant percentage of our class moved here from somewhere else, I wanted the adventure to be especially "New York". But also knowing how easily we slip into our routines and radii, I wanted to make it a place that was beyond the easy epicenter of downtown.

    Of all NY's contributions to world history and culture, Hip Hop is one of its most significant. That's why I decided to make my invitation a Soundwalk of the South Bronx, narrated by DJ Jazzy Jay, one of the earliest pioneers of Hip Hop.

    As seems to be my way, my original concept for my invitation was a little overambitious. I had envisioned some sort of diorama with cut out 2D images of the people, places and cultural icons of the first era of Hip Hop. Beat Street, Wild Style, Krush Groove, Busy Bee, KRS-ONE, tagged trains, black top MC battles, tenement house parties, etc. And I also wanted to mod one of those audio greeting cards to blast some classic track when the invitation was read in class. Well, time and economy conspired against me. The diorama turned into a card. The card into a box. And the audio greeting cards are hackable, but not re-recordable.

    Finally, the day before the invite was due, Ithai suggested I seek out the 9V Recording Module from Radio Shack. Brilliant! Not sure if I had more than one shot, I gambled on KRS-ONE's Step Into A World. It was either that or South Bronx, by Boogie Down Productions. But in a 20 second burst, neither had the right punch. Tanya convinced me that even though they were dope songs, and important classics, they weren't perfect for the application.

    Then came the fabrication. I found a box with no lid. I found a solid picture of a boom box online and printed it out. The idea was to cut the center out of one speaker, and nest my module speaker there. And then cut out a hole near the cassette buttons and shimmy the module play button in there. All well and good. But the cardboard lid I used was too thick for the module button and kept triggering it on and off. And taping the whole module to the back of the lid didn't allow for enough pressure when I wanted it. ALSO, the rubber button wouldn't adhere to anything I was gluing it to. Then, with the help of Ryan Viglizzo, we made a bigger hole, and affixed a bigger button to the original button. I learned about This To That, one of the reasons the internet is great. And with some needle injected epoxy, we got it to stick.

    I was getting ready to run out the door and go to class (only a little late) when Manuela Lamas suggested I hot glue the module to the lid leaving just the right amount of space and being close and firm enough to push the play button consistently. This worked perfectly. And as I removed all the gaff tape from the extraneous tendrils (mic and record button) I thought, maybe I should remove those, so that they don't accidentally record over my song.

    Here's where it gets good. I snipped those pieces off, and in doing so, disconnected my closed circuit. The sound wouldn't come out, and was in fact, deleted. Sweet! But here's where Manuela proved to truly be a guardian angel. She helped me sort out which wires went back where, twisted them together and then glued them back to the board. This story retells quickly, but was stressful in the 11:45th hour.

    Eventually, I got my shit together, retested the final product, and though it lacked a certain polish, proudly walked out with the boom box boomin'. Here's what it looked like.

    But here's where it got good.

    Tags

    Categories

  • Applications Presentation: Carter Emmart/Unirooney montage

    Our Applications class has a different speaker come every week. Our group project is to create a response to the speaker we are assigned. Fortunately, my group got Carter Emmart.

    Carter is the Director of Astrovisualization at the American Museum of Natural History. Carter had a lifelong dream of getting to Mars, and pursued it in one fashion or another for most of his life. As the Director of Astrovisualization, he was able to conceive of an experience that takes us well beyond Mars, and in fact has reframed his existential view of the Universe and space travel's necessity.

    Carter is a unique and dynamic man whose thoughts and stories tumble and speed forward almost indiscriminately. But for someone who has dealt with science, space and institutions for all of his professional life, he is remarkably kind, thoughtful, fun-loving and spiritual. He said many things to our class, and many more when we had the opportunity to get a privately guided tour of the universe from him at the Planetarium, but he said two things that stuck out to me. The first was that the astronauts he'd make had shared with him that one of the most awe-striking events in space is actually looking back at how beautiful the Earth is. And the second was that after years of trying to be an astronaut and get to Mars in person, he would now rather stay on Earth. He realized that technology had gotten to a place where it seemed unnecessary and wasteful to spend billions of dollars getting humans to touch a destination in outer space when there was so much work and aid to be done for the lifeforms we already know exist.

    That perspective resonated with me and had a fair amount of empathy and logic in it.

    So as a response, my group (John Capogna, GJ Lee and Xeudi Chen) decided to make an expressionistic montage that touched on the beauty, thoughtfulness, smallness and bigness of both Carter's perspective and Carter's personality.

    This is our piece.

    Unirooney from Jon Wasserman on Vimeo.

    Tags

    Categories

  • Moholy

    This was one of my early Intro to Computational Media projects. I referenced a Laszlo Moholy-Nagy painting, and put a hidden slider in it. See if you can find it in this sketch.

    Tags

    Categories

  • Clutterometer

    The Clutterometer, f.k.a. The Conscientious Sink was my Intro to Physical Computing mid-term project. My collaborators were Karl Ward and Andrew Cerrito. Two more clever and twisted brains you'd have trouble finding. And for me that was perfect.

    There is a common kitchen area on the floor at ITP, and somehow, with selfish and neglectful behavior, it frequently and rapidly becomes cluttered with dirty dishes. We wanted to change that.

    The Clutterometer detects when you are at the sink, engaging in dish activity, and then rates your contribution to the community based on what is in the sink after you leave. If you leave your dishes there, the needle goes to Red, if you do only your dishes it hovers at Yellow and if you've done more than your dishes, it rewards you with a Green status.

    Once the user steps on the welcome mat, force sensors detect that someone is in fact at the sink, and they trigger some suggestive audio tracks. Then pixel data from an IP camera mounted above the sink begins to average any changes and jiggle the needle. As the user leaves the sink/mat the average of 10 random captures of pixel data (including the status of an empty sink, the status when the user arrived at sink and the status upon leaving) gets mapped to the Clutterometer and records the success and degree to which the user has contributed to the sink. If getting a response of Yellow or Red, the "audio barb" is triggered. It is the sound of water running and someone else washing dishes. It was meant to inspire someone to come back and finish the job, or at least think they left the faucet on, and thus having to go back to engage the sink.

    Here is a little sizzle reel of the project.

    Clutterfilm Final from Jon Wasserman on Vimeo.

    Tags

    Categories

  • Response to Jonathan Lethems's "The Ecstasy of Influence, A Plagarism"

    For Applications class, I read Jonathan Lethem's "The Ecstasy of Influence, A Plagarism". I had enjoyed his writing before, and the subject was of particular interest to me. In the most personal ways, it addressed my love of hip hop and my experience trying to sell and protect my photographic intellectual property in a time where technology and culture conspired against traditional modes of capitalism.

    The article was frequently commented on and comments were responded to by my classmates. The entire thread is here.

     

    The ecstasy of influence:A plagiarism By Jonathan Lethem
    http://harpers.org/archive/2007/02/0081387

     

    This article is about a few things. It’s about appropriation, simulacra, auteurism, economics and the gift of art.

    Lethem talks about the beauty in the tradition of taking, copying and redefining ideas. He also talks about the illegitimacy of trying to own a culture experience, even if you made it. Somewhat compartmentalizing the entitlement to valuable compensation for the immediate thing you created, he discusses the importance and benefit of permitting access to intellectual property for the sake of the Commons. It’s a challenge to the status quo of Capitalism, an indictment of SoPA and a vote of confidence for Open Source. But it boils down very simply to this expression: We should make things for the service and enjoyment of society, and restriction of ideas only keeps the best of our potential from us.

    This argument sits somewhat outside the important discussion of our relationship to taking resources and outputting pollutants into the environment. I mean this both literally and metaphorically. But it not only espouses the creativity of sampling, mashing up and reimagining, it suggests that many intellectual property restrictions only benefit the corporate ownership related to that thing. The terms of the rules are defined by the corporate interest, and they benefit the corporate food web (the distributors over the creators) more than it truly serves the populace. He illustrates the derivation of nourishment from music and literature and film. He cites that Einstein takes credit for his work, but does not maintain ownership over everyone else’s opportunity to explore, use and remake it.

    Lethem says, “Honoring the commons is not a matter of moral exhortation. It is a practical necessity. We in Western society are going through a period of intensifying belief in private ownership, to the detriment of the public good. We have to remain constantly vigilant to prevent raids by those who would selfishly exploit our common heritage for their private gain. Such raids on our natural resources are not examples of enterprise and initiative. They are attempts to take from all the people just for the benefit of a few.”

    I’d say this is a compelling case for Open Source. The question is: How can Open Source play with Capitalism and not get sullied or destroyed? There is evidence supporting the idea that Capitalism is not so much broken as it was not a sustainable or “good” paradigm to begin with. These criticisms don’t usually promote a solution other than “tearing it all down and starting fresh”. But that fix seems oversimplified and excludes strategy. I don’t see Open Source as a polar opposite to Capitalism, but rather an entirely alternative ethos, whose achilles heel is the retort, “how will I earn money, own things and therefore provide for my family?” But if you can release from that mindset, the ways that the greater good and individuals benefit from it are more evident.

     

    Tags

    Categories

  • Rage Guy

    We were learning about PImage and I was having a little trouble with some of my other rudiments. This felt like a natural extension of the recurring theme of just trying to make something work, before I make it artful.

    BEHOLD

    Tags

    Categories

  • Pete Repeater

    Pete Repeater is an audio looper with a wooden block interface. It was my final project for Intro to Physical Computing, and my partner was Karl Ward. When we paired up Karl was really driven to create a looper that you played with more than just used. He described that most loopers were used by guitarists who didn't have any hands free and stepped on them with their feet. I could tell he was missing the experience of handling the thing itself for the tactile and intimate control of the recording, looping and direction of play.

    We felt it was less important to imitate or improve upon commercial loopers, and more worth while to make something that was fun, had presence and was easy to use. We also decided to hone in on a specific audience to help guide our decision making. We wanted it to be for children ages 6 and up. The idea was that a thoughtful 6 year old who was interested not only in making sounds, but making decisions and controlling the sound would have fun, get it right away, and then hack the simple tool for his or her own experimentation. And semi-secretly, we wanted to make a kids' toy that adults and adult musicians would want to appropriate. I'm positive Karl had his own vision of inspiration, but I kept thinking of a scene from Funky Monks, when Flea takes his daughter's mini toy piano and plays it, and its sound seems to be what they were looking for.

    I could go on about this, but truly, the best descriptions of this project and its conception are on Karl's blog post. And the more technical wizardry here. Please to enjoy.

    But here is some documentation of how it works. Sadly, we didn't record the freestyle duet Karl and I did recording Texasisms. You might could just imagine it on yrown.

    PETE REPEATER from Jon Wasserman on Vimeo.

    Tags

    Categories

  • Hobbs

    I have a friend named John Hobbs. Every year for his birthday, I make him something special. Like this. So it seemed appropriate that this year I flex some of my new skillz and make him this:

    hobbs vid from Jon Wasserman on Vimeo.

    I don't know. Is that weird?

    Tags

    Categories

  • Paula Deen Riding Me - ICM Final Project

    For my ICM Final I wanted to create a sketch in Processing that utilized the Kinect. I decided to riff off the meme "Paula Deen Riding Things" because memes are fun, physical play that comes with the Kinect is fun and because there would be a wedge to insert some social commentary and attention to health and food issues.

    "Paula Deen Riding Me" is a humorous way to vilify the cultural enablement of unhealthy decisions. Using the Kinect, the Player has to knock little, devilish "Deens" off his arms with fists of broccoli lest he turn into they type of evil that proclaims, "Life is too short to wonder where you hid your waffle maker."

    I was passionate about food justice, personal health and wellness, social/pop cultural critique.I was intrigued by Processing and the Kinect and wanted to craft a simple, fun game with a sense of humor that sneaks in a social agenda as a second read.

    Paula Deen is a Food Celebrity and TV Personality. Her brand relies on making Southern Cooking that is delicious specifically because it uses unhealthy staples of Southern Cuisine in excessive quantities. She promotes hyperbolic inclusion of ingredients known to cause diseases that make up the most visible epidemics in American culture.

    In 2011 the meme "Paula Deen Riding Things"(pauladeenridingthings.com) launched on tumblr.com. It featured .png files of Paula Deen excerpted from a photo stunt where she rode another celebrity chef like a horse. Immediately after the images were posted, a site was created that photoshopped the silhouetted shots of Deen onto other environments.

    Thanks to Vitor Freire, Caroline Sinders,Wyna Liu, Surya Mattu and David Rios for play-testing. Thanks to Craig Proetzel and Mimi Yin for helping me work through the logic of how to use Skeleton Tracking for the Kinect. Thanks to Maria Paula Saba dos Reis for sharing her code and expertise on background selection and conversion from infrared to RGB through the Kinect. And a special thank you to Lia Martinez for helping me take a fun idea and make it into a working, playable consistent sketch.

    The core of this code uses Greg Borenstein's Skeleton Anatomy Example from his book Making Things See.

     

    Here is my Processing code for this sketch. (It will be added to GitHub soon)

    import SimpleOpenNI.*;
    SimpleOpenNI kinect;

    import ddf.minim.*;

    Minim minim;
    AudioPlayer paulaSound;

    //**MPK is my note for Maria Paula Kinect sketch attributes

    //LIA is my note for Lia Martinez sketch attributes

    //Left siderz
    PVector shoulderL;
    PVector handL;
    PVector elbowL;
    //Right siderz
    PVector shoulderR;
    PVector handR;
    PVector elbowR;

    //LIA
    //some booleans for control
    boolean elbowUp;
    boolean timerOn;
    //LIA
    //for the timer
    int timer;
    int duration;

    //Declare Paula Images
    PImage pdL;
    PImage pdR;
    PImage broc;

    Paula p1;
    Paula p2;
    Paula p3;
    Paula p4;
    Paula p5;
    Paula p6;
    Paula p7;
    Paula p8;

    boolean droppedPaula = false;

    //MPK
    PImage backgroundImage;

    int[] userMap;
    color trans = color(0, 0, 0, 0);
    PImage resultImage;

    void setup() {
    size(640, 480);
    kinect = new SimpleOpenNI(this);
    kinect.enableDepth();
    kinect.setMirror(true);

    kinect.enableRGB();
    kinect.enableUser(SimpleOpenNI.SKEL_PROFILE_ALL);
    kinect.alternativeViewPointDepthToImage();

    minim = new Minim(this);
    paulaSound = minim.loadFile("creamy.wav"); //load paula sound

    //backgroundImage = loadImage("melting_butter2.png");
    backgroundImage = loadImage("chickenfrier150.jpg");
    resultImage = new PImage(640, 480, ARGB);

    //Before you intialize the Paula objects, load the Paula images
    pdL = loadImage("deenerLt2.png");
    pdR = loadImage("deenerRt2.png");
    broc = loadImage("broc.png");

    //These variables don't seem to change size...
    p1 = new Paula (width/2, height/2, pdL);
    p2 = new Paula (width/3, height/3, pdL);
    p3 = new Paula (width/3, height/3, pdL);
    p4 = new Paula (width/3, height/3, pdL);
    //
    p5 = new Paula (width/2, height/2, pdR);
    p6 = new Paula (width/3, height/3, pdR);
    p7 = new Paula (width/3, height/3, pdR);
    p8 = new Paula (width/3, height/3, pdR);
    }

    void draw() {
    imageMode(CORNER);
    kinect.update();
    //place bg image here
    image(backgroundImage, 0, 0);
    //image(kinect.rgbImage(), 0, 0);

    //remove background sketch goes here

    IntVector userList = new IntVector();
    kinect.getUsers(userList);

    if (userList.size() > 0) {
    int userId = userList.get(0);

    if ( kinect.isTrackingSkeleton(userId)) {
    //Below code existed pre-MPK
    //      println ("DRAW NOW " + userId);

    //MPK
    //cleaning background
    userMap = kinect.getUsersPixels(SimpleOpenNI.USERS_ALL);
    for (int i =0; i < userMap.length; i++) {
    // if the pixel is part of the user
    if (userMap[i] != 0) {
    // set the pixel to the color pixel
    resultImage.pixels[i] = kinect.rgbImage().pixels[i];
    }

    else {
    //set it to the background
    resultImage.pixels[i] = trans;
    }
    }
    //MPK
    resultImage.updatePixels();
    image(resultImage, 0, 0, 640, 480);
    drawSkeleton(userId);

    //My PVector variables
    shoulderL = new PVector(-1000, -1000);
    shoulderR = new PVector(1000, 1000);

    elbowL = new PVector(-1000, -1000);
    elbowR = new PVector(1000, 1000);

    handL = new PVector(1000, 1000);
    handR = new PVector(1000, 1000);
    }
    }
    }

    void drawSkeleton(int userId) {
    stroke(0);
    //strokeWeight(50);

    //kinect.drawLimb(userId, SimpleOpenNI.SKEL_NECK, SimpleOpenNI.SKEL_LEFT_SHOULDER);
    //kinect.drawLimb(userId, SimpleOpenNI.SKEL_LEFT_SHOULDER, SimpleOpenNI.SKEL_LEFT_ELBOW);

    //noStroke();
    //Make the joint nodes RED
    //fill(255,0,0);
    //drawJoint(userId, SimpleOpenNI.SKEL_LEFT_SHOULDER);
    // drawJoint(userId, SimpleOpenNI.SKEL_LEFT_ELBOW);

    stroke(0);
    // strokeWeight(30);

    shoulderL = getJoint(userId, SimpleOpenNI.SKEL_LEFT_SHOULDER);
    elbowL = getJoint (userId, SimpleOpenNI.SKEL_LEFT_ELBOW);
    handL =  getJoint (userId, SimpleOpenNI.SKEL_LEFT_HAND);

    shoulderR = getJoint(userId, SimpleOpenNI.SKEL_RIGHT_SHOULDER);
    elbowR = getJoint (userId, SimpleOpenNI.SKEL_RIGHT_ELBOW);
    handR =  getJoint (userId, SimpleOpenNI.SKEL_RIGHT_HAND);
    //
    //these were test lines and ellipses that I don't need
    //  line(shoulderL.x, shoulderL.y, elbowL.x, elbowL.y);
    //  fill (0, 255, 255);
    //  ellipse (shoulderL.x, shoulderL.y, 30, 30);
    //    fill (255, 0, 255);
    // // ellipse (elbowL.x, elbowL.y, 30, 30);

    //WRITING THE deenerL PImages
    image (broc, (handR.x - 50), (handR.y - 50));
    image (broc, (handL.x - 50), (handL.y - 50));

    //p1p1p1p1p1p1p1p1p1p1p1p1p1p1p1p1
    //Paula on the camera left ELBOW using shoulderL
    // paulaSound = minim.loadFile("creamy.wav");
    float armAngle = atan2(shoulderL.y - elbowL.y, shoulderL.x - elbowL.x) + PI;
    float radius = shoulderL.dist(elbowL)*-.1;
    p1.px = shoulderL.x+radius*cos(armAngle);

    //********SET Y POSITION OF PAULA for p1
    if (p1.isDroppedPaula) {
    p1.dropPaula();
    paulaSound.play();
    //  paulaSound = minim.loadFile("creamy.wav");
    // println("PAULA'S Y POSITION: " + p1.py);
    }
    else {
    if (p1.handIsCloseToPaula(handR.x, handR.y)) {
    //  println("CLOSE TO PAULA!!!!! DROP HER!!!!!!!!!!!!!!");
    p1.isDroppedPaula = true;
    }
    else {
    p1.py = (shoulderL.y+radius*sin(armAngle)-30);
    // println("NOPE");
    }
    }
    //******************************

    p1.display();

    //p2p2p2p2p2p2p2p2p2p2p2p2p2p2p2p2
    //Paula on the camera left BICEP using shoulderL
    float armAngle2 = atan2(shoulderL.y - elbowL.y, shoulderL.x - elbowL.x) + PI;
    radius = shoulderL.dist(elbowL)*.3;
    p2.px = shoulderL.x+radius*cos(armAngle2);

    //********SET Y POSITION OF PAULA for p1
    if (p2.isDroppedPaula) {
    paulaSound = minim.loadFile("yall2.wav"); //load paula sound
    p2.dropPaula();
    paulaSound.play();
    // paulaSound = minim.loadFile("yall2.wav");
    //paulaSound.stop();
    //paulaSound.pause();
    //println("PAULA'S Y POSITION: " + p2.py);
    }
    else {
    if (p2.handIsCloseToPaula(handR.x, handR.y)) {
    // println("CLOSE TO PAULA!!!!! DROP HER!!!!!!!!!!!!!!");
    p2.isDroppedPaula = true;
    }
    else {
    p2.py = (shoulderL.y+radius*sin(armAngle2)-25);
    // println("NOPE");
    }
    }
    //******************************

    p2.display();
    //  //  //Paula on the camera left BICEP using shoulderL
    //  float armAngle2 = atan2(shoulderL.y - elbowL.y, shoulderL.x - elbowL.x) + PI;
    //  radius = shoulderL.dist(elbowL)*.3;
    //  p2.px = shoulderL.x+radius*cos(armAngle2);
    //  p2.py = shoulderL.y+radius*sin(armAngle2);
    //  p2.display();

    //p3p3p3p3p3p3p3p3p3p3p3p3p3p3p3p3
    //  //Paula on the camera left FOREARM using shoulderL
    float armAngle3 = atan2(shoulderL.y - elbowL.y, shoulderL.x - elbowL.x) + PI;
    radius = shoulderL.dist(elbowL)*.7;
    p3.px = shoulderL.x+radius*cos(armAngle3);

    //  //********SET Y POSITION OF PAULA for p1
    if (p3.isDroppedPaula) {
    p3.dropPaula();
    paulaSound.play();
    //  paulaSound = minim.loadFile("creamy.wav");
    // println("PAULA'S Y POSITION: " + p3.py);
    }
    else {
    if (p3.handIsCloseToPaula(handR.x, handR.y)) {
    // println("CLOSE TO PAULA!!!!! DROP HER!!!!!!!!!!!!!!");
    p3.isDroppedPaula = true;
    }
    else {
    p3.py = (shoulderL.y+radius*sin(armAngle3)-30);
    // println("NOPE");
    }
    }
    //******************************

    p3.display();
    //Paula on the camera left FOREARM using shoulderL
    //float armAngle3 = atan2(shoulderL.y - elbowL.y, shoulderL.x - elbowL.x) + PI;
    //  radius = elbowL.dist(handL)*.45;
    //  p3.px = elbowL.x+radius*cos(armAngle3);
    //  p3.py = shoulderL.y+radius*sin(armAngle3);
    //  p3.display();

    //p4p4p4p4p4p4p4p4p4p4p4p4p4p4p4p4p4p4p4p4p4p4
    //Paula on the camera left WRIST using shoulderL
    float armAngle4 = atan2(shoulderL.y - elbowL.y, shoulderL.x - elbowL.x) + PI;
    radius = shoulderL.dist(elbowL)*1.1;
    p4.px = shoulderL.x+radius*cos(armAngle4);

    //********SET Y POSITION OF PAULA for p1
    if (p4.isDroppedPaula) {
    p4.dropPaula();
    paulaSound.play();
    // paulaSound = minim.loadFile("butter.wav");
    // println("PAULA'S Y POSITION: " + p4.py);
    }
    else {
    if (p4.handIsCloseToPaula(handR.x, handR.y)) {
    //// println("CLOSE TO PAULA!!!!! DROP HER!!!!!!!!!!!!!!");
    p4.isDroppedPaula = true;
    }
    else {
    p4.py = (shoulderL.y+radius*sin(armAngle2)-30);
    // println("NOPE");
    }
    }
    //******************************

    p4.display();
    //  ////  //Paula on the camera left WRIST using shoulderL
    //  float armAngle4 = atan2(shoulderL.y - elbowL.y, shoulderL.x - elbowL.x) + PI;
    //  radius = elbowL.dist(handL)*.8;
    //  p4.px = elbowL.x+radius*cos(armAngle4);
    //  p4.py = shoulderL.y+radius*sin(armAngle4);
    //  p4.display();
    //  //

    //p5p5p5p5p5p5p5p5p5p5p5p5p5p5p5p5p5p5p5p5p5p5p5p5p5p5
    //Paula on the camera right ELBOW using shoulderR
    float armAngle5 = atan2(shoulderR.y - elbowR.y, shoulderR.x - elbowR.x) + PI;
    radius = shoulderR.dist(elbowR)*-.1;
    p5.px = shoulderR.x+radius*cos(armAngle5);

    //********SET Y POSITION OF PAULA for p1
    if (p5.isDroppedPaula) {
    p5.dropPaula();
    paulaSound.play();
    //  paulaSound = minim.loadFile("creamy.wav");
    //println("PAULA'S Y POISTION: " + p5.py);
    }
    else {
    if (p5.handIsCloseToPaula(handR.x, handR.y)) {
    //println("CLOSE TO PAULA!!!!! DROP HER!!!!!!!!!!!!!!");
    p5.isDroppedPaula = true;
    }
    else {
    p5.py = (shoulderR.y+radius*sin(armAngle5)-30);
    // println("NOPE");
    }
    }
    //******************************

    p5.display();
    //  //  //Paula on the camera right ELBOW using shoulderR
    //    float armAngle5 = atan2(shoulderR.y - elbowR.y, shoulderR.x - elbowR.x) + PI;
    //    radius = shoulderR.dist(elbowL)*.5;
    //   p5.px = shoulderR.x+radius*cos(armAngle5);
    //  p5.py = shoulderR.y+radius*sin(armAngle5);
    //  p5.display();
    //   p5.Jitter();

    //p6p6p6p6p6p6p6p6p6p6p6p6p6p6p6p6p6p6p6p6p6p6
    //Paula on the camera right BICEP using shoulderL
    float armAngle6 = atan2(shoulderR.y - elbowR.y, shoulderR.x - elbowR.x) + PI;
    radius = shoulderR.dist(elbowR)*.5;
    p6.px = shoulderR.x+radius*cos(armAngle6);

    //********SET Y POSITION OF PAULA for p1
    if (p6.isDroppedPaula) {
    p6.dropPaula();
    paulaSound.play();
    // println("PAULA'S Y POISTION: " + p6.py);
    }
    else {
    if (p6.handIsCloseToPaula(handL.x, handL.y)) {
    // println("CLOSE TO PAULA!!!!! DROP HER!!!!!!!!!!!!!!");
    p6.isDroppedPaula = true;
    }
    else {
    p6.py = (shoulderR.y+radius*sin(armAngle6)-30);
    //  println("NOPE");
    }
    }
    //******************************

    p6.display();

    //Paula on the camera right BICEP using shoulderL
    //  float armAngle6 = atan2(((shoulderR.y - elbowR.y) - 200), shoulderR.x - elbowR.x) + PI;
    //  radius = shoulderR.dist(elbowR)*.1;
    //  p6.px = shoulderR.x+radius*cos(armAngle6);
    //  p6.py = shoulderR.y+radius*sin(armAngle6);
    //  p6.display();
    //  p6.Jitter();

    //p7p7p7p7p7p7p7p7p7p7p7p7p7p7p7p7p7p7p7p7p7p7p7
    //Paula on the camera right FOREARM using shoulderR
    float armAngle7 = atan2(elbowR.y - handR.y, elbowR.x - handR.x) + PI;
    radius = elbowR.dist(handR)*.3;
    p7.px = elbowR.x+radius*cos(armAngle7);

    //********SET Y POSITION OF PAULA for p1
    if (p7.isDroppedPaula) {
    p7.dropPaula();
    paulaSound.play();
    // paulaSound = minim.loadFile("creamy.wav");
    // println("PAULA'S Y POSITION: " + p7.py);
    }
    else {
    if (p7.handIsCloseToPaula(handL.x, handL.y)) {
    //   println("CLOSE TO PAULA!!!!! DROP HER!!!!!!!!!!!!!!");
    p7.isDroppedPaula = true;
    }
    else {
    p7.py = (shoulderR.y+radius*sin(armAngle7)+40);
    //  println("NOPE");
    }
    }
    //******************************

    p7.display();

    //  //  //Paula on the camera right ELBOW using shoulderR
    //    float armAngle5 = atan2(shoulderR.y - elbowR.y, shoulderR.x - elbowR.x) + PI;
    //    radius = shoulderR.dist(elbowL)*.5;
    //   p5.px = shoulderR.x+radius*cos(armAngle5);
    //  p5.py = shoulderR.y+radius*sin(armAngle5);
    //  p5.display();
    //   p5.Jitter();

    //p8p8p8p8p8p8p8p8p8p8p8p8p8p8p8p8p8p8p8p8p8p8p8p8p8p8
    //Paula on the camera right ???? using shoulderR
    float armAngle8 = atan2(shoulderR.y - elbowR.y, shoulderR.x - elbowR.x) + PI;
    radius = shoulderR.dist(elbowR)*1.5;
    p8.px = shoulderR.x+radius*cos(armAngle8);

    //********SET Y POSITION OF PAULA for p8
    if (p8.isDroppedPaula) {
    p8.dropPaula();
    paulaSound = minim.loadFile("butter1.wav"); //load paula sound
    p8.dropPaula();
    paulaSound.play();
    // paulaSound = minim.loadFile("butter1.wav");
    //  println("PAULA'S Y POISTION: " + p8.py);
    }
    else {
    if (p8.handIsCloseToPaula(handL.x, handL.y)) {
    //    println("CLOSE TO PAULA!!!!! DROP HER!!!!!!!!!!!!!!");
    p8.isDroppedPaula = true;
    }
    else {
    p8.py = (shoulderR.y+radius*sin(armAngle8)-30);
    //   println("NOPE");
    }
    }
    //******************************

    p8.display();
    }
    //Paula on the camera right ELBOW using shoulderR
    //float armAngle5 = atan2(shoulderR.y - elbowR.y, shoulderR.x - elbowR.x) + PI;
    //radius = shoulderR.dist(elbowL)*.5;
    //p5.px = shoulderR.x+radius*cos(armAngle5);
    //p5.py = shoulderR.y+radius*sin(armAngle5);
    //p5.display();
    //   p5.Jitter();

    //Kinect Stuff

    PVector getJoint(int userId, int jointID) {
    PVector joint = new PVector();
    float confidence = kinect.getJointPositionSkeleton(userId, jointID, joint);
    if (confidence < 0.5) {
    return joint;
    }
    //This ellipse is where the joint nodes are. To plot the location of other images, set coordinates based on dist from convertedJoin.x & .y
    //E.G. convertedJoint.x + 10 etc.? No, it shifts
    PVector convertedJoint = new PVector();
    kinect.convertRealWorldToProjective(joint, convertedJoint);

    return convertedJoint;
    }

    void drawJoint(int userId, int jointID) {
    PVector joint = new PVector();
    float confidence = kinect.getJointPositionSkeleton(userId, jointID, joint);
    if (confidence < 0.5) {
    return;
    }
    //This ellipse is where the joint nodes are. To plot the location of other images, set coordinates based on dist from convertedJoin.x & .y
    //E.G. convertedJoint.x + 10 etc.? No, it shifts
    PVector convertedJoint = new PVector();
    kinect.convertRealWorldToProjective(joint, convertedJoint);
    ellipse(convertedJoint.x, convertedJoint.y, 5, 5);
    // ellipse(convertedJoint.x + 50, convertedJoint.y + 50, 50,50);
    }

    // user-tracking callbacks!
    void onNewUser(int userId) {
    println("start pose detection");
    kinect.startPoseDetection("Psi", userId);
    }

    void onEndCalibration(int userId, boolean successful) {
    if (successful) {
    println(" User calibrated !!!");
    kinect.startTrackingSkeleton(userId);
    }
    else {
    println(" Failed to calibrate user !!!");
    kinect.startPoseDetection("Psi", userId);
    }
    }

    void onStartPose(String pose, int userId) {
    println("Started pose for user");
    kinect.stopPoseDetection(userId);
    kinect.requestCalibrationSkeleton(userId, true);
    }

    void stop()
    {
    paulaSound.close();
    minim.stop();

    super.stop();
    }

    Tags

    Categories

  • ICM Week 5

    RectPusher

    This week we dealt with Objects and Class. But it marks an important milestone for me. This is the week that I went from wax on/wax off to up the fence/down the fence. In ICM, that translates into releasing myself from two obstructions. 1. The obligation of memorizing syntax and 2. Relaxing my standard for boundary pushing creativity and aesthetic.

    Both those things were holding me down from taking in and processing info and relaxing my brain sphincters to believe that I understood the concepts.

    Conceptually, I am on board with objects, nesting, even arrays. Whipping out the appropriate code for that is not currently easy or quick, but I now know where to seek code and ideas when necessary. At least a little bit.

    I still have grandiose goals for aesthetic and reason, but I'm now biting off what I can chew.

    RectPusher

    Tags

    Categories

  • PComp Reading: Visual Intelligence

    This discussion of Phantom Limb mapping was very interesting. I had known about the residual pain and telescoping effects, but I didn't know the sensation mapping was so accurate and consistent and represented on more than one region of the remaining limb as well as the face. In Art school I took a Brain class. In it, my partner and I created a Mirror Box. At the time we had read that a way to alleviate lingering pain in phantom limbs was to put the limbs in the mirror box, and then once comfortable with the image of two moving working appendages quickly remove both so that they would snap the perception both of pain AND telescoping phantom limb back to the current real state. I wasn't able to find any links supporting that, but a better illustration is rehabbing with mirror techniques.

    The Homunculus is a dynamic map/visualization. There are numerous accounts of trauma that tweak the way touch perceptions sorts out. Sometimes overlapping, sometimes overcompensating. I wonder what the benefit and danger that messing with this order might reveal. That the genitals represent low on the sensitivity spectrum seems to be a matter of biological pragmatism. We wouldn't get anything done! But what if there was a device (or software) that stimulated the region of the brain dedicated to the genitals. No longer simply about triggering nerve impulses, the real manipulation would be in the perception and volume. I don't know about eating from the Tree of Knowledge on that one…

    Diane Ackerman's descriptions of sensations are so vivid. I'd almost call them deeply poetic in their authenticity. Screw AR, they make a compelling case for Virtual Tactility. I remember when they first came out with the rumble pack for the N64. Playing GoldenEye, running through maps shooting at each other, it made it reasonably more difficult when the controller shook. Now the haptic feedback on my phone helps me remain connected to the feeling of using a living tool instead of an inanimate object. And what about dildonics? Admittedly I'm not up to speed on the current state of development. But reading this article makes me think there is even more room for growth in that field/culture than I had imagined. THIS is all I could find.

    Tags

    Categories

  • PComp Get Creative - Servo

    Here is the first creative project I've made for PComp. I modified the Servomotor Control Lab to make a No Kill Fly Swatter. The genesis of the project was rooted simply in wanting to use my headband. And then wanting to use the Servo more/better.

    The idea is that you can shoo flies without having to kill them, or disturb your arms in a state of relaxation. I'm always thinking of hammocks.

    2012-09-26 15.05.28 from Jon Wasserman on Vimeo.

    Tags

    Categories

  • PComp Week 3 Labs

    Servo

    I found that using the lab's settings made the servo arm wiggle a little bit, but consistently. I started playing with the maximum degrees because the initial settings moved but didn't spin. By doubling the number I was able to increase the total angle of movement.Question: Why does setting the max degree to 1080 yield a 180 turn?

    fsr_servo_lab

    Tone Output

    tone_output_lab

    I got a sensor reading around 260 more or less. It put out a low tone, and I couldn't figure out how to make the tone stronger. Mike Allison suggested adding another resistor. I tried that, but it didn't make an identifiable impact. I understand the function of resistors, but I don't think I've totally got down conceptually how to employ them.

    Tags

    Categories

  • PComp Week 2 Labs

    One step forward, two steps back and twenty steps in place. Three spins counter-clockwise.

    pot_lab

    series_digi_out

    digi_out_switch

    fsr_input

    Apologies for the clunkiness of my posting, video upload and stream of consciousness. Obsessing over making things perfect has kept me from making things at all. I'm working on that. So for now, the quality of early blog posts will have to serve as a foil for the impressive improvement of future posts.

    [gallery link="file" columns="1" orderby="title"]

    Tags

    Categories

  • PComp Week 1 Labs

    Week 2 was the beginning of what I'm calling "The Catch Up Era." I don't know when the "caught up" era ended, but comprehension, time management and time estimation conspired against me. As of this post, I haven't righted the ship, but there is palpable (incremental) evidence of progress. Even as I post these videos late, I look back and understand how things are working better than when I did them.

    COMPONENTS/ELECTRONICS/SWITCHES

    Basic LED - https://vimeo.com/49782329

    Component Series - https://vimeo.com/49782326

    Component Parallel - https://vimeo.com/49782328

     

     

    Tags

    Categories

  • PComp Wk 1 Labs

    So... Week 1 labs were a bit of a hero's chore for me. I'm still having difficulty with the base concepts of electricity and the logic and reasons behind doing things. Though two things that have helped A LOT have been working with other people and physically touching stuff. I think I'll get there soon enough. Big ups to Andy Sigler for helping me with the soldering of my first DC Adapter. And to the residents, who patiently let me through my first luminous LED.

     

    Tags

    Categories

  • PComp HW Wk 2 - Observations

    Observation. Pick a piece of interactive technology in public, used by multiple people. Write down your assumptions as to how it's used, and describe the context in which it's being used. Watch people use it, preferably without them knowing they're being observed. Take notes on how they use it, what they do differently, what appear to be the difficulties, what appear to be the easiest parts. Record what takes the longest, what takes the least amount of time, and how long the whole transaction takes. Consider how the readings from Norman and Crawford reflect on what you see.

     

     

    The Elevator is an often overlooked interactive technology, that many people engage every day. Forget the ones with flatscreens peddling news. Forget the Hearst Building elevator that only takes you to the floor you’ve requested at the front desk, sorting passengers into different cars based on destination, keeping it energy efficient.

    The majority of elevators today have a couple of traits in common. They have buttons for the floor you want to go to. They have a stop button. They have an emergency call button or phone. They have a door “open button” and a “door close” button. And they have a motion sensor at the doorway to stop the doors from closing if they are obstructed.

    The motion sensor is the most expected interactive technology. It responds, essentially like a button, to an interruption. But it has two movements: the abrupt jolt and the smoother recession. And it wants to close. And takes nearly every opportunity to do so. It is also preset to make choices. No, that’s not correct. It is set with fixed zones where human interruption would likely occur: mid-calf and chest height. But it will respond each time you cross the plane.

    The buttons all have some low level interactivity in that they wait for human interaction and then respond. But this kind of passive dynamic is sort of what Crawford rejects as being not a full or robust conversation between actors. That being said, if you pull from what Jonathan Lethem said about the Surrealists and Heidegger “revealing the ‘thingness’ of objects,” it is easier to accept that we silently anthropomorphize the elevator as a thinking service provider. When it is fast, we’re thankful. When it is slow, we blame it as if it is responsible.

    But most importantly, we are complicit in our use of the door open/close buttons. In this way, the technology is the conduit for the conversation between other passengers and me. When I offer to hold the door for someone, pressing the button is similar to instructing a party guest to hold the door open for another guest who is right behind him. And in the same way, when the elevator is full or we are late or some inconsiderate passenger is trying to stretch an existing conversation with someone outside the elevator, we don’t close the door by ourselves like a hinge door. The elevator and I close it on that person together. It’s a poetic fistbump of agreement toward an action.

    Question: I’ve heard that elevator “door close” buttons are set to respond slowly, basically at the same rate that an uninterrupted elevator door would close on its own. This is a safety precaution and also protects the door from some margin of human abuse. If this is the case, what is the interactive relationship between an actor that thinks the conversation is advancing a result and the other actor “lying”?

    Tags

    Categories