Category: itp

itp work and progress

  • Editing as pathfinding: How to Edit a Film

    Editing as pathfinding: How to Edit a Film

    (In response to Walter Mulch’s In the Blink of an Eye: A Perspective on Film Editing.)

    So you make yourself comfortable in the well-equiped editing room, whether in the physical sense or whatever computing setup you use as your virtual editing room. Besides you lies a hundred feet or so worth of film, or alternatively, your computer screen flickers with hundreds of freshly-DSLR’d shots of close ups, tight shots, and context-establishing sequences.

    You feel the heaviness of the weight of the director’s original vision, whatever it might be, and which is often, in itself, a subjective take on the essence of a script and/or original source material, and the potential enjoyment (or lack thereof) of all prospective film viewers.

    To prepare yourself for the elusivity of the high-stakes process of film editing, you summon a “cheatsheet” of sorts that, supposedly, distills the process of film editing. Its contents, if they are of any merit, will most likely be along the following lines.

    1. Remind yourself that the process of editing is a process of pathfinding. You start with no inherent guarantees of what the end product will end up like, but you do know that it will be what it is supposed to be. Shots, in their rawest form are sturctureless, and what you do is to find or synthesise a path towards a coherently fulfilled vision.

    2. Know that the process of cutting is not just that, it is an extensive, iterative process of decision making as to what constitutes a suitable cut, and what can be put aside as a “shadow” cut, regardless of all the effort that went to it. Editing is a process that is measured in quality, not quantity.

    3. Be aware of the reason why (almost) all films need to be cut: Because of the logistic impossibility of having all objects, all actors, and all suitable conditions in the same place to construct a complete film in a single shot. In a way, by overcoming this limitation, filmmakers are freed from the impossibility of controlling every gesture or condition simultaneously to get a perfectly-coherent film.

    4. Recall that editing is not simply “taking out the bad bits,” but more of giving a film its unique nature through the order you bestow upon it. The shots are the DNA and the editing is the sequencing of it.

    5. Get your priorities straight: you should edit for emotion, then narrative, then rhythm. Lower in importance are the viewer’s eye-trace, the stage-line, and the 3D space continuity of the shot. Minimize the compromise you make for the first (tightly-bound) three as much as possible.

    6. Think like the viewer, and direct, or misdirect, him or her towards your narrative through editing. Minute details are  most of the time indiscernible to the viewer. (To help you keep that in mind at all time, you may use a cutout stick figure as a visual representation of the proportion of the cinema-goer against the towering cinema screen.)

    7. Isolate yourself from the conditions of shooting. All effort, time, cost, and agony that went into a shot are irrelevant if they do not perfect the emotional impact, progress the story, or fit into the rhythm of the film.

    8. Engage in editing in a pair or more, part of it is to avoid a locked POV on what should or should not be there, and the other part is to meet the often unforgiving timelines. One of you can be the “dreamer,” while the other will have to challenge the validity of the other’s dream.

    9. Visually distill each of the shots into a single “decisive moment,” preferably as a physically printed shot, that can both illuminate the emotion of a shot, and be used as a linguistic representation of all the details it can encompass.

    10. Conduct test screenings with an audience, but don’t take their explicit feedback to heart. Instead, focus on how does it feel to show your edited film to an audience.  If you must collect explicit feedback, be sure that it is a good time after the screening, so as to get a fully formed impression, not a reaction, or a referred pain, where the reaction is driven by a single scene when the issue is in what leads to or follows from it.

    11. Think of cuts as a blink of an eye amidst one’s continuously perceived reality: it happens when one thought gives way to another, when one emotion overtakes the observer in a way that changes his perception of the moment. Alternatively, think of an edited film as a dream where cuts happen to compress time and space in a meaningful, not necessarily realistic way.

    12. For any scene, there is a finite number of possible good cuts, which results in a specific branching schema that the editor has to bring to his awareness. The meaning of this is that not every moment is a potential cut; if, say, you want your audience to perceive a speaker as a liar, there is only one appropriate cut for that. An earlier or a later cut will result in the misdirection of the audience towards another conclusion.

     

  • Arduino Diaries IV: Sensor-controlled Dot

    Arduino Diaries IV: Sensor-controlled Dot

    Building up on the serial communications capability of the Arduino board, I used an accelerometer to control a dot in a Processing sketch. The x-value and the y-value captured from the accelerometer were mapped to the equivalent values of the screen.

  • Arduino Diaries III: Sensing Visualized

    Arduino Diaries III: Sensing Visualized

    Serial sensor data can be sent through an Arduino board to a computer, where it can be further processed by Arduino’s own Wiring programming environment, or alternative software environments such as Processing.

    Below are the results of such process for two different sensors. In both cases, the binary data received from the sensor was mapped to an appropriate range in Processing (0 to height of window), and graphed as a time-series.

    The first sensor is a force sensor whose returned values had a nice, expansive range due to its nuanced responsiveness to applied force.

    669436.44

    816304.75323532.03611160.7562932.44443994.03

     

     

     

     

     

     

     

    Compare this to the x-axis readings of an accelerometer, below.
     

    251795.34242564.2221143.979215.1166277.38232491.4146
     

     

     

     

     

     

     
    The results are not as varied as those of the force sensor mainly because of the inclination of the accelerometer’s reading to be of a broad set of ranges that are difficult to map appropriately to the Processing’s sketch screen.
     

  • Seeker/Avoider Autonomous Agents

    Seeker/Avoider Autonomous Agents

    This is my first dabbling into autonomous agents. An autonomous agent, or “vehicle” in Valentino Braitenberg‘s terminology, is an agent capable of selecting an action, and consequently steering itself towards achieving it.


    The particle system I created has three types of particles: target particles (visually represented as pink squares), seeker particles that pursue a target particle (represented as the large colorful ellipses), and avoider-seeker particles that are simultaneously moving towards a target and avoiding another (they are the small, ghostly hollow ellipses in the screenshots/videos). The particles follow an inheritance hierarchy: both the Seeker and Target classes inherit from a base Particle class. The AvoiderSeeker class is the grandchild of the Seeker class as there is an unused class in between, the Avoider class.

    The third particle type clearly exhibits the most complex behavior as its steering behavior is subject to two, potentially-contradicting innate forces: one towards a particle, and the other towards another particle. The sought-after/avoided particles are not necessarily the square pink ones, therefore our avoider-seeker particle go after a seeker particle that is in turn moving towards other particles. This results in moments of analysis-paralysis for said avoider-seeker particles that are visible in the above video.

    Another distinctive characteristic of an autonomous agent is that it has a limited  knowledge of its environment. The particles here achieve that in having an active knowledge of which direction the target is at, but not necessarily where it is located exactly.

  • Observation Study: Office Chair

    Observation Study: Office Chair

    What is it?
    The standard, stock office chair with a cushion-endowed sitting surface, wheels at the bottom (usually), a low-height backrest, and conveniently-placed hand-rests.

    Prior assumptions
    It is hardly interacted with. The user can interact with it through a single operation only: sitting.

    What is the context in which it is typically used?
    Typical context in which it is used: working on a computer, working with non-fixed tools (e.g. notebooks, hand tools) on an adjacent table or desk. Having a meal, or in a communal setting in which people interact (meetings, meals, etc).

    How do people actually use it?
    The following are typical operations done to a chair: Move the chair towards a general direction. Rotate the seating surface towards another person, a desk, etc. Hang sweater, jacket, or other clothing material on back. Adjust height. Hang a bag on sides or armrests.

    How is it being used differently by different people?
    Some people use it as a simple seating “device,” others as a surface to place things, hang stuff, or hide luggage underneath. When it is wheel-less, some people use it as a step to reach higher parts of the room, to pick up things from shelves. (In, er, rare cases, they use it recreationally for play.)

    What appears to be the most difficult?
    Getting things that have fallen on the ground; it is a challenge attempting to steer a moving chair properly to get a pencil or so from underneath. Getting it in the right orientation and height. Trying to move the seating surface all the while  the wheels move about by themselves.

    What appears to be the easiest?
    Siting on it, unsurprisingly. Leaning forward and backwards.

    What takes the most time?
    Finding the right position and/or orientation. If the chair is wheel-less, then this is even more difficult. Hanging stuff takes more than it should.

    What takes the least time?
    Again, leaning backwards and forwards.

    How long does the whole transaction take?
    To sit: 1 to 2 seconds. Hanging and placing stuff: 2 seconds. Adjusting position and orientation: 1 to 5 seconds. Leaning forward or back: less than 1 second. Standing up: less than 1 second.

    Critique, following Crawford and Norman
    Norman’s main thesis is that the design should show the answers to the problem it is attempting to resolve, and a chair that is conventionally-designed certainly does that for the main problem of sitting: the seating surface visibly affords to have a person sit on it. However, it does not afford that exclusively, nor do the other parts of the chair (its backrest, for instance, affords the placement of clothing materials). This is not a bad thing, but it calls for a revision in its essential design philosophy: emergent usage patterns requires that proper mapping is established between the new interaction operations with their controls and outcome. For instance, the backrest should no longer accommodate the process of leaning backwards exclusively, and should also be visibly mapped to the process of placing suspended materials.

     

    Photo credit: http://www.toddusa.com/products.aspx?id=54

  • Object-Oriented Shape-making

    Object-Oriented Shape-making

    Click on the sides of the image above to browse the gallery.

    For this project, I decided to create two classes, one that draws a shape, and the other uses transformations (rotation, etc.), and iteration to create a composite arrangement from the original shape.

    832590.0

    The Shape class draws a simple shape from basic shape directives (e.g. rect(), ellipse(), etc). My partner, Xinyi and I, worked on creating different variations of this class, to produce interesting results.

    687929.9

    The Arranger class has the capability of changing the sizes of the Shape instances it receives, as well as the radius and angle used in the rotation.

    494632.7

    Multiple runs with varying parameters (e.g. Shape class characteristics, radius, size, angle, and so on), yielded different designs.

    323254.2

  • Why People Don’t Trust The Dog, an audio documentary

    Why Dead People are Buried, a Nigerian folktale

    In the beginning of the world when the Creator had made men and women and the animals, they all lived together in the creation land. The Creator was a big chief, past all men, and being very kind-hearted, was very sorry whenever any one died. So one day he sent for the dog, who was his head messenger, and told him to go out into the world and give his word to all people that for the future whenever any one died the body was to be placed in the compound, and wood ashes were to be thrown over it; that the dead body was to be left on the ground, and in twenty-four hours it would become alive again.

    When the dog had travelled for half a day he began to get tired; so as he was near an old woman’s house he looked in, and seeing a bone with some meat on it he made a meal off it, and then went to sleep, entirely forgetting the message which had been given him to deliver.

    After a time, when the dog did not return, the Creator called for a sheep, and sent him out with the same message. But the sheep was a very foolish one, and being hungry, began eating the sweet grasses by the wayside. After a time, however, he remembered that he had a message to deliver, but forgot what it was exactly; so as he went about among the people he told them that the message the Creator had given him to tell the people, was that whenever any one died they should be buried underneath the ground.

    A little time afterwards the dog remembered his message, so he ran into the town and told the people that they were to place wood ashes on the dead bodies and leave them in the compound, and that they would come to life again after twenty-four hours. But the people would not believe him, and said, “We have already received the word from the Creator by the sheep, that all dead bodies should be buried.” In consequence of this the dead bodies are now always buried, and the dog is much disliked and not trusted as a messenger, as if he had not found the bone in the old woman’s house and forgotten his message, the dead people might still be alive.

     

    Using Logic Pro X, Isi Azu and myself set out to record the above chillingly existential folktale. As most folktales do, it evokes a communal sense of shared experience, and the way a culture attempts to make sense of the unknown. We felt that that this can be best captured through recording each phrase by a different individual from another culture. The results are below..

     

    (Another, more stripped down version is below).

     

  • “Man is the measure”

    “Man is the measure”

    E.M. Forester, in his short story, The Machine Stops, predicts—among many things, the prevalent paradigm of technology consumption: as a substitute for experience (or, “first-hand ideas”). Technology, he argues, can stand between man and his life—his “naked” humanity, through its persistent administration of convenient comfort.

    In one memorable passage, Kuno, the story’s sort-of protagonist describes the epiphany that helped him see through the veil of techology:

     

    Screen Shot 2013-09-20 at 12.13.31 AM

     

    In the dystopian world in which the story is set, the technologists advocate vicarious experience, as it is, they argue, the most purified and filtered. Vicarious experiences as in lectures, recorded music, telecommunications. As the zealot of a lecturer feverishly declared, the second-hand experience that one generation passes to the other would one day produce a “generation seraphically free  from any taint of personality.”

     

    To save ourselves from such a reality of saccharine death, we can’t help but follow in Kuno’s footsteps, to use ourselves as a measure of the world. Distance can only be understood by traversing it; and we only own things by holding them.

     

    Kuno found his way out by searching for darkness amidst the artificial light, for it is the only real exception among all the artificiality. Discomfort, at times, is the only real thing there is in one’s own context.
  • How to make human-friendly everyday things, a very short introduction

    How to make human-friendly everyday things, a very short introduction

    [In response to Donald Norman’s The Design of Everyday Things]

    Say you were asked to make or remake a piece of furniture, cutlery,  an appliance, or an electronic apparatus.

    You are perhaps a visual designer: you start aesthetically, exploring the palettes, textures, and patterns that interact with the visual systems of the users (viewers, to you).

    Or maybe you are an engineer, so you naturally concern yourself with the completeness of the physical and logical systems that comprise the product in question. You ensure, as a result, that the components run effectively and efficiently, as a whole and in individuality.

    Or, if you are a marketer, you have a business plan to follow, a portfolio of products to maintain to maximise profits and minimise the costs. Those pretty fancy tactile patterns that will take up the whole budget and then some? They’ll have to go. And so does the super fast, deathly quiet gears system that an engineer laboured to make with complete disregard to the budget at hand.

    All of the above are ways to make things; things that are profitable, efficient, pretty. Those are not mutually exclusive parameters. And they are not the only ones; they disregard usability, the manifestation of the user’s point of view, whether explicit or implicit, as opposed to that of the aesthete, engineer, and merchant. This does not mean that the user is not interested in the other three: a user might and will look for things that are pretty, reliable, and cheap. But not only that.

    ****

    So if we go back to you and your desire to make the aforementioned product, tool, etc. You want to do it differently from the aesthete, techie, and marketer above, in that you want your design to be of maximum usability and human-centricity How do you go about it?

    1. You start with the intent of having your design speak of the problem it is solving, without the need for instructions, symbols, etc. How can you do that?

    2. You do that by designing for and with affordance. Affordance is the actual and perceived properties of an object that allows it to perform specific tasks and none other than those. For instance, a table affords placement of things on its surface, but does not afford, say, to be used as a moving object.

    3. Some affordance properties are inherent to the physical reality of an object, e.g. the material it was made from. Soft and straight plywood affords carrying things of a specific weight range. Paper’s porousness affords words to be written on it.

    4. Other affordance properties are created by the designer: a slot in the design affords insertion, whereas a handle affords being pulled in a specific direction.

    5. Affordance is not the end of the story. The next step is to design with visibility in mind.

    6. Visibility of all the possible interactions with the object.

    7. Visibility of the outcome of said interactions i.e. the object should provide proper feedback to all possible operations by the user.

    8. Incorrect feedback mechanisms results in false causality that breeds superstition. For instance, a system failing right after a click of a button results in the user believing that said operation is the cause of the failure, even if that is incorrect.

    9. Affordance and visibility combine to provide mapping, i.e. the relationship between what can be seen and what can be done. Or, more concretely, the relationship between the controls, the operations, and their outcomes.

    10. The best possible mapping is natural mapping, which relies on the conventions (e.g. an arrow indicates direction) and the physical properties of things.

    11. If you keep the above in mind, you should be able to ease the user into a proper conceptual model of the product, where the user can predict the outcomes of his actions towards the object.

    12. A good user conceptual model is achieved when the designer’s own model is communicated simply and properly through  the physical model of the object itself.

    All of the above is a gross simplification, but what isn’t?  But it is a starting point.

    [photo credit: Mental model of how a car works, by davegray [http://www.flickr.com/photos/davegray/236316672/]

  • Arduino diaries II

    The Arduino microcontroller receives a digital input from a switch, and accordingly outputs a digital signal that lights up an LED depending on the state of the switch.