|HUMA: Manual under Construction The Input Patchbay - Data flow in HUMA|
To understand the concepts behind the input patchbay, one needs to be aware of how data traffic in HUMA is handled. One should also keep in mind that HUMA originates in projects that used a single axis infrared distance sensor as their interface and that contactless control by means of more or less potent sensor systems is still a main focus. This results in an interfacing concept that puts more focus on dynamic data than on clicks etc (it's really hard to deliver accurate clicks, drags etc. without a haptical device). While HUMA has more and more integrated click like events, resulting from devices like buttons etc (see Events & Actions), dynamic data is still very prominent.
|The input patchbay is used to distribute dynamic datastreams, that is - in oposition to two state events like button clicks - continously changing data as it for example results from a motion tracking system that tracks a user's position in the room.
The patchbay holds datasources on the left hand side and destinations on the right hand side. You make data flow from a source to a destination during runtime by connecting them here.
When you create a new scene, the patchbay will be initialized to default settings. You can set your own defaults, save & recall presets etc. through the context menu on the patch area.
Some of the present sources and destinations have more options, accessible via the arrow buttons. Some allow multiple connections, some just a single one. Be aware that theoretically you can construct senseless setups here that may even crash your system. On the other hand intelligent use of the patchbay will let you create behaviors that would make very complex template scripts (HUMAmodes) necessary. Some examples are listed below.
All sources deliver normalized data in a range from 0 to 65536 (16bit) That means that the maximal y position of the mouse for example results in the same value as a fully set Midi controller whose "real" range is 127 while the mouse may deliver 768 or more values. Identically, the movie source delivers the time position in the currentscene as a 16bit value regardless of if the scene has just 1 frame or lasts for 2 hours. This puts a little inacuracy into the system of course, but if one recalls that MIDI has made it for some twenty years now with a 7bit range, 16bit should offer a reasonable resolution for most situations.
Sensor: The sensor device selected for the scene. HUMA can handle scene 1 by Motion tracking, scene 2 by mouse, scene 3 by MIDI, etc. This defaults to "main", referring to the sensor you set to be the main sensor in the Runtime Configuration. Change it, if you want to use a different device for a scene.
The Sensor source has 7 outlets. Not all sensors will deliver data on all of them, and what arrives where depends on the configuration again, but the basics should be common among sensors. For example the "mouse sensor" will deliver the Y Position on outlet 1, x on outlet 2 and the scroll wheel on 3. All sensors should also deliver data on 6 (averaged amplitude of inputs) and 7 (timewise intensity of the incoming signals)
Movie: This source delivers the current movie time in the current scene and the movie's pitch (if it is set to be dynamically changeable).
Time: Delivers the scene's timeout and exit counters as continous streams. These counters will run, regardless of if you have any links or actions assigned to Timeout and Exit. What you need to set though is a time value, that will serve as the normalizing base.
Wavegenerator: Three LFO waveforms. Frequency, speed and offset are adjustable, but not yet runtime controllable
Network: No matter what sensor a scene uses, dynamic data coming from other runtimes on the network is allways available here.
ActionScript Return: When you build an Actionscript layer using the templates you can send dynamic data back to HUMA from your scripts. Here is where it arrives.
FFT: Audio Input that delivers overall, bass, mid and treble levels.
Click: Huma defines a click as a rapid movement, that is: a certain amount of the input range is crossed in short time. The patchbay's click destination allows you to generate the resulting click trigger from any of the available sources.
HUMA: this is the entry point to the template behaviors, called HUMAmodes. Not all of them will make use of four dynamic streams, but you can again route any of the sources into here to, for example, make a movie follow the sinewave output of the wavegenerator. etc.
Exit: exit is defined as the absence of input. Usually this is used to check if someone is active in the sensorfield. Soon as there is no more activity detected, a counter will start running (see Time source above) and execute the exit-trigger when the adjusted time has been reached. Detected activity will stop and reset this counter. The exit destination here allows you to not only check the sensorfield but for example an audio signal (counter will start, when signal drops below threshold). This destination allows multiple inputs.
Grid: Grids define virtual switches in the sensorfield, (see -> Grids). The ability to feed selectable input sources to these grids offers vast possibilities: for instance one could connect the sensor's y-axis to the grids y-input and any kind of timer to its x-input. With changing time the grid can than execute different triggers co-depending on how close the user is to the screen at different times. This destination also lets you select the grid to use for the scene.
Mouse Hijacker: This does what the name says: it allows you to take over the mouse pointer and attach it to any input routed here. This is, besides being really funny to see, useful mainly for sprite actions and animations as most of the Quicktime defined events that sprites can use are mouse-based (like mouseover, mouseout etc). When you use the mouse itself as a sensor, this does nothing.
ActionScript Cursor: sends connected data to flash layers build along the supplied templates. See -> Cursors for details.
Sound: four streams that are routed to HUMA's audio engine. In the scene builders sound page you can attach actions like dynamic fades etc. to these streams.
Network & Outputs: Huma may send dynamic data to the network and/or two physical output ports, when you feed it in here. Expand the destination's display and select what output to use (outputs need to be set up in runtime configuration). This again, allows multiple connections. For network sends you can set the ID of the receiving runtime by selecting a patchline and typing the ID into the textfield.
Media: Some parameters of the media present in a scene can be directly modulated via this. Click the edit button to set up the details. Accessible parameters incude opacity, zoom, clipping, volume, pan etc.
Beside the destinations mentioned, more may appear here, depending on other settings. For example if you create a controlable embedded movie, its control input will appear here. By right clicking the destinations' panel below the last port you can also set up ports that feed dynamic data to variables.
Some simple examples:
To make a movie play, when the user is inactive, use a pitch or position HUMAmode and connect the Time source's exit output to the main HUMA input.
To voice control scene changes feed the audio level into click and adjust click thresholds.
Use the wavegenerator to animate sprites (either by taking over the mouse and attach mouseactions to sprites or by making them follow variables). Then let the sensor control another sprite and setup collision actions for that one. Ready is your basic videogame.