Hydra workshop materials
Links
Editor
There are a few ways to use the hydra environment:
- Open the online editor in your browser (the easiest, recommended to start with)
- A plugin for you favourite text editor (atom/pulsar/vscode)
- Launch your own server/Embed into a VR canvas/generative NFT/the possibilities are endless…
Working with the editor
Launching code:
Alt + Enter
: Launch a block of code (separated by lines of whitespace)Ctrl + Enter
: Launch a line of codeCtrl + Shift + Enter
: Launch all the code on screen
How I learned hydra: the algorithm for self teaching
- Open the editor
- Click on the shuffle button (top-right corner) until you find something interesting or beautiful
- Change the numbers in the parameters, or add/remove functions in the code
- Launch the modified sketch, see what changed, and try to figure out the logic behind how it works by reading about the fuction in the resources
- goto step 3
The structure of a sketch
A simple sketch:
osc(20, 0.1, 1) // Texture
.rotate(0, 0.3) // Transformation
.repeat(6, 6) // Transformation
.kaleid() // Transformation
.out(o0) // Sending the result to the output labelled o0
A texture is the base object we start a sketch with. Anything you can display on the screen in hydra is a texture. Next, we apply transformations sequentially to the texture, chained through periods. The result is then sent to one of hydra’s outputs through the .out()
function - the default output is labelled o0
.
Textures
osc(frequency, sync, offset)
: Oscillator, one dimensional wave displayed in two dimensionsnoise(scale, offset)
: Random noiseshape( sides, radius, smoothing)
: Geometric shape, a regular polygon
Transformations
Geometry
rotate(angle, speed)
pixelate(pixelX, pixelY)
kaleid(nSides)
scale(factor)
Color
hue(hue)
: Hue shiftthresh(threshold)
: Thresholding by brightnesscolorama(shift)
: HSV shift
Ouputs
Hydra has four output channels (o0, o1, o2, o3) which we can use separately or mix together. The render()
function controls which channel is displayed on the screen. Just render()
without any arguments displays all four channels, and render(o0)
will display only o0
.
osc().out(o0)
noise().out(o1)
src(o0).diff(o1).out(o2)
src(o2).invert().out(o3)
render()
Additional inputs
Additional inputs need to be first initialized, and can then be accessed as textures with the src
function.
Webcam
s0.initCam()
src(s0).out()
Screen capture (full screen, a window, or a browser tab)
s0.initScreen()
src(s0).out()
Operators: mixing textures
So far we’ve looked at functions that transform a single texture. Let’s now consider binary operators, which can act on and mix two functions. Example:
osc(69, 0.1, 1)
.rotate(0, 0.3)
.kaleid(6)
.out(o0)
noise(4).out(o1)
src(o0).diff(o1).out(o2)
src(o1).blend(o0, 0.3).out(o3)
render()
Modulation
texture1.modulate(texture2, amount)
: Renders texture1, shifted based on the brightness of texture2 in the same area. The second parameter controls the strength of the shift.
Let’s launch this example:
osc(69, 0.1, 1)
.kaleid()
.out(o0)
shape(4).out(o1)
src(o0).modulate(o1).out(o2)
src(o1).modulate(o0).out(o3)
render()
Dynamically changing parameters
In all the code we’ve written so far, the parameters in the hydra functions have been static numbers. We can instead define these parameters dynamically, as mathematical functions of time, random numbers, MIDI inputs, audio inputs, and anything else we can load into javascript. There’s a special syntax for defining dynamic parameters:
// Full syntax
osc(function(){return 100 * Math.sin(time * 0.1)}).out()
// Shortened
osc(() => (100 * Math.sin(time * 0.1))).out()
Working with audio
Now we’re ready to make our sketches audio-reactive! In hydra we work with audio through the a
object:
// Parameters
a.show() // Show the loudness of our frequency bins
a.setBins(8) // Set the number of frequency bins
a.setSmooth(0.5) // Set the strength of smoothing in the interval [0,1]
osc(31, 0.1)
.modulate(noise(5))
.thresh(()=>0.8-3*a.fft[1]) // The threshold parameter depends on frequency bin 1
.out()
Feedback
When we use the output of the sketch as an input for some other part of the sketch, we have a feedback process. Stated roughly, the feedback algorithm is:
next_frame = some_transformation(previous_frame)
The transformation and the way in which we include content from the previous frame defines the visual characteristics of our feedback process.
An example of feedback using the entire output of the screen:
s0.initScreen()
src(s0).scale(()=>1+0.1*Math.sin(time)).out()
A more subtle example where we ‘anchor’ the feedback process by blending it with a texture:
src(o0)
.scale([1.02, 0.99, 1])
.rotate(()=>0.1*Math.sin(time))
.blend(
osc(20, 0.1, 1)
.mask(shape(3)
.scale(1, 0.7)
.modulate(noise(2), 0.05))
,0.32
)
.out()
My favourite trick in hydra: feedback with new layers determined by a mask based on a threshold.
s0.initCam()
src(o0)
.scale(1.01)
.layer(
src(s0).mask(src(s0).thresh().invert())
)
.out()