<?xml version="1.0" encoding="utf-8" ?><rss version="2.0" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:wfw="http://wellformedweb.org/CommentAPI/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>satyarth.me</title><atom:link href="http://satyarth.me/feed.xml" rel="self" type="application/rss+xml"></atom:link><link>http://satyarth.me</link><description>-</description><pubDate>Sat, 22 Nov 2025 01:00:00 +0100</pubDate><generator>Wintersmith - https://github.com/jnordberg/wintersmith</generator><language>en</language><item><title>Random Sound Walks with P5.js</title><link>http://satyarth.me/articles/random-walks/</link><pubDate>Sat, 22 Nov 2025 01:00:00 +0100</pubDate><guid isPermaLink="true">http://satyarth.me/articles/random-walks/</guid><author></author><description>&lt;p&gt;&lt;strong&gt;Random Walks&lt;/strong&gt; is a &lt;a href=&quot;https://gitlab.com/satyarth/random-walks&quot;&gt;open source&lt;/a&gt; generative audio-visual installation that tries to capture the vibe of a sound walk in an area by combining field recordings with OpenStreetMap data. It was originally developed for Moscow’s &lt;a href=&quot;https://cryptography-museum.ru/&quot;&gt;Museum of Cryptography&lt;/a&gt; as part of the &lt;a href=&quot;https://gorodgovorit.moscow/&quot;&gt;«Город говорит»&lt;/a&gt; project. Given geo-tagged field recordings, an abstract observer randomly traverses the city’s streets and triggers sounds. Here’s how it looks and sounds in action:&lt;/p&gt;
&lt;video width=&quot;640&quot; height=&quot;360&quot; controls&gt;
  &lt;source src=&quot;/articles/random-walks/zvukozavr.webm&quot; type=&quot;video/webm&quot;&gt;
Your browser does not support the video tag.
&lt;/video&gt;

&lt;p&gt;A library of over 150 sounds was assembled during our field recording workshop, with regular recorders as well as piezoelectric mics, ultrasound, geophone, hydrophone, and EMF. Here’s a small-scale browser-based &lt;a href=&quot;/articles/random-walks/random-walk.html&quot;&gt;demo&lt;/a&gt; using placeholder sounds.&lt;/p&gt;
&lt;p&gt;Here’s the &lt;a href=&quot;https://gitlab.com/satyarth/random-walks&quot;&gt;source code&lt;/a&gt; for the installation. If you end up repurposing it for your own needs or need help tweaking it, do &lt;a href=&quot;http://satyarth.me/articles/random-walks/satyarth.me/#contact&quot;&gt;reach out&lt;/a&gt;, I’d love to hear from you!&lt;/p&gt;
&lt;h2 id=&quot;original-description&quot;&gt;Original description&lt;/h2&gt;
&lt;blockquote&gt;
&lt;p&gt;Random Walks&lt;/p&gt;
&lt;/blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;Satyarth Mishra Sharma, Moscow Noise Manufactory&lt;/p&gt;
&lt;/blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;Meandering through the city soundscape. The work consists of field recordings by the participants of the laboratory &lt;em&gt;‘Symphony of the big city’&lt;/em&gt;, combined with the p5.js library and OpenStreetMap data. A ‘soundosaur’ drifts around the Marfino district of Moscow with a weighted random walk, triggering sounds as it passes locations where they were recorded. Thus, every time the ‘soundosaur’ assembles a unique, generative audio collage. This soundscape is hyperreal - it exists beyond the boundaries of exisitng locations and includes sounds outside of the audible range - ultrasound, infrasound, electromagnetic oscillations, vibrations of water, soil, concrete. &lt;/p&gt;
&lt;/blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;Сатьярт Мишра Шарма, Московская Шумовая Мануфактура&lt;/p&gt;
&lt;/blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;Дрейф по звуковому ландшафту города. Работа создана из звуков, записанных участниками лаборатории «Симфония большого города» с использованием библиотеки p5.js и данных с Open Street Maps, и представляет собой карту района Марфино, по которой путешествует «звукозавр». Он строит случайный маршрут по городу с помощью алгоритма weighted random walk и запускает воспроизведение звуков, проходя мимо точек на карте, где они были записаны. Таким образом, «звукозавр» каждый раз собирает уникальный звуковой коллаж. Это пространство гиперреально: оно выходит за пределы существующих мест и включает в ландшафт звуки вне зоны слышимости — ультразвуковые, инфразвуковые, электромагнитные колебания, вибрации почвы, бетона и арматуры.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h2 id=&quot;under-the-hood&quot;&gt;Under the hood&lt;/h2&gt;
&lt;p&gt;Everything you see, including audio playback/processing, is done in P5.js. Map data is loaded from &lt;a href=&quot;https://www.openstreetmap.org/&quot;&gt;OpenStreetMap&lt;/a&gt; via the &lt;a href=&quot;https://overpass-turbo.eu/&quot;&gt;overpass turbo API&lt;/a&gt;. We provide a set of waypoints that define the boundaries and interesting areas of the map dense with sounds, and the observer follows the algorithm:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Walk to the end of the path I’m on&lt;/li&gt;
&lt;li&gt;With probability &lt;code&gt;p&lt;/code&gt;, choose the path whose end is closest to my next waypoint. Otherwise, choose a random path that starts within some radius of where I am.&lt;/li&gt;
&lt;li&gt;If I get close to the waypoint I was heading towards, randomly choose another waypoint.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;There’s no explicit graph of the map paths being built or traversed, the algorithm is greedy and chaotic.&lt;/p&gt;
</description></item><item><title>Noise Web</title><link>http://satyarth.me/articles/noise-web/</link><pubDate>Fri, 22  Aug 2025 02:00:00 +0200</pubDate><guid isPermaLink="true">http://satyarth.me/articles/noise-web/</guid><author></author><description>&lt;p&gt;Inspired by the incredible rope webs of &lt;a href=&quot;https://www.youtube.com/@Charlieswebs&quot;&gt;Charlie’s Webs&lt;/a&gt;, my boi Lyosha and I decided to try making our own rope web for a tiny festival on the outskirts of Moscow. After an afternoon of work, we had a decent five-pointed web that could hold 10 people. Since we had the hardware from my &lt;a href=&quot;https://satyarth.me/articles/cadence-clock-cyfest/&quot;&gt;bike synthesizer&lt;/a&gt; project with us, we decided to hook up its gyroscope to the web in order to sonify people’s movements and make it an interactive installation. So the web movements were picked up by the gyroscope, streamed as OSC messages over WiFi with an ESP32, and then used to trigger and modulate various sounds in VCV Rack. Here’s the result:&lt;/p&gt;
&lt;iframe width=&quot;566&quot; height=&quot;1006&quot; src=&quot;https://www.youtube.com/embed/NC10tPILaKk&quot; title=&quot;Noise Web (rope web + VCV Rack + gyroscope)&quot; frameborder=&quot;0&quot; allow=&quot;accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share&quot; referrerpolicy=&quot;strict-origin-when-cross-origin&quot; allowfullscreen&gt;&lt;/iframe&gt;

&lt;p&gt;We also added projections above the web that react to movement for a maximally immersive festival viewing experience for the flies that we trapped in our web.&lt;/p&gt;
&lt;video width=&quot;640&quot; height=&quot;360&quot; controls&gt;
  &lt;source src=&quot;/articles/noise-web/web_projections.webm&quot; type=&quot;video/webm&quot;&gt;
Your browser does not support the video tag.
&lt;/video&gt; </description></item><item><title>Crack Plant Energy</title><link>http://satyarth.me/articles/crack-plant-energy/</link><pubDate>Sun, 09 Mar 2025 01:00:00 +0100</pubDate><guid isPermaLink="true">http://satyarth.me/articles/crack-plant-energy/</guid><author></author><description>&lt;p&gt;&lt;em&gt;This post is a work in progress&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;&lt;em&gt;Recommended music to listen to while reading this: &lt;a href=&quot;https://youtu.be/5pQVY_4rXSM?t=1398&quot;&gt;Skagos - Anarchic&lt;/a&gt; (especially Movement IV)&lt;/em&gt;  &lt;/p&gt;
&lt;p&gt;Every time I see a plant growing out of a crack in the ground, rocks, asphalt, or the side of a building - somewhere it has no place being - it fills me with a sense of delight and joy. I’m sure this appreciation is widespread - but where does it come from, and what does it mean? I call this vibe I like so much &lt;em&gt;&lt;strong&gt;‘crack plant energy’,&lt;/strong&gt;&lt;/em&gt; and in this love letter to plants growing out of cracks I’ll attempt to distill what it is about it I find so captivating. We’ll touch on themes of evolutionary biology, a thermodynamic view of ecology, solarpunk, the human condition, satanism, and black metal.  &lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;/articles/crack-plant-energy/gce_berlin.png&quot; alt=&quot;me.png&quot;&gt;&lt;/p&gt;
&lt;p&gt;The short version - crack plant energy is:  &lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Resilience and adaptability - thriving in the face of hostile conditions, with creative approaches in this pursuit.  &lt;/li&gt;
&lt;li&gt;Recognition of how things form - their history and how its shapes them and their unique qualities.  &lt;/li&gt;
&lt;li&gt;A reminder of the fragility of our concrete wastelands, and a glimpse of rebirth from their ashes.  &lt;/li&gt;
&lt;li&gt;Something about entropy  &lt;/li&gt;
&lt;/ul&gt;
&lt;hr&gt;
&lt;h2 id=&quot;origin-stories-a-link-to-the-past&quot;&gt;Origin stories: a link to the past&lt;/h2&gt;
&lt;p&gt;“Being comfortable growing out of a crack” is an &lt;strong&gt;&lt;em&gt;ecological niche&lt;/em&gt;&lt;/strong&gt;, a place in the ecosystem a species occupies through its adaptations which enable it to thrive in that niche. Niches are shaped by adaptation to limited resources and environmental stressors, along with competitive and symbiotic relationships within their ecosystems, and the species that occupy them evolve guided by these factors.  &lt;/p&gt;
&lt;p&gt;Some adaptations plants evolve to be comfy growing out of cracks include:  &lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Specialized root systems to anchors themselves in crevices and extract moisture and nutrients from them.  &lt;/li&gt;
&lt;li&gt;Extreme temperature, moisture or drought tolerance to survive in periods of hardship and harsh conditions.  &lt;/li&gt;
&lt;li&gt;Seed dispersal mechanisms that allow them to find and lodge in cracks, environments where they may outcompete generalist plants and propagate themselves.  &lt;/li&gt;
&lt;li&gt;Symbiotic relationships with pioneer species like mosses or lichens which stabilize and enrich their environment with respect to the resources they need.  &lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;These adaptations don’t arrive all at once - plants obtain them through a long, gradual process of co-evolution with their ecosystems. Their evolutionary history can be traced to simpler, more widespread species that occur in more forgiving conditions, gradually adapting to conquer previously unclaimed territory and solidify their claim on an ecological niche.  &lt;/p&gt;
&lt;p&gt;TODO: another example, more cracky&lt;/p&gt;
&lt;p&gt;While not strictly crack plants, check out these boreal forests in Karelia. Trees, ferns and mosses living their best lives on the side of a cliff overlooking Lake Onega:  &lt;/p&gt;
&lt;img src='gce_karelia_2.jpg' width='666'&gt;

&lt;!-- ![me.png](gce_karelia_2.jpg) --&gt;

&lt;!-- Another cliff over a smaller lake: --&gt;
&lt;!-- ![me.png](gce_karelia_3.jpg) --&gt;

&lt;p&gt;A whole pine forest on the rocky banks of a river near Girvas:  &lt;/p&gt;
&lt;img src='gce_karelia_0.jpg' width='666'&gt;
&lt;img src='gce_karelia_1.jpg' width='666'&gt;

&lt;p&gt;The ground is mostly bare rock with the thinnest layer of accumulated detritus and soil as you can see from the fallen tree, and it’s covered in snow for most of the year, yet this short bust of sunlight and productivity allows the system as a whole to survive and burst forth with life come spring, hosting large animals like bears and moose. Without mosses and lichens to kickstart the process of accumulating organic matter, these spots would be barren wastelands. Each species has its own place within the system, and the diversity of species and their interactions with each other make the system as a whole greater than the sum of its parts.  &lt;/p&gt;
&lt;p&gt;We’ve seen a few examples of how evolutionary processes guide adaptation and specialization to ecological niches. Evolutionary biologists have uncovered these laws through an empirical approach - historically by observing and dissecting organisms (including fossilized ones), and figuring out common threads that link their evolutionary histories. More recently the structure of the evolutionary tree of life has been more explicitly modelled with phylogenetic studies that look the overlaps in species’ genetic code to figure out points of divergence. So we have a pretty good idea of how the &lt;a href=&quot;https://www.onezoom.org/&quot;&gt;tree of life&lt;/a&gt; on our planet came to be the way we see it today (except for the very beginning) - how it matured from simple organisms to more complex ones, how it adapted to changing conditions on our planet with branches being chopped off by extinction events with new ones sprouting to occupy previously unreachable niches. This is very cool, but it feels incomplete - like getting a peek into the inner workings of the mechanism, but being oblivious to the principles that went into its design. And for a more fundamental view, I’m afraid we’ll have to turn to physics.&lt;/p&gt;
&lt;hr&gt;
&lt;h2 id=&quot;entropic-ecology-guided-by-chaos&quot;&gt;Entropic ecology: guided by chaos&lt;/h2&gt;
&lt;p&gt;The question of what life is has long fascinated physicists. Notably, Erwin Schrödinger popularized the idea of looking at life through the lens of thermodynamics and entropy in his 1944 book &lt;a href=&quot;https://archive.org/details/whatislifeothers00schr/&quot;&gt;‘What is Life’&lt;/a&gt; &lt;em&gt;(not sure I’d be thinking about the fundamental nature of life if I were around in 1944, but hey, you go Erwin!)&lt;/em&gt;.&lt;/p&gt;
&lt;h3 id=&quot;thermodynamic-entropy&quot;&gt;Thermodynamic entropy&lt;/h3&gt;
&lt;p&gt;&lt;strong&gt;&lt;em&gt;Entropy&lt;/em&gt;&lt;/strong&gt; is a measure of disorder within a systems. High entropy = chaotic, low entropy = ordered. For example, take water. In ice form it has a neat crystal structure (low entropy), in liquid form its molecules move around within the volume of the liquid (higher entropy), and when it evaporates to steam its molecules bounce around even more chaotically (even higher entropy). These transitions are driven by an energy input in the form of heat. The &lt;a href=&quot;https://en.wikipedia.org/wiki/Second_law_of_thermodynamics&quot;&gt;Second Law of Thermodynamics&lt;/a&gt; roughly states that the entropy of closed system always increases. Any local decrease in entropy is only made possible by a larger increase of entropy elsewhere. For example we can freeze water into ice lowering its entropy, but freezers (and any cooling mechanisms) consume energy in their operation and give off heat, which leads to an overall increase in entropy.&lt;/p&gt;
&lt;p&gt;From a thermodynamic point of view, living organisms can be seen as &lt;strong&gt;&lt;em&gt;self-replicating dissipative structures&lt;/em&gt;&lt;/strong&gt;, consuming free energy though their metabolism and dissipating it to their environment increasing entropy. Plants capture sunlight (low-entropy energy) and convert it into glucose (low-entropy molecules storing chemical energy) at the cost of increasing overall entropy by dissipating heat and water vapor. Herbivores get their energy by consuming and metabolizing these plants, and are in turn a food source for carnivores. At every stage in this food chain, around ~90% of this energy is lost as a waste byproduct of metabolic processes. This is why animal-based foods are generally more energy-dense than plant-based ones - the animals have done the work of concentrating energy for us. To sustain large and complex organisms, an ecosystem needs to efficient at capturing energy (and generating entropy as a byproduct). Trophic pyramid from &lt;a href=&quot;https://pressbooks.openeducationalberta.ca/planetearth/chapter/ecology-and-biogeogrphy/&quot;&gt;Open Education Alberta&lt;/a&gt;:&lt;/p&gt;
&lt;img src='trophic_pyramid.png' width='700'&gt;

&lt;p&gt;Quoting from &lt;a href=&quot;https://www.mdpi.com/1099-4300/25/3/405&quot;&gt;Entropy, Ecology and Evolution: Toward a Unified Philosophy of Biology&lt;/a&gt;:  &lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;A dissipative structure is something that builds and maintains order by consuming free energy and therefore creating more disorder in its environment. Dissipative structures arise as a consequence of dispersal and degradation of energy and associated increase in entropy and disorder. The structure is sustained by the flow of energy and as soon as that ceases the structure decays. When the process producing the dissipative structure occurs, the rate of generation of entropy in the universe is increased because energy is being dissipated more rapidly by the dissipative structure than it would in its absence. In the case of organisms, the entropy of the universe is increasing more rapidly as a result of photosynthesis and the biochemical pathways of metabolism and tissue growth than it would if the photons had fallen on inanimate earth. As soon as the source of energy is removed (photons from the sun), the dissipative structures of organisms will die and decompose. They are thermodynamic instabilities driven by the flow of energy and the transduction and degradation of that energy.  &lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Ecosystems are networks of organisms, and this principle applies at ecosystem scale as well. The characteristics of a species or ecosystem emerge through (co)evolution acting as an optimization method with the maximization of free energy dissipation (and the increase of entropy) being the optimization target - the system is constantly looking for configurations that enable energy to flow more freely. Evolving species that grow in inhospitable environments like cracks is one such configuration that maximizes harnessed energy.&lt;/p&gt;
&lt;h3 id=&quot;information-entropy&quot;&gt;Information entropy&lt;/h3&gt;
&lt;p&gt;Let’s now shift from considering thermodynamic entropy to information entropy. Though it’s hard to quantify the ‘information’ an ecosystem contains, we can think of it as being encoded in several layers:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;&lt;em&gt;Species abundance and diversity:&lt;/em&gt;&lt;/strong&gt; The number of species in an ecosystem, the abundance of their members, and their distributions in time and space encode a significant amount of information about the system. This layer is perhaps the simplest to investigate, as there’s a &lt;a href=&quot;https://www.coastalwiki.org/wiki/Measurements_of_biodiversity#Shannon-Wiener_diversity_index&quot;&gt;well-defined mathematical foundation&lt;/a&gt; for quatifying it and it can be measured by observing an ecosystem and counting the occurences of different species. &lt;/li&gt;
&lt;li&gt;&lt;strong&gt;&lt;em&gt;Genetic:&lt;/em&gt;&lt;/strong&gt; The genetic code of the organisms that comprise the ecosystem. For a singular species, changes in their DNA that make them more specialized represent a stricter ordering of genetic information, locally decreasing information entropy. However, these adaptations usually co-occur in a rich and diverse ecosystem, leading to greater variation (and higher entropy) in the system as a whole.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;&lt;em&gt;Cultural and behavioural knowledge:&lt;/em&gt;&lt;/strong&gt; This layer has to do with behaviour, memory, and culture. For example, a herbivore’s favourite foraging spots, &lt;a href=&quot;https://www.cell.com/current-biology/fulltext/S0960-9822(11)00291-0&quot;&gt;whale songs in different dialects&lt;/a&gt;, and the locations of breeding grounds in migratory animals can be considered a form of culture.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;&lt;em&gt;Inter-species relationships&lt;/em&gt;&lt;/strong&gt;: (eg. mycorrhizal networks)&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;This &lt;strong&gt;&lt;em&gt;maximum entropy (MaxEnt)&lt;/em&gt;&lt;/strong&gt; approach applies not only from the perspective of thermodynamics but also  &lt;a href=&quot;https://pmc.ncbi.nlm.nih.gov/articles/PMC7515227/&quot;&gt;information theory&lt;/a&gt; and is often applied in empirical ecological studies. By maximizing information entropy - a measure of uncertainty in probability distributions - MaxEnt derives theoretical predictions about the scaling relationships between species abundance, metabolic rates, spatial distributions, and energy allocation without relying on mechanistic assumptions about species interactions.&lt;/p&gt;
&lt;p&gt;For a treatment of how fractal spatio-temporal patterns emerge in nature from energetic constraints by accounting for how quickly sessile organisms grow and die mediated by competition for fluctuating resources, check out the paper &lt;a href=&quot;https://www.pnas.org/doi/10.1073/pnas.2020424118&quot;&gt;Growth, death, and resource competition in sessile organisms&lt;/a&gt;.&lt;/p&gt;
&lt;hr&gt;
&lt;h2 id=&quot;anthropogenic-cracks-visions-of-a-solarpunk-future&quot;&gt;Anthropogenic cracks: visions of a solarpunk future&lt;/h2&gt;
&lt;p&gt;While cracks in the ground or rocks can be certainly be hostile environments, they’re no match for many of our barren urban landscapes covered with asphalt and concrete. And to me, plants reclaiming anthropogenic (man-made) cracks for nature are the embodiment of the solarpunk ethos.&lt;/p&gt;
&lt;p&gt; Check out this radar complex from an abandoned military site on the outskirts of Moscow (incidentally, also the location for the abstract film &lt;a href=&quot;https://www.youtube.com/watch?v=TPXfV02N6uc&amp;amp;t=1864s&quot;&gt;ШАРО-ФОМИНСК&lt;/a&gt;). It’s a ~5 storey building capped with a concrete dome that would armor the radar equipment it housed.&lt;/p&gt;
&lt;img src='gce_shary_2.jpg' width='666'&gt;
&lt;img src='gce_shary_4.jpg' width='666'&gt;

&lt;p&gt;The roof of the building at the height of the surrounding forest’s canopy is sprouting its own mini-forest. Close to the dome we find areas that already support trees like birch, while decaying planks of wood criss-crossing its tarred surface act as nucleation points for moss to lay the foundations of future colonization.&lt;/p&gt;
&lt;img src='gce_shary_0.jpg' width='666'&gt;
&lt;img src='gce_shary_1.jpg' width='666'&gt;
&lt;img src='gce_shary_3.jpg' width='666'&gt;

&lt;p&gt;A discussion of the implications of MaxEnt for ecologically regenerative urban design: &lt;a href=&quot;https://www.mdpi.com/2073-445X/13/9/1375&quot;&gt;Ecologically Regenerative Building Systems through Exergy Efficiency: Designing for Structural Order and Ecosystem Services&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://www.reddit.com/r/reclaimedbynature/top/?sort=top&amp;amp;t=all&quot;&gt;r/reclaimedbynature&lt;/a&gt;  &lt;/p&gt;
&lt;!-- Dave Ackley stuff: propagating code   --&gt;

&lt;hr&gt;
&lt;h2 id=&quot;the-human-connection&quot;&gt;The human connection&lt;/h2&gt;
&lt;p&gt;What is nature telling us as humans? What does it mean for a person to embody ground crack energy?  &lt;/p&gt;
&lt;hr&gt;
&lt;h2 id=&quot;tangents-conclusions&quot;&gt;Tangents, conclusions&lt;/h2&gt;
&lt;p&gt;Crack plant energy is satanic. Quoting from &lt;a href=&quot;https://churchofsatan.com/satanism-the-feared-religion/&quot;&gt;this essay&lt;/a&gt; without elaboration:  &lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;The reality behind Satan is simply the dark evolutionary force of entropy that permeates all of nature and provides the drive for survival and propagation inherent in all living things. Satan is not a conscious entity to be worshipped, rather a reservoir of power inside each human to be tapped at will.  &lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;And now I’ll leave you with a quote from the black metal album linked above:  &lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;The soil is the seed’s universe. It is oblivious, as it longs and strains and reaches to sprout from the ground, to what lies beyond what it has always known. But is is born to strive upward, to whatever grief or joy is beyond. As I rise above the world, as I hurtle through the sky, as I expand in every direction, I do not know what is beyond these stars or the vast and aching blackness they pierce. But I must strive, I must rise. I must go beyond. It does not matter whether it is the boundary or myself that is destroyed. I am the transgressor! May my body break these bonds or may these bonds break my body. I am the fate of the earth. All the light will come to live within me, and I must shine with it or it will die within me, and existence will cease. I am the momentum of life hurtling ever forward. I am all my brothers and sisters of every kind – all silent standing trees and mottled owls and speckled fish gliding through the light as it shimmers in the water – they are all within me and I am within them. We are a circle and we must rise!  &lt;/p&gt;
&lt;/blockquote&gt;
</description></item><item><title>Livecoding Patterns</title><link>http://satyarth.me/articles/livecoding-patterns/</link><pubDate>Mon, 06 Jan 2025 01:00:00 +0100</pubDate><guid isPermaLink="true">http://satyarth.me/articles/livecoding-patterns/</guid><author></author><description>&lt;p&gt;I’ve been livecoding in &lt;a href=&quot;https://hydra.ojack.xyz/&quot;&gt;hydra&lt;/a&gt; and &lt;a href=&quot;https://tidalcycles.org/&quot;&gt;Tidal Cycles&lt;/a&gt; since about 2020, and want to share some patterns that I never get tired of.&lt;/p&gt;
&lt;h2 id=&quot;visuals&quot;&gt;Visuals&lt;/h2&gt;
&lt;p&gt;Feedback is my bread and butter in hydra. When VJing I like to set up a patch with feedback and audioreactivity, a source video/camera feed, and just walk away and chill out while my patch does its thing and lives its own life. This is a technique visual artists have been using for ages, but I picked up how to do in in hydra from watching &lt;a href=&quot;https://www.youtube.com/@flordefuega/videos&quot;&gt;Flor de Fuega&lt;/a&gt;‘s videos.&lt;/p&gt;
&lt;p&gt;The basic premise is that we grab the video buffer’s last frame, apply some transformations, and slap some new content on top of it. We want the new content to act as a new layer, with the feedback happening under it. To do the layering, we use hydra’s &lt;code&gt;layer&lt;/code&gt;, &lt;code&gt;mask&lt;/code&gt;, and &lt;code&gt;thresh&lt;/code&gt; functions to mask the input buffer based on thresholding its intensity like so:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;src(o0)
    .modulateScale(src(s0), ()=&amp;gt;a.fft[1]*0.1)   //
    .modulateRotate(src(s0), ()=&amp;gt;a.fft[3]*0.1)  //  Transformations go here
    .hue(0.01)                                  //  Unleash your imagination!
    .layer(
        src(s0).mask(src(s0).thresh(()=&amp;gt;0.3+a.fft[2]) // s0 is the buffer containing our &amp;#39;new content&amp;#39;
        )
    )
  .out()&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;&lt;a href=&quot;https://hydra.ojack.xyz/?sketch_id=sElrmXUQeMcCp30D&quot;&gt;Here’s&lt;/a&gt; the example patch in the hydra editor - open it up, give it access to your webcam/microphone, and make some sounds at your computer!&lt;/p&gt;
&lt;h2 id=&quot;sounds&quot;&gt;Sounds&lt;/h2&gt;
&lt;p&gt;Since my main instrument is the &lt;a href=&quot;http://satyarth.me/trophallaxis&quot;&gt;tabla&lt;/a&gt;, most of my sounds are written to accompany a live tabla performance. In Indian classical music, a tabla is typically accompanied by a ‘lehra’, a repeating melody with a constant tempo, played on an instrument like the harmonium or sarangi that acts as a scaffolding for the tabla player’s rhythmic explorations. This melody starts at a root note, then ascends and descends before arriving back at the root. The lehra doesn’t change, except for its expression being modulated by the accompanying artist in the context of the performance.&lt;/p&gt;
&lt;p&gt;Livecoding works great for generative lehras! I like to pre-define a ‘root sequence’ based on the time cycle I’m playing in, and have different instruments playing that sequence and adding variations on top of it - so we get the ascending and descending structure with some improvisation to spice things up. I mostly use Tidal Cycles for generating midi sequences that I send to my DAW or VCV rack, but I’ll use &lt;a href=&quot;https://strudel.cc/&quot;&gt;strudel&lt;/a&gt; to illustrate my point here.&lt;/p&gt;
&lt;iframe
  src=&quot;https://strudel.cc/#bGV0IHJvb3RTZXEgPSAiPDAgLTEgMiA0IDcgNiAzIDE%2BIgoKc3RhY2soCiAgbihyb290U2VxCiAgICAgIC5hZGQub3V0KGNhdCgiMCIsICJbMCAyIDAgLTJdfFszIDIgMSAwXXxbNCAyIDBdfFstMiAtMV18WzAgLTcgMCAtNF0iKS5mYXN0KDIpCiAgICAgICAgICAubGVnYXRvKCIzIDEuNSIpLmRlZ3JhZGVCeSgiMCAwLjQiKS5yb29tKDEpKSkKICAgIC5zY2FsZSgiQyM6bWlub3IiKQogICAgLnNvdW5kKCJwaWFubyIpLAogIG4oIjAvOCIpLnNjYWxlKCJDIzptaW5vciIpLnJvb20oMS41KS5zb3VuZCgiaGgiKSwKICBuKHJvb3RTZXEuYWRkLm91dCgiMCBbfiBbMiAwXXwgfiBbNCAyXXwgfiBbMyAyIDFdXXwgfiBbLTIgLTJdIHwgfiB%2BIFstMiAyIDAgM10gfiIuc2xvdygyKSkpCiAgICAuZGVncmFkZUJ5KCIwLjEgMC4zIikuc2NhbGUoIkMjOm1pbm9yIikubGVnYXRvKDAuNCkucm9vbSgyKS5zb3VuZCgidmlicmFwaG9uZSIpCikuY3BtKDYwKQ%3D%3D&quot;
  width=&quot;100%&quot;
  height=&quot;420&quot;
&gt;&lt;/iframe&gt;

&lt;p&gt;In this example I’ve hardcoded the variations which get picked randomly, but in Tidal Cycles I like to use Markov chains to try and make the sequences feel more alive.&lt;/p&gt;
&lt;h3 id=&quot;markov-chains-in-tidal-cycles&quot;&gt;Markov chains in Tidal Cycles&lt;/h3&gt;
&lt;p&gt;&lt;a href=&quot;https://en.wikipedia.org/wiki/Markov_chain&quot;&gt;Markov chains&lt;/a&gt; are a great way to get some structured randomness into your sounds. Tidal Cycles has an implementation of Markov chains built in, &lt;a href=&quot;https://blog.tidalcycles.org/index.html%3Fp=73.html&quot;&gt;here’s&lt;/a&gt; a post on the Tidal blog. I’ll start from the example at the end of the post and share some hacky tricks. Let’s start by defining a &lt;strong&gt;transition probability matrix&lt;/strong&gt; for a 3-state Markov chain:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;let tpm = [[0.3, 0.4, 0.3]
          ,[0.3, 0.5, 0.2]
          ,[0.2, 0.3, 0.5]]&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;&lt;code&gt;tpm[i,j]&lt;/code&gt; represents the probability of transitioning from state &lt;code&gt;i&lt;/code&gt; to &lt;code&gt;j&lt;/code&gt;. The rows don’t have to add up to 1, Tidal normalizes them for us. We’ll then feed &lt;code&gt;tpm&lt;/code&gt; to &lt;code&gt;markovPat&lt;/code&gt; to start getting some sounds:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;d1 
  $ n  (fromIntegral &amp;lt;$&amp;gt; (markovPat 8 0 tpm))
  # s &amp;quot;arpy&amp;quot;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;The first argument of &lt;code&gt;markovPat&lt;/code&gt; is the length of the chain, the second one is the initial state, and the final one is the transition probability matrix. So in this case, &lt;code&gt;markovPat&lt;/code&gt; is spitting out chains that start at state 0 (note 0), and ride the Markov train for 8 steps. What if we want to define our own notes for each state? We can index into a list of notes using &lt;code&gt;fmap&lt;/code&gt; like so:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;d1 
  $ n (fmap ([0,3,5]!!) $ (markovPat 8 0 tpm))
  # s &amp;quot;arpy&amp;quot;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;We can also use the Markov patterns to index into samples or slices:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;d1 $ loopAt 4 $ splice 4 (markovPat 8 0 tpm) $ s &amp;quot;breaks152&amp;quot;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;Okay, now the fuckery begins - Tidal/Haskell purists, turn back now (or feel free to &lt;a href=&quot;http://satyarth.me/#contact&quot;&gt;tell me&lt;/a&gt; more elegant ways to achieve this). I wanted to have the states of the Markov chain be patterns instead of notes or indices, but I couldn’t figure out how to do this with a &lt;code&gt;markovPat&lt;/code&gt;, so I resorted to using &lt;code&gt;runMarkov&lt;/code&gt; to generate a list of indices (long enough to not notice repetiton), use them to index into a list of patterns, glue it all into one big string with &lt;code&gt;foldr (++) &amp;quot;&amp;quot;&lt;/code&gt; and evaluate it with &lt;code&gt;parseBP_E&lt;/code&gt;. I know evaluating strings is bad programming practice, forgive me for I have sinned. But hey, this is what we end up with when we run two chains in parallel:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;list1 = (fmap ([&amp;quot;[-2 0]&amp;quot;,
            &amp;quot;[0 2 0 -2]&amp;quot;,
            &amp;quot;[3 2 1 0]&amp;quot;]!!) (runMarkov 128 tpm 0 2))
list2 = (fmap ([&amp;quot;[0 2]&amp;quot;,
            &amp;quot;[0 1 0 2]&amp;quot;,
            &amp;quot;[0 1 2]&amp;quot;]!!) (runMarkov 128 tpm 1 0))
markovSeq1 = (slow 32 (parseBP_E (foldr (++) &amp;quot;&amp;quot; list1)))
markovSeq2 = (slow 32 (parseBP_E (foldr (++) &amp;quot;&amp;quot; list2)))

do
  d1 $ n (scale &amp;quot;minor&amp;quot; markovSeq1) # s &amp;quot;arpy&amp;quot; # pan 0.25
  d2 $ n (scale &amp;quot;minor&amp;quot; markovSeq2) # s &amp;quot;arpy&amp;quot; # pan 0.75&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;Combining it with the above technique of using a common ‘root sequence’, using the markovs for variation:&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;let rootSeq = (slow 4 &amp;quot;0 -1 1 2 4 3 2 1&amp;quot;)

do
  d1 $ degradeBy 0.3 $ n (scale &amp;quot;minor&amp;quot; (rootSeq +| markovSeq1)) # s &amp;quot;supersaw&amp;quot; # pan 0.25
  d2 $ degradeBy 0.3 $ n (scale &amp;quot;minor&amp;quot; (rootSeq +| markovSeq2)) # s &amp;quot;supersaw&amp;quot; # pan 0.75
  d3 $ n (scale &amp;quot;minor&amp;quot; (rootSeq +| &amp;quot;0!16&amp;quot;)) # s &amp;quot;supersaw&amp;quot; # amp 0.3&lt;/code&gt;&lt;/pre&gt;</description></item><item><title>Cadence Clock: A Technical Overview</title><link>http://satyarth.me/articles/cadence-clock-cyfest/</link><pubDate>Thu, 17 Oct 2024 02:00:00 +0200</pubDate><guid isPermaLink="true">http://satyarth.me/articles/cadence-clock-cyfest/</guid><author></author><description>&lt;p&gt;Building on the &lt;a href=&quot;http://satyarth.me/articles/bikesynth&quot;&gt;humble beginnings&lt;/a&gt; of my bike synthesizer project, I got the chance to develop it into a live performance for &lt;a href=&quot;https://www.cyfest.art/16-sound-performance&quot;&gt;CYFEST 16&lt;/a&gt; in Yerevan with my boi &lt;a href=&quot;https://www.instagram.com/crossroads_imcs/&quot;&gt;Rob&lt;/a&gt;. We called our project &lt;a href=&quot;https://www.instagram.com/cadenceclock/&quot;&gt;Cadence Clock: Rhythm of the Streets&lt;/a&gt;. In this post I’ll give an overview of our technical setup and decisions, and what went right and wrong.&lt;/p&gt;
&lt;p&gt;Poster by &lt;a href=&quot;https://www.instagram.com/bogdamn.it/&quot;&gt;Bogdan Boichuk&lt;/a&gt;:&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;/articles/cadence-clock-cyfest/poster.jpg&quot; alt=&quot;poster&quot;&gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Cadence Clock: Rhythm of the Streets is an immersive audio-visual experience navigating city streets on a fixed-gear bicycle. By equipping an otherwise minimalistic track bike with sensors, we attempt to augment the viewer’s senses to translate the visceral, connected, and synchronized flow of fixed gear cycling through city traffic into perceivable sights and sounds, archiving these feelings. The rider’s movements, navigation style and decision making — modulated by the city’s traffic patterns, geography, and infrastructure are emphasized through audio-visual synthesis and weaved into a story exploring the rhythm of our city streets, who they are for, and how they can be reclaimed.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Promo video:&lt;/p&gt;
&lt;video width=&quot;444&quot; height=&quot;789&quot; controls&gt;
  &lt;source src=&quot;/articles/cadence-clock-cyfest/promo.webm&quot; type=&quot;video/webm&quot;&gt;
Your browser does not support the video tag.
&lt;/video&gt; 

&lt;h2 id=&quot;the-setup&quot;&gt;The Setup&lt;/h2&gt;
&lt;p&gt;Our goal was a live audio-visual performance involving riding a track bike through the city, with sounds generated and livestreamed video modulated by the bike’s movement. Our extremely overcomplicated setup looked like this:&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;/articles/cadence-clock-cyfest/scheme.png&quot; alt=&quot;scheme&quot;&gt;&lt;/p&gt;
&lt;p&gt;On the bike, we had an ESP32 microcontroller reading from two sensors - the cadence module and the accelerometer module. The cadence module is a Hall-effect sensor detecting the rotation of eight magnets glued to the chainring which acts like a clock for generating sounds (cadence clock, get it?). The accelerometer module is an MPU6050 attached to the handlebars that captures their movement. Two video feeds (a PoV stream from Rob, and a third person view from our friend &lt;a href=&quot;https://savvamihaescu.com/&quot;&gt;Savva&lt;/a&gt; riding behind) were streamed using GoPros.&lt;/p&gt;
&lt;p&gt;I initially really wanted to minimize latency and possible disruptions because of internet dropping, and tried to stick to analog signals for both the sensor data and the video. For the sensor data, I ended up encoding it into audio as a sum of gated sine waves for the discrete cadence signal and frequency-shifted sines for the accelerometer signal, which was then decoded into MIDI with a python script at the receiving end. This encoding was done by essentially using the ESP32 as a polyphonic synthesizer, with the audio output transmitted with a UHF radio walkie-talkie. &lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;/articles/cadence-clock-cyfest/helmet.jpg&quot; alt=&quot;scheme&quot;&gt;&lt;/p&gt;
&lt;p&gt;For transmitting an analog video signal, we first experimented with &lt;a href=&quot;https://shop.siyi.biz/products/siyi-hm30&quot;&gt;a system&lt;/a&gt; designed for flying drones and model aircraft. While the latency and video quality were great, we unfortunately didn’t account for how strongly this radio signal would be attenuated by obstacles in an urban environment when both the transmitter and receiver are at ground level. We then decided to scrap the analog video idea and stream directly from GoPros over mobile internet. The RTMP streams from the GoPros were decent, but latency was all over the place - we got better results setting up a &lt;a href=&quot;https://github.com/AlexxIT/go2rtc&quot;&gt;go2rtc&lt;/a&gt; server as an intermediary and sending WebRTC video to the endpoints with &amp;lt; 2 second latency.&lt;/p&gt;
&lt;p&gt;On the receiving end at the performance venue, once the sensor signals were converted into MIDI, these were fed into generative VCV rack patches coordinated by a DAW for sounds and used to modulate the video streams with &lt;a href=&quot;https://hydra.ojack.xyz/&quot;&gt;hydra&lt;/a&gt;.&lt;/p&gt;
&lt;h2 id=&quot;how-d-it-go-&quot;&gt;How’d it go?&lt;/h2&gt;
&lt;p&gt;Not quite according to plan… everything went great during the start of the performance, but then Murphy’s law struck and things started falling off.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;A couple of days before the performance, we realized the RF signals from the walkie-talkie transmission were causing strong interference on the ESP32, messing up the analog readings from the Hall sensor and I2C communication with the accelerometer module. With not much time to reliably figure out and test electromagnetic shielding, we ditched the walkie talkies and decided to transmit audio over a VOIP call which worked pretty well. However, during the performance we ended up using the VOIP client on a different (untested) OS, which resulted in wildly fluctuating audio levels (presumably due to some noise suppression feature) that our decoder couldn’t handle well. This messed things up and made our MIDI signals very unstable.&lt;/li&gt;
&lt;li&gt;We needed a stable bandwidth of ~8 Mb/s up for our video streams, which the mobile internet handled well during our tests. Sadly, the internet gods weren’t on our side on the day of the performance and the video streams kept dropping. The lag because of dropped frames would also keep accumulating, leading to synchronization issues.&lt;/li&gt;
&lt;li&gt;The individual parts of our setup worked great in isolation, but we lacked the diligent integration testing required to work out the kinks in such a complex interconnected system.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Excerpt from move2armenia’s &lt;a href=&quot;https://www.instagram.com/p/DC4h28jMB1g/&quot;&gt;video&lt;/a&gt; from CYFEST 16:&lt;/p&gt;
&lt;video width=&quot;540&quot; height=&quot;960&quot; controls&gt;
  &lt;source src=&quot;/articles/cadence-clock-cyfest/cc.webm&quot; type=&quot;video/webm&quot;&gt;
Your browser does not support the video tag.
&lt;/video&gt; 

&lt;h2 id=&quot;what-did-we-learn-&quot;&gt;What did we learn?&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Don’t overcomplicate things.&lt;/li&gt;
&lt;li&gt;Our aversion to sending our sensor signals over internet was misplaced - a day after the performance I managed to send them from the ESP32 via its inbuilt WiFi modem as OSC messages and it worked perfectly, I regret not starting with that and overcomplicating things.&lt;/li&gt;
&lt;li&gt;While the image quality and stabilization of the GoPros was amazing, it would have made more sense to just use phones and a video call app specifically optimized for low latency and robustness.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Despite the setbacks, performing at CYFEST was an amazing experience overall. It was a huge challenge and I had to learn a lot from scratch as I went - from soldering and hardware protoyping to working with low-level audio. A few months ago it would have been hard to imagine we’d come as far as we did, and we’re very grateful to our curators Sergei Komarov and Lidiia Griaznova for their faith in us, and the team at &lt;a href=&quot;https://www.instagram.com/sec_yvn/&quot;&gt;sound enthusiastic community&lt;/a&gt; for their technical support during the performance.&lt;/p&gt;
&lt;p&gt;This was the birth of Cadence Clock, and we’re very excited to see where it takes us next! Stay tuned.&lt;/p&gt;
&lt;h2 id=&quot;code&quot;&gt;Code&lt;/h2&gt;
&lt;p&gt;A messy repo with the code for the ESP32 bike unit, python decoder, and visualizations can be found &lt;a href=&quot;https://gitlab.com/satyarth/bikesynth/&quot;&gt;here&lt;/a&gt;.&lt;/p&gt;
</description></item><item><title>Cool People Whose Stuff I Like</title><link>http://satyarth.me/articles/cool-people/</link><pubDate>Sat, 28 Sep 2024 02:00:00 +0200</pubDate><guid isPermaLink="true">http://satyarth.me/articles/cool-people/</guid><author></author><description>&lt;p&gt;Wow, it’s been five years since my last attempt at using my website to &lt;a href=&quot;http://satyarth.me/articles/fav-videos/&quot;&gt;bare my soul to the whole wide internet&lt;/a&gt;! Let’s give it another shot with some cool people on the internet who inspire me and I think more people should know about. In no particular order,&lt;/p&gt;
&lt;hr&gt;
&lt;h3 id=&quot;tony-santoro&quot;&gt;&lt;a href=&quot;https://www.crimepaysbutbotanydoesnt.com/&quot;&gt;Tony Santoro&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;Best known for his YouTube channel &lt;a href=&quot;https://www.youtube.com/@CrimePaysButBotanyDoesnt&quot;&gt;Crime Pays but Botany Doesn’t&lt;/a&gt;, Tony’s a passionate botanist who doesn’t hide his disdain for man’s arrogance towards nature. Quoting from his &lt;a href=&quot;https://www.crimepaysbutbotanydoesnt.com/about&quot;&gt;website&lt;/a&gt;: &lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;The study and dedication to Earth’s plant life has provided my gruff, misanthropic ass a lens through which to view my own place in the world, along with a sense of peace and humility that can be hard to obtain through other means. Plants - when viewed through the “bigger picture” of ecology and evolution rather than what they can “do” for us (as if holding up the biosphere isn’t enough) - can provide us not only with an awareness and context for our part in the intricate web of life here on Planet Earth, but also with a philosophical underpinning that will enable us to weather and withstand some of the dark elements coming our way.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;His videos are a healthy combination of knowledge bombs about botany, ecology, and geology mixed in with the occasional rant. He’ll also often remind you to kill your lawn:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Join our goddamned cult. Kill Your Lawn. Create habitat and eradicate the bland. create a native plant or a vegetable garden in your front yard. Killing your lawn and growing native plants is the best way to learn to identify the plants that USED to grow where you lived before the commercial automobile slum cesspool (trademark) was built.  But most importantly, kill the lawn within yourself.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;&lt;em&gt;Suggested entry points&lt;/em&gt;:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href=&quot;https://www.youtube.com/watch?v=35qF2hEefXg&quot;&gt;The Plant Ecology of Concrete, Garbage and Urine - Botanizing A Toilet&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://www.youtube.com/watch?v=xYdLfkJcfok&quot;&gt;Somewhat Verbally Abusive “Kill Your Lawn” Instructional Video&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;…literally any video from his &lt;a href=&quot;https://www.youtube.com/@CrimePaysButBotanyDoesnt&quot;&gt;channel&lt;/a&gt; where he’s walking around a beautiful ecosystem and taking you along for the ride.&lt;/li&gt;
&lt;/ul&gt;
&lt;hr&gt;
&lt;h3 id=&quot;hundred-rabbits&quot;&gt;&lt;a href=&quot;https://100r.co/site/home.html&quot;&gt;Hundred Rabbits&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;Hundred Rabbits consists of &lt;a href=&quot;https://wiki.xxiivv.com/site/home.html&quot;&gt;Devine&lt;/a&gt; and &lt;a href=&quot;https://kokorobot.ca/&quot;&gt;Rek&lt;/a&gt;, who sail around the world making somewhat esoteric &lt;a href=&quot;https://100r.co/site/projects.html&quot;&gt;software and art&lt;/a&gt; and dropping &lt;a href=&quot;https://100r.co/site/knowledge.html&quot;&gt;knowledge bombs&lt;/a&gt;. They’re also founders of the &lt;a href=&quot;https://wiki.xxiivv.com/site/merveilles.html&quot;&gt;Merveilles&lt;/a&gt; community. As they describe themselves:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Hundred Rabbits is an artist collective that documents low-tech solutions with the hope of building a more resilient future. We live and work aboard a 10 m sailboat named Pino in remote parts of the world to learn more about how technology degrades beyond the shores of the western world.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;They have much to say about off-grid living, the fragility of contemporary software practices, and doing a lot with a little.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;Suggested entry points&lt;/em&gt;:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Talk on &lt;a href=&quot;https://100r.co/site/computing_and_sustainability.html&quot;&gt;Computing and Sustainability (permacomputing)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Talk on &lt;a href=&quot;https://www.youtube.com/watch?v=BW32yUEymvU&quot;&gt;their lifestyle and ethos&lt;/a&gt; (video)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://100r.co/site/orca.html&quot;&gt;Orca&lt;/a&gt;: Cellular automaton-like sequencer&lt;/li&gt;
&lt;/ul&gt;
&lt;hr&gt;
&lt;h3 id=&quot;lu-wilson&quot;&gt;&lt;a href=&quot;https://www.todepond.com/&quot;&gt;Lu Wilson&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;Lu (aka todepond) makes slightly-surreal &lt;a href=&quot;https://youtube.com/@TodePond&quot;&gt;videos&lt;/a&gt; and codes &lt;a href=&quot;https://www.todepond.com/explore/&quot;&gt;cool toys and tools&lt;/a&gt; that are all ultimately different flavours of simulating falling sand through cellular automata (not really). She has a cool &lt;a href=&quot;https://www.todepond.com/wikiblogarden/&quot;&gt;wikiblogarden&lt;/a&gt; and I’m particularly inspired by her call to &lt;a href=&quot;https://www.todepond.com/wikiblogarden/scrappy-fiddles/sharing/normalising/live/&quot;&gt;normalize sharing scrappy fiddles&lt;/a&gt;, which I’m trying to heed.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;Suggested entry points&lt;/em&gt;:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href=&quot;https://www.youtube.com/watch?v=gv40Z9tVjAI&quot;&gt;Cells in Cells in Cells&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://www.youtube.com/watch?v=xvlsJ3FqNYU&quot;&gt;Spellular Automata&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Talk: &lt;a href=&quot;https://www.youtube.com/watch?v=MJzV0CX0q8o&quot;&gt;What it means to be open&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;hr&gt;
&lt;h3 id=&quot;dave-ackley&quot;&gt;&lt;a href=&quot;https://www.cs.unm.edu/~ackley/&quot;&gt;Dave Ackley&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;Dave says:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;I do research, development, and advocacy of robust-first and best-effort computing on indefinitely scalable computer architectures. Prior work has involved neural networks and machine learning, evolutionary algorithms and artificial life, and biological approaches to security, architecture, and models of computation.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Dave is a researcher probing some very interesting approaches to distributed and self-organizing computing he calls ‘living computation’. One instance of this is his &lt;a href=&quot;https://t2tile.com/&quot;&gt;T2 Tile Project&lt;/a&gt;:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;The T2 Tile project is an attempt to build the world’s first indefinitely scalable computational stack. First, we suspend the idea that we must be bound to an architecture based on correct and efficient deterministic hardware and software. Instead, much like the physical world around us, we look to robustness as a foundational requirement, building living systems as vessels for digital computation that is firstly robust, then as correct as possible, and finally, as efficient as necessary. &lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;His &lt;a href=&quot;https://www.youtube.com/@T2TileProject/videos&quot;&gt;YouTube channel&lt;/a&gt; has regular updates videos on the T2 Tile Project. He is also co-host of the &lt;a href=&quot;https://computingup.com/&quot;&gt;Computing Up&lt;/a&gt; podcast, whose content and guests I enjoy a lot!&lt;/p&gt;
&lt;p&gt;&lt;em&gt;Suggested entry points&lt;/em&gt;:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href=&quot;https://www.youtube.com/watch?v=euFgci7Y318&quot;&gt;An Introduction to the Living Computation Theory of Everything&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://www.youtube.com/watch?v=O8kOkLPwNNw&quot;&gt;Artificial Life Creation T-0 and Launching&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;hr&gt;
&lt;h3 id=&quot;kris-de-decker&quot;&gt;&lt;a href=&quot;https://www.krisdedecker.org/&quot;&gt;Kris De Decker&lt;/a&gt;&lt;/h3&gt;
&lt;p&gt;Creator of &lt;a href=&quot;https://solar.lowtechmagazine.com/&quot;&gt;Low-tech magazine&lt;/a&gt;, a publication running on a solar-powered server &lt;a href=&quot;https://solar.lowtechmagazine.com/power/&quot;&gt;(which might not always be up)&lt;/a&gt; about high-tech problems and low-tech solutions.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Technology has become the idol of our society, but technological progress is—more often than not—aimed at solving problems caused by earlier technical inventions. There is a lot of potential in past and often forgotten knowledge and technologies when it comes to designing a sustainable society. Interesting possibilities arise when we combine old technology with new knowledge and new materials, or when we apply old concepts and traditional knowledge to modern technology.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Their &lt;a href=&quot;https://solar.lowtechmagazine.com/offline-reading/&quot;&gt;archives&lt;/a&gt; will make a great addition to your post-apocalyptic info stash.&lt;/p&gt;
</description></item><item><title>Sonification of Chaotic Traffic</title><link>http://satyarth.me/articles/chaotic-traffic/</link><pubDate>Wed, 21 Feb 2024 01:00:00 +0100</pubDate><guid isPermaLink="true">http://satyarth.me/articles/chaotic-traffic/</guid><author></author><description>&lt;p&gt;Remember that time two years ago when Nitin Gadkari said he’s planning to introduce a law that mandates replacing horns with the sound of indian classical instruments? (&lt;a href=&quot;https://timesofindia.indiatimes.com/auto/news/indian-musical-instrument-sounds-for-vehicle-horns-soon-nitin-gadkari/articleshow/86773005.cms&quot;&gt;news article&lt;/a&gt; if you don’t.) Some might call it unrealistic, but it came up in a conversation and I thought the man was ahead of his time so I wrote a toy agent-based simulation of chaotic traffic and hooked it up to VCV rack to see what happens.&lt;/p&gt;
&lt;iframe width=&quot;560&quot; height=&quot;315&quot; src=&quot;https://www.youtube.com/embed/gt9RqCXUWx0?si=WvG43SwdOIud4iWa&quot; title=&quot;YouTube video player&quot; frameborder=&quot;0&quot; allow=&quot;accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share&quot; allowfullscreen&gt;&lt;/iframe&gt;

&lt;p&gt;The simulation, written in python with &lt;a href=&quot;https://github.com/projectmesa/mesa/&quot;&gt;Mesa&lt;/a&gt;, initializes vehicles as agents following simple rules (similar to &lt;a href=&quot;https://eater.net/boids&quot;&gt;boid flocking&lt;/a&gt;). The rules are:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;If I’m feeling cramped (other agents close to me just ahead of me that I can collide with), decelerate. If I’m not feeling cramped, accelerate until I reach my top speed.&lt;/li&gt;
&lt;li&gt;Steer away from neighbors just ahead of me so I can try to overtake or go around them. Also steer away from the side of the road and try to align my direction with the general flow of traffic.&lt;/li&gt;
&lt;li&gt;If I’m being forced to decelerate, honk! And honk again if I haven’t honked in a while! Honks send out MIDI triggers to be synthesized in VCV Rack.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Running a simulation with these rules, we start to observe emergent behaviour like &lt;a href=&quot;https://en.wikipedia.org/wiki/Three-phase_traffic_theory&quot;&gt;three-phase traffic&lt;/a&gt;.&lt;/p&gt;
</description></item><item><title>Bikesatanist Manifesto</title><link>http://satyarth.me/articles/bikesatanist-manifesto/</link><pubDate>Thu, 25 Jan 2024 01:00:00 +0100</pubDate><guid isPermaLink="true">http://satyarth.me/articles/bikesatanist-manifesto/</guid><author></author><description>&lt;img src=&quot;bikesatanist_manifesto.png&quot; alt=&quot;bikesatanist manifesto&quot; width=&quot;666&quot;/&gt;

&lt;p&gt;The bikesatanist manifesto, as published in the &lt;a href=&quot;https://t.me/rodraws/263&quot;&gt;3rd annual новогодний зин КППХ&lt;/a&gt; (russian language). Art by &lt;a href=&quot;https://t.me/rodraws&quot;&gt;interpassive&lt;/a&gt;. Text below.&lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;Why is cycling satanic? It is:    &lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Rejection of the established car-centric order&lt;/li&gt;
&lt;li&gt;Acceptance of chaos on the road and in our lives, and self-determination in the ways we choose to navigate it&lt;/li&gt;
&lt;li&gt;Reclamation of autonomy in transport through self-reliance and an intimate connection with one’s body&lt;/li&gt;
&lt;li&gt;More empathy for and awareness of our surroundings by breaking the metal cage separating us&lt;/li&gt;
&lt;li&gt;Respect for our community and environment through the minimization of pollution&lt;/li&gt;
&lt;li&gt;Faith in the innate good in our fellow humans not to murder us with giant metal beasts&lt;/li&gt;
&lt;li&gt;Fun as fuck, feeding our base instincts and carnal desires&lt;/li&gt;
&lt;li&gt;Sexy&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;So uncage yourself, get out there, tear up these motherfucking streets and keep it satanic. Stay safe and follow your local traffic laws, or don’t, I’m not going to tell you what to do.  &lt;/p&gt;
</description></item><item><title>Bike Synthesizer</title><link>http://satyarth.me/articles/bikesynth/</link><pubDate>Mon, 09 Oct 2023 02:00:00 +0200</pubDate><guid isPermaLink="true">http://satyarth.me/articles/bikesynth/</guid><author></author><description>&lt;p&gt;Back in 2020, I was cycling home when I noticed something in my drivetrain was squeaking and I loved the way the sound synchronized with the movement of my legs. It feels great to cycle, even better to cycle to music, and when the music’s synced with you - pure perfection. That’s when I decided a bike synthesizer needs to exist. So I made an arduino based device with a hall-effect sensor and glued magnets to my chainring. The arduino sends MIDI signals over USB that can then be used however to make music.&lt;/p&gt;
&lt;p&gt;I later found out it’s been done before by &lt;a href=&quot;https://www.youtube.com/watch?v=LWqImNULtao&quot;&gt;Look Mum No Computer&lt;/a&gt;, but wanted to take it further and see what it’d be like on a fixed gear track bike where the music can be completely synchronized to a rider’s legs, and perform it tearing up these mf streets riding through traffic of course.&lt;/p&gt;
&lt;p&gt;The first (static) test from 2022 giving an idea how it works:&lt;/p&gt;
&lt;iframe width=&quot;560&quot; height=&quot;315&quot; src=&quot;https://www.youtube.com/embed/LO39Ge_ICn4?si=lUct8tR1ThUG9ZQZ&quot; title=&quot;YouTube video player&quot; frameborder=&quot;0&quot; allow=&quot;accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share&quot; allowfullscreen&gt;&lt;/iframe&gt;

&lt;p&gt;A recording of a live audiovisual performance at &lt;a href=&quot;https://www.instagram.com/p/CwKt_ohrNKd/&quot;&gt;Fjords&lt;/a&gt; in September 2023 in Petrozavodsk. Footage recorded in Yerevan in June 2023 with my bois &lt;a href=&quot;https://www.instagram.com/crossroads_life_of_streets/&quot;&gt;Rob&lt;/a&gt; riding and &lt;a href=&quot;https://www.youtube.com/@SavvaMihaescu/&quot;&gt;Savva&lt;/a&gt; shooting.&lt;/p&gt;
&lt;iframe width=&quot;560&quot; height=&quot;315&quot; src=&quot;https://www.youtube.com/embed/knS9l0L2J6Q?si=u0_SBunWTBUNzKnZ&quot; title=&quot;YouTube video player&quot; frameborder=&quot;0&quot; allow=&quot;accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share&quot; allowfullscreen&gt;&lt;/iframe&gt;

&lt;p&gt;It’s been years since the initial idea but I still haven’t realized its full potential - stay tuned!&lt;/p&gt;
&lt;p&gt;Edit: see &lt;a href=&quot;http://satyarth.me/articles/cadence-clock-cyfest&quot;&gt;Cadence Clock @ CYFEST 16 Yerevan&lt;/a&gt; for the next chapter in this story.&lt;/p&gt;
</description></item><item><title>Important YouTube Videos</title><link>http://satyarth.me/articles/fav-videos/</link><pubDate>Tue, 12 Mar 2019 01:00:00 +0100</pubDate><guid isPermaLink="true">http://satyarth.me/articles/fav-videos/</guid><author></author><description>&lt;p&gt;To make my website a bit more personal, and help visitors get to know ‘the real me’, here are my top 5 favourite YouTube videos. Presented without comment (for now at least), do with this information what you may.&lt;/p&gt;
&lt;h3 id=&quot;-5&quot;&gt;#5&lt;/h3&gt;
&lt;iframe width=&quot;560&quot; height=&quot;315&quot; src=&quot;https://www.youtube.com/embed/KxGRhd_iWuE&quot; title=&quot;YouTube video player&quot; frameborder=&quot;0&quot; allow=&quot;accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share&quot; allowfullscreen&gt;&lt;/iframe&gt;

&lt;h3 id=&quot;-4&quot;&gt;#4&lt;/h3&gt;
&lt;iframe width=&quot;560&quot; height=&quot;315&quot; src=&quot;https://www.youtube.com/embed/iCErJUzJA5g&quot; title=&quot;YouTube video player&quot; frameborder=&quot;0&quot; allow=&quot;accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share&quot; allowfullscreen&gt;&lt;/iframe&gt;

&lt;h3 id=&quot;-3&quot;&gt;#3&lt;/h3&gt;
&lt;iframe width=&quot;560&quot; height=&quot;315&quot; src=&quot;https://www.youtube.com/embed/Nsqjk9eqxAw&quot; title=&quot;YouTube video player&quot; frameborder=&quot;0&quot; allow=&quot;accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share&quot; allowfullscreen&gt;&lt;/iframe&gt;

&lt;h3 id=&quot;-2&quot;&gt;#2&lt;/h3&gt;
&lt;iframe width=&quot;560&quot; height=&quot;315&quot; src=&quot;https://www.youtube.com/embed/gER7xe11nz8&quot; title=&quot;YouTube video player&quot; frameborder=&quot;0&quot; allow=&quot;accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share&quot; allowfullscreen&gt;&lt;/iframe&gt;

&lt;h3 id=&quot;-1&quot;&gt;#1&lt;/h3&gt;
&lt;iframe width=&quot;560&quot; height=&quot;315&quot; src=&quot;https://www.youtube.com/embed/etsyCc6DKWs&quot; title=&quot;YouTube video player&quot; frameborder=&quot;0&quot; allow=&quot;accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share&quot; allowfullscreen&gt;&lt;/iframe&gt;
</description></item><item><title>Supa Bot 🔥Fire🔥: I Spit That</title><link>http://satyarth.me/articles/supa-bot-fire/</link><pubDate>Sat, 30 May 2015 02:00:00 +0200</pubDate><guid isPermaLink="true">http://satyarth.me/articles/supa-bot-fire/</guid><author></author><description>&lt;p&gt;I wrote a bot that replies to people’s tweets in the style of &lt;a href=&quot;https://www.youtube.com/watch?v=-ChppfnazzE/&quot;&gt;Supa Hot Fire&lt;/a&gt;. While the primary goal was dank memes, it also served as a nice introduction to natual language processing and the Twitter API.&lt;/p&gt;
&lt;blockquote class=&quot;twitter-tweet&quot; lang=&quot;en&quot;&gt;&lt;p lang=&quot;en&quot; dir=&quot;ltr&quot;&gt;nothing: I regret that. &lt;a href=&quot;https://twitter.com/AubriasV&quot;&gt;@AubriasV&lt;/a&gt;&lt;/p&gt;&amp;mdash; Supa Bot Fire (@supabotfire) &lt;a href=&quot;https://twitter.com/supabotfire/status/606235211337232384&quot;&gt;June 3, 2015&lt;/a&gt;&lt;/blockquote&gt;
&lt;script async src=&quot;//platform.twitter.com/widgets.js&quot; charset=&quot;utf-8&quot;&gt;&lt;/script&gt;

&lt;p&gt;&lt;em&gt;Update&lt;/em&gt;: The bot got banned. RIP in peace.&lt;/p&gt;
&lt;h2 id=&quot;how-it-works&quot;&gt;How it works&lt;/h2&gt;
&lt;p&gt;When running, the bot gets a stream of tweets containing ‘I’ or ‘we’ via Twitter’s &lt;a href=&quot;https://dev.twitter.com/streaming/reference/post/statuses/filter&quot;&gt;streaming API&lt;/a&gt;. It looks for tweets in the stream in the format of a personal pronoun followed by a verb (for example &lt;em&gt;&quot;I like turtles”&lt;/em&gt;). When it gets a hit, it restructures the sentence (&lt;em&gt;&quot;Turtles: I like that”&lt;/em&gt;) and tweets it back at the tweeter. Then it chills out for up to an hour.&lt;/p&gt;
&lt;p&gt;It uses &lt;a href=&quot;http://www.tweepy.org/&quot;&gt;Tweepy&lt;/a&gt; as a Twitter API wrapper and &lt;a href=&quot;http://www.nltk.org/&quot;&gt;NLTK&lt;/a&gt; to chunk and tag tweets.&lt;/p&gt;
&lt;h2 id=&quot;links&quot;&gt;Links&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href=&quot;https://twitter.com/supabotfire&quot;&gt;Twitter&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://github.com/satyarth/supa-bot-fire&quot;&gt;Github&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
</description></item><item><title>Shortest Tour of Cambridge Colleges</title><link>http://satyarth.me/articles/college-tour/</link><pubDate>Thu, 26 Mar 2015 01:00:00 +0100</pubDate><guid isPermaLink="true">http://satyarth.me/articles/college-tour/</guid><author></author><description>&lt;p&gt;I wondered what the most efficient way to visit every college in Cambridge in a single tour was. I investigated. I found the answer.&lt;/p&gt;
&lt;h2 id=&quot;tl-dr&quot;&gt;TL;DR&lt;/h2&gt;
&lt;p&gt;The shortest tour is:&lt;/p&gt;
&lt;p&gt;Selwyn → Clare Hall → Robinson → Churchill → Girton → Fitzwilliam → Murray Edwards → St Edmund’s → Lucy Cavendish → Magdalene → St John’s → Sidney Sussex → Jesus → Christ’s → Emmanuel → Downing → Hughes Hall → Homerton → Peterhouse → Pembroke → St Catharine’s → Corpus Christi → King’s → Gonville and Caius → Trinity Hall → Trinity → Clare → Queens’ → Darwin → Newnham → Wolfson → Selwyn&lt;/p&gt;
&lt;div class='main-width'&gt;&lt;iframe src=&quot;map.html&quot; width=&quot;100%&quot; height=&quot;500&quot; marginwidth=&quot;0&quot; marginheight=&quot;0&quot; scrolling=&quot;no&quot; frameborder=&quot;0&quot;&gt;&lt;/iframe&gt;
&lt;/div&gt;

&lt;h2 id=&quot;notes&quot;&gt;Notes&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;As fun as it would have been to write my own TSP solver, I used &lt;a href=&quot;http://www.math.uwaterloo.ca/tsp/concorde/index.html&quot;&gt;Concorde&lt;/a&gt;. Concorde’s last release was all the way back in 2003 but apparently it was “state of the art” as recently as 2007. To set the problem up for Concorde I had to get the coordinates of the colleges, then calculate the distance between each pair of colleges.&lt;/li&gt;
&lt;li&gt;I fetched the coordinates for colleges via the Google Maps API, but Google resolves &lt;code&gt;Clare College&lt;/code&gt; to Memorial Court, and &lt;code&gt;Trinity College&lt;/code&gt; to somewhere in the North Paddock, which isn’t entirely accurate.&lt;/li&gt;
&lt;li&gt;I calculated distances as the geodesic distance between pairs of coordinates, leading to a relatively short tour length of about 15.83 km. This also assumes you can fly, which is &lt;em&gt;probably&lt;/em&gt; not the case. It’s possible to fix this by calculating distances with the &lt;a href=&quot;https://developers.google.com/maps/documentation/distancematrix/&quot;&gt;Google Distance Matrix API&lt;/a&gt;, which would give me the distance when walking along the shortest route, but… meh.&lt;/li&gt;
&lt;/ul&gt;
</description></item><item><title>Pixel Sorting</title><link>http://satyarth.me/articles/pixel-sorting/</link><pubDate>Tue, 10 Feb 2015 01:00:00 +0100</pubDate><guid isPermaLink="true">http://satyarth.me/articles/pixel-sorting/</guid><author></author><description>&lt;p&gt; Pixel sorting is an interesting, glitchy effect which selectively orders the pixels in the rows/columns of an image. It was popularized (possibly invented) by artist &lt;a href=&quot;http://kimasendorf.com/&quot;&gt;Kim Asendorf&lt;/a&gt; (processing source code &lt;a href=&quot;https://github.com/kimasendorf/ASDFPixelSort&quot;&gt;here&lt;/a&gt;). The processing script was cryptic and not very hackable, and I felt like something more lightweight was needed, so I wrote my own version in python – more info on &lt;a href=&quot;https://github.com/satyarth/pixelsort/&quot;&gt;GitHub&lt;/a&gt;.&lt;span class='marginnote'&gt;The earliest reference  to pixel sorting I could find on the internet is &lt;a href=&quot;http://satyarth.me/articles/pixel-sorting/www.isprs.org/proceedings/XXVII/congress/part3/31_XXVII-part3.pdf&quot;&gt;this paper&lt;/a&gt; (PDF) by scientists from Iraq’s Scientific Research Council. They propose using it as a technique for unsupervised classification, to be applied to images from remote sensing satellites. Neat!&lt;/span&gt;&lt;/p&gt;
&lt;p&gt;Here’s what it looks like in action:&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;/articles/pixel-sorting/1.png&quot; alt=&quot;&quot;&gt;&lt;/p&gt;
&lt;span class='marginnote'&gt;
![](orig.jpg)
*The original image*, via [pexels](https://www.pexels.com/photo/water-waterfall-forest-woods-24222/)
&lt;/span&gt;

&lt;h2 id=&quot;how-does-it-work-&quot;&gt;How does it work?&lt;/h2&gt;
&lt;p&gt;There are two steps involved:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;The rows/columns of the image are split into ‘intervals’ (more on this later).&lt;/li&gt;
&lt;li&gt;The pixels in each interval are rearranged so as to sort them with respect to some property, say, lightness.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Kim Asendorf’s code simply applies this technique twice: first vertically, then horizontally.&lt;/p&gt;
&lt;p&gt;The intervals are defined by regions of the image that are too light or too dark – the edges of these regions define the edges of the intervals.&lt;/p&gt;
&lt;p&gt;For example, consider:&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;/articles/pixel-sorting/example.png&quot; alt=&quot;1&quot;&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;The top left is the original image.&lt;/li&gt;
&lt;li&gt;The top right is the image with pixels outside the lightness threshold replaced with black, and the rest filled in with white. Notice that the foam – too light – and the shadows – too dark – are outside the threshold. &lt;em&gt;(Yes, I just discovered en-dashes. Deal with it.)&lt;/em&gt;&lt;/li&gt;
&lt;li&gt;On the bottom left is the image with the intervals filled in with random colors – each color represents a different interval.&lt;/li&gt;
&lt;li&gt;On the bottom left is the image with sorted intervals.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id=&quot;how-can-i-pixelsort-&quot;&gt;How can I pixelsort?&lt;/h2&gt;
&lt;p&gt;All the images in this article were generated using a pixel sorting script I wrote in python – &lt;a href=&quot;https://github.com/satyarth/pixelsort/&quot;&gt;source here&lt;/a&gt;. It shouldn’t be too hard to use, details are in &lt;code&gt;REAMDE.md&lt;/code&gt;. If you do anything with it I’d love to see it!&lt;/p&gt;
&lt;h2 id=&quot;stuff-people-made-with-the-script&quot;&gt;Stuff people made with the script&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href=&quot;https://www.youtube.com/watch?v=bV2Br6e_bd8&quot;&gt;Troxum - Ediacarana&lt;/a&gt; (music video)&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://x.com/CrookedCosmos&quot;&gt;CrookedCosmos&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://gmic.eu/reference/pixelsort.html&quot;&gt;G’MIC plugin&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&quot;https://github.com/Akascape/Pixelort&quot;&gt;Akascape/Pixelort&lt;/a&gt; - GUI pixelsorting tool&lt;/li&gt;
&lt;/ul&gt;
</description></item><item><title>Unplify</title><link>http://satyarth.me/articles/unplify/</link><pubDate>Tue, 10 Feb 2015 01:00:00 +0100</pubDate><guid isPermaLink="true">http://satyarth.me/articles/unplify/</guid><author></author><description>&lt;p&gt;Udank?&lt;/p&gt;
</description></item><item><title>Under the Hood</title><link>http://satyarth.me/articles/under-the-hood/</link><pubDate>Mon, 09 Feb 2015 01:00:00 +0100</pubDate><guid isPermaLink="true">http://satyarth.me/articles/under-the-hood/</guid><author></author><description>&lt;p&gt;This website is generated with &lt;a href=&quot;http://wintersmith.io/&quot;&gt;wintersmith&lt;/a&gt;, a static website generator written on top of Node.js. The source for the website is publicly available on &lt;a href=&quot;https://github.com/satyarth/satyarth.me&quot;&gt;GitHub&lt;/a&gt;. It was first hosted on &lt;a href=&quot;https://pages.github.com/&quot;&gt;Github pages&lt;/a&gt;, then &lt;a href=&quot;http://divshot.io/&quot;&gt;divshot&lt;/a&gt;, and now lives on a &lt;a href=&quot;https://www.digitalocean.com/&quot;&gt;DigitalOcean&lt;/a&gt; VPS and is served via &lt;a href=&quot;http://www.lighttpd.net/&quot;&gt;lighttpd&lt;/a&gt;. The domain is registered with &lt;a href=&quot;https://www.namecheap.com/&quot;&gt;namecheap&lt;/a&gt; and DNS services are provided by &lt;a href=&quot;https://www.cloudflare.com/&quot;&gt;CloudFlare&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;I’ve tried to make it simple, lightweight, and sexy. Except for web fonts (and MathJax on certain pages), no external resources are required. The style for the articles is heavily influenced by &lt;a href=&quot;https://edwardtufte.github.io/tufte-css/&quot;&gt;Tufte CSS&lt;/a&gt;. If you have suggestions, I’d love to &lt;a href=&quot;http://satyarth.me/#contact&quot;&gt;hear them&lt;/a&gt;!&lt;/p&gt;
</description></item><item><title>Lorem Ipsum</title><link>http://satyarth.me/articles/lorem-ipsum/</link><pubDate>Wed, 22 Jan 2014 01:00:00 +0100</pubDate><guid isPermaLink="true">http://satyarth.me/articles/lorem-ipsum/</guid><author></author><description>&lt;h2 id=&quot;penates-taedae-maior-solet-vitare-et&quot;&gt;Penates taedae maior solet vitare et&lt;/h2&gt;
&lt;p&gt;Lorem markdownum! Aras aut, sanctaque, &lt;span class='marginnote'&gt;&lt;img src=&quot;http://satyarth.me/articles/lorem-ipsum/side.png&quot; alt=&quot;test&quot;&gt;&lt;em&gt;Caption&lt;/em&gt;&lt;/span&gt;quinquennem tamen ac mille dum nostri,
ero perfusam amantem sine, et non. Ferrum adversos!&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Silva e &lt;strong&gt;meruisse heros&lt;/strong&gt;. Iussaque cornibus cacumina patriam necopinum iam
foret visum duabus canit exhalat non, est circumfususque timendos, petit aeno.
Avita cessataque motis, diversa et est nescit lacrimae &lt;strong&gt;neque saxa&lt;/strong&gt; tamquam
finierat ardentem.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h2 id=&quot;mea-sole&quot;&gt;Mea sole&lt;/h2&gt;
&lt;p&gt;Neque iuvene, in nostram e pondus tacto. Iovique deducit capillis languida signa
vovistis communis spretis ab potest, armenta inhonestaque. Deus senserit illis
in iniceret, carina at &lt;a href=&quot;http://kimjongunlookingatthings.tumblr.com/&quot;&gt;vultus&lt;/a&gt;
cadentem simul at. Odium nunc qui locorum data, femineusque recentes quantum.&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;http://satyarth.me/articles/lorem-ipsum/img.png&quot; alt=&quot;test&quot;&gt;&lt;/p&gt;
&lt;p&gt;Neque iuvene, in nostram e pondus tacto. Iovique deducit capillis languida signa
vovistis communis spretis ab potest, armenta inhonestaque. Deus senserit illis
in iniceret, carina at &lt;a href=&quot;http://kimjongunlookingatthings.tumblr.com/&quot;&gt;vultus&lt;/a&gt;
cadentem simul at. Odium nunc qui locorum data, femineusque recentes quantum.&lt;/p&gt;
&lt;p&gt;\[e^{i \pi} + 1 = 0 \]&lt;/p&gt;
&lt;pre&gt;
&lt;code class=&quot;python&quot;&gt;
for y in range(len(pixels)):
    row=[]
    xMin = 0
    for xMax in intervals[y]:
        interval = []
        for x in range(xMin, xMax):
            interval.append(pixels[y][x])
        if random.randint(0,100) &gt;= args.randomness:
            row += sort_interval(interval)
        else:
            row += interval
        xMin = xMax
    row.append(pixels[y][0]) # wat
    sorted_pixels.append(row)
return(sorted_pixels)
&lt;/code&gt;
&lt;/pre&gt;

&lt;h2 id=&quot;freta-ensem-petis&quot;&gt;Freta ensem petis&lt;/h2&gt;
&lt;p&gt;&lt;strong&gt;Morphea avellere&lt;/strong&gt;, nec vultus origo &lt;a href=&quot;http://www.wedrinkwater.com/&quot;&gt;patefecit
modo&lt;/a&gt;. Mergit prohibete o altos, &lt;strong&gt;semimarem
diversaeque&lt;/strong&gt; locorum scelerataque &lt;em&gt;Tereus oscula&lt;/em&gt;, tua opem vos!&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Senex saepius regna virum sororis&lt;/li&gt;
&lt;li&gt;Coniuge mille curat&lt;/li&gt;
&lt;li&gt;Mora non turba lacertos nimbi effigies effundite&lt;/li&gt;
&lt;li&gt;Lapsasque aequor dolores&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id=&quot;barbam-arma-parte-intravimus-quaesitus-fera-excutior&quot;&gt;Barbam arma parte intravimus quaesitus fera excutior&lt;/h2&gt;
&lt;p&gt;Longa faciem laniaverat: fugit sola credit. Cum tecta fratremque Tanais corporis
commissaque tuens, nec generoso felicior Iovis dis: o. Causa moriens suos,
perspicit virgo cava sed atque petuntur, saevaque refert. Poplite ibat flammas:
per posses, cecidisti sororum; matri. Fuit mortale, est manes aethera: illa amat
esset!&lt;/p&gt;
&lt;pre&gt;&lt;code&gt;dcim -= nanometerCyclePoint;
if (45) {
    ccd_drive.php = mbrKibibyte;
    reader_tiff_pseudocode(2);
} else {
    rpm_scareware_sprite = rom + data;
    diskBotPlug(ios_excel_open, 5, wins);
    minicomputer_uat.vaporware(1 + ics);
}
website.zone_artificial_jsp += modelColumnWindows;
if (20 + httpStreaming &amp;lt; andAclCodec) {
    barOfUsername.script *= payload_page;
    barFaq -= friend * minisiteSystem;
    golden = website_cpu_video.android_duplex(compiler, batchTcpComputing *
            driveGoogle);
}&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;Tenues &lt;strong&gt;postquam pharetras vestros&lt;/strong&gt; mentis magnumque nomine refert orbus ut
sunt talibus, numeri et tamen sidera, cum. Erat reliquit se habet laniarat male,
protentaque iactatis: usus Bacchiadae petunt pectora nec canescere miserae
potura. Vero illo &lt;strong&gt;meritis&lt;/strong&gt; montibus aut frugum Syenites vocabere quod
iactura. Dis ante positoque &lt;strong&gt;turba&lt;/strong&gt;, iam mater aequorei, somnus penetralia
abiit habebat: Graias. A et non ille quodque tu insonat trucis et enim cornuaque
&lt;em&gt;gens&lt;/em&gt;.&lt;/p&gt;
</description></item></channel></rss>