Skip to content

coreybobco/markovmutagen

Repository files navigation

Introduction

"Unfortunately human effort, which always varies the arrangement of existing elements, cannot be applied to producing a single new element. A landscape in which nothing terrestrial figures is beyond the scope of our imagination." --Andre Breton, "Max Ernst" (1921)

"The true literature machine will be one that itself feels the need to produce disorder, as a reaction against its preceding production of order: a machine that will produce avant-garde work to free its circuits when they are choked by too long a procession of classicism." -Italo Calvino, Cybernetics and Ghosts (1967)

"today's writer resembles more a programmer than a tortured genius, brilliantly conceptualizing, constructing, executing, and maintaining a writing machine." -Kenneth Goldsmith, Uncreative Writing (2011)```

A Markov chain is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event. In the case of text generation, these probabilities are determined by word order of an input, which can be broken down into n-grams (units) of 1, 2, or 3 words. For the order-1 n-gram model, Markov text generation begins by selecting random word which begins a sentence and then selects a random word which followed that word, weighted by frequency of appearance: for instance, if in the input "The" is followed by "rhombus" thrice and "skeleton" once, there is a 75% chance the algorithm will select rhombus next and a 25% chance skeleton will be selected. If skeleton is selected, the word which follows skeleton in the original text will appear next, since skeleton appeared only once, but if rhombus is selected, then a random word which follows rhombus in the original passage will appear. The order 2 n-gram variant would proceed similarly, except it would start by randomly selecting "The rhombus" or "The skeleton" and then, if "The rhombus" was selected, randomly select a word which followed "The rhombus."

By running multiple texts together through a Markov chain at the same time, it becomes possible to collage bits of text in novel, often nonsensical ways that often rebel from syntax and invert idiomatic constructions. (Software, after all, has no subconscious to veto awkward constructions before they arise to conscious thought.) There are several precedents to this technique. The cut-up technique, pioneered by the Dadaists and further developed by William S. Burroughs and Brion Gysin, rearranged cut-up blocks of text from multiple pages to invent a new body of text. In the 1980s, programmers realized the parodic potential of Markov chain algorithms when applied to writing and released Dissociated Press, a plug-in for the eMacs text editor. Following in their footsteps, Jamie Zawinski released the C program dadadodo which generates text from an input file.

With Markov Mutagen, you can easily combine and run inputs through a Markov text generator or a simulation of the cut-up technique. You are encouraged to combine, edit, and collage the outputs or even recycle the output as an input. Cybernetic writing is an open-ended game whose procedures are still being developed and combined in new ways. So try sampling text from a novel, news article, encyclopedia, conspiracy theory, theological treatise, or whatever else strikes your fancy. Try replacing a word (noun, verb, adjective) in one input with a word from another to manipulate the probabilities in the Markov model. The outputs of neural networks like TalkToTransformer also serve well as inputs. And enjoy the fruits of your discordant prosody.

Running the server

Because the generativepoetry module this depends on has cross-platform issues, I recommend using Docker to start and run the app via:

docker-compose up

About

Word processing toolkit using Markov chains.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published