The running GLSL/P5js sketch (linked above)
Snapshots of the GLSL sketch are fed into DALL-E 3, prompting the model to interpret the images as mandala patterns composed of interwoven radial orbits.
These hi-res images are then fed into DALL-E 2, unprompted, in order to produce "degraded" variations of the source image.
Note:  the original hi-res source image from DALL-E 3 is intentionally not included in the variation set.
When variations are fed back into the model, subsequent variations become more generalized and abstract.
Mixtures of these variations are hand-curated and ordered, serving as keyframes, then fed into Runway's frame interpolation model to produce smooth transformations between images.
The interpolated video is then fed into Runway's style transfer model, using the original DALL-E 3 source image as the style reference.
Back to Top