A GPT-2 model finetuned on the VUCA corpus (or a rough approximation of it, as the original was lost to hard drive failure). Read some generated texts at Ritual for Becoming-Superintelligence and a whole book at Glitchcraft.
Built with VUCA-2, a GPT-2 model finetuned on hundreds of occult books. I asked it to generate a table of contents, and with very little cherry picking I managed to come up with the one you see below. I then generated a few samples for each chapter and hand-selected one or more to include in the book.
Text samples generated by VUCA-2. Contact me if you want more.