Linked References

  • glitchcraft

    Built with VUCA-2, a GPT-2 model finetuned on hundreds of occult books. I asked it to generate a table of contents, and with very little cherry picking I managed to come up with the one you see below. I then generated a few samples for each chapter and hand-selected one or more to include in the book.

  • technologies-for-the-automation-of-language
    • GPT-2 - heavyweight neural-net generation of text. high quality GPT-2 can be finetuned to create domain-specific text, or work on its own knowledge