unclear if the walls inside the walls also have hallways inside them. if any oneironauts want to work on this lmk

πŸ—¨οΈ 0 β™Ί 0 🀍 5


today's the day. gonna build the biggest dataset of information about books

πŸ—¨οΈ 4 β™Ί 0 🀍 29


this was not meant to start a pissing contest on the size of book dataset. but if there were such a contest, I would win, because I have no scruples or academic integrity to maintain πŸ˜‡

πŸ—¨οΈ 0 β™Ί 1 🀍 11


you cannot prosecute me in any jurisdiction that matters

πŸ—¨οΈ 2 β™Ί 1 🀍 19


Hot eggs say it back

πŸ—¨οΈ 9 β™Ί 1 🀍 14


recipe calling three eggs. two tablespoons oil. salt and pepper to taste.. 1) Make eggs real hot 2) eat eggs. Hot sauce optional

πŸ—¨οΈ 2 β™Ί 0 🀍 6


new crypto idea: proof-of-virginity

πŸ—¨οΈ 3 β™Ί 0 🀍 13


why is there no Chad emoji or virgin emoji

πŸ—¨οΈ 4 β™Ί 1 🀍 17


information for sale. individually or by the foot

πŸ—¨οΈ 1 β™Ί 1 🀍 9


academics love to pretend like their weird arbitrary bureaucratic norms are accepted by the rest of the world

πŸ—¨οΈ 1 β™Ί 0 🀍 8


type of guy who simps for the Demiurge smh

πŸ—¨οΈ 1 β™Ί 0 🀍 11


a star shines, a dream shatters love is all that really matters love me now before we have to go awaaaay https://t.co/dQIU5CvJ4p

πŸ—¨οΈ 0 β™Ί 0 🀍 1


the cdc says dudes who are vaccinated can now rock

πŸ—¨οΈ 1 β™Ί 2 🀍 23


how much can i bench press? oh, i don't know, i can usually drink two or three presses before getting too jittery

πŸ—¨οΈ 0 β™Ί 0 🀍 5


are deep learning models actually better for recommender systems than traditional shallow ML approaches? or is RL showing gainz? i feel like everything claims it's the "state of the art" and then in practice it'ss "we noticed you just bought a Pixel 4, perhaps u want Pixel 5?"

πŸ—¨οΈ 4 β™Ί 0 🀍 16


make your Twitter profile your browser homepage

πŸ—¨οΈ 2 β™Ί 0 🀍 12


this is not a question it's a command

πŸ—¨οΈ 2 β™Ί 0 🀍 6


gc has agreed that twitter notifications are a howling whirlwind of tormented souls and that any notifs you receive from this hellsite are your cross to bear

πŸ—¨οΈ 1 β™Ί 0 🀍 5


okay we got this. next i need the wojaks from the midwit meme https://t.co/4xltpiJ6W8

πŸ—¨οΈ 0 β™Ί 0 🀍 5


the virgin the chad meme templater emoji user πŸšΆβ€β™‚οΈ vs. πŸ•Ί

πŸ—¨οΈ 2 β™Ί 7 🀍 39


poast screen time cowards

πŸ—¨οΈ 3 β™Ί 0 🀍 3


this fucked me up https://t.co/0NqVfkcFma

πŸ—¨οΈ 0 β™Ί 0 🀍 1



image from twitter

image from twitter

πŸ—¨οΈ 1 β™Ί 0 🀍 4


it still takes too long (over a minute to search 57,000 images) but it is very fun to play with. also it seems to be showing some world knowledge as well as OCR πŸ€” https://t.co/lzeMAL9Mxb

image from twitter

πŸ—¨οΈ 1 β™Ί 0 🀍 3


i figured out why memery was taking so long to search over pre-indexed images. i had written a list comprehension to unpack the archive in the proper ranked order. but it was actually walking over a giant nested dictionary once for each item in the ranked list. rookie maneuver

πŸ—¨οΈ 2 β™Ί 0 🀍 11


i have modified the archive dictionary so the keys are positive integers instead of unique string IDs. this allows me to pull the ranked indices by key, instead of walking the dictionary looking for an "index" value. all because Annoy only takes integers as indexes πŸ˜…

πŸ—¨οΈ 1 β™Ί 0 🀍 4


not sure if i somehow fcked up the indexer now, or if it's just taking a long time for this dataset because it's on an external hard drive. it's USB 3.0 it shouldn't be a file transfer bottleneck right?

πŸ—¨οΈ 1 β™Ί 0 🀍 3


guys which 5" shorta should i get? i feel like the "baggies" have that particular swim trunks texture and i just can't stand it

πŸ—¨οΈ 4 β™Ί 0 🀍 7


i fixed the ranker!! it's still taking a long time to index (need to optimize for hardware) but searching seems to be lot better. now only takes .05 seconds for 57,000 images instead of 2-4 MINUTES like before!! https://t.co/oYYUN5m9Rj

πŸ—¨οΈ 1 β™Ί 0 🀍 5


well, 0.05 to 1 second. not bad though for all these sweet kilt books https://t.co/c77l6y9Rhe

image from twitter

πŸ—¨οΈ 1 β™Ί 0 🀍 7


i pushed these changes, if anybody wants to try it on a dataset hit me up, I'd love to hear how it works out

πŸ—¨οΈ 1 β™Ί 0 🀍 4


how does functional programming always make me feel like the 🀯 emoji?? https://t.co/QEmts1QYoa

πŸ—¨οΈ 2 β™Ί 0 🀍 7


I think the reason I'm obsessed with CLIP is because it's hard evidence for unified meme theory

πŸ—¨οΈ 5 β™Ί 16 🀍 101


yeah all that lets get it

πŸ—¨οΈ 0 β™Ί 0 🀍 2


unified meme theory states that the "meme", defined as the transmissible unit of human thought, and defined has a picture with some words on it that gets copied on the internet, are not different. https://t.co/tkLy9gSydI

πŸ—¨οΈ 3 β™Ί 5 🀍 41


a meme is picture+words because it represents a gradient in semantic space

πŸ—¨οΈ 3 β™Ί 2 🀍 33


semantic space is a much-theorized "embedding" for language concepts. Like if you think about a chair, and then you think about a chaise, and then you think about a silla... there's a region in semantic space that activates from all of those. in your mind and mine

πŸ—¨οΈ 3 β™Ί 1 🀍 35


the image provides a field, an area in semantic space. the words provide a vector. Your attention moves to that area of semantic space, and then updates in the direction that the vector points. a gradient in semantic space.

πŸ—¨οΈ 2 β™Ί 1 🀍 31


depending on how far the words send you, and how well developed your map of that semantic space, either it lands or it doesn't. if it works on you, you save it to send it to someone else later. showed this one to gf, she made me send it to her so she can send it to her dad https://t.co/FP0UycuNmF

image from twitter

πŸ—¨οΈ 2 β™Ί 1 🀍 31


but it has to be the right person: you have to simulate their whole mental state to understand whether this gradient will work on them. so memes follow social topologies as well. this makes sense, though: semantic space is a product of social animals using mimetic calls

πŸ—¨οΈ 2 β™Ί 1 🀍 25


CLIP encodes visual image and text about that image into the same embedding space. You've probably seen this image, maybe with a caption that says "AI is too dumb to tell the difference between an apple and an iPod lol" but this is actually amazing https://t.co/upCTu9OQJV

image from twitter

πŸ—¨οΈ 1 β™Ί 1 🀍 25


The fact that CLIP can recognize letters in photographs Is a sign that they're encoded in the brain the same way as other visual data. they're just a bunch of weird squiggles, but they cause the meaning of the image to change. in predictable ways!

πŸ—¨οΈ 1 β™Ί 1 🀍 27


This neural net learned to read. It learned to read handwriting! it's not very good at it, but it wasn't trained to do that. it was trained to match flashcards of images to text captions. Is this not mostly a picture that says iPod? with a hint of apple and a dash of wood fence

πŸ—¨οΈ 1 β™Ί 1 🀍 35


there's a high dimensional space representing all these concepts and how they interact. Your brain evolved to think about how other monkeys think about other other monkeys. CLIP is trained on contrastive pretraining objective. we are kinda the same https://t.co/u7nB0l0gYz

πŸ—¨οΈ 2 β™Ί 1 🀍 22


This is a great thread showing how CLIP vectors encode world knowledge and how you can add and subtract them https://t.co/lkIjQjxgSx

πŸ—¨οΈ 3 β™Ί 1 🀍 23