@sun@shitposter.world @SuperDicq@minidisc.tokyo Yes indeedie I think we all know that openai or claude on a great GPU are just dandy... but the local models are "good enough" to build horizontal product already.
Also, many robotic AI tasks can be done at home with fallback to the cloud. I mean it's pretty obvious.
A lot of systems are going to REQUIRE fallback to local models so please think about that when you're designing AI products in the near term for use at home.
enclature
@nom@mk.spook.social
1776 > 1976 > 1984 > 2025
#NANC
I no longer make hard right turns on the 405 fwy doing 70mph at 3am in the morning
Posts
Latest notes
@sun@shitposter.world @SuperDicq@minidisc.tokyo Not true, really, for those of us in the US who can extrapolate/have some modicum amount of imagination.
Definitely true if you've become a slave to the big ai modules / are trapped in lockstep greater Asia.
@sun@shitposter.world @SuperDicq@minidisc.tokyo Well I'm guessing you can install ollama on an Android/Termux + bypass OpenAI... can run the pieces wherever you like. So it can be "distributed" however it suits you.
@vitalis@dirtyknight.life That'd make a fine album cover.
@lain@lain.com @sun@shitposter.world You might also like... Solaris/SunOS!
@lain@lain.com Uh-oh I'll never get hired at nVide-o-o now!
@sun@shitposter.world Almost invariably after screening they had a meeting! I love meetings. My guess:
Bob: "I like it but in that shooting scene it doesn't seem futuristic enough"
Jebediah: "It's ok we'll just add some pink laser stuff on top of the cells, update the audio, reshoot those frames on film and it'll beeeee fine."
Bob: "Sounds great!"