discord_bridge[asciilifeform]: (awtho) Been messing around with AI. Managed to get a prototype working w/o editing a single line of code - end result is a python web app. All the power of a team of code monkeys in Ukraine at a fraction of the cost.
asciilifeform: awt: what's a galley fulla ukrs cost these days, anyway?
discord_bridge[asciilifeform]: (awtho) No idea? $40/hr? Honestly no clue
asciilifeform: gpu farm also costs sumthing neh
asciilifeform experimented, like errybody else, with gptism at the tail end of his commercial work days, but it was uniquely terrible fit for the line of work he was in
asciilifeform: i.e. moar or less the exact opposite of 'generate 100kB of boilerplate for www app'
discord_bridge[asciilifeform]: (awtho) I'm using Cline in Visual Studio. First plugin I've found that will work at the project level rather than just the file level.
discord_bridge[asciilifeform]: (awtho) Makes sense. I'm delving deep into platform-based liquishit here.
asciilifeform: the days of 'get $$ for www boilerplate' are prolly numbered. sorta like how starving immigrants could get paid to wash dishes in restaurant, but prolly nowhere nao, they use machines
asciilifeform: (while the days of 'get $$ to fix what gpt users subtly fucked' not yet arrived, nor obvious that they will, likely things will simply stay subtly fucked indefinitely)
discord_bridge[asciilifeform]: (awtho) Scratching my own itch here. I frequently get emails from customers with the same issues. Trying to put something in front that does some triage before I have to deal with it.
asciilifeform: a, the 'q&a chatbot' a la lulazon et al support ?
discord_bridge[asciilifeform]: (awtho) Over email, but yes.
asciilifeform would expect it'd be infuriating to folx w/ 'over roomtemp iq', like lulazon's, but not knows who is the customer base, so can't say for sure
discord_bridge[asciilifeform]: (awtho) Not over roomtemp iq
asciilifeform would rather read an ordinary text faq than fight with a bot, errytime
asciilifeform: a then wainot.
discord_bridge[asciilifeform]: (awtho) Yes specifically happens with the customers that are at roomtemp IQ.
asciilifeform: there's a species of humanoid that want to 'feel served' 1st, while wants actually to resolve a question a distant 2nd
asciilifeform: for these, 'eliza'-level 'ai' would have prolly sufficed, not sure wai the bots had to wait for gptism etc
discord_bridge[asciilifeform]: (awtho) It's just regular people who don't understand image resolution and transparency and how that translates to the quality of a finished product.
billymg[asciilifeform]: http://logs.bitdash.io/pest/2024-12-27#1034622 << neat, have you also tried Continue? if so, how do they compare?
billymg[asciilifeform]: i've been using Continue in VS Code backed by a locally running model, "DeepSeek-Coder V2 Lite". it's been great for certain things, also handles spanish localization pretty well
billymg[asciilifeform]: i've not tried having it write a full app though, mostly just use it to generate boilerplate snippets
billymg[asciilifeform]: i think what i like most about it is that it makes writing webshit a lot less tedious. it acts as a sort of motivator, by removing the "ugh" hurdle to sitting down and making progress
billymg[asciilifeform]: but it's possible that wears off over time
discord_bridge[asciilifeform]: (awtho) I haven’t tried Continue. I’ll check it out.
discord_bridge[asciilifeform]: (awtho) Yeah Cline made a bunch of boring work disappear: oauth, email threads, writing tests…
billymg[asciilifeform]: are you using an API for the GPTism or running something locally?
discord_bridge[asciilifeform]: (awtho) I’m using Claude via api.
discord_bridge[asciilifeform]: (awtho) Running into the token limits is slowing me down on tasks with a lot of context.
billymg[asciilifeform]: how much vram you got?
discord_bridge[asciilifeform]: (awtho) On my MacBook pro? Lol
discord_bridge[asciilifeform]: (awtho) I have a very heavy duty box at home but don’t know the stats.
billymg[asciilifeform]: https://huggingface.co/bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF << probably won't match the Claude API but the whole thing fits on a 24gb consumer card
discord_bridge[asciilifeform]: (awtho) I’m gonna try it.
billymg[asciilifeform]: i'm using "Q8_0" quant type. since i've not tried any of the API models i'd be curious to hear how you think it compares in your use case
billymg[asciilifeform]: i serve the model locally with: https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#llama-server, which the Continue plugin talks to
asciilifeform: awt, billymg : any speshul sauce req'd to try these on crapple? ( asciilifeform atm has an 'm2 ultra' w/ 192GB ram that's mostly idle, lol )
billymg[asciilifeform]: asciilifeform: can build the llama.cpp thing for metal https://github.com/ggerganov/llama.cpp/blob/master/docs/build.md#metal-build
asciilifeform: billymg: ty!
asciilifeform wonders how performance would compare to the moar typical pcie gpu
billymg[asciilifeform]: from there can play around with any .gguf format model you find on huggingface
asciilifeform has the box simply on acct of it having been, at the time bought, 'fastest gcc box $ could buy', but not yet even tried gpuism of any kind on subj
billymg[asciilifeform]: asciilifeform: not sure, i think if you compare to top of the line nvidia cards it's a lot slower, but that's assuming the person with the nvidia card has enough vram to fit the whole model. usually they don't, whereas on your mac you've got the 192gb of 'unified' ram that the gpu cores can use
asciilifeform: sadly haven't a recent nvidia card to compare to, but oughta suffice for simple experiments
billymg[asciilifeform]: nvidia intentionally gimps the vram on consumer cards for this reason, so if you want more than 24gb at the moment you gotta pay ~4x what a 4090 costs for the A6000 (ada model)
asciilifeform: if the gpttrons could do, for the sorta things asciilifeform does, what they do for 'make me http server boilerplate' folx, the cost wouldn't seem excessive to asciilifeform
asciilifeform: but afaik atm lolno
billymg[asciilifeform]: afaict it really is just an advanced autocomplete, so only works for things that have been written before
asciilifeform: not necessarily useless on complex problems simply from this fact. theoretically, could be the proverbial 'sufficiently smart compiler'(tm)
asciilifeform suspects that there are plenty of 'low hanging fruit' simple patterns in all kindsa domains that haven't been usefully, systematically mechanically detected yet.
asciilifeform: afaik the irons req'd to actually train gptistic models de novo, tho, costs $maxint
asciilifeform if not for this, would've tried already the mega-obvious application: to sniff for interestingly weak rngs...
asciilifeform: all the coinz sitting in addrs genned on winxp etc., potentially, waiting to be excavated...
asciilifeform: 'hypertrophied autocomplete' in principle could make short work of keys that came outta gimped rng. (considering that it aint the least bit difficult to bring up a 2009-style ms victim's box and get as many samples as could ever want)
asciilifeform: analogously to e.g. petro prospecting, just about any degree of success would handily pay for whatever irons...