Hide Idle (>14 d.) Chans


← 2023-02-07 | 2023-02-09 →
signpost[asciilifeform]: 100%. this thing is going to devour NPC "jobs" for that reason.
signpost[asciilifeform]: and as far as being useless for a real person, I disagree. is full-text search over your personal notes also useless?
signpost[asciilifeform]: how about same over scientific papers, assuming for the moment that a corpus of real one could be had (the lack of which isn't the fault of this particular tech)
signpost[asciilifeform] keen to try training such a thing on the logs and a few hops out of linked material
signpost[asciilifeform]: might actually have hope of organizing them into topics.
signpost[asciilifeform]: thing has no place substituting for a real human's output, but "hay boss, I saw some similar symbol chains at positions x, y, z in this textpile" mighty useful.
phf: i wrote an ungodly implementation of myers using infix code, http://paste.deedbot.org/?id=zZir
phf: signpost, i don't have full text search for my notes, i'm pretty disorganized
phf: 􏿽my work focus at the moment is doing some gnarly refactoring, that'll drive the project forward, but basically there's nobody else who can do it. once i'm back to day to day management functions i'm sure i'll be able to find use for chatgpt. the other c- level is raving about it by
phf: 􏿽 putting it to all kinds of "write me a letter to district" use
signpost[asciilifeform]: I find I do way better if I can prompt myself with reference material.
signpost[asciilifeform] is a better indexer than key-value store.
signpost[phf]: yep, that makes sense. thing does feel about like having the stable of 22yr old dorks.
phf[asciilifeform]: why not grab some useful chatgpt interaction when you are particularly pleased with it, and write an article about it, or even a paste
signpost[asciilifeform] working solo atm too. farting out some stack-level code that generalizes the last few stacks I've had to put together for webshits.
phf: the other c-level is the main chatgpt champion in my life, but every time he sits down to actually demonstrate it, it goes back to "chatgpt tell me a meaning of life?" "...." isn't this amazing!
signpost[asciilifeform]: I should. I had a pretty damned interesting "thread" with the thing applying a jungian lens to Revelation.
signpost[asciilifeform]: all the direction in the conversation came from me, but the thing did well tossing in detail where I wanted it, and which I could go verify myself. comparable archetypical myths, etc
signpost[phf] struggled a lot in reading that book throughout life.
phf[asciilifeform]: i wonder if it'll do better than sokal on a sokal prank!
signpost[asciilifeform]: ah this was the journal-punking event. yeah, I bet it would.
signpost[asciilifeform]: and really, I might be an idiot, but with a dash of humility one might be able to see something like gptism as a stage in the language pipeline. probably running in massive parallel, vying for becoming output.
signpost[asciilifeform]: vetted against a reality simulator, say. I know my own thoughts are something like a narrator speaking a play into being.
phf: that could be a frankenstein falacy though, like "man is a like a loom!" "man is a like steam engine!" "man is like a computer!" etc.
signpost[asciilifeform]: could be. another interpretation is we're getting a better idea as we look at our effect on our surroundings.
signpost[asciilifeform]: enough of "us" is out there to infer more detail
signpost[asciilifeform]: http://logs.bitdash.io/pest/2023-02-07#1022525 << might be that this is blasphemous because god's already said it all in reality - or *as* reality - and nothing outside what's real may be said.
bitbot[asciilifeform]: Logged on 2023-02-07 18:04:09 phf[deedbot]: friend of mine once told me that his highest goal is to produce a thought that's a result of an authentic thought process (or something similar, we were both drink and smoked too much hookah), but "not being a chatgpt" might become the highets aspiration of life reflected in a po
asciilifeform: http://logs.bitdash.io/pest/2023-02-08#1022542 << consider that most 'npc work' would not even need elaborate ai bot to automate. it exists not because needs doing, but from arbeit macht frei (i.e. they gotta park'em somewhere, give hamster wheel)
signpost[asciilifeform]: dunno what the authentically produced thought would be other than the consequence of the symbol state of his head and the evaluator running forward.
bitbot[asciilifeform]: Logged on 2023-02-08 09:57:47 signpost: 100%. this thing is going to devour NPC "jobs" for that reason.
phf: signpost, as you keep talking the frankenstein falacy starts peeking through more and more :>
phf: or, not to call it a falacy, some kind of material realist reductionism. i dunno should ask chatgpt what to call it
signpost[asciilifeform]: yes, but you're predisposed to think that certain men are special, so we're both starting from beloved axioms, right?
phf: no, i'm platonist, so i think all men are imperfect approximations of a Man, but most can come pretty close
phf: if you take the worst, most debased, broken of a human, and then put him next to a system with a large corpus, you can easily say "look, the machine has surpased man", but my only answer to that is not even to debate, but to go "ok? anyway"
signpost[phf]: mhm, but we agree on that.
signpost[phf]: will only replace humans who shouldn't exist in the first place.
phf[asciilifeform]: i don't know if i like that thought either, it feels like desperately trying to pull a ladder from a thing that's flying on its own
phf: like, i get it, there's a certain merth that shaquinda from hr will become useless overnight, but shaquinda in hr is in hr not because her job was irreplacable to begin with. she's there for political reasons
signpost[asciilifeform]: if we're lifting the thread to the political, there's also a "SF couldn't possibly have built something useful" that I smell.
signpost[asciilifeform]: (and for that matter, they didn't. they threw money at what was already public knowledge)
phf: nah, that's not a factor in my motivation at all
signpost[asciilifeform]: but I think the thing I said that provoked balking was the suggestion that a statistical symbol-chainer might live at some level of the pipeline of human language.
signpost[asciilifeform]: necessary but insufficient is all I'd say, and just speculating.
phf: well, no, that wasn't what "provoked balking" that's what moved the conversation from casual riffing into a more serious consideration
signpost[asciilifeform]: please assume I'm just a brash texan picking words with that priming.
signpost[asciilifeform] not grokking clearly.
phf: so my thought on the subject of chatgpt is that perhaps it solved the kind of problems that one shouldn't have to begin with, but if one does, then it's a good solution, and nothing more.
phf: from the "considerations of the human nature" perspective, i've suggested before that "npcs" do a kind of markov chain generation in their head, and LLM is just closer to being a more useful and practical metaphor
phf: but the state of npc is not an exhaulted state, it's a deliberate debasement of human nature by the human himself and by external powers
asciilifeform: historically, this kinda 'solution' simply leads to where problem grows to eat up avail. cycles. ( recall '70s 'computers will abolish bureaucracy' lulz )
phf: signpost, that's why i said "can you post an example of a useful interaction", and when you said "well, we had a good conversation about heady philosophical point" my retort was "i'm sure a bullshit generator is good at generating bullshit"
signpost[asciilifeform]: well, that's your level of respect for me as well as the other thing, lol
phf: and i said "can you post" not as a challenge, but as an attempt to get a confirmation that indeed, this thing is pretty good at doing useful junior assistant functions. i believe that might be the case, but not having trusted the system enough myself, i don't have any way to test it. i'm just curious about examples
signpost[asciilifeform]: and I've heard it passed the goog l3 interview
phf: 􏿽signpost, that's your "didn't get my phd" american chip on the shoulder! i'm suggesting that one can has a very productive philosophical reflection by doing i-ching or a tarot card layout or just even opening a random book one doesn't quite understand and reading random passages. i
phf: 􏿽've done all of those before. it's not "hur dur dum signpost has heady conversations with a bullshit machine", it's "there are all kinds of methods to extract meaning from meaningless, and human brains are really good at that"
signpost[asciilifeform]: ^ example of a class of requests for which I've found it working well. "bolt system A into system B by some well-known protocol"
signpost[asciilifeform]: sure, that's why I said that all the thoughts worth having came from me, but lemme post the bulleted summary it produced from the conversation.
signpost[asciilifeform]: if I were preparing to write on this subject, this would be mighty useful
signpost[asciilifeform]: I'd prefer the thing hyperlink to sources, and hyperlink to its own logs to reference previously "discussed" material.
signpost[asciilifeform]: plenty of it is "yes, that is the reading list of a first year philosophy / history student"
signpost[asciilifeform]: but that's exactly why I'm talking about the thing as a search tool.
phf[asciilifeform]: (we were listening to rome on the drive to horses yesterday, so i when i read "kali yuga" in that list the song that was already stuck in my head popped up again https://www.youtube.com/watch?v=VkVmDQ1_JDk )
phf: http://logs.bitdash.io/pest/2023-02-08#1022602 << i don't know how to interact with this output. what i mean by this is that it's wrong in many places, in code and in text, and doesn't satisfy prompt requirements either.
bitbot[phf]: Logged on 2023-02-08 10:46:03 signpost: http://paste.deedbot.org/?id=jXML
phf[asciilifeform]: so we're kind of back to "reviewing pull requests from 22 year old developmentally challenged junes"
phf: it particularly well captures the "smug certainty combined with in your face wrong answer" element of that experience :D
signpost[asciilifeform]: sounds like a problem of degree though. but I do not have a strong argument for saying so.
phf: what do you mean?
signpost[asciilifeform]: well, like it hallucinated a socket package. is it not trained enough on CL code? did I insufficiently prompt the thing to not pretend other code exists
signpost[asciilifeform]: I often "hallucinate" the existence of other code, write as though it exists, then go implement the meat behind the interface
awt[asciilifeform]: ^ This has been my experience. ChatGPT halucinated extremely useful public method that was actually private and so not so useful.
phf: http://logs.bitdash.io/pest/2023-02-08#1022555 << that's why i asked what i asked, rather than "give me an example"
bitbot[asciilifeform]: Logged on 2023-02-08 10:10:35 phf[deedbot]: why not grab some useful chatgpt interaction when you are particularly pleased with it, and write an article about it, or even a paste
phf: because i can come up with usecases for chatgpt also, but i'm curious about something that came in flow
signpost[asciilifeform] will grab the next example.
phf: 􏿽because if you start looking at imperfect examples of chatgpt output, well, the result here is ~factually~ invalid, so where do we go from here? if the objective of the original exercise was to get a couple of practical sb-bsd-sockets examples, that obviously didn't work. is the im
phf: 􏿽agined interface good enough? it's a riff on usocket, so i guess it's not bad
signpost[asciilifeform]: been deleting because their UI is shitty for threads
signpost[asciilifeform]: lemme see if I can coach it like the autistic junior dev. sec
phf: but where chatgpt "worked on its own" it's junk. e.g. send-message could've taken format-like arguments, and then one wouldn't need to write (format nil "" ...) in every single send-message invocation
signpost[asciilifeform]: lemme give it your prompt too.
phf: lol, it just replace socket with sb-bsd-sockets, but sbcl's sb-bsd-sockets interface is different
signpost[asciilifeform]: it is probably abuse to ask it to write in a language that isn't Our Democracy approved.
phf: well, i'm more amused that the experience of "trying to get a june to do the right thing" has been captured ~exactly~ :D
phf: i suspect it doesn't know anything about sb-bsd-sockets, because it can't get those calls right
phf: it's like a fractal of errors, i only just noticed the \r\n in the format string
phf: by the way, since one rarely sees that, and there's no way chatgpt knows anything about it, the way you write this without the extra format is (defun foo (fmt &rest args) (format nil "~?~c~c" fmt args #\return #\newline))
phf: i'm pretty sure sbcl has it wrong, because it's implementation dependent, but the correct way of handling newline,return is on a stream configuration level. you say something like :external-format (utf-8 crlf), and then you can do simple (format stream "~%") and get both characters spooled
asciilifeform: http://logs.bitdash.io/pest/2023-02-08#1022619 << seems likely that the training set included a snippet which referred to sumbody's actually existing 'sockets' pkg
bitbot[asciilifeform]: Logged on 2023-02-08 11:07:54 signpost: well, like it hallucinated a socket package. is it not trained enough on CL code? did I insufficiently prompt the thing to not pretend other code exists
asciilifeform: http://logs.bitdash.io/pest/2023-02-08#1022615 << the 1 obv diff is that typically even most 'chromosomally-challenged' indian dev (at least occasionally) ~runs~ the coad
bitbot[asciilifeform]: Logged on 2023-02-08 10:59:41 phf[awt|deedbot]: so we're kind of back to "reviewing pull requests from 22 year old developmentally challenged junes"
phf: i think it's usocket
phf: which i suspect is the most widely used compatability layer, drakma and hunchentoot both use it, etc.
signpost[asciilifeform]: http://logs.bitdash.io/pest/2023-02-08#1022647 << yup. ben pointed out to me that interaction with a compiler is easy to model as a conversation. would be interesting to see how it deals with that output.
bitbot[asciilifeform]: Logged on 2023-02-08 12:08:25 asciilifeform[jonsykkel|deedbot|awt]: http://logs.bitdash.io/pest/2023-02-08#1022615 << the 1 obv diff is that typically even most 'chromosomally-challenged' indian dev (at least occasionally) ~runs~ the coad
signpost[asciilifeform]: and yeah, runtime errors. wouldn't be hard to bolt an interpreter to this thing when they open the API.
signpost[asciilifeform]: http://paste.deedbot.org/?id=9YTG << example of something that's close enough to right that it saves me time.
signpost[asciilifeform]: can pick nits in intellij for 5min rather than farting around figuring out how JavaPoet expresses what I want it to do.
signpost[asciilifeform]: and yeah, probably made this salad from querydsl, jooq, etc.
phf: what's the underlying sql? it's querying tables like SCHEMATA and TABLES, which i don't know what it is. can it do the same but with postgresql's schema table conventions?
phf[asciilifeform]: i guess the meat relevant to your interests is in generateJavaClasses, and the open question is whether or not the use of TypeSpec and FieldSpec are correct. they probably are, but..
phf: you can't trick me, wizard, by throwing a wall of java boilerplate! just look at generateJavaClasses and what sort of structure it's actually generating
phf: it's utter utter jank, it's worse than common lisp
signpost[cgra]: this was a reproduction of code I ended up using in a current project.
signpost[cgra]: wanted a rigid representation of the schema without a giant ORM dependency.
signpost[cgra]: and yeah, it got the use of JavaPoet mostly right
signpost[cgra]: http://paste.deedbot.org/?id=wYE2 << it misunderstood, because I wanted it to use the postgres-specific catalog, pg_*, but I didn't say specifically either.
phf: i guess you asked it to give you something like FooSchema.FooTable.FooColumn.NAME kind of stuff?
signpost[cgra]: aside, this ROME band is great
signpost[cgra]: http://paste.deedbot.org/?id=OG7P << not gonna make it do the whole thing, but the queries look right
phf: curiously the one place where we have a chatgpt produced code in our codebase (as a kind of "first!" novelty) is also dealing with sql/orm generating bunch of methods/properties from schema
asciilifeform supposes that the time when anyone gaveashit re: fact that microshit (or whoever operates the bot) gets yer entire src, is long past
signpost[cgra]: yep, MSFT is the major investor in OpenAI, and owns shithub
asciilifeform: using the bot is effective same thing, afaik, as uploading yer entire proggy to shithub
signpost[cgra]: yup. item I'm feeding it is BSD licensed code though.
phf: asciilifeform, it's well aligned with the idea that nobody's going to use chatgpt to build a republic. writing java for postgresql schema, writing "thank you very much for talking to me earlier" emails, etc.
phf: which goes back to my point of http://logs.bitdash.io/pest/2023-02-08#1022593
bitbot[asciilifeform|cgra]: Logged on 2023-02-08 10:39:25 phf[awt|deedbot]: so my thought on the subject of chatgpt is that perhaps it solved the kind of problems that one shouldn't have to begin with, but if one does, then it's a good solution, and nothing more.
signpost[cgra]: right. what political failing made the database speak SQL but the program speak... kind of problem
phf: a 22 year old june is the perfect analogy, because you're not going to involve him in the inner workings either
signpost[cgra]: and since this political failing occurred, then need a projection of the one to the other.
signpost[cgra]: still smells too dismissive though, but I don't have a good counterargument to provide yet.
signpost[cgra]: will we ever have one representation system that is so orderly and so reduced that this kind of redundant structure does not occur?
phf[asciilifeform]: well, are we then just reducing it to team trump team biden? because it's evidently Technology, with all kinds of interesting applications, but at present early stage, hampered by various political and social problems
signpost[cgra]: not at all.
signpost[cgra]: meanwhile found some folks that claim to have an open source comparable model. going to fart around with it, the logs, and the nvidia card
phf[cgra]: yeah, there's a bunch of /g/ posts on subj, i want to look into eventually. they mostly use it to write porn fiction :D
signpost[cgra]: who reads porn. bunch of girls
signpost[cgra]: https://pornpen.ai/feed << this thing produces some bizarre shit
phf: well, it's weird scienc dream of e-girlfriend. 'uuuge foking tits, and none of the head cockroaches
phf: i'm more interested in the training part. like how much of a corpus do you need for it to start producing interesting results? how does it behave with a small corpus? etc.
phf: if you just "train it on your notes", are you going to get uncanny valley of own thoughts regurgitated, or it's just going to be jibberish, because not enough corpus
phf: does thing ~only~ work if trained on gigabytes of textual data, and therefore requires extensive amount of 3rd world pajeet in it?
asciilifeform strongly suspects the latter
phf: i guess the first l in the llm strongly implies
phf: i have more questions, can you train it without it producing the reams of english 101 expositions. "this answer is correct, because i correctly answered the question that you have asked previously." etc
phf: can you distill the information to some kind of schematic reductionist thing, where the "wires" are seen, or is it only ever manifest as extensive fluff because it's inherent to the way the information is generated
phf: why is nobody asking or answering these questions. why is every single article is chatgpt generated "i asked the oracle and it told me correctly! the ai is open us! told me wrongly! the ai is bullshit"
asciilifeform: cuz it's an imponator.
phf[asciilifeform]: and to the point of l part of llm, is the process even controllable? like wolfram suggested that chatgpt is fundamentally flawed unless you can shape output by something like alpha. that is currated certainty source.
phf[cgra]: chatgpt troons are doing something to nerf e.g. "write a poem about god emperor trump being the best", wihch are easily subverted by prompt fuckery
bitbot[cgra]: Logged on 2023-02-08 13:40:06 asciilifeform[jonsykkel|awt]: cuz it's an imponator.
phf[cgra]: so it's obvious that people who ostensibly designed the system have only a very limited ability to shape it's output/input behavior
asciilifeform: reportedly, the 'nerf' is where most of their effort goes
phf: which makes one wonder to what extent the thing can be "analyzed from parts". my current impression is that you either get an amorphous vector space of meaning that collapses into anything at all, OR null
phf: because the nerf is obviously very crude, slightly better than keyword filter kind of stuff
signpost[cgra]: http://logs.bitdash.io/pest/2023-02-08#1022692 << good experiment to run. there are gigabytes of non-pajeet to be had
bitbot[cgra]: Logged on 2023-02-08 13:36:58 phf[awt|deedbot]: does thing ~only~ work if trained on gigabytes of textual data, and therefore requires extensive amount of 3rd world pajeet in it?
phf: signpost, didn't one of ours do an archive of gutenberg txt and the whole thing is something like 500mb?
asciilifeform: signpost: where?
signpost[cgra]: dunno, but I have my own snag of that
asciilifeform: dun recall how much it weighed unzipped, but possible that only coupla GB, summed, of 'non-pajeet', even exist...
asciilifeform: (and with good % of it mixed, like uranium in ocean, with pajeet)
signpost[cgra] resigns, defeated, lol
signpost[cgra]: I'll feed their existing pajeet-trained thing books from prior to 1900, say, for a while, and see what happens.
asciilifeform: doesn't mean it aint worth a try feeding gutenberg, logs, etc. to a bot
asciilifeform tried the former with old-fashioned shannonizers at one pt
asciilifeform: imho there's nuffin wrong with playing with chatbots, so long as one has realistic expectations, and realizes that the current crop is built 'to impress' with coherent text output, rather than 'cognition' (however else defined)
asciilifeform: folx are readily tricked with 'clean-looking text' given as in humans it correlates with 'someone is home in the crankcase'
asciilifeform: ( and in most cases 'impress' takes very little. recall weizenbaum's story re eliza & his secretary )
signpost[cgra]: something even a few hairs better than full text search over a selectable corpus would be useful.
asciilifeform historically suffered many moar headaches with 'smart' tools, e.g. non-greplike search, than with 'dumb' fwiw
asciilifeform: see also kreinin's 'computers you can't program' piece re subj
phf: of course meanwhile signpost will make his hundreds of thousands with chatgpt assist, retire in tx as a proper upper middle class, little trinquets running all over, etc.
asciilifeform: possibly
asciilifeform: (if microshit doesn't simply robotize his co's service using his own src, lol)
phf: in totally unrelated, i wired the gui to khan packet sorter, first thing it did is stuff accidentally unconnected recent message in the beginning of the log..
asciilifeform: phf: what's a khan sorter ?
phf: asciilifeform, packets that come in get placed in a netchain/selfchain linked graph, which is then khan sorted, and the sort is compared against previous sort using myers. it's all not very optimal, but good enough for now
phf: myers gives you an update script, so you can go replace parts of output without full redraw
phf: i mean it's an overkill for the way the packets behave, but theoretically at least accomodates for any arbitrary injections of missing packet subgraphs
phf: so i finally brought it up, and the immediate result is that khan sort doesn't do the right thing
phf: it needs to smh incorporate timestamps in order resolution. that is when everything else is equal, it should place packets with higher timestamps closer to the end of the list, but i don't quite know how to do that yte
phf[asciilifeform]: and to tie the two threads together, i've noticed past few days (since i came back to this problem), every time i search top results are filled with total crap articles
phf: from sites like programmersought.com or geeksforgeeks.org. i've not seen those before, and i'm pretty sure that's the first case of chatgpt generated junk
billymg[asciilifeform]: phf: i've noticed that as well, for at least the last few months, perhaps closer to a year for certain domains, and becoming more and more common
billymg[asciilifeform]: all with the same style table of contents at the top of the page
billymg[asciilifeform]: and formatted sort of like an FAQ page
phf: i'm also searching for some very computer science 101 kind of term combinations, so the results are particularly abysmal
asciilifeform: phf: noticed over9000 of these when looking for various cppisms
asciilifeform thought they were simply the usual stackoverflow rips, but not dug in deeply
phf: woop, replacing a stack with a priority queue in a khan sort fixes problem
phf: !!help
phf: hmm, test
phf: !!help
phf: so now incremental update is broken, but i can still run with a full refresh, which with 250 packets takes .5s
phf: !!help
phf: !!help
phf: apologies for spam
phf: this myers implementation with parts done using infix notation is so ugly, i love it http://paste.deedbot.org/?id=QPOZ
phf: this is probably not what those "can't we lisp, but with an algol syntax?" people meant, huh
asciilifeform: phf: what are the odd uniturds in there ? (show up as 'diamonds' on asciilifeform's box)
phf: that's a symbol i hung the dispatch macro from! for extra alien weirdness
asciilifeform was starting to see
asciilifeform: was that macro in an earlier paste ? wat's it look like
phf: no, i starting writing one myself, but then remembered there was inflix.cl in cmu archive
phf: i patched it to support some extra pythonic syntax (x//y and x[y:z], though the later i'm still working on, so the paste above uses (subseq ...) rather than new syntax still)
asciilifeform: linked src says there was a similar bolixism, but asciilifeform not recalls seeing infixism in any of the leaked srcs there (maybe missed)
phf: i've not ever seen the infix hack in official source code, but the various home directories i have contain hacks that use the infix notation here and there. mostly in the style of the code above: when you need to write an unpleasant mathematical formula
phf: test
phf: !!help
phf: now we're cooking with fire
← 2023-02-07 | 2023-02-09 →