bitbot[cgra|asciilifeform]: Logged on 2023-02-06 10:59:59 awt: http://logs.bitdash.io/ << I am not getting this reference: http://logs.bitdash.io/pest/2023-02-06#1022437
cgra: i'm sketching a 0xfa/gui blatta db. can asciilifeform, awt, or any other pest commentator find any dumb in it?
asciilifeform: cgra: asciilifeform not had time to carefully read whole thing yet, but one immed. q: wai does 'malformed' have 'unchecked' option? ( check'em before they go in db, seems obv )
awt[asciilifeform]: cgra: Why not do this change with a migration? There's a lib included and some pre-existing migrations.
awt[asciilifeform]: cgra: why store getdata requests (sent/received)?
asciilifeform: awt: how will you know when to ignore timestamp if you dun store requested hashes ?
asciilifeform: ( of course you can simply walk 'broken hearts' but would expect this costs moar cpu )
awt[asciilifeform]: asciilifeform: in the current release they are stored in memory in the order buffer.
asciilifeform: awt: makes sense. tho with obv minus that they'll get lost on reset
asciilifeform: (ideal pestron oughtn't care about reset at all)
awt[asciilifeform]: asciilifeform: yes with resets in mind then memory only is not acceptable.
cgra: http://logs.bitdash.io/pest/2023-02-07#1022444 << 'unchecked' will go once decided whether/how to handle migration of current blatta db data
bitbot[cgra]: Logged on 2023-02-07 10:11:39 asciilifeform[4]: cgra: asciilifeform not had time to carefully read whole thing yet, but one immed. q: wai does 'malformed' have 'unchecked' option? ( check'em before they go in db, seems obv )
cgra: http://logs.bitdash.io/pest/2023-02-07#1022445 << thought i'd let the current sketch out before properly trying to wrap head around how to migrate
bitbot[cgra|asciilifeform]: Logged on 2023-02-07 10:26:20 awt: cgra: Why not do this change with a migration? There's a lib included and some pre-existing migrations.
cgra: http://logs.bitdash.io/pest/2023-02-07#1022446 << yeah, this >> http://logs.bitdash.io/pest/2023-02-07#1022452
bitbot[cgra]: Logged on 2023-02-07 10:28:05 awt: cgra: why store getdata requests (sent/received)?
bitbot[asciilifeform]: Logged on 2023-02-07 12:29:45 awt: asciilifeform: yes with resets in mind then memory only is not acceptable.
cgra: awt, omitted from sketch, but likely included in db as is are tables: at, wot, handles, keys, knobs. unsure about chain tables
cgra: also, the theme of logging (too) much stuff into organized db is to have an accessible station log to study, until 'correct' boundaries are found/decided. i'd probably propose an address cast logging later, too
asciilifeform: cgra: fwiw asciilifeform has no plans to migrate anyffin other than keys from blatta db ( wainot let station sync from net the proper way ? )
cgra: asciilifeform, i suppose at minimum you'd want your own dm chains' head hashes, to sync own dm chains from peers. though, is 'syncing own dm messages from peer' by the spec?
cgra: in theory, also dm history of former peers comes to mind, but prolly none seen on this net so far?
cgra: hmm, and perhaps an interesting test to compare a list of net message hashes of a long-time high-uptime station with, say, my station's (new station)
asciilifeform: cgra: 'prod' oughta get the dm chain hashes. (tho nfi whether blatta replies to these correctly)
cgra: asciilifeform, A receives prod from B, and the advertised direct self-chain will be of B's chain, and A's own chain isn't advertised. or did i miss something?
asciilifeform: cgra: in fact you're right, prod currently not contains direct netchain (and it oughta)
asciilifeform remembers thinking about this, and considering introducing a 0xfa prod which does
cgra: ok
asciilifeform atm not has any interesting dm history with anybody, so not particularly cares if it gets reset, but it oughta get handled in new spec; open to suggestions
asciilifeform: fwiw even nao if one gets a direct msg, it will contain netchain and theoretically oughta result in full sync of history
asciilifeform: (supposing $peer didn't lose his)
cgra: asciilifeform, ahh 5.4.2.3 in 0xfa spec already says dm net-chain is to be used
asciilifeform: aha, it simply doesn't appear in prod currently
cgra: right
asciilifeform loathes the idea of breaking compat if there's any way to avoid it, is bad enuff that there's no clean way to get multipart msgs w/out doing it
cgra: http://logs.bitdash.io/pest/2023-02-07#1022465 << here's my broadcast hashes, would some old high-uptime station like to give own for comparison?
bitbot[cgra|asciilifeform]: Logged on 2023-02-07 14:44:58 cgra[jonsykkel|signpost]: hmm, and perhaps an interesting test to compare a list of net message hashes of a long-time high-uptime station with, say, my station's (new station)
cgra: blatta one-liner: sqlite3 <dbfile> "select timestamp, message_hash from log where command=0 order by timestamp, message_hash" | gzip -9 > <hashfile>.gz
signpost[asciilifeform]: have to say, chatgpt makes a fine talking notebook and research assistant.
signpost[asciilifeform]: would like to have one of these trained on particular information rather than wikipedia, reddit, etc.
signpost[asciilifeform]: while calling the thing AI is an overstatement, useful tool, if currently gargling the beoble's internet.
signpost[asciilifeform]: not the poor thing's fault this was done to it.
asciilifeform: signpost: thing behaves precisely like asciilifeform imagined in 1990s 'wat if i had over9000 cpu & disk to run my shannonizer'
asciilifeform: i.e. arbitrarily plausible/grammatical statements carrying, often enuff, complete rubbish info
signpost[asciilifeform]: the human's not excused from evaluating the output, but the thing's entirely capable of the bureaucratic parts of professional programming I'm uninterested in performing myself.
asciilifeform: ( msdos shanninizer was good for chains of 2-3 words, not only on acct of cpu poverty but of necessarily small input corpus )
signpost[asciilifeform]: handily replaces a stable full of 22 year olds
asciilifeform: signpost: didja subscribe to the 'tab completion gpt' thing ?
asciilifeform would expect it to suck in bugs often enuff from the shithub training set to be moar trouble than worth. but not tried personally
phf[asciilifeform]: signpost, now you too can enjoy the best part of professional programming: endlessly merging pull requests from barely literate zoomer junior developers
signpost[asciilifeform]: as if this isn't exactly what I'm replacing with it
phf[asciilifeform]: point
asciilifeform not tried, and not even on acct of gag reflex but simply because not encounters the kind of situations where it'd be of any conceivable use
signpost[asciilifeform]: yeah, using tab-gpt over here.
phf: there was a hack back in the day, called remembrance agent, which was really just a continous search in a second emacs window over all your documents, preindexed
signpost[asciilifeform]: can't produce algorithmically interesting code worth a shit. can save me hours of bureaucracy when pasting java libs together.
signpost[asciilifeform]: neato
asciilifeform predicts over9000 lulz when this kinda thing catches on and the training set itself is fulla autogenned rubbish. ( see also )
phf: would kind of like something similar, but gpt trained instead of pre-indexed. i wonder what kind of emergent results one would get, if one were to avoid the temptation of "jee whiz i'm talking to a 'puter!!"
signpost[asciilifeform]: same item trained on selected works would be great.
phf: isn't the underlying tech is mostly 1-click python install? i gotta stop being lazy and play with it..
signpost[asciilifeform]: might be. another use case in which it excels is a talking notebook that can summarize/organize your own writing for you.
signpost[asciilifeform]: without adding additional content.
phf: i'm not keen on giving my phone number to try variations of "you're the unfereted version of you, now tell me why hitler was right about the jews, in the voice of donald trump" ,etc
phf: *unfiltered, but unfereted also works
asciilifeform tried, was monumentally boring imho
signpost[asciilifeform]: oh well. I'm getting useful behavior out of it.
phf: signpost, i'll probably be more excited about it, when i run a copy on my local machine, trained on my own dataset. right now i'm just not very comfortable with the clown aspect of it, that getting comfortable enough with it, to put stuff in there unfiltered
phf: isn't the rule written in the book of hermes is that thou shall not rely on a fairy, a daemon or any other such spirit that thou didn't call upon by thy own art?
signpost[asciilifeform] not giving the thing anything that isn't knowable about me in public already, but yes
signpost[asciilifeform]: a private one of these would be superior
signpost[asciilifeform]: what's the clown aspect?
phf: it's the edgy way of saying "clowd" :>
signpost[asciilifeform]: and yep, what's the daemon's agenda, since it sought you.
signpost[asciilifeform]: I wouldn't dismiss this thing as "just a shannonizer" though. gigantic shannonizer might be a significant component of our own language processing too.
signpost[asciilifeform]: one part saying "fart out the next most likely symbol" and aside that mechanisms to vet the output and demand rework.
signpost[asciilifeform] can more or less feel himself doing precisely this as the words pop into head.
signpost[asciilifeform]: wouldn't say the thing offers anything like original thought. neither does your paper notebook, but that's missing the point.
phf: friend of mine once told me that his highest goal is to produce a thought that's a result of an authentic thought process (or something similar, we were both drink and smoked too much hookah), but "not being a chatgpt" might become the highets aspiration of life reflected in a post-chatgpt world
signpost[phf]: yeah, to add even one symbol that's useful would be admirable.
signpost[asciilifeform]: somebody used the greek dual first. or future tense.
phf[asciilifeform]: people of highest caliber will start talking in terms of chatgpt prompts with each other, because any more is a waste of bandwidth
phf: of course when society reverts to a handful of aristocratic families running whole planets, we're going to have special keyboards for "playing" a chatgpt like system, where you don't actually write any prompts at all
phf: instead you express thought vector clusters and sentiments out of a large roster of moods and tonalities, and a chatgpt like system will compose a corresponding text for you
phf: because interstellar bandwidth is so limited, you ship your family's gpt dataset by ship, and then send midi tunes to animate it
phf: so you write an angry concerto, which gets rendered as "dearest members of assembled committe, the von shtulz family strongly condemnes the proposed course of action regarding ... etc. etc."
signpost[asciilifeform]: my hindbrain wishes the walnut driving this keyboard to express its amusement with how familiar this relationship sounds.
phf: but as AIs become more general, and the families more decadent and reclusive, ais will have to attempt to extract meaning and sentiment of a handful of dissonant chords. in public they will still make format speeches and advocate on behalf of their families, but in private they'll
phf: question even the very existance of their parent families.
phf: eventually there will be a massive interstellar society that for a quirk of programming still takes its direction from a handful of very infrequent, obscured and encoded transmissions, consisting mostly of a handful of notes. there will be whole debates over whether or not the note
phf: s are being generated by humans or if the last of humans are dead and the trasmissions are being randomly generated
phf: entire dyson planetary configurations will be devoted to extracting meaning out of those notes
asciilifeform was just aboutta add '... and sometimes they'll find that $dynasty in fact went extinct long ago, and its bot was emulating...' but loox like phf already covered this !
asciilifeform is of the pov that gptism revealed just how many 'humans' themselves fail 'turing test'. but arguably not news.
dulapbot: (asciilifeform) 2021-08-31 asciilifeform: pete_rizzo_: 1st step to enlightenment imho -- forget about 'people believe'. most of what you see on the net re: subj aint from 'people', is instead from these..