Show Idle (> d.) Chans

Results 1 ... 44 found in trilema for 'dataflow'

mircea_popescu: there's also a II) dataflow control failure. these are all the "add" (kids so aggitated they're saying something, the WHAT they're saying gets lost in dribbles). these are the manic-depressives, people so concerned with their "status" they fail to actually produce anything, like a robot that's 99% sensors and 0% servos.
asciilifeform: verilog etc have approx as much to do with general-purpose dataflow as arm-flapping has to do with aviation.
phf: i think thread half heartedly mentioned dataflow (via verilog/vhdl), but by the time i saw it my attention waned
asciilifeform: spoiler for readers : linked piece mentions dataflow 0 times , instead proposes the usual idiocy of 'immutability' and 'very large numbers of hardware-scheduled threads' etc
esthlos: somewhat on topic, as someone still reading logz: is eventual plan tmsr dataflow lispm revival? heathen iron has to be abandoned, no?
asciilifeform: dataflow fabric can express e.g. carry-save-adders as well as anything else.
asciilifeform: on the other hand, pipeline idea per se was a mistake; same kind of failure to invent dataflowism as dma
ben_vulpes: asciilifeform: i was hoping for eg trinary circuits, dataflow fabric, but quickly disappoint.
asciilifeform: the correct world is 'dataflow' (see logz) where every op signals output-ready, and no op proceeds until all inputs signal same.
asciilifeform: << in other olds. dataflow crackpotteries.
asciilifeform: josephsmithing of dataflow, lol
phf: asciilifeform: i'm just trying to establish the dataflow here, for my own curiosity
phf: welcome to dataflow country
asciilifeform: (incidentally plugboards survived in analogue computers, which were manufactured ALMOST UNTIL '80s because ~actually easier to dataflow program on~)
gabriel_laddel: ;; later tell adlai I was referring to the dataflow paradigm.
adlai: << somebody shares alf's dataflow dreams
ben_vulpes: asciilifeform: hey boss, i keep meaning to ask you about the details of dataflow programming (vis a vis turning ascii files into computation), but i'm having trouble booking the like 3 hours of sit-down time it'd take to respond in a timely fashion
trinque: put me back in the stasis pod til you finish dataflowputer
asciilifeform: in 'dataflow' concept, your computer is actually a barrel full of very small computers. most of them perhaps idling at any given time.
asciilifeform: << must dig out this thread, because this is when i point out that the generalization of this principle - 'if it really has to happen all the time, it ought to have dedicated silicon' - is called 'dataflow computation'
asciilifeform: << you just described the dataflow box, aha
gabriel_laddel: stas: dataflow never caught on because you absolutely need dedicated hardware. if you want real efficiency, you can’t even use standard RAM. so it gets dismissed as nuttery, on the rare occasions it comes up.
asciilifeform: trinque: code on dataflow processor controls switching matrix.
gabriel_laddel: so, for instance, think of the terminal you are now typing into. on a true dataflow box, it would simply be the end of a circuit (screen memory) directly linked to a circuit with the keyboard matrix decoder on the other end of it. you thus also lose the concept of ‘interrupts’ or ‘processes’. a dataflow box doesn’t need a scheduler, or an interrupt
asciilifeform: but i seem to be the only one who (publicly) realized that it is the very thing for building dataflow comp.
asciilifeform: adlai: if you're curious, i quit the emulator when i realized that even the text output absolutely trivial on a dataflow machine cannot be rendered in real time on a von neumann box (simply attach dependency graph to each rectangle of video buffer ram)
asciilifeform: i'll add that neither 'mark and sweep' (stop-the-world in its basic form) nor 'reference counting' are necessary on a dataflow machine
trinque: -or- when the hell do I get a dataflowputer
BingoBoingo: Problem is not lack of basic examples it is that a decade or two long wait for dataflow cadr CPU... seems ok...
decimation: asciilifeform: what language would you use to model 'dataflow' processing?
BingoBoingo: <asciilifeform> BingoBoingo: i rejected three entirely different designs (different as in the diff b/w dataflow and von neumann) that did not even bother describing. perhaps archaeologists (or the vultures) can get something from my notes. << Now you are touching my view of qntra realized. Not as "news" but as cabinet of microfiche
asciilifeform: BingoBoingo: i rejected three entirely different designs (different as in the diff b/w dataflow and von neumann) that did not even bother describing. perhaps archaeologists (or the vultures) can get something from my notes.
asciilifeform: mircea_popescu: the pipe - is everything. (hence my crackpot pushing of dataflow as replacement for von neumann computation, etc)
asciilifeform: likewise, for dataflow cpu fabric (the only truly reasonable way to build a computer) - existing logic synthesis methods are unsuitable.
asciilifeform: moriarty: in point of fact, given correct hardware design ('dataflow' machine) neither compilers nor 'parallelism' as conventionally understood - are necessary concepts at all.
asciilifeform: decimation: everything 'wants' to be dataflow arch.
decimation: asciilifeform it seems to me that a router is exactly the kind of device that 'wants' to be a dataflow architechture,
mircea_popescu: "Go with a straight-dataflow paradigm, where all operations are part of a dependency graph (and if your chip is large enough, exist at all times as physical objects which wait for their inputs to become available, and signal their successors within picoseconds of their output becoming ready.)" << tbh, this is not only grand in theory
decimation: ascii I imagine your dataflow machine as a collection of various calculating state machines, called into existence for various purposes when needed
decimation: asciilifeform: yeah I forgot about ECL. The point is, discrete logic need not be slow. However, at high speeds propagation delay is going to be a serious issue, which implies distributed clockless dataflow design.
decimation: re: dataflow: I've been browsing the academic papers on "dataflow" computers. Most of them are crazy "C machine" and "Dataflow sidecar" hybrids. Makes as much sense as putting a massive aircraft engine in a car:
asciilifeform: these folks are crackpots, quite possibly inspired by my ravings re: dataflow archs. so i am to blame.
benkay: first i'd ever encountered the notion of dataflow stuff
benkay: asciilifeform: i dreamt about a dataflow soup (using biocomputing elements) last night and then read your Alert Reader post this morning.