Data Flow Systems are quite powerful. They are often used in hard real time problems. The basic data flow conceptual elements are the process, the file, and the data flow. Sources and Sinks are privileged processes at the external environmental boundaries of the system. The bible to me on this approach is TomDeMarco's book on SASS which was published about 1975 or thereabouts. Great book!
I have always liked an architecture which I call IPO-Net, an IPO-Net is just a Data Flow Architecture -- I used to program this way before there were disk files. Everything had to go from and to tape. So an IPO was an input tape, a process and an output tape. This is a pipes and filters kind of thing where the process can be interrupted and archived at any stage. It is to me a neat way to think about computational architecture (maybe I'm just showing my age) because it is applicable across such a broad range of hardware architectures and can be made so forgiving.
I used it to create a tool-set for a system called The Flyover System which was a processing toolset for analyzing airborne radio leakage data collected over cable television systems to verify compliance with FCC leakage regulations. It was a simple and forgiving system which was easy to use, to extend, etc. --RaySchneider
From my point of view, DataFlowSystems are the same as PipesAndFilters.
I think that DataFlowSystems may be more general than PipesAndFilters. (Consider whether the data flow is "pushed", "pulled", "clocked", or some hybrid. --DafyddRees
See also FlowBasedProgramming. --PaulMorrison