There turns out to be a lot of new stuff now using FBP-like concepts under the name Pipelines, and I’ve done a lot more reading. Some of my notes are attached. Some thoughts:
CMSP (John Hartmann's CMS Pipelines - see CMSTSOPipelines) has moderately powerful pipelines/networks, and passes amorphous records along them. The records will typically be fixed length text or binary, corresponding roughly to COBOL or PL/I record definitions. I think we’re beyond that now.
Windows PowerShell? has simple pipelines and passes objects, which have properties. I feel this has some powerful advantages, but considerably weakens the pipeline/network concept. There are complex rules that determine what kinds of object each kind of action can operate on. I think objects are a move too far.
XML pipelines are simple and straight, but rely on very rich processing of very complicated data structures. I really don’t want to be doing this.
I’m quite strongly inclined to focus on (a) a rich pipeline/network structure and (b) passing SQL-like record sets along them. By that I mean that a record consists of fields, each field has a scalar type (usually string, decimal or date) and each field has a name. Quite likely arrays (vectors) as well as scalars, but not objects and not complex types.
In that case it should be possible to have parameters in the same form as packets, so one component can easily create them for another. However, that still leaves two important issues.
a) The language used to define the network needs to be much smarter. I like your earlier macros and CMSP much more than the current compiled Java/C# code with a mixture of bits of string and bits of objects. That means some kind of a network-specific shell or compiler. Possibly the graphical tool is enough, but then the tool has to be closely integrated.
b) I think the components need to “see” parameter packets during an initial “syntax” phase. (1) To reject bad parameters. (2) For non-loopers, to parse the parameter packet once only, and then be reactivated multiple times for data packets.
In software engineering, a pipeline consists of a chain of processing elements (processes, threads, coroutines, etc.), arranged so that the output of each element is the input of the next. Usually some amount of buffering is provided between consecutive elements. The information that flows in these pipelines is often a stream of records, bytes or bits.
An XML Pipeline is formed when XML (Extensible Markup Language) processes, sometimes called XML transformations, are connected together.
Pipeline overview for business app processing.
Matchers are used to match user requests such as URLs or cookies against wildcard or regular expression patterns. Each user request is sent through the pipeline until a match is made. It is from here that a particular request is processed.
Generators create a stream of data for further processing. This stream can be generated from an existing XML document or there are generators that can create XML from scratch to represent something on the server, such as a directory structure or image data.
Transformers take a stream of data and change them in some way. The most common transformations are performed with XSLT to change one xml format into another. But there are also transformers that take other forms of data (SQL for example).
A serializer takes a data stream, makes any required changes, and sends it to the client. There are serializers that allow you to send the data in many different formats including HTML, XHTML, PDF, RTF, SVG, WML and plain text, for example.
Selectors offer the same capabilities as a switch statement. They are able to select particular elements of a request and choose the correct pipeline part to use.
Views are mainly used for testing. A view is an exit point in a pipeline. You can put out the XML-Stream which is produced till this point. So you can see if the application is working right.
Readers publish content without parsing it (no XML processing). Used for images and such.
Actions are Java classes that execute some business logic or manage new content production.
Shell and admin tool, which uses simple pipelines to pass objects between cmdlets.