As I mentioned last week, yesterday was the webinar on Reactive Streams. Here's a quick recap.
Overall, the webinar was useful food for thought, and is highly recommended to all systems architects or programmers working on complex dataflows. The project is seeking to define a standard paradigm (with implementations in various platforms and languages) for the common problem of managing data that is flowing across "asynchronous boundaries" -- streams of data that are going from node to node, process to process, thread to thread or just actor to actor. The core problem being solved here seems minor but is a fairly universal bugaboo: what do you do when the downstream layers start to overflow? To cope with this consistently and well, the authors advocate a common publish/subscribe mechanism that builds the notion of back-pressure (the downstream parts keeping the upstream parts apprised of how it's going) deeply in.
While the webinar is run by an Akka guy, it is mostly platform-neutral, only getting into Akka details in the last ten minutes. Indeed, the first half is simply establishing the problem statement -- describing precisely what is meant by "a stream of data", and the common problems it introduces. Then it describes the universal API, and only at the end gets into the high-level Akka "Flow" abstraction that they are developing on top of that API.
Like I said, recommended for anyone playing at the high level. The recording of the webinar can be found on YouTube, and the slides are on Slideshare. Ignore the first minute or so, which is just introduction by the host; the actual content begins around 1:30.