Special Issue on Stream Processing, Call for Papers

Another call for papers: the Journal of Web Semantics invites submissions to a special issue on Stream Processing to be edited by Monika Solanki and Jean-Paul Calbimonte. Submissions are due by 1st July 2015.

(I am on the Programme Committee.)

Important Dates and Submission Guidelines

From the call:

Streaming System Benchmarks

Streaming systems are complex; apart from correct functionality (which might differ between implementations and vendors) many non-functional aspects can be benchmarked such as memory comsumption, latency, and throughput. For RDF Stream Processing several benchmarks exist, shown as follows. From data stream management, older benchmarks exist which are not specific to RDF data but might be adapted. Some are listed below.

An RDF Model for Events

Why do we need an event model? Many RDF streaming systems discussed have little or no model for the real-time data they ingest. These systems make the lowest common assumptions about the structure of the data, i.e. that the data consist of a stream of RDF triples. Thus, each piece of real-time data (event) is one triple. One triple, however, cannot hold a lot of information. For example: flexibility in timstamping (one vs. two timestamps or application time vs.

Immutability and Event Derivation in RDF

"In many event processing systems [...] events are immutable"1. This stems from the definition of what an event is: "An event is an occurrence within a particular system or domain; it is something that has happened, or is contemplated as having happened [...]"2. So events cannot be made to unhappen.