You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The datastreaming system is being built as part of in-kind work to ESS. It will be the system that the ESS uses to take data and write it to file - basically their equivalent to the [ICP](/specific_iocs/DAE-and-the-ICP). The system may also replace the ICP at ISIS in the future.
12
+
The data streaming system is being built as a requirement for HRPD-X and possibly SANDALS-II, separate (and complementary) to the `MNeuData` project. HRPD-X, SANDALS-II and other future instruments will not have a traditional DAE2/DAE3 as they are now obsolete. It is architecturally similar to the system that the ESS uses to take data (neutron events, sample environment, and anything else that we can throw into a streaming platform) and write it to file. Previously ISIS aided development to the ESS' streaming pipeline as part of an in-kind project. The system will replace the ICP at ISIS.
13
13
14
-
In general the system works by passing both neutron and SE datainto [Kafka](https://kafka.apache.org/) and having clients that either view data live (like Mantid) or write the data to file, additional information can be found [here](http://accelconf.web.cern.ch/AccelConf/icalepcs2017/papers/tupha029.pdf) and [here](https://iopscience.iop.org/article/10.1088/1742-6596/1021/1/012013).
14
+
In general this works by producing both neutron events and histograms, sample environment data, and other diagnostic data into a [Kafka](https://kafka.apache.org/)cluster and having clients (consumers in Kafka lingo!) that either view data live and act on it or write the data to a nexus file. Additional information can be found [here](http://accelconf.web.cern.ch/AccelConf/icalepcs2017/papers/tupha029.pdf) and [here](https://iopscience.iop.org/article/10.1088/1742-6596/1021/1/012013).
15
15
16
-
All data is passed into flatbuffers using [these schemas](https://github.com/ess-dmsc/streaming-data-types) - we have a tool called [saluki](https://github.com/ISISComputingGroup/saluki) which can deserialise these and make them human-readable after they've been put into Kafka.
16
+
All data is serialised into [Flatbuffers](https://flatbuffers.dev/) blobs using [these schemas](https://github.com/ess-dmsc/streaming-data-types) - we have a tool called [saluki](https://github.com/ISISComputingGroup/saluki) which can deserialise these and make them human-readable after they've been put into Kafka.
17
17
18
-
The datastreaming layout proposed looks something like this, not including the Mantid steps or anything before event data is collected:
18
+
Overall architecture is as follows:
19
19
20
-

20
+

21
21
22
-
## Datastreaming at ISIS
23
-
24
-
Part of our in-kind contribution to datastreaming is to test the system in production at ISIS. Currently it is being tested in the following way, with explanations of each component below:
25
-
26
-

22
+
This comprises of a few different consumers and producers:
23
+
-[`azawakh`](https://github.com/ISISComputingGroup/azawakh) - This is a soft IOC which provides `areaDetector` views, spectra plots and so on by consuming events from the cluster and displaying them over EPICS CA/PVA.
24
+
-[`borzoi`](https://github.com/ISISComputingGroup/borzoi) - This is also a soft IOC which is more or less a drop-in replacement for the ISISDAE. It provides an interface that several clients (ie. [genie](https://github.com/ISISComputingGroup/genie), [ibex_bluesky_core](https://github.com/ISISComputingGroup/ibex_bluesky_core), [ibex_gui](https://github.com/ISISComputingGroup/ibex_gui)) talk to to start/stop runs and configure streaming electronics. `borzoi` will send UDP packets to the streaming electronics to configure it.
25
+
-[`BSTOKAFKA`](https://github.com/ISISComputingGroup/BSKAFKA) - This configures the `forwarder` with the blocks that are in an instrument's current configuration, as well as other PVs which will either get written to a file or archived for e.g. the log plotter.
26
+
-`forwarder` - See [Forwarding Sample Environment](datastreaming/Datastreaming---Sample-Environment)
27
+
-`filewriter` - See [File writing](datastreaming/Datastreaming---File-writing)
27
28
28
29
{#kafkacluster}
29
30
## The Kafka Cluster
30
31
31
-
There is a Kafka cluster at `livedata.isis.cclrc.ac.uk`. Port 31092 is used for the primary Kafka broker.
32
+
There is a (non-production!) [Redpanda](https://www.redpanda.com/)Kafka cluster at `livedata.isis.cclrc.ac.uk:31092`.
32
33
A web interface is available [here](https://reduce.isis.cclrc.ac.uk/redpanda-console/overview).
33
34
34
35
:::{important}
@@ -37,71 +38,19 @@ Automation team. See `\\isis\shares\ISIS_Experiment_Controls\On Call\autoreducti
37
38
support information.
38
39
:::
39
40
40
-
### I want my own local instance of Kafka
41
-
42
-
See {ref}`localredpanda`
43
-
44
-
## Neutron Data
45
-
46
-
The ICP on any instrument that is running in full event mode and with a DAE3 may stream neutron events into Kafka.
47
-
48
-
This is controlled using flags in the `isisicp.properties` file:
49
-
50
-
```
51
-
isisicp.kafkastream = true
52
-
# if not specified, topicprefix will default to instrument name in code
53
-
isisicp.kafkastream.topicprefix =
54
-
# FIA team run their kafka cluster on port 31092, not 9092
In the same file, you will also need to ensure the following properties are set:
62
-
63
-
```
64
-
isisicp.incrementaleventnexus = true
65
-
66
-
# Event rate, can adjust up or down
67
-
isisicp.simulation.neventssim = 5000
41
+
## How to/FAQs
42
+
See {ref}`datastreaminghowto`
68
43
69
-
# Ensure simulated data is switched on
70
-
isisicp.simulation.simulatedata = true
71
-
isisicp.simulation.simulatespec0 = true
72
-
isisicp.simulation.simulatebin0 = true
73
-
isisicp.simulation.spreadsimevents = true
74
-
```
75
-
76
-
You additionally need to ensure you are running in event mode. You can do this using the DAE tables `wiring_event_ibextest.dat`, `detector_ibextest.dat` & `spectra_ibextest.dat`. Copies of these tables can be found at:
0 commit comments