github.com/argoproj/argo-events@v1.9.1/docs/eventsources/setup/redis-streams.md (about) 1 # Redis Streams 2 3 Redis stream event-source listens to messages on Redis streams and helps sensor trigger workloads. 4 5 Messages from the stream are read using the Redis consumer group. The main reason for using consumer group is to resume from the last read upon pod restarts. A common consumer group (defaults to "argo-events-cg") is created (if not already exists) on all specified streams. When using consumer group, each read through a consumer group is a write operation, because Redis needs to update the last retrieved message id and the pending entries list(PEL) of that specific user in the consumer group. So it can only work with the master Redis instance and not replicas (<https://redis.io/topics/streams-intro>). 6 7 Redis stream event source expects all the streams to be present on the Redis server. This event source only starts pulling messages from the streams when all of the specified streams exist on the Redis server. On the initial setup, the consumer group is created on all the specified streams to start reading from the latest message (not necessarily the beginning of the stream). On subsequent setups (the consumer group already exists on the streams) or during pod restarts, messages are pulled from the last unacknowledged message in the stream. 8 9 The consumer group is never deleted automatically. If you want a completely fresh setup again, you must delete the consumer group from the streams. 10 11 ## Event Structure 12 13 The structure of an event dispatched by the event-source over the eventbus looks like following, 14 15 { 16 "context": { 17 "id": "unique_event_id", 18 "source": "name_of_the_event_source", 19 "specversion": "cloud_events_version", 20 "type": "type_of_event_source", 21 "datacontenttype": "type_of_data", 22 "subject": "name_of_the_configuration_within_event_source", 23 "time": "event_time" 24 }, 25 "data": { 26 "stream": "Name of the Redis stream", 27 "message_id": "Message Id", 28 "values": "message body" 29 } 30 } 31 32 Example: 33 34 { 35 "context": { 36 "id": "64313638396337352d623565612d343639302d383262362d306630333562333437363637", 37 "source": "redis-stream", 38 "specversion": "1.0", 39 "type": "redisStream", 40 "datacontenttype": "application/json", 41 "subject": "example", 42 "time": "2022-03-17T04:47:42Z" 43 }, 44 "data": { 45 "stream":"FOO", 46 "message_id":"1647495121754-0", 47 "values": {"key-1":"val-1", "key-2":"val-2"} 48 } 49 } 50 51 ## Specification 52 53 Redis stream event-source specification is available [here](https://github.com/argoproj/argo-events/blob/master/api/event-source.md#argoproj.io/v1alpha1.RedisStreamEventSource). 54 55 ## Setup 56 57 1. Follow the [documentation](https://kubernetes.io/docs/tutorials/configuration/configure-redis-using-configmap/#real-world-example-configuring-redis-using-a-configmap) to set up Redis database. 58 59 1. Create the event source by running the following command. 60 61 kubectl apply -n argo-events -f https://raw.githubusercontent.com/argoproj/argo-events/stable/examples/event-sources/redis-streams.yaml 62 63 1. Create the sensor by running the following command. 64 65 kubectl apply -n argo-events -f https://raw.githubusercontent.com/argoproj/argo-events/stable/examples/sensors/redis-streams.yaml 66 67 1. Log into redis pod using `kubectl`. 68 69 kubectl -n argo-events exec -it <redis-pod-name> -c <redis-container-name> -- /bin/bash 70 71 1. Run `redis-cli` and publish a message on the stream `FOO`. 72 73 XADD FOO * message hello 74 75 1. Once a message is published, an argo workflow will be triggered. Run `argo list` to find the workflow. 76 77 ## Troubleshoot 78 79 Redis stream event source expects all the streams to be present on redis server. It only starts pulling messages from the streams when all of the specified streams exist on the redis server. 80 81 Please read the [FAQ](https://argoproj.github.io/argo-events/FAQ/).