US 12,105,740 B2
Low-latency streaming analytics
Alexander William Cruise, Vancouver (CA); Byron Jason Shelden, Coquitlam (CA); and Claire Alexandria Tanner Semple, Vancouver (CA)
Assigned to Splunk Inc., San Francisco, CA (US)
Filed by Splunk Inc., San Francisco, CA (US)
Filed on Jun. 28, 2023, as Appl. No. 18/343,420.
Application 18/343,420 is a continuation of application No. 17/811,849, filed on Jul. 11, 2022, granted, now 11,727,039.
Application 17/811,849 is a continuation of application No. 17/114,283, filed on Dec. 7, 2020, granted, now 11,386,127, issued on Jul. 12, 2022.
Application 17/114,283 is a continuation of application No. 15/715,077, filed on Sep. 25, 2017, granted, now 10,860,618, issued on Dec. 8, 2020.
Prior Publication US 2023/0342380 A1, Oct. 26, 2023
This patent is subject to a terminal disclaimer.
Int. Cl. G06F 16/2455 (2019.01); G06F 9/54 (2006.01); G06F 11/30 (2006.01); G06F 16/28 (2019.01); G06Q 10/10 (2023.01)
CPC G06F 16/285 (2019.01) [G06F 9/542 (2013.01); G06F 11/30 (2013.01); G06F 16/24568 (2019.01); G06F 16/288 (2019.01); G06Q 10/10 (2013.01)] 20 Claims
OG exemplary drawing
 
1. A method comprising:
iteratively processing a message through a data stream processing system, wherein the data stream processing system implements at least a first and second stage of processing at least partly in parallel, wherein implementing the first stage of processing includes:
providing a modified message from the data stream processing system to the data stream processing system for continued processing at the second stage of processing, wherein the modified message is generated by modifying content of the message, and
wherein implementing the second stage of processing includes:
providing an output based at least in part on evaluating the modified message according to a set of rules maintained by the data stream processing system.