With so many variables affecting the outcome of business events, the ability to gauge the likeliness of certain outcomes would be an extremely valuable asset for organizations. Complex event processing (CEP) is emerging as an effective way to provide those kinds of insights into events.
CEP is relatively new, and even the definition of the term seems to be evolving. In addition to its predictive elements, CEP enables organizations to analyze information on the fly, to make better sense of all the information that’s coming in from a variety of applications. CEP involves the use of software applications to predict events that are likely to result from specific sets of factors. The technology analyzes cause-and-effect relationships among various events as they are occurring, letting users proactively take action to respond to different scenarios in the most effective ways possible.
CEP can play a role in a several areas of IT and business processes, including risk management, customer relationship management (CRM), business process management (BPM), business activity monitoring (BAM) and stock trading.
“The main driver for CEP is the explosion in various forms of operational intelligence applications — often provided through online dashboards — that give near real-time visibility into the state of a company and its environment,” says Roy Schulte, vice president and distinguished analyst at Gartner Inc., Stamford, Conn.
Schulte says Gartner tracking shows that there are eight “pure-play” CEP engine vendors, seven vendors of CEP-enabled application platforms, and more than 70 other vendors that offer substantial CEP capabilities to complement other software functions.
CEP brings value to organizations in four key ways, Schulte says. One is improved quality of decisions. “Computers can extract the information value from dozens, hundreds or thousands of base events per second in real-world applications — as long as the events are simple,” he says. “By contrast, a person can assimilate only a few events per second, thus cannot consider nearly as many factors when making a decision.”
Another benefit is faster response. “CEP systems can respond faster than people,” Schulte says. “For example, CEP-based, financial-trading systems use rules to make fully automated buy-and-sell decisions in 20 milliseconds without human involvement. A person cannot type a character on a keyboard in that short a time.”
Yet another potential gain is preventing data overload. CEP systems reduce the volume of unwanted, unnecessary data presented to people, Schulte says. “A CEP system may run for hours or days, turning millions of base events into thousands of complex events before detecting a complex event that must be brought to the attention of a person,” he says.
Finally, CEP can lead to reduced costs. “CEP systems offload the drudgery of repetitive calculations and pattern detection comparisons from people to computers,” Schulte says. A CEP system can run continuously, evaluating incoming notifications throughout the day and performing at least the initial event screening and computation, he says. This reduces the amount of human labor needed to analyze data.
The technology offers “a set of building blocks to create an underlying business intelligence” about events, says Kevin McPartland, senior analyst at TABB Group, a Westborough, Mass., research and strategic advisory firm. “You could do this before, but building [the infrastructure] from scratch was complicated and very time consuming,” he says. “CEP providers are offering tool kits and libraries that allow you to build the logic based on multiple data streams very quickly.”
While the discipline is still evolving, experts say businesses in data-intensive industries such as financial services already are exploring the use of CEP in different areas. For example, for the past two years finance firms have begun using CEP for high-speed electronic trading, to help users make quick decisions based on fast-arriving market data, McPartland says.
More recently, he adds, financial services firms have applied CEP to risk management efforts, examining data from various sources across the enterprise to determine whether their organizations are meeting regulatory compliance and security requirements.
“One interesting [use of CEP] is to monitor the overall server infrastructure,” McPartland says. So, for example, a company could see precisely how much capacity it has on its servers at any point and calculate future performance and availability based on the applications currently running. CEP can also be used for low-latency monitoring, to predict whether network delays will occur that could have a huge impact on IT service delivery or the availability of information.
Like any other software-based technology, CEP is only as useful as the quality of the information that’s being analyzed, McPartland says. Also, organizations need to have an adequate IT infrastructure in place to fully benefit from CEP.
“A lot of vendors’ products get faster by throwing hardware at them,” McPartland says. “If the infrastructure is not up to snuff, then you’re not going to see the kind of speed or response times you hope to.
source:informationagenda.techweb.com
Complex Event Processing Provides a New Level of Business Insight
Posted by
Admin
at
Monday, June 22, 2009
0 comments:
Post a Comment