Categories: BPM

Slides from OOP 2014 Online: Next-Generation BPM – How to create intelligent Business Processes thanks to Big Data

Just a short blog post with my slides from OOP 2014: Next-Generation BPM – How to create intelligent Business Processes thanks to Big Data.

Content

Business processes are often executed without access to relevant data because technical challenges occur when trying to integrate big masses of data from different sources into the BPM engine. Companies miss a huge opportunity here! This session shows how to achieve intelligent business processes to improve performance and outcomes by integrating big data – just with open source tooling.

Target Audience: Architects, Developers, Project Leader, Manager, Decision Makers
Prerequisites: Basic knowledge in a programming language, databases, and BPM concepts
Level: Introductory

You will learn:
1) Learn how to create intelligent business processes
2) Learn why to combine BPM and Big Data
2) Learn how to combine BPM and Big Data

Extended abstract:
BPM is established, tools are stable, many companies use it successfully. However, today’s business processes are based on “dumb” data from relational databases or web services. Humans make decisions due to this information. Companies also use business intelligence and other tools to analyze their data. Though, business processes are executed without access to this important information because technical challenges occur when trying to integrate big masses of data from many different sources into the BPM engine. Additionally, bad data quality due to duplication, incompleteness and inconsistency prevents humans from making good decisions. That is status quo. Companies miss a huge opportunity here!
This session explains how to achieve intelligent business processes, which use big data to improve performance and outcomes. A live demo – based on open source frameworks such as Apache Hadoop – shows how big data can be integrated into business processes easily. In the end, the audience will understand why big data needs BPM to improve data quality, and why BPM needs big data to achieve intelligent business processes.

Slides

Click on the button to load the content from www.slideshare.net.

Load content

Kai Waehner

bridging the gap between technical innovation and business value for real-time data streaming, processing and analytics

Share
Published by
Kai Waehner

Recent Posts

How Penske Logistics Transforms Fleet Intelligence with Data Streaming and AI

Real-time visibility has become essential in logistics. As supply chains grow more complex, providers must…

1 day ago

Data Streaming Meets the SAP Ecosystem and Databricks – Insights from SAP Sapphire Madrid

SAP Sapphire 2025 in Madrid brought together global SAP users, partners, and technology leaders to…

6 days ago

Agentic AI with the Agent2Agent Protocol (A2A) and MCP using Apache Kafka as Event Broker

Agentic AI is emerging as a powerful pattern for building autonomous, intelligent, and collaborative systems.…

1 week ago

Powering Fantasy Sports at Scale: How Dream11 Uses Apache Kafka for Real-Time Gaming

Fantasy sports has evolved into a data-driven, real-time digital industry with high stakes and massive…

2 weeks ago

Databricks and Confluent Leading Data and AI Architectures – What About Snowflake, BigQuery, and Friends?

Confluent, Databricks, and Snowflake are trusted by thousands of enterprises to power critical workloads—each with…

3 weeks ago

Databricks and Confluent in the World of Enterprise Software (with SAP as Example)

Enterprise data lives in complex ecosystems—SAP, Oracle, Salesforce, ServiceNow, IBM Mainframes, and more. This article…

3 weeks ago