Categories: BPM

Slides online: How to create intelligent Business Processes thanks to Big Data (BPM, Apache Hadoop, Talend, Open Source)

I had a talk at EAM / BPM conference in London about “How to create intelligent Business Processes thanks to Big Data“. I just want to share the slides with you.

Content

BPM is established, tools are stable, many companies use it successfully. However, today’s business processes are based on data from relational databases or web services. Humans make decisions due to this information. Companies also use business intelligence and other tools to analyze their data.  Though, business processes are executed without access to this important information because technical challenges occur when trying to integrate big masses of data from many different sources into the BPM engine. Additionally, bad data quality due to duplication, incompleteness and inconsistency prevents humans from making good decisions. That is status quo. Companies miss a huge opportunity here!

This session explains how to achieve intelligent business processes, which use big data to improve performance and outcomes. A live demo shows how big data can be integrated into business processes easily – just with open source tooling. In the end, the audience will understand why BPM needs big data to achieve intelligent business processes.

Slides

Click on the button to load the content from www.slideshare.net.

Load content

As always, I appreciate any kind of feedback via comment, email, Twitter, or social networks (LinkedIn, Xing). Thanks.

Best regards,

Kai Wähner (Twitter: @KaiWaehner)

Kai Waehner

bridging the gap between technical innovation and business value for real-time data streaming, processing and analytics

Share
Published by
Kai Waehner

Recent Posts

How Penske Logistics Transforms Fleet Intelligence with Data Streaming and AI

Real-time visibility has become essential in logistics. As supply chains grow more complex, providers must…

1 day ago

Data Streaming Meets the SAP Ecosystem and Databricks – Insights from SAP Sapphire Madrid

SAP Sapphire 2025 in Madrid brought together global SAP users, partners, and technology leaders to…

6 days ago

Agentic AI with the Agent2Agent Protocol (A2A) and MCP using Apache Kafka as Event Broker

Agentic AI is emerging as a powerful pattern for building autonomous, intelligent, and collaborative systems.…

1 week ago

Powering Fantasy Sports at Scale: How Dream11 Uses Apache Kafka for Real-Time Gaming

Fantasy sports has evolved into a data-driven, real-time digital industry with high stakes and massive…

2 weeks ago

Databricks and Confluent Leading Data and AI Architectures – What About Snowflake, BigQuery, and Friends?

Confluent, Databricks, and Snowflake are trusted by thousands of enterprises to power critical workloads—each with…

3 weeks ago

Databricks and Confluent in the World of Enterprise Software (with SAP as Example)

Enterprise data lives in complex ecosystems—SAP, Oracle, Salesforce, ServiceNow, IBM Mainframes, and more. This article…

3 weeks ago