Systems Integration in the NoSQL Era with Apache Camel and Talend (MongoDB, Neo4j, HBase, AWS S3, Hazelcast, CouchDB)

In February 2013, I was at ApacheCon NA 2013 in Portland, Oregon, USA. It was a small, but great conference. I met so many awesome Apache experts and learned a lot about several Apache projects.

Besides all of the Hadoop related projects, I was especially interested in Apache Syncope, an open source system for managing digital identities in enterprise environments, and Apache Streams, a new Incubator project that aims to develop a scalable server for the publication, aggregation, filtering and re-exposure of enterprise social activities.

My session was named “Systems Integration in the NoSQL Era with Apache Camel“. I showed how to integrate several different NoSQL databases such as MongoDB (document), Neo4j (graph), HBase (column), AWS S3 (key-value), or Hazelcast (in-memory). I used Apache Camel with text editor and IDE. Besides, I showed some open source tooling on top of Camel with Talend ESB. With Talend, you  can use a graphical user interface, all Camel code is generated. You have just to configure your routes.

Here are the slides from my talk:

Click on the button to load the content from www.slideshare.net.

Load content

If you have any further questions, feel free to write a comment or contact me via Twitter, email or social networks (LinkedIn, Xing).

 

Best regards,

Kai Wähner (Twitter: @KaiWaehner)

 

Kai Waehner

bridging the gap between technical innovation and business value for real-time data streaming, processing and analytics

Recent Posts

How Penske Logistics Transforms Fleet Intelligence with Data Streaming and AI

Real-time visibility has become essential in logistics. As supply chains grow more complex, providers must…

1 day ago

Data Streaming Meets the SAP Ecosystem and Databricks – Insights from SAP Sapphire Madrid

SAP Sapphire 2025 in Madrid brought together global SAP users, partners, and technology leaders to…

6 days ago

Agentic AI with the Agent2Agent Protocol (A2A) and MCP using Apache Kafka as Event Broker

Agentic AI is emerging as a powerful pattern for building autonomous, intelligent, and collaborative systems.…

1 week ago

Powering Fantasy Sports at Scale: How Dream11 Uses Apache Kafka for Real-Time Gaming

Fantasy sports has evolved into a data-driven, real-time digital industry with high stakes and massive…

2 weeks ago

Databricks and Confluent Leading Data and AI Architectures – What About Snowflake, BigQuery, and Friends?

Confluent, Databricks, and Snowflake are trusted by thousands of enterprises to power critical workloads—each with…

3 weeks ago

Databricks and Confluent in the World of Enterprise Software (with SAP as Example)

Enterprise data lives in complex ecosystems—SAP, Oracle, Salesforce, ServiceNow, IBM Mainframes, and more. This article…

3 weeks ago