Confluent’s Connect with Confluent program streamlines real-time data integration through its cloud-native Apache Kafka® platform, supporting hybrid, multi-cloud, and on-premises environments. With the One Connect Databricks Sink Connector, businesses can effortlessly stream data from SAP systems into Databricks for advanced analytics, machine learning, and operational intelligence—all without middleware.
The program also equips Onibex with invaluable engineering, sales, and marketing resources from Confluent to ensure customer success every step of the way, from deployment to ongoing support.
The One Connect Databricks Sink Connector unlocks game-changing capabilities for real-time and batch SAP data integration into Databricks. Here’s what makes it stand out:
Low/No Code Drag and Drop Graphic Data Modeler: Effortlessly map and integrate SAP tables and CDS views into entities using a drag-and-drop functionality, minimizing coding and accelerating implementation.
Prepackaged SAP Entity Mapping: Upload 170+ pre-mapped SAP entities, from Sales Orders to Vendors, with a one-time setup that minimizes manual effort and streamlines deployment.
Real-Time & Batch Data Transmission: Start with batch data uploads and activate real-time delta updates, capturing changes as they happen. Apply filters at the application layer to define data before it leaves SAP and integrates with Snowflake or Databricks via Kafka.
Easy Add Fields and Column Definitions: Add fields and columns with a double-click, eliminating ABAP or complex coding. Automatically create missing tables, derive table names from Kafka topics, and map SAP data to predefined structures.