Data and Analytics
SAP BTP Data and Analytics empowers organizations to harness the full potential of their data by integrating, managing, and analyzing it within an open and multi-cloud architecture. This week at TechEd, SAP introduces enhanced data lake capabilities for SAP Datasphere, a new knowledge graph engine in SAP HANA Cloud, and a real-time risk analysis feature in SAP Analytics Cloud, ensuring businesses can uncover hidden insights, optimize planning, and drive informed decision-making.
SAP Enhances Data Lake Capabilities for SAP Datasphere, powered by SAP HANA Cloud
SAP announces new embedded data lake capabilities for SAP Datasphere, its cloud data management solution that allows organizations to access, integrate and manage data across various environments while preserving business context. Available by the end of Q4 2024, these enhancements – built on the generally available SAP HANA Cloud SQL on files and data lake file service capabilities – expand the business data fabric architecture with a data lake option, complementing existing storage solutions. This helps businesses access, manage, and analyze data across cloud and hybrid environments while preserving its context and logic, improving decision-making, operational efficiency, and AI-driven innovation.
A business data fabric is an architectural framework that helps customers integrate, manage, and analyze data more effectively. This new data lake option will further enhance data analysis capabilities by allowing businesses to store raw data in its original context – unlike traditional approaches that often emphasize pre-processed, organized data.
SAP’s new embedded data lake capabilities will include:
An integrated object store, which provides a simpler way to store large amounts of data, making it easier for businesses to expand their storage as needed.
Spark compute, which facilitates efficient data transformation and processing based on existing SAP Datasphere data integration capabilities.
SQL on files functionality, which lets developers access data in the integrated object store without needing to physically copy the data. This makes it appear in SAP Datasphere data models the same as any other persisted data (that is, data which is permanently stored and readily accessible), streamlining data integration processes, reducing storage redundancy and ensuring data consistency.
These capabilities let users import and integrate data at scale while maintaining a consistent user experience that ensures seamless data integration from both SAP sources (including SAP S/4HANA and SAP Business Warehouse) and non-SAP sources while preserving the original structure and context of the data. Coupled with SAP Datasphere’s recently-unveiled ability to scale compute and processing memory based on SAP HANA Cloud’s elastic compute nodes, these capabilities signal SAP’s continued investment in enabling organizations of all sizes to implement a business data fabric in a cost-efficient manner.
SAP Introduces Knowledge Graph Capabilities to SAP HANA Cloud
SAP announces the addition of a knowledge graph engine to SAP HANA Cloud, its database-as-a-service for modern applications and analytics.
Generally available in the first half of 2025, the knowledge graph engine will let users identify and understand complex relationships in their data not otherwise discoverable using traditional data modeling tools. These capabilities help users better integrate and draw meaningful inferences from diverse data, as well as providing context-aware insights for AI use cases.
Unlike traditional databases that rely heavily on complex data connections, the knowledge graph engine can quickly identify meaningful insights across various data points without the usual performance and design challenges. It can achieve this because of the unique way a knowledge graph stores data. Each piece of information is stored in the database in three parts: the subject of the data, the object to which it is related, and the nature of the relationship between the two. This approach efficiently organizes data into a web of interconnected facts, making it easier to see how different pieces of information relate to one another. As a result, users can navigate through these connections to derive actionable insights with greater speed and accuracy.
The knowledge graph engine will be based on the industry standard Resource Description Framework, a core foundational technology and data model. It will also support the industry standard SPARQL semantic query language that lets users interact with and extract useful information from a knowledge graph.
New SAP Analytics Cloud Capability Simplifies Real-Time Risk Analysis
SAP announces general availability of the new SAP Analytics Cloud compass feature in Q1 2025. Business users can use SAP Analytics Cloud compass to model complex risk scenarios and simulate a broad range of probable outcomes. For example, companies might use this feature to analyze how uncertainties in commodity prices, labor costs or production volumes might impact operational expenses and revenue.
SAP Analytics Cloud compass employs the Monte Carlo simulation method, a computational technique used to calculate the probability of different outcomes by running a large number of simulations with random variables with a business user-friendly interface that lets non-technical users perform real-time risk analysis without prior setup or knowledge.
The new SAP Analytics Cloud compass:
Performs simulations directly where the data resides, reducing the time and effort compared to manual analysis.
Visualizes best case, worst case, and most realistic case scenarios through a display of probability distributions and their corresponding boundaries.
Lets business users change assumptions on the fly, dynamically updating scenario models to reflect key uncertainties.
These capabilities help customers better understand key drivers of business performance and refine their simulations for improved decision-making.