Skip to main content

digna Introduces Major Platform Update Enhancing Enterprise Data Quality and Observability

By: Newsfile

digna's new release adds global database connections, multiple source connections per project, logical datasources, anomaly relevance conditions, module-level notification settings, and CSV exports. It also expands data validation with multi-column uniqueness and referential integrity checks.

Vienna, Austria--(Newsfile Corp. - March 2, 2026) - digna has published its latest platform update, continuing the development of its data quality and observability platform, and reinforcing its architectural focus on adaptive anomaly detection in enterprise data environments. Delivering improvements to datasource modeling, connection management, and inspection usability, the release enhances flexibility across modules and expands data quality and validation coverage, including new capabilities in digna Data Validation.

Cannot view this image? Visit: https://images.newsfilecorp.com/files/8552/285937_50403220f46e7b7a_001.jpg

digna Introduces Major Platform Update Enhancing Enterprise Data Quality and Observability

To view an enhanced version of this graphic, please visit:
https://images.newsfilecorp.com/files/8552/285937_50403220f46e7b7a_001full.jpg

"We now make data quality operations easier to manage while strengthening the rules teams can enforce," said Marcin Chudeusz, CEO of digna. "By improving how connections and datasources are modeled, teams can build inspections that stay maintainable as projects and environments evolve."

Modernized connection management and reuse

digna's changelog release introduces global database connections, which are now configured at a global level and can be reused across all projects. This change is designed to simplify configuration and maintenance, reduce operational overhead, and support consistent connectivity across environments.

Projects can also now reference multiple source connection configurations. This enables more flexible setups for complex project data landscapes and supports heterogeneous data sources within realistic enterprise architectures.

Logical datasources and more context-aware anomaly evaluation

Datasources now represent a logical layer within a project. Each datasource can be backed by a database table, a database view, or a custom SQL statement. By separating the logical datasource layer from physical storage, the release improves reuse, clarity, and inspection modeling across modules while decoupling inspections and data quality rules from the underlying storage.

The release also adds an Anomaly Relevance Condition that can be defined to control anomaly status evaluation at the dataset level. Statistics are calculated independently of whether the condition is set or met. If the condition is not met, digna Data Anomalies does not provide anomaly status.

"Teams don't just need anomaly detection, they need anomaly evaluation that reflects real operating context," said Danijel Kivaranovic, PhD, CTO of digna. "The relevance condition helps ensure status is assigned only when the dataset meets the criteria that matter for that business scenario."

Notification control and CSV export for inspection results

Notifications can now be configured per module directly in digna, allowing independent control of alerting behavior for digna Data Anomalies, digna Data Timeliness, digna Data Validation, and other modules. This enables more precise alerting strategies aligned with team responsibilities and criticality.

In addition, users can now download inspection results as CSV files, supporting offline analysis, reporting, and integration with external tools. The release notes the CSV export simplifies audits, reporting, and downstream data quality analysis.

Expanded digna Data Validation coverage

With the new release, digna Data Validation supports a broader set of data quality rules, including row-level validation rules, multi-column uniqueness checks, and referential integrity validation across datasources. Together, these checks are intended to enable enforcement of structural and relational data quality rules across complex data landscapes.

New Uniqueness Checks for Multiple Columns support validation of compound keys and business-level uniqueness constraints, helping detect duplicates that cannot be identified with single-column checks.

New Referential Integrity Checks validate relationships between datasources by ensuring foreign key values in a source datasource exist in a referenced target datasource. These checks support validation across different tables or views, different schemas, and different database connections within the same project. The release also notes these checks help detect orphaned records, broken relationships, and data consistency issues early, and are designed to work with logical datasources including views and custom SQL. Use cases listed include data warehouse integrity, regulatory reporting, master data consistency, and reliable downstream analytics.

Contact:
Mayowa Ajakaiye
mayowa.ajakaiye@digna.ai

To view the source version of this press release, please visit https://www.newsfilecorp.com/release/285937

Recent Quotes

View More
Symbol Price Change (%)
AMZN  208.39
-1.61 (-0.77%)
AAPL  264.72
+0.54 (0.20%)
AMD  198.62
-1.59 (-0.79%)
BAC  49.81
-0.02 (-0.04%)
GOOG  306.36
-5.07 (-1.63%)
META  653.56
+5.38 (0.83%)
MSFT  398.55
+5.81 (1.48%)
NVDA  182.48
+5.29 (2.99%)
ORCL  149.25
+3.85 (2.65%)
TSLA  403.32
+0.81 (0.20%)
Stock Quote API & Stock News API supplied by www.cloudquote.io
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the Privacy Policy and Terms Of Service.