Solutions

Issue Management

Create and manage issues as first-class elements within the platform, linking them to the data elements that are either the subject of or impacted by the issue:

  • One or multiple issues can be linked to other related issues.
    • The author and current owner of the issue is recorded.
  • One or multiple resolutions can be proposed for an issue.
    • Each resolution has both a target and completion date.

A workflow is available to automate the steps in the issue management process. This capability also allows anyone viewing a physical, logical, or business data element to easily see all the issues that refer to it, to better understand their current status.

Manage Data Sharing / Usage Agreements

Data sharing/usage agreements can be established to define the conditions of the data sharing/usage, including access privileges, data usage limitations, data ownership, expiration date for data, and the conflict resolution process. This enables data consumers to have instant visibility of these agreements/usage agreements, ensuring they fully understand the data they can access and the conditions of access. Simultaneously, data providers can be aware of their commitments and obligations.

Manage Critical Data Elements

Data elements can be classified as critical and other significant classifications can be made, such as confidentiality, data retention, private, and others. This allows key data management processes like data governance and data quality to focus their efforts on the most important data elements. Search and query results can easily be filtered by selecting a classification such as critical to help users select the most important elements to work on. Diagrams generated to visualize data relationships can also be highlighted or filtered to focus on elements with a critical (or any other) classification.

Access and Analyze Operational Data in the Context of Its Metadata

The graph database of the Adaptive platform allows customers to record not only metadata but also operational data itself. This unique and powerful capability enables the analysis of operational data within the context of the metadata, which provides both business and technical meaning to the data.

Metadata Management

Metadata describing data in various systems, databases, data warehouses, data lakes, and cloud environments can be captured. Developers and business analysts can utilize the search, browse,and query functionality to find reusable data elements with ease. The search features encompass free text, wildcards, classifications, and creation/modification date, which expedites new development, minimizes redundancy, and enhances reuse.

Data Lineage

Adaptive bridges can automatically harvest the metadata of data transformations from ETL tools and other transformation technologies and languages. This enables lineage analysis to be performed and visualized automatically to include interactive diagrams that support drilldown. Lineage analysis helps users comprehend the origin of data, as well as the sequence of transformations and movements made to it until it is incorporated into a report or analytic.

Impact Analysis

Metadata is automatically captured for data elements across the enterprise in databases, data warehouses, data marts, data lakes, and the cloud. The transformations and consumption of data through stored procedures, ETL, BI reports, and other analytics are also automatically captured. When a change is proposed to a specific data element, transformation, or BI report, the direct and indirect impacts of the change can be easily determined. This allows management to assess the true cost of the change and properly anticipate all the actions required to accommodate it.

Data Quality Management

Data quality rules and metrics can be defined at the level of business terms, eliminating the need to redundantly define them for every physical data element. Utilizing traceability relationships, Adaptive’s data quality rule execution capability can then automate the execution of these rules against all operational data where those business terms are implemented.

Data Privacy

The graph database of the Adaptive platform allows customers to record not only metadata but also operational data itself. This unique and powerful capability enables the analysis of operational data within the context of the metadata, which provides both business and technical meaning to the data.

Machine Learning - Artificial Intelligence-Inference Engine

  • Provide a comprehensive record of all Personally Identifiable Information (PII) data, detailing where it is stored, how it is moved, and which systems and applications utilize it.
  • Relate data design to goals and processes, to ensure that the data is minimally sufficient.
  • Define each relevant category of Personal Data, as a data classification, as defined by internal and external authorities. Privacy data classifications are made at the business level and automatically inferred at the logical and physical levels. This can be done using a bottom-up approach (e.g., identifying fields that resemble a Social Security number) or a top-down approach (e.g., fields named “SSN”; or “Social”).
  • Capture essential metadata for each data element, to manage privacy policies:
    • Data subject
    • Date collected.
    • Authorized purpose(s)
    • Origin (person directly, purchased database, calculated, etc.)
    • Date updated and origin of update.
    • Date checked/confirmed and how (e.g., checked against an external database)
    • Next review date
    • Target disposal date
    • Data owner, steward, consumer, and other relevant roles
    • Browsable privacy regulations (GDPR or other relevant policies) can be traced to business terms.

Machine Learning - Artificial Intelligence-Inference Engine

Adaptive harnesses AI/machine learning algorithms to analyze real-time data and user behavior for customized data insights. This implements and governs models that continuously refine and enhance predictions about behavior, based on client-centric analytics.

  • Store, manage, and track models and experiments.
  • Utilize real-time visualization to monitor the performance of models as they automatically generate charts, graphs, and other visual aids for seamless collaboration within your team.
  • Leverage data/model driven capabilities.
  • Store models, metadata, parameters, code version, metrics, and artifacts.
  • Automatically record code and parameters while tracking changes, for effortless data science replication.
  • Auto-link terms in unstructured documents to relevant terms in Adaptive, connecting unstructured and structured data sources.
  • Manage data sets through tagging, automated versioning, and querying capabilities.

Seamlessly connect your data and quickly employ machine learning and data science techniques to enrich data sets with advanced insights, hidden patterns, and more. Utilize data science to identify emerging patterns based on specific use cases that analyze data sources.

Risk Management

Utilize SWOT analysis (Strengths, Weaknesses, Opportunities, and Threats) to establish a cohesive integrated risk framework, encompassing data, business processes, technology, and organization/people. This approach will facilitate centralized, efficient, and consistent risk reporting.

Report Rationalization

Today’s organizations struggle with managing their extensive collection of business, operational, and technical reports. Frequently, reports are generated without considering existing resources, resulting in report proliferation, insufficient governance, and persistent issues with redundancy, inconsistency, and maintenance. Adaptive tackles these challenges by utilizing the business semantics associated with these reports. It provides efficient capabilities to collect report metadata, link it to business semantics, analyze and identify redundancies, and rationalize these redundant reports.

Manage Third-Party Data Subscriptions

Effective management of purchased data feeds involves identifying the necessary information, understanding its usage, and detecting redundancies in data sources. This approach helps eliminate costly redundancies in third-party data subscriptions and highlights unused subscriptions that can be eliminated.

Reference Data Management

  • Manage a comprehensive list of reference data items, including designated responsibilities and usage locations.
  • Establish and maintain new codes and code sets to ensure consistent versions are used throughout the system (golden source capability).
  • Function as the operational source for real-time reference data access by other operational applications.

Compliance

Enable internal and external auditors to trace the lineage of each data item on a regulatory report, ensuring proper management and auditability of all data transformations and storage. Example areas include our Bank-in-a-Box (BiaB) capability offerings to address compliance in areas such as Anti-Money Laundering and Data Privacy.

Business Glossary

  • Manage and store business terms and definitions in their respective business glossaries, and apply classifications to understand their specific purposes, usage, and characteristics.
  • Map each term to an enterprise glossary, industry ontology, or business taxonomy to compare it with standardized definitions.
  • Trace each term to its logical and physical data elements that implement it, enabling business users to quickly and easily discover all data storage locations that implemented the corresponding term.
  • Provide immediate visibility to the business term(s) defining the meaning and purpose of a data element for technical users such as software developers, DBAs, and BI report writers.

Data Governance

  • Automate the change approval process to ensure that any proposed modification to a business, logical, or physical data element undergoes the appropriate sequence of review and approval. Each reviewer is notified of a pending review via automated email subscriptions.
  • Document all relationships (owner, steward, consumer, stakeholder, etc.) for each key data element, ensuring that everyone is aware of their responsibilities and roles in managing the data.

Data Catalog

  • The data catalog functionality allows users to search, comment, rate, recommend, and subscribe to data sets. Users are automatically notified of any changes and other activities relating to the data sources and data sets they have subscribed to.
  • Data profiling and usage statistics can be provided for data sets.
  • The data catalog enhances the experience for BI analysts by helping them discover and utilize internal “approved” data sets that meet their requirements. It also facilitates collaboration in the overall data set on-boarding process.
  • Power data users can leverage the catalog’s capabilities to manipulate arbitrary combinations of data across multiple data sets. This enables them to view personalized information while the data catalog provides essential characteristics of the data at-a- glance, such as certification level, quality level, ownership/stewardship, and content. This knowledge empowers users to combine diverse data sets for running specific and more sophisticated BI analyses.

Enterprise Architecture

  • Provide a comprehensive overview of business capabilities, processes, systems, and data.
  • Evaluate the adaptability of an enterprise’s architecture in supporting business initiatives and estimate the end-to-end cost of required changes for a cost/benefit analysis.
  • Align IT capabilities with business processes and goals to achieve synergy and maximize efficiency.
  • Utilize As-is vs. To-be analysis to compare new business or IT architecture to its original state, assessing the impact of changes and evaluating their benefits.
  • Establish and manage business and IT standards, policies, and definitions, as well as their influence on data elements, processes, systems, applications, and services.
  • Define and link business rules and key performance indicators (KPI) to the data and process elements they measure, ensuring that architects making changes are immediately aware of the business rules and KPIs impacted by their modifications.

Parts Logistics

  • Capture and associate parts and components, along with their relationships in relevant packages and offerings, to maintain an accurate inventory.
  • Enforce quality and reporting requirements, documenting, and communicating them to all relevant stakeholders to ensure compliance.
  • Integrate with infrastructure communication protocols and integration buses, ensuring that operational systems have a single source of truth for packaging and delivery data.