Data Integration Tools   

METL / Metadata ETL

Semantic rules provide features and efficiency you never conceived before

We believe semantic ETL will have such a profound impact on data management that it will become the dominant ETL methodology within a decade

Use Cases Were METL Excels

  • Large Schemas: Manage "Fat Data" with minimal staffing. Ideal for financial services, healthcare and government.
  • Regulatory Compliance: Track and report on data lineage, data dictionary, personal identifiable information (PII), obfuscation of PII, SDLC, data errors and mitigation.
  • Schemas with Frequent Changes: Automatic detection of source schema changes and rules based ETL is ideal for any application development that has data integration requirements. App developers can add fields in OLTP database (SQL or NoSQL) and the schema changes automatically propagated to reporting database. This allows agile app development without delays caused by ETL development.
  • Runtime/Realtime Schema Management: Applications with dynamic forms, semi structured data or data movement. Using rules, changes can automatically be mapped to data integration touchpoints or create new ones on the fly.
  • Numerous or Changing Flat Files: Sweep a FTP directory, ingest multiple files, read column headers, sample data, detect schema changes then create or update our data dictionary and optionally the destination schema.
  • NoSQL ETL: Native Json processing allows METL to be unopiniated to SQL/NoSQL 
  • Hybrid SQL and Json or Name Value Pair Data Models: Seamlessly handle SQL and semi-structured data. Pivot/unpivot and normalize Json automatically. 
  • Minimize or Eliminate ETL Programmer Staffing: ETL programmers are expensive and difficult to find and retain. Turnover is expensive because programmers develop institutional knowledge. This is magnified by legacy ETL tools requiring manual documentation which is typically insufficiently implemented. METL's metadata and lexicons are self documenting. Anyone with database skills can understand our data dictionary style interface. After the initial configuration to implement your data architecture, METL requires a minimal amount of ongoing programming. Rules can resolve a majority of change automatically. Remaining change can be managed using our web portal. This can be handled by a DBA or any personnel with data experience. 
  • Automated Change/Error Management: The rules engine can resolve most changes or errors automatically. METL Manager includes it's own optional BPM service to automatically create a workflow for unhandled schema changes or errors that require human intervention.  The workflow can have approvals, QA tasks etc. and then update the metadata and schema directly. Conversely, METL's microservices architecture allow integration with your existing SDLC app or workflow/BPM apps.
  • Multi-Vendor or Customer Data Integration:  Automatic detect and resolve when your vendors change their schema or file formats. Centralized management across multiple ODS, flat files or REST data feeds. Allow your customers to define their own data feeds via web portal. 
  • Elasticsearch Integration: Streaming or batch. Able to combine partial documents from normalized SQL data model.
  • Scalability: METL is based on the persisted message queue of Kafka. Parallelize and partition data flows at both source or destination independently. Scale out across a server farm.
  • Data Architecture: As an unopinionated platform with extensible rules and metadata, METL can implement any data architecture. Rules implement your specific design of data warehouse, data lake, ODS, search engine, flat files or REST end points.
  • Embed METL in Your Application: Microservices interface allows easy integration. Add real-time/codeless data integration to your app. OEM licensing available.

Talk to a Data Architect

Our POC Challenge:
We Can Implement Major Aspects of Your ETL Within Days in a Proof of Concept.

Copyright @2021 Data Integration Tools LLC