Data Type Mappings
How the platform maps data types between different database systems and handles type conversions during transformations.
Overview
When transforming data between databases, the platform automatically maps source data types to equivalent destination types. This ensures data compatibility while preserving semantics and precision.
Type Mapping Strategy
The platform uses these rules for type mapping:
-
Preserve semantics: Ensure data meaning is maintained
-
Preserve precision: Don’t lose decimal places or string length
-
Use native types: Prefer destination database’s native types
-
Handle edge cases: NULL values, special characters, timezone data
Common Type Mappings
String Types
| Source Type | PostgreSQL | MySQL |
|---|---|---|
VARCHAR(N) |
VARCHAR(N) |
VARCHAR(N) |
CHAR(N) |
CHAR(N) |
CHAR(N) |
TEXT |
TEXT |
TEXT / LONGTEXT |
CLOB |
TEXT |
LONGTEXT |
Numeric Types
| Source | PostgreSQL | MySQL | Oracle |
|---|---|---|---|
INTEGER |
INTEGER |
INT |
NUMBER(10) |
BIGINT |
BIGINT |
BIGINT |
NUMBER(19) |
DECIMAL(P,S) |
NUMERIC(P,S) |
DECIMAL(P,S) |
NUMBER(P,S) |
FLOAT |
REAL |
FLOAT |
BINARY_FLOAT |
DOUBLE |
DOUBLE PRECISION |
DOUBLE |
BINARY_DOUBLE |
Database-Specific Types
PostgreSQL Specific
Arrays:
INTEGER[] → Supported in PostgreSQL only
→ Converted to JSON in other databases
JSON/JSONB:
JSONB → Supported in PostgreSQL
→ JSON in MySQL
→ CLOB in Oracle
UUID:
UUID → UUID in PostgreSQL
→ CHAR(36) in MySQL/Oracle
See: Type Handling
Oracle Specific
NUMBER:
NUMBER → NUMERIC in PostgreSQL
→ DECIMAL in MySQL
CLOB/BLOB:
CLOB → TEXT in PostgreSQL
→ LONGTEXT in MySQL
MySQL Specific
TINYINT:
TINYINT(1) → BOOLEAN in PostgreSQL
→ NUMBER(1) in Oracle
See: MySQL TINYINT
ENUM:
ENUM('A','B','C') → VARCHAR with CHECK constraint
Type Conversion During Transformation
The platform handles type conversions automatically:
Implicit Conversions
String → Number:
# Original column: VARCHAR containing "12345"
# Destination: INTEGER
# Platform converts: "12345" → 12345
Number → String:
# Original: INTEGER 12345
# Destination: VARCHAR
# Platform converts: 12345 → "12345"
Explicit Conversions
Use Value Mapping for custom conversions:
- columns: ["status_code"]
type: ValueMapping
params:
mappings:
"1": "ACTIVE"
"0": "INACTIVE"
See: Value Mapping
Custom Type Handling
For unsupported types, use Scripting Transformer:
- columns: ["custom_type_column"]
type: Scripting
params:
mode: MASKING
code: |
(ctx, originalRecord) => {
const value = originalRecord.get('custom_type_column');
// Custom conversion logic
return convertCustomType(value);
}
Best Practices
-
Test type mappings on small datasets first
-
Check precision for DECIMAL types (don’t lose decimal places)
-
Validate NULL handling for NOT NULL constraints
-
Use explicit mappings for critical business data
-
Document custom conversions in workflow config