Skip to main content
DDL (Data Definition Language) Import allows you to paste SQL CREATE TABLE statements and have ChartDB automatically parse and visualize your database schema. This is the most flexible import method, supporting multiple SQL dialects.

Overview

The DDL import system uses sophisticated SQL parsing to extract:
  • Table definitions
  • Column types and constraints
  • Primary keys and foreign keys
  • Indexes (including composite indexes)
  • Check constraints
  • Default values
  • Comments
  • Views
  • Enum types
  • Alter table statements

Supported SQL Dialects

ChartDB includes specialized parsers for different SQL dialects:

PostgreSQL

Full PostgreSQL syntax including arrays, custom types, and extensions

MySQL

MySQL and MariaDB with engine specifications and charset

SQL Server

T-SQL syntax with SQL Server-specific features

Oracle

Oracle PL/SQL DDL statements

SQLite

SQLite syntax with virtual tables and triggers

Generic SQL

Standard SQL for other databases

PostgreSQL DDL Import

ChartDB’s PostgreSQL parser is the most comprehensive, supporting advanced PostgreSQL features.

Basic Table Import

CREATE TABLE users (
    id SERIAL PRIMARY KEY,
    email VARCHAR(255) NOT NULL UNIQUE,
    created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);

Advanced PostgreSQL Features

PostgreSQL array types are fully supported:
CREATE TABLE posts (
    id SERIAL PRIMARY KEY,
    tags TEXT[],
    categories VARCHAR(50)[],
    metadata JSONB
);
Arrays are imported with isArray: true attribute.

ALTER TABLE Support

PostgreSQL ALTER TABLE statements are fully parsed:
ALTER TABLE users 
ADD COLUMN phone VARCHAR(20),
ADD COLUMN address TEXT;

MySQL DDL Import

MySQL syntax is fully supported with dialect-specific features:
CREATE TABLE `users` (
  `id` INT AUTO_INCREMENT PRIMARY KEY,
  `email` VARCHAR(255) NOT NULL UNIQUE,
  `status` ENUM('active', 'inactive', 'banned') DEFAULT 'active',
  `created_at` TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
  INDEX `idx_email` (`email`),
  CONSTRAINT `chk_email` CHECK (`email` LIKE '%@%')
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci;

MySQL-Specific Features

Engine specifications are recognized:
ENGINE=InnoDB
ENGINE=MyISAM
ENGINE=MEMORY
Charset and collation settings:
DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci
AUTO_INCREMENT columns are marked with increment: true:
id BIGINT AUTO_INCREMENT PRIMARY KEY
MySQL backtick identifiers are properly handled:
CREATE TABLE `order-items` (
  `order-id` INT,
  `product-id` INT
);

SQL Server DDL Import

T-SQL syntax with SQL Server-specific features:
CREATE TABLE [dbo].[Users] (
    [Id] INT IDENTITY(1,1) PRIMARY KEY,
    [Email] NVARCHAR(255) NOT NULL,
    [CreatedAt] DATETIME2 DEFAULT GETDATE(),
    [Status] VARCHAR(20) CHECK ([Status] IN ('Active', 'Inactive')),
    CONSTRAINT [UQ_Email] UNIQUE ([Email])
);

CREATE NONCLUSTERED INDEX [IX_Users_Email]
ON [dbo].[Users] ([Email]);

T-SQL Features

  • IDENTITY(seed, increment) columns
  • NVARCHAR and NCHAR Unicode types
  • Bracketed identifiers [table_name]
  • GETDATE(), NEWID() functions
  • DATETIME2, DATETIMEOFFSET types

Oracle DDL Import

Oracle PL/SQL DDL statements:
CREATE TABLE employees (
    employee_id NUMBER PRIMARY KEY,
    email VARCHAR2(255) NOT NULL,
    hire_date DATE DEFAULT SYSDATE,
    salary NUMBER(10,2) CHECK (salary > 0),
    department_id NUMBER,
    CONSTRAINT fk_dept FOREIGN KEY (department_id) 
        REFERENCES departments(department_id)
);

CREATE SEQUENCE emp_seq START WITH 1000 INCREMENT BY 1;

CREATE INDEX idx_emp_email ON employees(email);

Oracle-Specific Features

  • VARCHAR2 and NUMBER types
  • SEQUENCES for auto-increment
  • SYSDATE and other Oracle functions
  • PL/SQL block syntax handling

Preprocessing and Cleanup

The DDL importer includes sophisticated preprocessing:
1

Comment Removal

Multi-line /* */ and single-line -- comments are stripped:
// Remove multi-line comments
cleanedSQL = cleanedSQL.replace(/\/\*[\s\S]*?\*\//g, '');

// Remove single-line comments (preserving strings)
// Comments in strings are protected
2

Statement Splitting

SQL is split into individual statements by semicolons, accounting for:
  • Quoted strings
  • Function bodies ($$, BODYBODY)
  • Nested blocks
3

Statement Categorization

Each statement is categorized:
type StatementType = 
    | 'table'
    | 'view'
    | 'index'
    | 'alter'
    | 'function'
    | 'type'
    | 'comment';
4

Parser Selection

The appropriate dialect parser is selected based on database type.

Foreign Key Parsing

Foreign keys are extracted from multiple sources:
CREATE TABLE posts (
    id SERIAL PRIMARY KEY,
    user_id INT REFERENCES users(id)
);

View Support

Views are imported as special tables:
CREATE VIEW active_users AS
SELECT id, email, created_at
FROM users
WHERE status = 'active';

-- Materialized views (PostgreSQL)
CREATE MATERIALIZED VIEW user_stats AS
SELECT 
    u.id,
    COUNT(o.id) as order_count,
    SUM(o.total) as total_spent
FROM users u
LEFT JOIN orders o ON o.user_id = u.id
GROUP BY u.id;
Views are marked with isView: true and positioned separately on the canvas.

Complex Import Example

-- Create enum type
CREATE TYPE user_role AS ENUM ('admin', 'user', 'guest');

-- Create main tables
CREATE TABLE users (
    id SERIAL PRIMARY KEY,
    email VARCHAR(255) NOT NULL UNIQUE,
    role user_role DEFAULT 'user',
    tags TEXT[],
    settings JSONB,
    created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
    CONSTRAINT valid_email CHECK (email ~* '^[A-Za-z0-9._%+-]+@[A-Za-z0-9.-]+\.[A-Z|a-z]{2,}$')
);

CREATE TABLE posts (
    id SERIAL PRIMARY KEY,
    user_id INT NOT NULL,
    title VARCHAR(255) NOT NULL,
    content TEXT,
    published BOOLEAN DEFAULT false,
    view_count INT DEFAULT 0 CHECK (view_count >= 0),
    created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
    FOREIGN KEY (user_id) REFERENCES users(id) ON DELETE CASCADE
);

CREATE TABLE comments (
    id SERIAL PRIMARY KEY,
    post_id INT NOT NULL,
    user_id INT NOT NULL,
    content TEXT NOT NULL,
    created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
    FOREIGN KEY (post_id) REFERENCES posts(id) ON DELETE CASCADE,
    FOREIGN KEY (user_id) REFERENCES users(id) ON DELETE CASCADE
);

-- Create indexes
CREATE INDEX idx_posts_user_id ON posts(user_id);
CREATE INDEX idx_posts_published ON posts(published) WHERE published = true;
CREATE INDEX idx_comments_post ON comments(post_id);
CREATE INDEX idx_users_tags ON users USING GIN(tags);

-- Create view
CREATE VIEW recent_posts AS
SELECT p.id, p.title, u.email as author_email, p.created_at
FROM posts p
JOIN users u ON u.id = p.user_id
WHERE p.created_at > NOW() - INTERVAL '7 days';

Error Handling and Validation

The DDL parser includes robust error handling:
Unsupported statements (triggers, functions, policies) are skipped with warnings but don’t break the import process.
If a statement fails to parse, the error is logged but other statements continue processing.
Unknown data types are preserved as-is and can be edited in the diagram editor.
Check constraint expressions are validated before import:
const validationResult = validateCheckConstraintWithDetails(expression);
if (!validationResult.isValid) {
    throw new DBMLValidationError(validationResult.error);
}
Foreign key cycles are detected and handled gracefully.

Best Practices

For best import results:
  1. Include schema definitions (CREATE SCHEMA) before table definitions
  2. Define referenced tables before tables with foreign keys
  3. Use consistent quoting (all quoted or all unquoted identifiers)
  4. Include constraint names for better diagram labels
  5. Add SQL comments for table and column descriptions

Import Performance

The DDL parser is optimized for large schemas:
Schema SizeParse TimeNotes
< 10 tables< 100msInstant
10-50 tables100-500msFast
50-200 tables0.5-2sGood
200+ tables2-5sAcceptable

Troubleshooting

Ensure referenced tables are defined before the referencing table, or use ALTER TABLE statements after all tables are created.
Make sure CREATE INDEX statements come after the table definitions.
Custom types must be defined before they’re used in table definitions. Use CREATE TYPE statements.
Check for syntax errors in your DDL. Try importing smaller sections to isolate problematic statements.

Next Steps

DBML Import

Learn about DBML format

Export Options

Export your diagram to different formats