Quantum Minds Table Operators
Introduction
Table operators in Quantum Minds provide powerful capabilities for manipulating, analyzing, and visualizing tabular data. These operators enable you to transform raw data into actionable insights through summarization, visualization, and advanced analysis.
Available Table Operators
Operator |
Description |
Common Use Cases |
TableToTextSummary |
Generates text summaries from tabular data |
Data reports, insights generation, executive summaries |
TableToGraph |
Creates visual chart representations |
Data visualization, trend analysis, pattern identification |
TableToGraphV2 |
Enhanced visualization with improved formatting |
Advanced charts, custom visualizations |
TableToGraphV3 |
Advanced visualization with interactive elements |
Interactive dashboards, complex data stories |
PandasAi |
AI-powered dataframe analysis |
Data exploration, pattern discovery, quick insights |
PandasAiDf |
Dataframe-to-dataframe transformations |
Data preparation, feature engineering, transformation |
SyntheticTableGenerator |
Creates synthetic tabular data |
Testing, demos, data augmentation |
TableMerge |
Combines multiple tables |
Data integration, relationship analysis |
TableDeduplicate |
Removes duplicate entries |
Data cleaning, entity resolution |
TableFilter |
Filters rows based on conditions |
Data subsetting, focused analysis |
TableGroupBy |
Groups and aggregates data |
Summary statistics, dimensional analysis |
TableValidate |
Validates data against rules |
Data quality, compliance checking |
TableSort |
Sorts table data |
Ranking, ordered analysis |
TableDateFormat |
Standardizes date formats |
Data normalization, time-based analysis |
TableHash |
Creates hashes from data |
Anonymization, identity protection |
TableRowProcessor |
Processes individual rows with LLMs |
Record enrichment, classification, entity extraction |
TableMerger |
Alternative implementation for combining tables |
Complex data integration, multi-source analysis |
MDToTable |
Converts markdown tables to dataframes |
Document data extraction, structural parsing |
TextToGraph |
Generates graphs from text descriptions |
Quick visualizations, concept illustrations |
TableToTextSummary
The TableToTextSummary operator analyzes tabular data and generates natural language summaries highlighting key insights.
Inputs
Parameter |
Type |
Required |
Description |
prompt |
string |
Yes |
Instructions for the type of summary needed |
dataframe |
object |
Yes |
Tabular data to summarize |
trigger |
string |
No |
Optional control signal |
Outputs
Parameter |
Type |
Description |
type |
string |
Output format (markdown) |
content |
string |
Text summary with insights |
Example Usage
Prompt: "Summarize the key trends in monthly sales by product category"
Dataframe: [Sales data with product categories and monthly figures]
Output:
"Analysis of monthly sales by product category reveals several key trends:
1. Electronics showed the strongest overall performance with 18% YoY growth
2. Apparel experienced seasonal fluctuations with peaks in December and July
3. Home goods demonstrated consistent growth throughout Q3
4. Beauty products declined gradually until September before rebounding in Q4
The most significant correlation appears between electronics and home goods sales, suggesting complementary purchasing patterns."
Best Practices
- Provide specific instructions about what aspects to focus on
- Include context about the data's purpose or business relevance
- Request specific types of insights (trends, outliers, comparisons)
- Consider the audience when specifying the technical depth
TableToGraph Series (V1, V2, V3)
The TableToGraph operators convert tabular data into visual representations through various chart types.
Inputs
Parameter |
Type |
Required |
Description |
prompt |
string |
Yes |
Instructions for visualization type and focus |
dataframe |
string |
Yes |
Data to visualize |
trigger |
string |
No |
Optional control signal |
Outputs
Parameter |
Type |
Description |
type |
string |
Output format (recharts) |
content |
lgraph |
Visualization data structure |
Versions Comparison
Feature |
TableToGraph |
TableToGraphV2 |
TableToGraphV3 |
Chart types |
Basic (bar, line, pie) |
Extended (scatter, area, radar) |
Comprehensive (heat maps, treemaps, network) |
Customization |
Limited |
Moderate |
Extensive |
Interactivity |
Static |
Basic tooltips |
Full interactive elements |
Multi-series |
Basic support |
Enhanced support |
Advanced multi-dimensional |
Annotations |
None |
Basic |
Rich annotations |
Responsive |
Limited |
Improved |
Fully responsive |
Example Usage
Prompt: "Create a stacked bar chart showing quarterly revenue by product category, with a trend line showing overall growth"
Dataframe: [Quarterly sales data by product category]
Output: [Interactive visualization with stacked bars and trend line]
Best Practices
- Be specific about chart type and data dimensions
- Mention color schemes if important
- Specify axes labels and titles
- Request annotations for important data points
- Consider using V3 for dashboards and external presentations
PandasAi and PandasAiDf
These operators leverage AI to analyze and transform dataframes with Pandas-like operations.
PandasAi
Generates Python code for data analysis and returns both analysis and transformed data.
Inputs
Parameter |
Type |
Required |
Description |
prompt |
string |
Yes |
Instructions for data analysis |
dataframe |
string |
Yes |
Data to analyze |
trigger |
string |
No |
Optional control signal |
Outputs
Parameter |
Type |
Description |
type |
string |
Output format (markdown) |
content |
string |
Analysis results and explanations |
PandasAiDf
Focuses on dataframe-to-dataframe transformations for data preparation workflows.
Inputs
Parameter |
Type |
Required |
Description |
prompt |
string |
Yes |
Instructions for data transformation |
dataframe |
string |
No |
Primary data to transform |
dataset |
string |
No |
Alternative data source |
trigger |
string |
No |
Optional control signal |
Outputs
Parameter |
Type |
Description |
type |
string |
Output format (markdown) |
content |
string |
Transformed dataframe |
Example Usage
# PandasAi
Prompt: "Analyze the relationship between customer age and purchase amount, identify any correlations and outliers"
Dataframe: [Customer purchase data]
Output:
- Statistical analysis of age vs. purchase relationship
- Correlation coefficients
- Outlier identification
- Visualizations of the relationship
- Insights and recommendations
# PandasAiDf
Prompt: "Create a pivot table showing average purchase amount by age group (18-25, 26-35, 36-45, 46+) and product category"
Dataframe: [Customer purchase data]
Output: Transformed dataframe with the pivot table structure
Best Practices
- Be specific about the analysis or transformation needed
- For complex operations, break down into sequential steps
- Include details about grouping, filtering, or aggregation methods
- Mention expected output format or structure
- Use PandasAi for exploratory analysis and PandasAiDf for data pipeline transformations
SyntheticTableGenerator
Creates synthetic tabular data based on specifications, useful for testing and demonstrations.
Inputs
Parameter |
Type |
Required |
Description |
prompt |
string |
Yes |
Description of the table structure and data characteristics |
rows |
string |
Yes |
Number of rows to generate |
trigger |
string |
No |
Optional control signal |
Outputs
Parameter |
Type |
Description |
type |
string |
Output format |
content |
markdown |
Generated synthetic data |
Example Usage
Prompt: "Generate synthetic sales data with columns for customer_id, purchase_date, product_category (Electronics, Apparel, Home, Beauty), quantity, and price. Include seasonal patterns with higher sales in December."
Rows: "100"
Output: Synthetic dataset matching the specifications with realistic patterns
Best Practices
- Describe the desired schema with column names and types
- Specify any relationships or patterns to include
- Mention value ranges and distributions when important
- Include business rules or constraints the data should follow
- Consider realistic data characteristics for testing scenarios
Data Cleaning Operators
Several Table operators focus on data cleaning operations:
TableDeduplicate
Removes duplicate entries from tables based on specified columns.
Inputs
Parameter |
Type |
Required |
Description |
dataframe |
string |
Yes |
Data to deduplicate |
columns |
string |
Yes |
Columns to use for identifying duplicates |
Outputs
Parameter |
Type |
Description |
type |
string |
Output format (markdown) |
content |
string |
Deduplicated dataframe |
TableFilter
Filters rows in a table based on specified conditions.
Inputs
Parameter |
Type |
Required |
Description |
dataframe |
string |
Yes |
Data to filter |
condition |
string |
Yes |
Filter condition (e.g., "age > 30 and region == 'North'") |
Outputs
Parameter |
Type |
Description |
type |
string |
Output format (markdown) |
content |
string |
Filtered dataframe |
TableGroupBy
Groups table data and performs aggregations.
Inputs
Parameter |
Type |
Required |
Description |
dataframe |
string |
Yes |
Data to group |
groupby |
string |
Yes |
Columns to group by |
aggregations |
string |
Yes |
Aggregation operations (e.g., "mean", "sum", "count") |
Outputs
Parameter |
Type |
Description |
type |
string |
Output format (markdown) |
content |
string |
Grouped and aggregated dataframe |
TableValidate
Validates table data against specified rules.
Inputs
Parameter |
Type |
Required |
Description |
dataframe |
string |
Yes |
Data to validate |
not_null_columns |
string |
No |
Columns that should not contain null values |
regex_columns |
string |
No |
Columns to validate with regex patterns |
regex_pattern |
string |
No |
Regex patterns for validation |
Outputs
Parameter |
Type |
Description |
type |
string |
Output format (markdown) |
content |
string |
Validation results |
TableSort
Sorts table data based on specified columns.
Inputs
Parameter |
Type |
Required |
Description |
dataframe |
string |
Yes |
Data to sort |
columns |
string |
Yes |
Columns to sort by |
ascending |
boolean |
Yes |
Sort direction (true for ascending, false for descending) |
Outputs
Parameter |
Type |
Description |
type |
string |
Output format (markdown) |
content |
string |
Sorted dataframe |
TableDateFormat
Standardizes date formats in a table.
Inputs
Parameter |
Type |
Required |
Description |
dataframe |
string |
Yes |
Data containing dates |
column |
string |
Yes |
Column containing dates to format |
output_format |
string |
Yes |
Target date format |
custom_format |
string |
No |
Custom format string if needed |
timezone |
string |
No |
Timezone for conversion |
Outputs
Parameter |
Type |
Description |
type |
string |
Output format (markdown) |
content |
string |
Dataframe with standardized dates |
TableHash
Creates hash values from table content, useful for anonymization.
Inputs
Parameter |
Type |
Required |
Description |
dataframe |
string |
Yes |
Data to hash |
columns |
string |
Yes |
Columns to apply hashing to |
algorithm |
string |
Yes |
Hashing algorithm (e.g., "md5", "sha256") |
Outputs
Parameter |
Type |
Description |
type |
string |
Output format (markdown) |
content |
string |
Dataframe with hashed values |
Advanced Table Operations
TableRowProcessor
Processes individual rows from a dataframe using LLMs for enrichment or transformation.
Inputs
Parameter |
Type |
Required |
Description |
prompt |
string |
Yes |
Instructions for row processing |
dataframe |
string |
Yes |
Data containing rows to process |
trigger |
string |
No |
Optional control signal |
Outputs
Parameter |
Type |
Description |
type |
string |
Output format (markdown) |
content |
string |
Processed data |
Example Usage
Prompt: "For each customer record, analyze the purchase history and generate a personalized product recommendation with reasoning"
Dataframe: [Customer purchase history data]
Output: Customer data enriched with personalized recommendations and reasoning
TableMerger / TableMerge
Both operators combine multiple tables but with different implementations and capabilities.
Inputs
Parameter |
Type |
Required |
Description |
dataframe |
string |
Yes |
Primary dataframe |
other_dataframe |
string |
Yes |
Secondary dataframe to merge with |
how |
string |
Yes |
Join type (inner, outer, left, right) |
on |
string |
Yes |
Join key(s) |
concat |
string |
Yes |
Whether to use concatenation instead of join |
axis |
string |
Yes |
Axis for concatenation (0 for rows, 1 for columns) |
Outputs
Parameter |
Type |
Description |
type |
string |
Output format |
content |
string |
Merged dataframe |
MDToTable
Converts markdown tables to structured dataframes.
Inputs
Parameter |
Type |
Required |
Description |
markdown |
string |
Yes |
Markdown text containing tables |
trigger |
string |
No |
Optional control signal |
Outputs
Parameter |
Type |
Description |
type |
string |
Output format (markdown) |
content |
string |
Structured dataframe |
Example Usage
Markdown: "
| Name | Age | City |
|------|-----|------|
| John | 32 | New York |
| Alice | 28 | Boston |
| Bob | 45 | Chicago |
"
Output: Structured dataframe with proper columns and data types
TextToGraph
Creates graphs directly from text descriptions without requiring a dataframe.
Inputs
Parameter |
Type |
Required |
Description |
prompt |
string |
Yes |
Description of the graph to create |
trigger |
string |
No |
Optional control signal |
Outputs
Parameter |
Type |
Description |
type |
string |
Output format (lgraph) |
content |
string |
Graph visualization |
Example Usage
Prompt: "Create a pie chart showing market share distribution: Google 35%, Apple 25%, Microsoft 20%, Amazon 15%, Others 5%"
Output: Pie chart visualization based on the described data
Common Table Operation Patterns
Data Cleaning Pipeline
SQLExecution → TableDeduplicate → TableFilter → TableDateFormat → PandasAiDf
Analysis and Visualization
PandasAi → [TableToTextSummary, TableToGraphV3]
Data Transformation
ExcelToTable → TableRowProcessor → TableGroupBy → TableSort → CreateDataset
Report Generation
PandasAiDf → TableToTextSummary → TextToGraph → CardGenerator
Best Practices for Table Operators
Data Quality Considerations
- Always validate input data before complex processing
- Handle missing values appropriately (remove, impute, or flag)
- Check for data type consistency
- Be aware of outliers that might skew analysis
Performance Optimization
- Filter early to reduce data size
- Use appropriate indexing when available
- Consider chunking for very large datasets
- Optimize memory usage by dropping unnecessary columns
Effective Visualization
- Choose appropriate chart types for your data
- Limit the number of dimensions in a single visualization
- Use clear labels and titles
- Consider color choice for accessibility
Creating Reusable Patterns
- Group common table operations into reusable mind fragments
- Document data transformation steps
- Create template patterns for common analysis tasks
- Use variables to make table operations configurable
Integration with Other Operator Categories
Combining with SQL Operators
Create complete data pipelines from database to analysis:
TextToSQL → SQLExecution → TableDeduplicate → PandasAi → TableToGraph
Combining with Document Operators
Extract and process tabular data from documents:
RAGSummarize → MDToTable → TableToTextSummary → CardGenerator
Combining with Flow Operators
Create conditional data processing workflows:
TableValidate → Flow.Condition → [TableFilter, TableTransform] → TableToGraph
Combining with Code Operators
Implement custom logic for complex data manipulation:
PandasAiDf → Code.Python.Execute → TableToGraph → CardGenerator
Real-World Examples
Customer Segmentation Analysis
// Extract customer data
{
"operator": "TextToSQL",
"input": {
"prompt": "Get all customer transactions with demographic information for the past year",
"dataset": "customer_database",
"trigger": "data_extraction"
}
}
↓
{
"operator": "SQLExecution",
"input": {
"sql": "$TextToSQL_001.output.content",
"dataset": "customer_database",
"trigger": "query_execution"
}
}
↓
// Clean and prepare data
{
"operator": "TableFilter",
"input": {
"dataframe": "$SQLExecution_001.output.content",
"condition": "transaction_amount > 0 and customer_id is not null",
"trigger": "data_filtering"
}
}
↓
{
"operator": "PandasAiDf",
"input": {
"prompt": "Segment customers based on purchase frequency, recency, and monetary value (RFM analysis)",
"dataframe": "$TableFilter_001.output.content",
"trigger": "customer_segmentation"
}
}
↓
// Analyze and visualize segments
{
"operator": "TableToGraph",
"input": {
"prompt": "Create a scatter plot of customer segments showing monetary value vs frequency, with color representing recency and bubble size representing customer count",
"dataframe": "$PandasAiDf_001.output.content",
"trigger": "visualization_creation"
}
}
↓
{
"operator": "TableToTextSummary",
"input": {
"prompt": "Provide a detailed analysis of each customer segment including key characteristics, purchase patterns, and recommendations for targeted marketing",
"dataframe": "$PandasAiDf_001.output.content",
"trigger": "segment_analysis"
}
}
Sales Forecasting Dashboard
// Get historical sales data
{
"operator": "TextToSQL",
"input": {
"prompt": "Get monthly sales by product category for the past 3 years",
"dataset": "sales_database",
"trigger": "historical_data"
}
}
↓
{
"operator": "SQLExecution",
"input": {
"sql": "$TextToSQL_001.output.content",
"dataset": "sales_database",
"trigger": "data_retrieval"
}
}
↓
// Prepare data for forecasting
{
"operator": "TableDateFormat",
"input": {
"dataframe": "$SQLExecution_001.output.content",
"column": "sales_month",
"output_format": "yyyy-MM-dd",
"trigger": "date_standardization"
}
}
↓
// Forecast future sales
{
"operator": "PandasAi",
"input": {
"prompt": "Forecast sales for each product category for the next 6 months using time series forecasting. Account for seasonality and trends. Include prediction intervals.",
"dataframe": "$TableDateFormat_001.output.content",
"trigger": "forecasting"
}
}
↓
// Create forecast visualization
{
"operator": "TableToGraphV3",
"input": {
"prompt": "Create a line chart showing historical sales and forecasted sales by category. Include prediction intervals as shaded areas. Use different colors for each category. Add a vertical line to separate historical from forecast data.",
"dataframe": "$PandasAi_001.output.content",
"trigger": "visualization"
}
}
↓
// Generate insights summary
{
"operator": "TableToTextSummary",
"input": {
"prompt": "Analyze the sales forecast and provide key insights, growth opportunities, and potential risks. Highlight categories with the strongest growth potential and those requiring attention.",
"dataframe": "$PandasAi_001.output.content",
"trigger": "insight_generation"
}
}
Next Steps
Explore how Table Operators can be combined with Document Operators for comprehensive data analysis that includes unstructured content.
Overview | Operator Categories | SQL Operators | Document Operators