Skip to content

Natural Language Queries

OstrichDB’s Natural Language Processing (NLP) feature allows you to interact with your database using plain English commands. This powerful AI-driven interface makes database operations accessible to users of all technical levels.

The NLP feature transforms database operations into conversational interactions:

  • Plain English Commands: No need to learn complex query syntax
  • AI Understanding: Advanced language processing interprets your intent
  • Safe Operations: Review and confirm before execution
  • Batch Operations: Perform multiple operations in a single request
  • Learning System: Improves understanding over time

Free Tier:

  • ❌ Not available

Pro Tier ($25/month):

  • ✅ 50 requests per day
  • ✅ Basic NLP operations
  • ✅ Standard response time

Enterprise Tier:

  • ✅ Unlimited requests
  • ✅ Advanced NLP operations
  • ✅ Priority processing
  • ✅ Custom training options
  1. Navigate to any project in your dashboard
  2. Click “Natural Language Query Processor” in the sidebar
  3. Or use the direct NLP link in project navigation
  4. Select your target project if prompted

The NLP interface includes:

  • Query Input Box: Where you type your natural language request
  • Context Display: Shows current project/collection context
  • AI Response Area: Displays the AI’s interpretation of your request
  • Confirmation Panel: Review and approve operations before execution
  • History Panel: View previous NLP interactions
  • Usage Counter: Track your daily request limit (Pro tier)

NLP queries can be conversational and flexible:

Simple Operations:

  • “Create a new collection called ‘user-data’”
  • “Delete the ‘temp’ cluster”
  • “Show me all records in the users collection”
  • “Find records where status is active”

Complex Operations:

  • “Create a user profile collection with name, email, and registration date fields”
  • “Delete all records in the temporary cluster that are older than 30 days”
  • “Show me users created this month sorted by registration date”
"Create a new collection called 'products'"
"Add a cluster named 'electronics' to the products collection"
"Create a record for a laptop with price 999.99"
"Make a new project for my blog application"
"Show me all collections in this project"
"List records in the users cluster"
"Find all products with price greater than 100"
"Get the total count of active users"
"Update the price of the laptop to 899.99"
"Change the status of user john_doe to inactive"
"Rename the 'temp' collection to 'archived'"
"Move all draft posts to published status"
"Delete the old_data collection"
"Remove all records from the temp cluster"
"Delete users who haven't logged in for 6 months"
"Clear the cache cluster"
"Create collections for users, products, and orders"
"Delete all temporary data from last month"
"Update all draft posts to published and set today's date"
"Create user records for Alice, Bob, and Charlie with default settings"
  1. Enter Your Request: Type your command in natural language
  2. AI Processing: The system interprets your intent
  3. Review Interpretation: Check the AI’s understanding
  4. Modify if Needed: Clarify or correct the interpretation
  5. Confirm Execution: Click “Confirm & Execute” to proceed
  6. View Results: See the outcome of your operation

User Input:

"Create a new collection for storing customer information with fields for name, email, phone, and registration date"

AI Interpretation:

I understand you want to:
1. Create a new collection named "customer_information"
2. Create a cluster within this collection (default name: "main")
3. Create the following record templates:
- name (type: STRING)
- email (type: STRING)
- phone (type: STRING)
- registration_date (type: DATE)
Is this correct?

User Confirmation:

✅ Confirm & Execute
❌ Cancel
🔄 Modify Request

The NLP system maintains context throughout your session:

  • Current Project: Remembers which project you’re working in
  • Recent Operations: References previous operations in the conversation
  • Data Relationships: Understands connections between your data structures
  • User Preferences: Learns your naming conventions and patterns

The AI provides intelligent suggestions:

  • Data Type Recommendations: Suggests appropriate types for your data
  • Naming Conventions: Follows patterns from your existing data
  • Relationship Detection: Identifies potential data relationships
  • Optimization Hints: Suggests performance improvements

When the AI is uncertain:

  • Clarification Questions: Asks for more specific information
  • Multiple Options: Presents different interpretations for you to choose
  • Safe Defaults: Uses conservative defaults when unsure
  • Rollback Options: Offers to undo operations if results aren’t expected

Be Specific:

  • ❌ “Delete some old data”
  • ✅ “Delete records from the temp cluster older than 30 days”

Use Clear Names:

  • ❌ “Create a thing for users”
  • ✅ “Create a collection called ‘user_profiles’”

Specify Data Types:

  • ❌ “Add a field for age”
  • ✅ “Add an age field as an integer”

Include Context:

  • ❌ “Update the status”
  • ✅ “Update the status field in the users collection to ‘active‘“

Review Before Execution:

  • Always check the AI’s interpretation
  • Verify the scope of operations (especially deletions)
  • Confirm data types and field names
  • Check for unintended side effects

Start Small:

  • Test operations on small datasets first
  • Use specific rather than broad commands
  • Create backups before major changes
  • Practice with non-critical data

Monitor Results:

  • Check operation outcomes immediately
  • Verify data integrity after changes
  • Monitor performance impact of bulk operations
  • Keep track of your daily usage limits

“I don’t understand that request”

  • Solution: Be more specific and use clearer language
  • Example: Instead of “fix the data,” try “update all null values in the name field to ‘Unknown’”

“Ambiguous operation detected”

  • Solution: Provide more context about which data you’re referring to
  • Example: Specify collection and cluster names explicitly

“Operation would affect too much data”

  • Solution: Break down large operations into smaller chunks
  • Example: Instead of “delete old data,” specify “delete records from July 2023 in the logs cluster”

“Daily limit reached”

  • Solution: Wait for the daily reset or upgrade to Enterprise
  • Alternative: Use the manual query editor for additional operations

Learn from Examples:

  • Review successful queries in your history
  • Note which phrasings work best for your use cases
  • Build a personal library of effective commands

Use Consistent Terminology:

  • Stick to the same names for similar concepts
  • Use your established naming conventions
  • Be consistent with data type references

Provide Feedback:

  • Confirm when the AI gets it right
  • Correct misunderstandings clearly
  • Help improve the system’s understanding of your needs

Complex Queries:

  • Very complex multi-step operations may need to be broken down
  • Advanced filtering requires specific syntax
  • Some edge cases may need manual query writing

Data Analysis:

  • Statistical operations are limited
  • Complex aggregations may not be supported
  • Advanced reporting requires API or manual queries

Bulk Operations:

  • Large datasets may have processing time limits
  • Memory constraints for very large operations
  • Some bulk operations may need to be batched

Consider the manual query editor for:

  • Performance-critical operations
  • Complex conditional logic
  • Advanced filtering and sorting
  • Bulk operations on large datasets
  • Operations requiring precise control

Pro Tier Monitoring:

  • Daily Counter: Shows requests used out of 50 daily limit
  • Reset Time: Displays when your limit resets (midnight UTC)
  • Usage History: Track patterns in your NLP usage
  • Efficiency Metrics: See average operations per request

Enterprise Tier Monitoring:

  • Unlimited Usage: No daily limits
  • Performance Metrics: Response times and processing statistics
  • Usage Analytics: Detailed reports on NLP utilization
  • Custom Limits: Set organizational limits if desired

To get the most out of NLP:

  1. Practice with Examples: Try the sample queries provided
  2. Explore Your Data: Use NLP to understand your existing data structure
  3. Combine with Manual Queries: Use both interfaces for different needs
  4. Learn from History: Review your successful queries for patterns
  5. Consider Upgrading: Get unlimited access with Enterprise tier

For more advanced database operations: