Natural Language Queries
Natural Language Queries
Section titled “Natural Language Queries”OstrichDB’s Natural Language Processing (NLP) feature allows you to interact with your database using plain English commands. This powerful AI-driven interface makes database operations accessible to users of all technical levels.
Overview
Section titled “Overview”What is NLP in OstrichDB?
Section titled “What is NLP in OstrichDB?”The NLP feature transforms database operations into conversational interactions:
- Plain English Commands: No need to learn complex query syntax
- AI Understanding: Advanced language processing interprets your intent
- Safe Operations: Review and confirm before execution
- Batch Operations: Perform multiple operations in a single request
- Learning System: Improves understanding over time
Availability by Plan
Section titled “Availability by Plan”Free Tier:
- ❌ Not available
Pro Tier ($25/month):
- ✅ 50 requests per day
- ✅ Basic NLP operations
- ✅ Standard response time
Enterprise Tier:
- ✅ Unlimited requests
- ✅ Advanced NLP operations
- ✅ Priority processing
- ✅ Custom training options
Accessing NLP
Section titled “Accessing NLP”Getting to the NLP Interface
Section titled “Getting to the NLP Interface”- Navigate to any project in your dashboard
- Click “Natural Language Query Processor” in the sidebar
- Or use the direct NLP link in project navigation
- Select your target project if prompted
Interface Layout
Section titled “Interface Layout”The NLP interface includes:
- Query Input Box: Where you type your natural language request
- Context Display: Shows current project/collection context
- AI Response Area: Displays the AI’s interpretation of your request
- Confirmation Panel: Review and approve operations before execution
- History Panel: View previous NLP interactions
- Usage Counter: Track your daily request limit (Pro tier)
Using Natural Language Queries
Section titled “Using Natural Language Queries”Basic Query Structure
Section titled “Basic Query Structure”NLP queries can be conversational and flexible:
Simple Operations:
- “Create a new collection called ‘user-data’”
- “Delete the ‘temp’ cluster”
- “Show me all records in the users collection”
- “Find records where status is active”
Complex Operations:
- “Create a user profile collection with name, email, and registration date fields”
- “Delete all records in the temporary cluster that are older than 30 days”
- “Show me users created this month sorted by registration date”
Supported Operations
Section titled “Supported Operations”Data Creation
Section titled “Data Creation”"Create a new collection called 'products'""Add a cluster named 'electronics' to the products collection""Create a record for a laptop with price 999.99""Make a new project for my blog application"
Data Retrieval
Section titled “Data Retrieval”"Show me all collections in this project""List records in the users cluster""Find all products with price greater than 100""Get the total count of active users"
Data Modification
Section titled “Data Modification”"Update the price of the laptop to 899.99""Change the status of user john_doe to inactive""Rename the 'temp' collection to 'archived'""Move all draft posts to published status"
Data Deletion
Section titled “Data Deletion”"Delete the old_data collection""Remove all records from the temp cluster""Delete users who haven't logged in for 6 months""Clear the cache cluster"
Batch Operations
Section titled “Batch Operations”"Create collections for users, products, and orders""Delete all temporary data from last month""Update all draft posts to published and set today's date""Create user records for Alice, Bob, and Charlie with default settings"
NLP Workflow
Section titled “NLP Workflow”Step-by-Step Process
Section titled “Step-by-Step Process”- Enter Your Request: Type your command in natural language
- AI Processing: The system interprets your intent
- Review Interpretation: Check the AI’s understanding
- Modify if Needed: Clarify or correct the interpretation
- Confirm Execution: Click “Confirm & Execute” to proceed
- View Results: See the outcome of your operation
Example Workflow
Section titled “Example Workflow”User Input:
"Create a new collection for storing customer information with fields for name, email, phone, and registration date"
AI Interpretation:
I understand you want to:1. Create a new collection named "customer_information"2. Create a cluster within this collection (default name: "main")3. Create the following record templates: - name (type: STRING) - email (type: STRING) - phone (type: STRING) - registration_date (type: DATE)
Is this correct?
User Confirmation:
✅ Confirm & Execute❌ Cancel🔄 Modify Request
Advanced NLP Features
Section titled “Advanced NLP Features”Context Awareness
Section titled “Context Awareness”The NLP system maintains context throughout your session:
- Current Project: Remembers which project you’re working in
- Recent Operations: References previous operations in the conversation
- Data Relationships: Understands connections between your data structures
- User Preferences: Learns your naming conventions and patterns
Smart Suggestions
Section titled “Smart Suggestions”The AI provides intelligent suggestions:
- Data Type Recommendations: Suggests appropriate types for your data
- Naming Conventions: Follows patterns from your existing data
- Relationship Detection: Identifies potential data relationships
- Optimization Hints: Suggests performance improvements
Error Handling and Clarification
Section titled “Error Handling and Clarification”When the AI is uncertain:
- Clarification Questions: Asks for more specific information
- Multiple Options: Presents different interpretations for you to choose
- Safe Defaults: Uses conservative defaults when unsure
- Rollback Options: Offers to undo operations if results aren’t expected
Best Practices
Section titled “Best Practices”Writing Effective Queries
Section titled “Writing Effective Queries”Be Specific:
- ❌ “Delete some old data”
- ✅ “Delete records from the temp cluster older than 30 days”
Use Clear Names:
- ❌ “Create a thing for users”
- ✅ “Create a collection called ‘user_profiles’”
Specify Data Types:
- ❌ “Add a field for age”
- ✅ “Add an age field as an integer”
Include Context:
- ❌ “Update the status”
- ✅ “Update the status field in the users collection to ‘active‘“
Safety Practices
Section titled “Safety Practices”Review Before Execution:
- Always check the AI’s interpretation
- Verify the scope of operations (especially deletions)
- Confirm data types and field names
- Check for unintended side effects
Start Small:
- Test operations on small datasets first
- Use specific rather than broad commands
- Create backups before major changes
- Practice with non-critical data
Monitor Results:
- Check operation outcomes immediately
- Verify data integrity after changes
- Monitor performance impact of bulk operations
- Keep track of your daily usage limits
Troubleshooting NLP
Section titled “Troubleshooting NLP”Common Issues
Section titled “Common Issues”“I don’t understand that request”
- Solution: Be more specific and use clearer language
- Example: Instead of “fix the data,” try “update all null values in the name field to ‘Unknown’”
“Ambiguous operation detected”
- Solution: Provide more context about which data you’re referring to
- Example: Specify collection and cluster names explicitly
“Operation would affect too much data”
- Solution: Break down large operations into smaller chunks
- Example: Instead of “delete old data,” specify “delete records from July 2023 in the logs cluster”
“Daily limit reached”
- Solution: Wait for the daily reset or upgrade to Enterprise
- Alternative: Use the manual query editor for additional operations
Getting Better Results
Section titled “Getting Better Results”Learn from Examples:
- Review successful queries in your history
- Note which phrasings work best for your use cases
- Build a personal library of effective commands
Use Consistent Terminology:
- Stick to the same names for similar concepts
- Use your established naming conventions
- Be consistent with data type references
Provide Feedback:
- Confirm when the AI gets it right
- Correct misunderstandings clearly
- Help improve the system’s understanding of your needs
NLP Limitations
Section titled “NLP Limitations”Current Limitations
Section titled “Current Limitations”Complex Queries:
- Very complex multi-step operations may need to be broken down
- Advanced filtering requires specific syntax
- Some edge cases may need manual query writing
Data Analysis:
- Statistical operations are limited
- Complex aggregations may not be supported
- Advanced reporting requires API or manual queries
Bulk Operations:
- Large datasets may have processing time limits
- Memory constraints for very large operations
- Some bulk operations may need to be batched
When to Use Manual Queries
Section titled “When to Use Manual Queries”Consider the manual query editor for:
- Performance-critical operations
- Complex conditional logic
- Advanced filtering and sorting
- Bulk operations on large datasets
- Operations requiring precise control
Usage Monitoring
Section titled “Usage Monitoring”Tracking Your Usage
Section titled “Tracking Your Usage”Pro Tier Monitoring:
- Daily Counter: Shows requests used out of 50 daily limit
- Reset Time: Displays when your limit resets (midnight UTC)
- Usage History: Track patterns in your NLP usage
- Efficiency Metrics: See average operations per request
Enterprise Tier Monitoring:
- Unlimited Usage: No daily limits
- Performance Metrics: Response times and processing statistics
- Usage Analytics: Detailed reports on NLP utilization
- Custom Limits: Set organizational limits if desired
Next Steps
Section titled “Next Steps”To get the most out of NLP:
- Practice with Examples: Try the sample queries provided
- Explore Your Data: Use NLP to understand your existing data structure
- Combine with Manual Queries: Use both interfaces for different needs
- Learn from History: Review your successful queries for patterns
- Consider Upgrading: Get unlimited access with Enterprise tier
For more advanced database operations:
- Manual Query Editor - Direct command interface
- Cluster Editor - Visual data management
- API Reference - Programmatic access