The Problem
On Monday you tested the 3 prompts in ChatGPT. You saw how image analysis ā feature extraction ā description generation works. But here's the reality: you can't ask your agents to copy-paste 200 times per day. One agent spending 3 hours manually creating listings? That's $90/day in labor costs (at $30/hr). Across a busy brokerage handling 50 listings/day, you're looking at $135,000/year just on listing admin. Plus the inconsistencies that hurt SEO and the delayed time-to-market that costs you leads.
See It Work
Watch the 3 prompts chain together automatically. This is what you'll build.
Watch It Work
See the AI automation in action
The Code
Three levels: start simple, add reliability, then scale to production. Pick where you are.
When to Level Up
Simple API Calls
0-100 properties/day
- Sequential processing
- Basic error handling
- Manual MLS updates
- Single-threaded execution
With Retries & Integration
100-1,000 properties/day
- Retry logic with exponential backoff
- MLS API integration
- Automated publishing
- Comprehensive logging
- Error tracking
Queue & Batch Processing
1,000-5,000 properties/day
- Redis queue management
- Batch processing (50-100 per batch)
- Background workers
- Rate limiting
- Monitoring dashboard
- Auto-scaling workers
Multi-Agent System
5,000+ properties/day
- Distributed agent architecture
- Load balancing across regions
- Multi-model routing (GPT-4 + Claude)
- Real-time monitoring
- Auto-failover
- Custom model fine-tuning
- Dedicated infrastructure
Real Estate Gotchas
Industry-specific challenges you'll hit in production. Here's how to handle them.
MLS Compliance & Data Standards
Build a normalization layer that maps your internal format to each MLS's requirements. Use RESO Data Dictionary as baseline.
# MLS normalization layer
class MLSAdapter:
def __init__(self, mls_type: str):
self.mls_type = mls_type
self.field_mappings = self.load_field_mappings()
def normalize_listing(self, internal_data: Dict) -> Dict:
"""Convert internal format to MLS-specific format"""Image Processing & Storage
Compress images on upload, use CDN for delivery, cache AI analysis results. Don't re-analyze same image twice.
# Image processing pipeline
import hashlib
from PIL import Image
import io
class ImageProcessor:
def __init__(self, cache_backend, cdn_url):
self.cache = cache_backendDuplicate Listing Detection
Use address + city + zip as primary key. Check for existing listings before creating new. Handle price changes as updates.
# Duplicate detection
import Levenshtein
class DuplicateDetector:
def __init__(self, db_connection):
self.db = db_connection
async def find_duplicate(self, new_listing: Dict) -> Optional[str]:Market-Specific Terminology
Maintain region-specific terminology dictionaries. Include market context in prompts. Use local examples for few-shot learning.
# Region-specific terminology
class RegionalTerminology:
TERMS = {
'US': {
'property_types': ['Single Family', 'Condo', 'Townhouse', 'Co-op'],
'features': ['Hardwood floors', 'Central AC', 'Finished basement'],
'rooms': ['Master bedroom', 'Family room', 'Mud room']
},Fair Housing Compliance
Post-process all descriptions through compliance filter. Flag and remove prohibited terms. Log all modifications for audit trail.
# Fair Housing compliance filter
import re
from typing import List, Tuple
class FairHousingFilter:
# Prohibited terms under Fair Housing Act
PROHIBITED_TERMS = [
# Family status implicationsAdjust Your Numbers
ā Manual Process
ā AI-Automated
You Save
2026 Randeep Bhatia. All Rights Reserved.
No part of this content may be reproduced, distributed, or transmitted in any form without prior written permission.