CoreLogic's Hedonic Index: What Agents Should Understand About Property Data
“CoreLogic says the property is worth…” It’s a phrase heard in vendor conversations daily. But few agents understand how CoreLogic actually calculates property values or what their data represents.
That understanding matters. When you know how the numbers are generated, you can explain them to vendors more effectively—and identify when they’re likely wrong.
How the Hedonic Index Works
CoreLogic’s headline property price index uses a methodology called hedonic regression. Without diving into statistical complexity, here’s the practical explanation:
Traditional price indices track identical items over time. The price of a specific basket of groceries, for example. But properties aren’t identical—each sale is a unique asset.
Hedonic indices solve this by decomposing property prices into constituent characteristics: location, bedrooms, bathrooms, land size, property type, and dozens of other factors. The model calculates how much each characteristic contributes to price.
This allows the index to track “constant quality” price movements. If properties this quarter are selling for more than last quarter, the index distinguishes between:
- Price increases for equivalent properties (genuine appreciation)
- Changes in what’s selling (composition effects)
A quarter where only expensive properties sell might show raw price increases, but the hedonic index adjusts for that compositional shift.
What This Means for Agents
Understanding hedonic indices affects how you interpret and communicate data:
Market Commentary
When CoreLogic reports “Sydney prices rose 1.2% this month,” they’re reporting hedonic-adjusted figures. This is more meaningful than raw median prices, which fluctuate with sales mix.
But the index is still an estimate. It assumes the model correctly captures all relevant price factors. When those assumptions don’t hold, the index may mislead.
Individual Property Valuations
CoreLogic’s automated valuations (AVMs) apply the hedonic model to specific properties. They estimate what a property with those characteristics should sell for based on recent comparable sales.
This works well for standard properties in high-transaction markets. It works poorly for:
- Unusual properties (characteristics not well captured)
- Low-transaction areas (insufficient data for accurate estimation)
- Properties with unmeasured features (renovations, views, condition)
When vendors arrive with CoreLogic estimates, knowing these limitations helps you contextualise the numbers.
Understanding Data Lag
CoreLogic data relies on settlement information, which lags contracts by weeks. Index updates reflect activity from 6-8 weeks prior. In fast-moving markets, current conditions may differ significantly from what data shows.
Agents observing real-time auction results and contract activity often have more current market intelligence than published indices.
Practical Applications
Vendor Conversations
“CoreLogic values my property at $1.3 million” deserves a thoughtful response:
“CoreLogic estimates are useful starting points, but let me explain what they capture and what they miss.
The model looks at property characteristics—bedrooms, land size, location—and estimates based on similar properties that have sold. It’s quite accurate for typical properties in areas with lots of sales.
But your property has [specific features] that the model doesn’t fully account for. Your renovation quality, the outlook, the presentation—these affect value but aren’t in the algorithm.
Let me show you specific comparable sales and explain how your property compares to each. That gives us a more accurate range than the automated estimate alone.”
This response respects the data while establishing your expertise in interpreting it.
Market Analysis
When preparing market reports or listing presentations:
Use multiple sources: CoreLogic, PropTrack, and other providers use different methodologies. Triangulating across sources provides more robust insights than relying on any single dataset.
Check recent activity: Indices lag reality. Supplement index data with recent auction results and contract information you’ve observed firsthand.
Understand local variation: City-wide indices obscure suburb-level dynamics. Use granular data where available.
Acknowledge uncertainty: Data provides evidence, not certainty. Presenting ranges with confidence levels is more honest than false precision.
Days on Market Data
CoreLogic publishes days on market statistics that vendors and buyers often reference. Understanding the methodology matters:
- DOM calculations vary between providers
- Withdrawn and relisted properties complicate measurement
- Seasonal patterns affect comparability
- Property type and price point significantly influence typical DOM
Context these statistics rather than accepting them uncritically.
Data Quality Issues
All data providers face quality challenges:
Coverage Gaps
Not all sales are captured. Off-market transactions, certain auction results, and delayed settlements may be missing or delayed. Low-transaction markets have higher uncertainty.
Characteristic Data
Property characteristics are sourced from council records, historical listings, and other databases. Errors and outdated information are common. A property listed as 3-bedroom might actually have 4 after renovation.
Time Lags
Settlement data flows through various channels before reaching data providers. The delay between contract and data availability affects currency.
Model Limitations
Hedonic models are statistical estimates with inherent uncertainty. Confidence intervals around any estimate are wider than headline numbers suggest.
These limitations don’t invalidate the data—it remains valuable. But treating it as gospel rather than useful-but-imperfect evidence leads to poor decisions.
Explaining Data to Clients
Sophisticated data explanation builds credibility:
Don’t dismiss: “Those numbers are wrong” sounds defensive. “Let me explain what those numbers represent and what they might miss” sounds expert.
Show your work: Walk vendors through comparable sales analysis. Demonstrate that your valuation has a foundation more robust than automated estimates.
Acknowledge when data agrees: If CoreLogic aligns with your assessment, say so. It’s not about being contrarian—it’s about being accurate.
Use data to support, not replace, judgment: “The data suggests a range of $1.25-1.35 million. Based on my inspection and market knowledge, I expect you to achieve the upper end because of [specific factors].”
The best agents are fluent in data without being slaves to it.
Looking Ahead
Property data continues evolving:
More sources: Buyer behaviour data, social signals, and alternative datasets will supplement traditional sources.
Faster updates: Real-time transaction data will reduce lag issues.
Better models: Machine learning approaches may capture patterns that traditional hedonic models miss.
Greater accessibility: Data that was previously professional-only is increasingly available to consumers, changing vendor conversations.
Agents who understand data foundations—not just surface numbers—will maintain advantage as information democratises.
Understanding how CoreLogic and other providers generate their numbers isn’t just technical knowledge. It’s the foundation for credibility in an increasingly data-driven industry.
Linda Powers consults with real estate agencies on market analysis and data utilisation. Her 25-year career has tracked the evolution from intuition-based to data-informed property markets.