Is the science different from the old routine – “support your opinion”?

It kinda sounds like the same thing.  Of course.  No real difference.  But is it possible there a difference in the thinking process?  Would a scientist think about the problem differently?

First, let’s be clear on one thing.  We get paid for four things:

  • Identify the problem correctly;
  • Gather the data;
  • Analyze the data;
  • Communicate the results (or our opinion).

At first glance, it appears that the scientific method parallels the “valuation process.”

The appraisal process says (The Appraisal of Real Estate 14th ed., p.36):

  • Step 1: Identify the problem and determine the scope of work.
  • Step 2: Investigate trends (i.e., the market analysis)
  • Step 3: Study the property: (the approaches to value)
  • Step 4: Integrate the above, reconciling to a point value or range.

The valuation science process is outlined similarly:

  • Step 1: Identify the problem, define the data frame and probable solution path.
  • Step 2: Enhance, clean and transform (wrangle) the data for analyses.
  • Step 3: Analyze to predict value(s), forecasts, and reliability/risk measures.
  • Step 4: Transmit results, reproducibility elements, and interface constructs.

So, what’s the difference?  The difference is – three out of four things have changed!

Most importantly, data science is specifically iterative.  Steps 1 and 2 recycle to identify the ideal data set.  Exploratory data analysis is a critical part of data science.  It is explicitly different from the traditional “Trust me, I know a good comp when I see it!”

In the past, we used the term “collect” the data.  When I became an appraiser, the main part of my job was “data collection” (market area, subject, and comps).  Once I had four or five comps (confirmed), I could take a deep breath, and turn my work over to the typist for a final edit.  Today, I plug in a few parameters:  the area, time back, and perhaps property size.  Three seconds I have it on a screen, downloaded into my data templates, into R “tibbles”, ready for scrubbing and predictive analysis.

Data was scarce, rough, and slow to collect.  Today, data is plentiful, usually complete, and instant.

In the vintage method, I anticipated which approaches to value to use.  Each approach gave different results, and the approaches were interrelated, and caused “inbreeding.”  (Each approach is built on comparison anyway.)  Reconciliation is based on subjective reasoning/weighting to come to a final opinion of a point value.  Evidence Based Valuation© fully integrates the income approach data and the cost approach data into the process.  “Reconciliation and weighting” are cohesively unified.  Whatever reconciliation does take place – does so in a more detailed and precise and measurable pattern, within the process, not separately.

In the past, the data (the comps) were selected first – then the appraiser did a market analysis.  In the “new valuation modeling paradigm,”¹ the CMS (Competitive Market Segment) ©, the market analysis comes first – then (and only then) the data evolves into useful information.

Finally, and in short – data science focuses the data.  If you get the data right, a good solution follows.  Remember the George Dell’s Rule #1: (Impossibility Theorem): “You can’t get objective output from subjective input!”

¹Marvin L. Wolverton, letter to the editor. The Appraisal Journal (Spring 2014): 175.