Evidence Based Valuation© – EBV exemplifies the best of what has been called “appraisal modernization.”  Do current standards help our progress, or are they in the way of the best future?

Part I is an overview.  Part II is on education.  Standards intertwine with education.  Here in Part III we start our multi-part look at USPAP for elements which promote optimal methods and which elements impede technology.

In starting to write this I reviewed key points in the USPAP Integrity standards and the performance standards.  Surprisingly, 18 distinct points aroused interest.  These comprise:  scope of work, the goal of credibility, bias issues, competency, ethics, and clarity of reporting issues in the age of data-stream.  It appears this Part III will be several parts.  Our intent is to clarify where Standards help or obstruct the move to Evidence Based Valuation© (EBV).

As we start, recall that an adequate job is measured in three components:

  • Is the right question asked? (The hypothesis)
  • Does the result tend toward the right answer? (Trueness/accuracy)
  • Is the result as certain as possible? (Sureness/precision)

SCOPE OF WORK

Appraisal ‘procedures’ and the USPAP ‘Rule’ define this well:

USPAP defines scope of work as “the type and extent of research and analysis…”.  (“DEFINITIONS”)

TARE (The Appraisal of Real Estate) detaches the “identification of the assignment elements” from the determination of the scope of work.  The test of adequacy for scope of work is three-fold:  1) Does it lead to results that are worthy of belief; 2) Is it consistent with user expectations; and 3) Is it consistent to peers’ actions?

The Data Science Approach follows a practice consistent with the philosophy of science, rather than a procedure to be followed.  The analyst scientist forms a hypothesis.  The hypothesis is based on abductive reasoning, which combines preliminary information with field-related prior knowledge.  It is an educated guess as to the best procedure to be followed.  The context for the hypothesis comprises:

  • Preliminary research or consultation – to improve/refine the hypothesis;
  • The question(s) to be answered (value, risk, forecast, relevance);
  • Acceptable assumptions of fact and context;
  • Definition of variables at issue;
  • Data collection.

The statement (hypothesis) must be reproducible or testable.  This requires documentation, particularly with influential and outlier data points.  Valuation is substantially predictive in nature.  A value (expected selling price) is a prediction.  Predictive algorithms (adjustments) provide an estimate of any marginal or conditional change.  In traditional appraisal, this is called an opinion.  Other valuations (such as AVMs) provide an analytic result. EBV

The data science approach to valuation is:

  • Identify the Assignment Data Frame (ADF)©
  • Define the Competitive Market Segment (CMS)©
  • Enhance the CMS for bias-variance optimization (OIF)©
  • Apply one of the three fundamental predictive algorithms.

Data Science hypotheses are broader and more comprehensive than vintage “scope of work.”