Does it matter, or can I just do better at the old way of doing things?

Big Data is a broad term, used and misused to describe a lot of things.  One thing it clearly includes is the fact that we are being overwhelmed with a lot of data.  A lot of that data is collected without any specific study or purpose in mind.  Yet we know that in science, many discoveries were made ‘accidentally’ or when a new process or information or formula was applied to a different area.

It also clearly includes two things which are important to appraisers.  1) lots of data, often providing complete ‘populations’ of what is being researched; and, 2) computation, working complex algorithms quickly, and providing improving visual displays.

What do these two technologies mean to appraisers?

So lots of data means that now we can research and easily identify the complete CMS (competitive market segment).  The segment which actually competed with the subject at the hypothetical sale date.  Having this complete (or substantially complete) data set means we can extract all the information possible out of this market.  If I use just three comps, or six comps, I am probably discarding information.  In the old days, this made sense.  Coming to an “appraiser’s opinion,” it makes sense to examine a handful of sales in depth, gaining ‘richness’ of information, and personal understanding of the market.  To individually verify/confirm 10 or 20 sales was cost prohibitive.  Still is in most cases.  Also, the human brain simply does not do well with more than 6 or 7 pieces of information at a time.

If the CMS (Competitive Market Segment) is the ideal data set, computation enables us to analyze the optimal data set - the CMS - regardless of size.

Computation means that if there are more than say seven competitive sales, simple visual plots and simple descriptive parameters (“statistics”) can also help the analyst understand the market.  Simple visual plots and summary parameters are quick and easy to get, with modern analytics software (like R).  If the CMS (Competitive Market Segment) is the ideal data set, computation enables us to analyze the optimal data set – the CMS – regardless of size.

Computation Algorithms

We have been using computation algorithms for a long time now.  An example is our residential form software or general spreadsheet or DCF software — to ‘automatically’ calculate the SqFt adjustment, or expense ratios or size adjustments, or present value of future cash flows.  These are algorithms, simple, but very helpful and time-saving.

What is different now is:  1) if the CMS is more than a handful of comps, it is prudent to suck as much possible information out of that CMS; 2) there is no need to sample the data, simplifying much explaining and justifying with words — simple numbers and graphs explain themselves.

Big Data is a problem — too much data.

Appraisal is considered a “small data” problem.  It can be 12 SFR sales, 23 industrial lots, or 123 four-star hotel properties.  In each case, a more accurate appraisal is possible if information is not discarded.  We intend in future blog posts and educational materials – to provide modern methods for appraisers, investors, and collateral lenders.

We can make progress from the old “trust me” world – to a new “the evidence is” world.  The difference is an objectively optimized data – the CMS.