• Welcome to AppraisersForum.com, the premier online  community for the discussion of real estate appraisal. Register a free account to be able to post and unlock additional forums and features.

"Quantifiable Market-Derived Methods" for adjustments required by FNMA/USPAP

Yes. I said that already. Experience in itself, is certainly not viable as the foundation for a fact driven analysis. (But nothing replaces experience either, whether we are talking about life altering surgery, or professional analysis opinions by somebody who has "been there and done that" for years.)

Experience "should" also recognize corrupt data, which abounds today. You've not been a Tennessee appraiser in decades, so you've not had reason to review how questionable some of the data is that spews from our MLS. The same data used in groupings to support many adjustments and is sent to the Billow and Fluff sites used by everybody, including the GSE's. Data that's often distorted by information we see all the time individually. GLA is often wrong - Concessions are not reported - Last minute buyer agent commissions added to the price at the 11th hour - Again, these things can be caught individually, but not in bulk data and it distorts many things.

"Corrupt Data?" You have to have a pretty deep understanding of the data and statistics, including plenty of experience in building models from domain data to really understand what you are talking about.

1. First, what exactly do you mean by "Corrupt Data?"
a) Do you mean data that someone has incorrectly modified to effect a certain outcome? That can happen of course. We might expect that from data provided by a real estate agent.
b) Or, do you mean data that is in error due to some clerical mistake?
c) Or data that is in error through systemic failures in communication (updating) between different government agencies (e.g. owner/builder changes to construction plans not communicated to the official recorder)?
d) Or some combination of the above, or some other source of error or data inconsistency?

2. Are these errors you refer to random, systemic, or intentional changes?

3. The point is that random errors in large amounts of data usually cancel each other out and do not really impact the regression or stat averages. We can address this issue by increasing, if possible, the amount of data analyzed by going further back in time or expanding the geographical scope, or looking at addtional MLS sources, if they exist.

4. Systemic errors can be at least partially neutralized by understanding and removing them in the final set of comps (as we probably won't have the resources to clean them from all data) - or by discovering whether they cancel each other out (as is sometimes the case). My example is homeowners requesting changes to builder plans, such as removing second-floor areas to create vaulted ceilings or requesting extensions to the living area (e.g., the creation of overhangs), - where these changes never get updated by the official recorder. One reduces GLA, and the other increases it. To gain this kind of experience requires having worked in a given market area for a significant period of time, conducting real statistical analysis.

So, in conclusion, "I" would say that experience should include performing accurate statistical analysis using MARS - and getting good R2 values from cross-validation, using separate training, validation, and possibly test data sets to avoid overfitting. The feedback from model building is where you acquire a real understanding of the vast majority of data errors.

Any appraiser who does not understand overfitting can be assumed to lack knowledge of advanced statistics (MARS) and, therefore, probably lacks complete competence as an appraiser for the given market area.

CAVEAT: My experience is limited to California.
 
Last edited:
Hi Bert - That post was nearly a month ago.......

Corrupt: "(of a text or a computer database or program) made unreliable by errors or alterations"

In rural Tennessee, we have to deal with limited data and inaccuracies become a bigger deal and are multi-faceted. We often correct the information shown from our MLS or tax assessment sources, which isn't easily done when pulling data for trend or regression purposes. In many cases, if we weren't involved in the appraisal of the comparable sales, we wouldn't have any idea of the errors out there.

And logically - The more limited the data in an area, the more impact any wrong information will have for obvious reasons. There are multiple causes for the bad information, and there's no accountability mechanisms in place to force the issue. (Inaccuracies including GLA, acreage, sale dates, concessions, sale prices, DOM, etc.) I was told recently that the GSE information is much more accurate than anything we have out there, and they are well aware of the data mishaps. It just seems odd to me that so much emphasis is put on the back of the appraiser for being accurate, yet those who slap us around do nothing about the data. If data involves hundreds of sales, then a few won't matter too much. If data involves a few dozen, it does.

Example: When pulling a grouping of 15 - 20 sales in a rural community around here, and we find 6 sales therein with basements included as 2nd stories, the data is going to be corrupted. That just happened in August.

My point - An appraiser who lacks experience to smell those things out, will end up using bad information in a report or in their ongoing adjustments. The post referenced a prior statement about "experience" being important.
 
Hi Bert - That post was nearly a month ago.......

Corrupt: "(of a text or a computer database or program) made unreliable by errors or alterations"

In rural Tennessee, we have to deal with limited data and inaccuracies become a bigger deal and are multi-faceted. We often correct the information shown from our MLS or tax assessment sources, which isn't easily done when pulling data for trend or regression purposes. In many cases, if we weren't involved in the appraisal of the comparable sales, we wouldn't have any idea of the errors out there.

And logically - The more limited the data in an area, the more impact any wrong information will have for obvious reasons. There are multiple causes for the bad information, and there's no accountability mechanisms in place to force the issue. (Inaccuracies including GLA, acreage, sale dates, concessions, sale prices, DOM, etc.) I was told recently that the GSE information is much more accurate than anything we have out there, and they are well aware of the data mishaps. It just seems odd to me that so much emphasis is put on the back of the appraiser for being accurate, yet those who slap us around do nothing about the data. If data involves hundreds of sales, then a few won't matter too much. If data involves a few dozen, it does.

Example: When pulling a grouping of 15 - 20 sales in a rural community around here, and we find 6 sales therein with basements included as 2nd stories, the data is going to be corrupted. That just happened in August.

My point - An appraiser who lacks experience to smell those things out, will end up using bad information in a report or in their ongoing adjustments. The post referenced a prior statement about "experience" being important.

That's why I added the caveat: "My experience is limited to California." Somebody does have to try to get the local and county planners to do a better job of acquiring accurate data. I imagine they get original building plans in the original permit process which are sent to the county recorder at some point in time, - but which get altered later on by the original builder, or the owner himself withoout additional permits, or if those addition permits are submitted, they are submitted with inaccurate measurments which may or may not get submitted to the County Assessor's office, and if they do get submitted up proper chains, then we need to question whether someone is going to take the time to modify the original recording. And likely there are large differences between how different states and counties deal with these issues.

It would also be useful if the MLS associations would do a better job of getting measurements for sales listings, as they are relied upon by appraisers. MLS organizations tend to just download data from the County Assessor. But it would be very nice for appraisers, if they required new listings to submit accurate floor plans.

Then you have states like Texas.
 
IMO this drive for "supportable" adjustments using fancy software and regression theories only found in dusty texts, has swung beyond any reasonable definition of credible. I'm sure there's a unicorn or two out there, but 99.999999% of appraisers aren't smart enough to understand what a team of programmers and data scientists backed by VC money has put together to "support" market adjustments. Heck, if you follow these software teams on social media, they have a hard time explaining the outputs too. And something that has been glossed over, how in the world can an average Joe borrower read a report and understand any of this?

At the end of the day, buyers aren't walking through properties and running a dozen different regression formulas to determine if an inground pool fits their lifestyle or not. Likewise, a seller isn't sitting through a listing pitch reading a dozen squiggly-line graphs/charts to determine if their extra 1/2 bath justifies a higher list price.

Crazy how we've evolved from made up adjustment lists to made up adjustment software. We've gone from ludicrous to plaid.
 
Last edited:
If you’re using spreadsheets charts and graphs to determine your adjustments, you arent qualified to be an appraiser. That’s not how people buy homes that’s not how we should be doing appraisals. If I throw a bunch of junk data into a spreadsheet and it tells me a screen porch is a 10k adjustment on a $700,000 home, I better use my experience and expertise in my market to know that That porch probably cost 40-60k to build and a $10k adjustment is wrong.

MLS data inputted by real estate agents. Agent inputted MLS info is one of the biggest examples of junk data There is out there. There’s no uniformity to how things are entered, and there’s no price to pay when agents don’t enter things right. Half the basement homes in my market aren’t separated by above and below grade. I still have agents who will list the 5000 ft.² above great home with a 2000 square-foot basement as a 7000 square-foot home. I have to go in and separate everything out on almost every basement home I do. If I just hit a button and downloaded that stuff into a chart and graph, it would be horribly unreliable. And that’s just one example, half the amenities aren’t listed in the section. They’re supposed to be listed in. You have to read paragraphs and scan every picture to know what the hell you’re dealing with.

Even major items like inground pools often aren’t entered into the right cell. I’ve pulled many comps that I thought were good then I scroll through the photos and see it has a pool. No mention in the amenities or exterior features line item in the MLS. Imagine randomly inputting That home into your data set and leaving out a 100 K plus feature.

But for some reason, every old man in position of power is absolutely mesmerized by the tech bros. Sometimes I think they’re in love.

So that’s fine, you can’t argue with a grown man who has puppy love for the tech bro. So when I’m asked to provide a graph, I’ll provide them a graph if it makes them feel better. But just like measuring to the nearest 10th of a foot, it’s all for show. Meaningless.
 
Last edited:
MLS data inputted by real estate. Agent inputted MLS info is one of the biggest examples of junk data There is out there. There’s no uniformity to how things are entered, and there’s no price to pay when agents don’t enter things right. Half the basement homes in my market aren’t separated by above and below grade. I still have agents who will list the 5000 ft.² above great home with a 2000 square-foot basement as a 7000 square-foot home. I have to go in and separate everything out on almost every basement home I do. If I just hit a button and downloaded that stuff into a chart and graph, it would be horribly unreliable. And that’s just one example, half the amenities aren’t listed in the section. They’re supposed to be listed in. You have to read paragraphs and scan every picture to know what the hell you’re dealing with.
Well said Chad. That's exactly what we run into every week in Tennessee.
 
IMO this drive for "supportable" adjustments using fancy software and regression theories only found in dusty texts, has swung beyond any reasonable definition of credible. I'm sure there's a unicorn or two out there, but 99.999999% of appraisers aren't smart enough to understand what a team of programmers and data scientists backed by VC money has put together to "support" market adjustments. Heck, if you follow these software teams on social media, they have a hard time explaining the outputs too. And something that has been glossed over, how in the world can an average Joe borrower read a report and understand any of this?

At the end of the day, buyers aren't walking through properties and running a dozen different regression formulas to determine if an inground pool fits their lifestyle or not. Likewise, a seller isn't sitting through a listing pitch reading a dozen squiggly-line graphs/charts to determine if their extra 1/2 bath justifies a higher list price.

Crazy how we've evolved from made up adjustment lists to made up adjustment software. We've gone from ludicrous to plaid.
100% agree.
The problem is the very reason it is being pushed. And that is because glitzy charts, graphs, and stats look impressive. The fact taht an appraiser clicked to get the result and might not analzse to see how relevant it is, or take the time to vet the data input means it is a cover to allow the fast, piecemeal fast food product the entities have turned appraisal into withy an empahiss on hybrids and PDC collections, alwasy denigrating what they call "traditional appraials" as slow and expensive.

The only way an appraiser can survive in the low-fee AMC fee and deadline environment is to churn and burn, then the entities blame the appraisers for taking shortcuts, giving them an excuse to cut out the role of appraiser even further.
There are not enough private and lender clients not using AMC s to sustain the majority of res license appraisers, so the system is a Darwinian selection of who remains and why.

Imo, statistics and these kinds of graphs are best used for time/market condition adjustments, where a larger amount of sales is needed to show a trend. For property and location adjustments, you correctly identify the problem - a larger sample of sales of non-similar properties skews the results. Appraisers who have field experience develop geo competence for the ability to analyze the real world market with dry data. But the PDC collections done by third parties will limit field experience for appraisers going forward, making them even more irrelevant - by intention.
 
If you’re using spreadsheets charts and graphs to determine your adjustments, you arent qualified to be an appraiser. That’s not how people buy homes that’s not how we should be doing appraisals. If I throw a bunch of junk data into a spreadsheet and it tells me a screen porch is a 10k adjustment on a $700,000 home, I better use my experience and expertise in my market to know that That porch probably cost 40-60k to build and a $10k adjustment is wrong.

MLS data inputted by real estate agents. Agent inputted MLS info is one of the biggest examples of junk data There is out there. There’s no uniformity to how things are entered, and there’s no price to pay when agents don’t enter things right. Half the basement homes in my market aren’t separated by above and below grade. I still have agents who will list the 5000 ft.² above great home with a 2000 square-foot basement as a 7000 square-foot home. I have to go in and separate everything out on almost every basement home I do. If I just hit a button and downloaded that stuff into a chart and graph, it would be horribly unreliable. And that’s just one example, half the amenities aren’t listed in the section. They’re supposed to be listed in. You have to read paragraphs and scan every picture to know what the hell you’re dealing with.

Even major items like inground pools often aren’t entered into the right cell. I’ve pulled many comps that I thought were good then I scroll through the photos and see it has a pool. No mention in the amenities or exterior features line item in the MLS. Imagine randomly inputting That home into your data set and leaving out a 100 K plus feature.

But for some reason, every old man in position of power is absolutely mesmerized by the tech bros. Sometimes I think they’re in love.

So that’s fine, you can’t argue with a grown man who has puppy love for the tech bro. So when I’m asked to provide a graph, I’ll provide them a graph if it makes them feel better. But just like measuring to the nearest 10th of a foot, it’s all for show. Meaningless.
MLS data is valuable, and like it is the only alternative to public records, which might be more accurate (or not) and has limitations. I vet the MLS data as you describe.

The problem is that regardless of the data source, if each sale was vetted by an appraiser or operator before the stats were run, it would be too time-consuming. Agree with your post overall though !
 
The problem is that regardless of the data source, if each sale was vetted by an appraiser or operator before the stats were run, it would be too time-consuming. Agree with your post overall though !
Absolutely. A long time ago the appraisal institute did research on accuracy of settlement dates. Found out it was only 1 or 2 % off. Now how much off statistically is sufficient to not go time wasted, on not needed perfection. Even a regression doesn't have the answers for left over unaccounted $amounts. Somewhere, some government idiot savants think their system is perfect, no, maybe statistically perfect enough.

From the very beginning of appraising it was said right the 1st time. But perfection creep, ansi, linear or non linear, regression, can only point to a direction, but not necessarily the holy grail number. But statistically enough to call off the hounds.
The Appraising Residential Properties, 4th Edition, Appraisal Institute, "Other Quantitative Adjustment Techniques”, Page 344 further states:
The process of supporting the contribution of individual variables (features) is limited and often difficult to quantify, with adjustment deemed to be qualitatively supported unless otherwise addressed. All methods of supporting adjustments are usually limited by inherent uncertainties within the applications themselves.
God bless this author.
 
...I better use my experience and expertise in my market to know that That porch probably cost 40-60k to build...
In my view, it is this type of language that drove the ASB to clarify that "experience" is not a valid method of support. You state that you are using "experience and expertise" when, in fact, you are (based on your own words) using data (the cost new), not "experience" :)
 
Find a Real Estate Appraiser - Enter Zip Code

Copyright © 2000-, AppraisersForum.com, All Rights Reserved
AppraisersForum.com is proudly hosted by the folks at
AppraiserSites.com
Back
Top