**How SPARK calculates the numbers at the top of the 1004MC**

**How SPARK calculates the numbers at the top of the 1004MC**

All data for the 1004MC numbers and the numbers at the top of Page 2 (Page 3 for condos) are calculated based on the Competing data uploaded by the appraiser.

**1) Status Types**

By default, Spark considers active listings and contingent sales to be active. Pending sales are not included, however, if you believe pending sales should be included as active for the purposes of the 1004MC then that's a simple setting change. To make this change do the following:

Click on the 'Gear' icon in the upper right and choose 'General Settings'. Then find the setting for how to calculate status types and make the change. Then click 'Save Changes' and then 'Take Me Back'.

**2) How many sales/actives in a time period?**

For closed sales Spark looks at all 'Closed' status types and then adds up all of those that have a Sold Date in the time period. **Example:** in the 0-3 month period, if your effective date is 12/15/2016 then it would look from the date 9/16/2016 to 12/15/2016 and add up all 'Closed' status types that had a sale date in that time-frame.

Regarding active listings, Fannie Mae's guidance is explicit that in order for a listing to be counted as active it has to be considered as active on the last (most recent) date in the time period.

**1004MC Actives Example:** The effective date is 12/15/2016 and we're calculating what properties were active in the 0-3 month period. According to Fannie Mae we look at what was actively available on the most recent date in that time period which, in this case, is also the effective date. So Spark will include whatever properties were actively available on the effective date. For the 4-6 month period it would look at what was actively available on the most recent date in that time period which would be, in this example, 9/15/2016. Then for the 7-12 month period it would be 6/15/2016. All listings not as 'actives' are calculated this way throughout Spark. If it refers instead to listings, not actives then the calculation is based solely on when the property was originally listed.

**Reference: **Fannie Mae SEL-2010-09 - *...the "Total # of Comparable Active Listings" should be based on the most recent day in each of those time periods. For example, when completing the "Current - 3 Months" column for "Total # of Comparable Active Listings," the number should reflect the listings on the most recent date in the 3-month period (which is also the effective date of the appraisal), and not the cumulative number of listings for the entire 3-month time period.... For example, in the "Prior 4-6 Months" column, the "Total # of Comparable Active Listings" should reflect the listings on the last (most recent) day in that time period. Likewise, in the "Prior 7-12 Months," the "Total # of Comparable Active Listings" should reflect the listings on the last (most recent) day in that time period.*

**3) Absorption Rate and Housing Supply**

Absorption rate is the rate at which homes have sold and is calculated as the number of sales in the period divided by the number of the months in the period. **Example: **There are 4 sales in the 4-6 month period and there are 3 months in that period so the absorption rate would be 4 divided by 3 which is 1.33.

Housing supply is then taking the number of active listings in a period and dividing by the absorption rate. Example: If there are 5 active listings in the 4-6 month period and the absorption rate is 1.33 Spark does the following calculation: 5 divided by 1.33. Therefore, the housing supply for that period is 3.8.

**Note: **When a calculation cannot be made because there are no sales or no listings, Spark will note it as such using the appraiser's settings (e.g. N/A, None, Unable, etc).

**Reference: **Fannie Mae Announcement 08-30 - *To determine the absorption rate, the appraiser divides the total number of settled sales by the time frame being analyzed. The months of housing supply is based on the total listings for the applicable period divided by the absorption rate.** *

**4) Median Sale Price, List Price, and DOM**

To arrive at median price and median DOM, Spark takes the properties that were calculated as being Closed (for Median Sale Price and Median Closed DOM) or Active (for Median List Price and Median Listings DOM) in the given time period and calculates the median sale/list price and days on market for those. **Example: **There are 5 sales in the 0-3 month period with prices of $110,000, $104,000, $90,000, $107,000, and $115,000. The Median Sale Price for this period would be $107,000.

**Note:** In calculating DOM for active listings in the 4-6 month and 7-12 month periods, Spark calculates how many days there are between the effective date and the Prior Period Date and subtracts that from the current DOM DOM for that listing. If the effective date is 12/15/2016, the Prior Period Date for the 4-6 month period would be 9/15/2016. In the very rare case where this leads to a negative DOM, Spark will instead calculate the DOM as the number of days between List Date and the Prior Period Date.

**5) Sale Price as % of List Price**

Spark calculates the Sale-to-List Price ratio for each individual sale which is the sale price divided by the list price (or original list price if the appraiser chooses). Then Spark looks at the individual Sale-to-List Price ratios for those sales in the given time period and calculates the median. **Example:** In the 0-3 month period there are 5 sales. Those 5 sales have Sale-to-List Price ratios of: 99%, 100%, 100%, 99%, 100%. The median Sale-to-List Price ratio for the 0-3 month period would be 100%.

**Reference:** Fannie Mae Appraisal and Property Related FAQs (2/12/2016) - *The Median Sale Price as % of List Price is to be determined by analyzing the comparables that have sold and settled during the specific time frame, not by using the data from the lines above this section on the form.*

**Neighborhood Calculations**

**Neighborhood Calculations**

The Low, High, and Predominant numbers in the Neighborhood section of the report (Page 1) are calculated based on the Neighborhood data uploaded by the appraiser. For the 1004 this would be all SFR properties in the defined neighborhood boundaries. For the 1073 (condo form) this would be all condo units in the defined neighborhood boundaries.

**Reference:** Fannie Mae Selling Guide (10/24/2016) - *The appraiser must indicate the price range and predominant price of properties in the subject neighborhood. The price range must reflect high and low prevailing prices for one-unit properties, two- to four-unit properties, condo units, or co-op units depending on the property type being appraised and the appraisal form being used.*

**1) Low/High:** In order to avoid using potentially outdated low and high prices, Spark only examines sale prices in the past year. The low price is the lowest sale price in the past year. The high price is the highest price in the past year. The low age is the lowest age in all the Neighborhood data and the high age is the highest age in all the Neighborhood data.

**2) Predominant:** Fannie Mae states that predominant price and age should be based on which number occurs most frequently. By default, the predominant value that Spark uses is also the number that occurs most frequently which is a measure of central tendency known as mode. The appraiser can easily change this to average or median if they prefer since, in certain cases (e.g. a glut of recent new construction listings in the neighborhood), the mode does not give an accurate picture of the neighborhood.

**Reference:** Fannie Mae Selling Guide (10/24/2016) - **the predominant price will be that which is the most common or most frequently found in the neighborhood.**

Active listings are calculated as noted above based on Fannie Mae guidelines, however when a features says Listings (not Actives) it calculates those in a normal manner based on when the property was originally listed.

**Trend Calculations (Regression)**

**Trend Calculations (Regression)**

Scatter plot charts provide trend calculations (i.e. percent change) based on simple linear regression performed on all the data points. Grouped charts (bar charts and line charts) provide a trend analysis that uses simple linear regression as well but based on the data after it's been grouped together into periods (e.g. the median price of homes in the 4-6 month period would be data that is grouped together).

**How is the percent change calculated? ** The results from simple linear regression provide the slope of a trend line (variable m) and the y-intercept (variable b) in this equation y = mx+b. To calculate the rate of change, the beginning y-value (at the earliest or oldest point in time on the chart) and the oldest y-value (the y-value on the last day in the chart which is the effective date) are utilized. This is found by utilizing the known m, x and b variables for both x-axis points (first and last).

**Calculation Example** In this example we are analyzing median price and looking at the past 12 months of data. Simple regression from this data gave us m = 50 and b = 100,000. These variables are utilized along with the x-variable which is 1 and 365.

First point) y1 = 50*1+100,000 → y1 = 100,050

Last point) y2 = 50*365+100,000 → y2 = 118,250

Utilizing this information the rate of change per month, quarter, or year is calculated by one of the following two methods. The first method (called simple growth) subtracts value y1 from y2 and then divide by the absolute value of y1.

Total Change = (118,250-100,000) / |100,000| → Total Change = 18.25%

That is the total change over the entire 12 months. If the rate of change is chosen to be per month then Spark will convert that 18.25% change over 12 months by dividing the total change by the number of months analyzed.

% Change Per Month = 18.25% / 12 → % Change Per Month = 1.5%

The second method utilizes compounding to determine the % Change Per Month. This is done by dividing y2 by y1 and then raising that result to the power of 1/n where n is the number of months analyzed. That result then has 1 subtracted from it and the result is the total growth.

^{1/12}- 1

% Change Per Month = 1.01407 - 1 → % Change Per Month = 1.4%

Note: When the percent change calculation is 1,000% or greater Spark will show "999%*" due primarily to space limitations. In addition, numbers this large are virtually meaningless and typically only take place when the starting point of the data approaches zero.

**Visual Exceptions for Scatter Plots** Quantity and Percent of Quantities (e.g. % of properties that are REO) are plotted differently in order to visualize the data more easily. As an example, in a scatter plot of REO %, rather than plot three arm's-length transactions and one REO on the same day as three 1's and one 0, that data is plotted as 0.25 (25%) instead. The simple regression trend is still calculated on the actual data since it's a scatter plot.

Due to how many Actives there are in many data-sets (based on the Fannie Mae method) and since they are calculated daily for scatter plots, there are so many dots it makes the scatter plot unusable. Therefore, scatter plots involving Actives (Housing Supply, # of Actives, etc) are plotted on a weekly basis allowing for better "readability" of the data. The simple regression trend is still calculated on the actual data since it's a scatter plot.