Blog Archive

Archives by Month:

June 2017

Detroit Vacancy Rates: A Guide to Conscious Data Consumption

By Stephanie Quesnelle

What do all of these numbers have in common?

They’ve been used in the past year to describe the vacancy rate in Detroit.  Some represent different parts of the city others represent different types of building (residential vs. commercial), types of tenants (renters vs. owners), and different data sources.  Declining vacancy rates are touted as a mark of Detroit’s comeback, but they’re surprisingly hard to measure!

17.8% Metro Detroit office vacancy rate Newmark Grubb Knight Frank

4.89% Detroit rental vacancy rate Department of Numbers

25.9% Detroit business vacancy rate Drawing Detroit

16.5% Metro Detroit office vacancy rate (2017) DBusiness

13.3%: Office vacancy rate for downtown Detroit Detroit News

21.9% Detroit vacancy rate Drawing Detroit

18.7% Metro Detroit office vacancy rate (2016) DBusiness

14.2% Detroit Central Business District office vacancy rate Newmark Grubb Knight Frank

2.6% Metro Detroit regional apartment vacancy rate Crain’s Detroit

15% Detroit’s central business district office vacancy rate DBusiness

22.4% Detroit’s residential vacancy rate Drawing Detroit

9.4% Prominent downtown office space vacancy rate Crain’s Detroit

Despite these numbers measuring very different things, with very different methodologies, it would be easy to consume the data and turn to a friend and say “Oh Detroit’s vacancy rate is 26%” or “Detroit’s vacancy rate is 2.6%”. That’s why it’s important for people to be a responsible consumer of data.

When consuming or reporting data, it’s important to keep in mind five things about the numbers:

  • Scope
  • Geography
  • Availability
  • Scale
  • Source/Methods


The scope of the data refers to the magnitude of the data collection.  For example, if data comes from a survey, like the Crain’s Detroit prominent office building vacancy rate, it’s important to pay attention to who is surveyed and what the response rate is.  Smaller samples make data less reliable.  The length of time it took to collect the data also impacts data reliability.

Another important aspect of scope is how long the data took to be collected.  This is especially true when reporting with American Community Survey data.  Each year data are released at less-populated areas as 5-year average estimates and more populated areas as 1- year estimates.  We wrote an informative blog post about the ACS methodology when the 3-year estimates were eliminated, for example, you can’t compare 1-year and 5-year ACS estimates with each other.


Paying attention to the geographic boundaries that the data represents is also important.  In our list of vacancy rate data, we have prominent downtown office space, the Central Business District, downtown, the city of Detroit, and the Metro Detroit region.  Each of these numbers contributes to different narratives so we have to be careful that we don’t over-generalize a data point. A number for downtown Detroit can’t be assumed to represent all of Detroit and the Metro Detroit area.  


The first consideration around availability should be “is this data available in the time period that makes most sense?”  We still see people citing a literacy rate number from 1998, which we debunked back in 2011. Detroit is changing at a rapid pace, so ensuring that the numbers we cite are very recent is important to capture an appropriate picture.

The second consideration you should take into account is whether or not the data and methodology are publically available. It’s good to have access to data so you can run the numbers yourself and confirm that the aggregations are correct.


Data is collected and reported at different levels of granularity.  For example, the U.S. Postal Service data used to calculate Drawing Detroit’s statistics are collected at the address level, which is different than the building-level data collected by Jones Lang LaSalle, which is different than the city-level data reported from the 2015 American Community Survey.


Understanding methodology encompasses all of the above considerations, but we’ll call it out specifically here as an understanding of how questions were asked to obtain data.  For example, USPS data is collected by reports from individual postal employees who report if addresses seem vacant.  The Jones Lang LaSalle data had building owners report vacancy rates and rents from their own buildings.  Self-reporting can bias the data because the person reporting usually has incentives for the data to say something specific.

All of these different factors affect the potential biases of the data itself.  Every number that’s reported relies on people who bring their own set of opinions to everything from survey writing and administration to reporting and interpretation.  Being conscious of the many challenges in measuring difficult to define aspects of our community is really important. It helps us be better consumers of information as we watch the news, read a magazine, or interact with people on social media.

The NNIP Spring 2017 Retrospective

By Erica Raleigh

The National Neighborhood Indicators Partnership (NNIP) is a national network of local data intermediaries like D3 that work together to support the development and use of data in local policy-making and community building.  Twice a year, we gather in one of the partner cities to share project work, best practices, and new perspectives everyone is working on in their respective cities and we just got back from the Spring 2017 conference in Baltimore.

Usually, when we come back from an NNIP meeting, we share takeaways from formal conference presentations or links to interesting resources from other partners.  This spring, we’re mixing it up with a bird’s eye view of some common themes from the presentations.  A holistic approach to data collection, analysis, and communication can have a powerful impact.  There are four themes that we try to keep in mind while designing and implementing our work, which also came up in many of the NNIP sessions.

  1. Power of the narrative
  2. Power of evidence
  3. Power of restorative practice
  4. Power of genuine collaboration

Power of the Narrative

Storytelling is powerful because it connects our data points to a real human experience.  How we frame our data and analysis matters because the narrative drives how the information is received and the actions that result.  One of our favorite examples of powerful narratives is Solutions Journalism.  ModelD is a local publication of the Issue Media Group and they do great long-form narrative pieces, like this one about early childhood center quality in Detroit.

If you’re interested in learning more about solutions journalism, we recommend two podcasts, the first from On the Media and the second from It’s All Journalism.  

Power of Evidence

While narrative is important, facts highlight the real problems.  It’s so important in our community to have real data that identifies the problem, defines the scope of the challenge and a goal, and provides measurable indicators if that goal was achieved.  We know that exposure to lead in childhood has impacts on that child’s life, we have access to blood lead levels from tests, and we can identify communities where lead poisoning is a big problem, which helps target prevention initiatives and allows us to measure decreases in blood lead levels.

Evidence can also make more abstract concepts like economic and racial segregation more concrete. For example, one of our projects right now is helping update the “Business Case for Racial Equity”, a report that provides data to business leaders about the financial cost of racial inequality in Michigan.  

Power of Effective and Genuine Collaboration

Along with giving back to the community, it’s powerful when we can have a two-way conversation that exchanges information beneficial to both parties.  Community members can help us shine a light on the data and tell the story, but it only comes together if both parties are willing to learn from the other. Knowing the problem is one thing, but engaging the people who might utilize the solution ensures it actually meets the need of the community.  One way to do this is through human-centered design thinking (the U.S. Department of Veteran Affairs implements it well in this report).

In our work for Turning the Corner, a national initiative related to neighborhood change, we’re engaging community leaders in a Citizens Advisory Group. We’re asking CAG members to contribute their on-the-ground knowledge of neighborhood change—quantitative indicators we should look into, critiques of our model related to predicting change, input into the actual membership of the CAG.  They’re also helping us identify interviewees from specific Detroit neighborhoods that will help our report tell a powerful story about how important it is to identify neighborhood change.

Power of Restorative Practice

In addition to collaboration, there is a serious need to give back, whether through providing research results in a digestible format or a new tool that is functional and easy to use.  With the magnitude of the challenges in Detroit and the number of innovative ideas in the city, it can feel sometimes like the Detroit has been researched to death.  This is exhausting for residents as they’re usually promised that this will inform a new big initiative and subsequently fizzles out. It’s so important to have community participation in research, which helps

One of our key initiatives with Microsoft is the Civic User Testing Group or CUTGroup.  D3 recruited a network of Detroit residents to test out key websites and apps designed for them to ensure the information useful and the design makes sense. Our most recent test was of the Detroit Ledger, which tracks grants given to Detroit nonprofits.

So there you have it, a glimpse into the collective brain at D3.  We strive to keep these four aspects of our work in mind at all time and try to incorporate as many aspects of compelling storytelling, understandable data points, giving back to the community, and engaging that community in the actual process.  Even more importantly, we were thrilled to see these themes reflected on frequently during the NNIP meetings.  These four themes are important to keep in mind as we all pursue our work in local data collection and encourage data-driven decision making.