2. Accuracy
Quality data are dependent on a number of factors, some of which are common to large data enterprises (enhancing quality control and reducing errors), while others emerge when comparing data sources that measure crimes in different ways as well as reports that measure crime across different time periods. These factors can pose challenges to the interpretation and use of the data. In addition, data involving firearms are often inaccurate due to the miscoding of some assaults as accidents.
The federal government collects information on crime trends in numerous ways: federal, state, local, and tribal law enforcement agencies report to the UCR Program, the NCVS is fielded annually, and public health surveillance systems record information on deaths and injuries caused by criminal actions. These systems are complex. Federal agencies and their state, local, tribal, and public health partners use practices and processes to maintain and improve data quality, ensuring that data provide an accurate representation of what occurred. But data quality depends on many factors. Some are common to large data enterprises (enhancing quality control and reducing errors), while others emerge when comparing data sources that measure crimes in different ways or reports that measure crime across different time periods. These factors can pose challenges to the interpretation and use of the data.
The Challenge of Quality Control
Working Group interviews and focus groups with local law enforcement officials and state UCR program managers conducted for this report revealed concerns about the quality of crime trends information reported to NIBRS.15 The transition from the Summary Reporting System to the more complex incident-based reporting system increased the burden for under-resourced law enforcement agencies, including the front-line police officers charged with recording the data that are ultimately reported. As part of its transition to NIBRS, for example, the New York Police Department increased the length of its incident report form by 50%–from four to six pages.
Other law enforcement officials shared concerns about the quality of data submitted by patrol officers and the difficulty of securing staff and resources for quality assurance and data infrastructure at a time when department leaders and elected officials were eager to put more officers on the street. Data gleaned from focus groups and a survey conducted with state UCR program managers indicated that the overwhelming majority of programs lacked the staff or resources to provide much quality control assistance to local law enforcement agencies.16
Discrepancies Between NIBRS and NCVS
While official crimes reported to law enforcement and published through the UCR Program have typically been the leading barometer of crime trends, most crimes are not reported to the police. For example, 2022 NCVS data indicate that about 42% of violent crimes were reported to law enforcement; rape and sexual assault were the least likely to be reported, at only about 1 in 5.17 About 32% of property crimes were reported.
The UCR and NCVS measures indicate very different total volumes of crime. But over the past three decades, they have generally shown fairly consistent year-to-year rates of change for violent and property crime. When they don’t agree, however, there can be significant confusion about the most fundamental question: Did crime go up or down?
This was especially the case in 2022, as noted by the founding chair of this Working Group, Richard Rosenfeld, and Group member Janet Lauritsen. In an October 2023 brief, the pair pointed out that law enforcement data reported through the UCR Program indicated that all types of violent crime fell 2% from 2021 to 2022, while NCVS victimization data showed an increase of 75%, a discrepancy primarily due to a much higher rate of aggravated assaults in the victimization survey (Figure 2). Potential explanations for the divergence include increased police response time due to reduced staffing levels; declining public trust in law enforcement, leading to fewer public calls for police service; data reporting periods that differ by a few months in an environment of changing trends; or potential issues encountered with data collected during the COVID-19 pandemic period.
Common Issues with Reporting Accurate Data
Beyond these big-picture methodological questions are the more mundane but critical issues of accuracy in the recording and reporting of law enforcement data. Many law enforcement agencies present their crime trends data in multiple formats, including reporting to the UCR Program, posting on agency websites and data portals, and publishing data and analysis in official reports.
Reporting to NIBRS is more complicated than reporting to the Summary Reporting System, and officers don’t always record incident data correctly or completely. As noted above, law enforcement agencies and state UCR programs often lack the staff to do much quality control on the backend. The FBI does provide state UCR programs and law enforcement agencies with a variety of quality assurance tools, but they generally focus on ensuring that data are complete and properly formatted; they cannot ensure that data were entered correctly.
As with any data enterprise, inaccuracies and inconsistencies within and between data sources are inevitable due to various types of errors, including simple data entry mistakes. Additionally, data published on agency websites may be preliminary and updated over time. Discrepancies between values published on websites and those shared by the FBI, however, may be substantial and lead to confusion and concern about what the actual volume of crime might be in a city or jurisdiction.18
Surveys such as the NCVS are also subject to errors because they rely on respondents to report on all crimes experienced by those in their household aged 12 and older.19 Errors occur when survey respondents may be unaware of or may not accurately recall the nature or timing of all victimization events—or may be unwilling to tell a researcher about them.
Small Differences, Large Impact
Even small discrepancies can have big consequences in the policy and political environment. This is especially true when an inaccuracy makes the difference between a particular crime trend going up or down. In a jurisdiction that had 50 murders in a prior year, for example, a count that’s off by 5 could mean homicide is either up 10% or down 10%.
Perhaps such differences shouldn’t have an outsized impact on public debate, especially when base rates are low compared to historical levels or are recorded in sparsely populated jurisdictions. But they do. When a crime type is declining, citizens, the media, and political opponents tend toward contentment. Rising crime, on the other hand, brings political attacks and clamor for changes. The potential fallout can be huge—in terms of crime tactics and strategies, whether the police chief stays on the job or gets fired, even whether the mayor or prosecutor is re-elected or driven from office. The high-stakes consequences of getting data on crime trends right put a premium on accuracy.
Limitations of National Data on Non-Fatal Firearms Assaults and Injuries
Limitations in the current federal data collection systems leave significant gaps in our understanding of levels of gun crime, particularly non-fatal firearms assaults and injuries.20 There are currently no national crime statistics counting the number or rate of non-fatal shooting victimizations; data are sometimes published by law enforcement agencies but are, in general, sparsely available.21 NIBRS does have a data element for “assault with a firearm,” which records whether a firearm was used in an incident, but this element does not allow specification of whether the gun was fired, a person was shot, a gun was used as a blunt instrument (e.g., pistol-whipping), a gun was brandished (i.e., unlawfully displayed), or a gun was otherwise used in the assault.
Existing data on non-fatal firearms assaults and injuries come from public health surveillance systems: the Firearm Injury Surveillance Through Emergency Rooms (FASTER) program, which runs the Advancing Violence Epidemiology in Real-Time (FASTER: AVERT) initiative;22 the Firearm Injury Surveillance System of the National Electronic Injury Surveillance System (NEISS-FISS);23 and the Healthcare Cost and Utilization Project (HCUP).24
While these systems provide valuable information, they also suffer from a significant drawback: Data involving firearms are often inaccurate due to the miscoding of some assaults as accidents.
Hospitals record the diagnoses assigned and treatment provided to patients using codes established in the International Classification of Diseases, Ninth and Tenth Revisions. Public health surveillance systems (along with health services researchers) use these codes to record the nature of firearms injuries, determining how many non-fatal firearms injuries occur and whether those injuries stemmed from an accident or an assault. Evidence suggests, however, that coding practices often result in firearms assaults being incorrectly coded as accidental injuries, which obscures understanding about how often firearms are used in assaults.26 The miscoding of data may be complicated by overlapping but not identical categorization of these types of incidents between data sources; in the public health data, a firearms assault is called an intentional injury, and in the crime data, they are categorized as an aggravated assault. One result of this miscoding is that many people incorrectly believe that the accidental discharge of a firearm in their home is a larger risk than a firearm-related homicide or suicide.27 For example, of the roughly 49,000 people in the U.S. who died by firearms in 2021, only 1% died by unintentional gun injury.28
All of these databases provide policymakers and the public with valuable insights but fall short of what’s needed: a full, accurate national portrait of the number, circumstances, and the health outcomes of non-fatal firearm assaults.29 Without accurate information on these and other aspects of non-fatal firearms assaults and injuries, policymakers and the public miss opportunities to shape effective strategies. In many shootings, chance dictates whether a gunshot is fatal,30 so the lack of clear understanding about how firearms are used in crimes that do not result in death—and whether these crimes were intended to result in death—also significantly hampers homicide prevention and intervention strategies.
