The GRID Measures – Aggregate Report is an interactive tool that allows you to benchmark radiology report turnaround-time performance against facilities of the same region, facility type, and location as well as against the entire registry. The benchmark comparisons can help GRID participants identify areas for improvement and track quality improvement efforts over time. The report also provides performance insight for facilities reporting turnaround-time measures for the Center for Medicare and Medicaid Services’ (CMS) Merit-Based Incentive Payment System (MIPS).


The report is updated weekly and comprised of three tabs: Peer Comparisons, Facility Comparisons, and Registry Comparisons that display distinct views of turnaround-time performance by:

  • Date range of interest – the default is the last 30 days

  • Place of service, e.g., ambulatory, inpatient and emergency department

  • One or more imaging modalities

  • Report interpretation by a single radiologist vs. multiple radiologists, e.g., a resident and attending physician

The report’s Peer Comparisons tab displays the performance ranking of a single facility against the peer groups:

  • Facilities in the same census region

  • Facilities of the same type (academic, community hospital-based, freestanding center, multi-specialty clinic, or other)

  • Facilities in the same location type (metropolitan, suburban, rural)

  • All facilities in the registry


The horizontal bars give a visual indication of the number of exams (grey bars) and average turnaround time (blue bars) for the selected filters.  Each circle represents how well the facility's turnaround-time rate ranks against all sites within each of four peer groups. Turnaround times are calculated by subtracting the exam completion date and time from the date and time the final report was signed.

Note: Submitting Place of Service data provides more insight about performance and helps to identify opportunities for improvement.   

Below is an example of a facility’s CT turnaround-time performance shown with place of service details for the first six months of 2020.  The performance data are compared to peer data from the last six months of 2019.


Hover over a circle (see red square in example below) to view a pop-up with measure details, including the facility rate and detailed comparison to peer group quartiles.

The Facility Comparisons tab displays performance rankings for multiple facilities affiliated with a Corporate Account, allowing comparison across facilities. The example below shows comparisons across three facilities to include Place of Service and # of Readers detail and a Registry Grand Total comparison. A blank cell indicates too little data are available for comparison—hover over the center of the cell to see the details.

The Registry Comparisons tab displays performance rankings for a single facility or multiple facilities against the registry. The turnaround-time rankings are presented in deciles, similar to the approach used for the reporting of MIPS measures for the CMS Quality Payment Program.

Each circle represents how well the facility ranked against all sites in the registry. Rankings are calculated by comparing the facility average turnaround time with the deciles for all sites in the comparison data set. Deciles are displayed in the report in order from worst to best. For example:

  • Average turnaround times greater than the Registry 10th percentile rank in the bottom 10% of all sites in the registry (Decile 1)

  • Average turnaround times falling between the Registry Median (50th percentile) and 60th percentile rank in the top 40-50% (Decile 6)

  • Average turnaround times less than the Registry 90th percentile are in the top 10% of all sites (Decile 10)

Note: Click the information tooltip icon in the upper left corner of the report for report details including how to download data to PDF reports.