Appendix D — Notifying Administration of Metrics and Data Issues

On September 26, members of the statistics department contacted several members of the Office of Research and Innovation and the team reported to be responsible for assembling the data and the analysis of the data used as quantitative measures of department performance during the budget reduction process.


Everyone,

The Statistics department has been preparing for the APC hearing by digging into the data, and we have discovered a problem with implications not only for the budget proposal but also for UNL’s plans to rejoin the AAU. IANR leadership has indicated that any issues with the metrics used for budget cuts should be brought to the attention of ORI (for research) as soon as possible.

We have discovered that the UNL computations based on SRI produce misleading results (e.g. the z-score comparison method). Yesterday, we met with an Academic Analytics analyst and confirmed our suspicions. Ultimately, because SRI is a discipline-specific weighted average of different research factors, creating z-scores from SRI metrics is problematic and destroys the signal available in the data, particularly when those scores are calculated across different disciplines with different weightings. There are alternatives that would allow for cross-department comparisons, and we would be happy to discuss those alternatives.

In the case of the Statistics department, the SRI numbers from Academic Analytics indicate that we are performing at a level of productivity equivalent to leading departments in Statistics ??? Iowa State, Michigan State, and Univ Illinois Urbana-Champaign, among others. Obviously, this is a far cry from the z-score SRI provided to the department that indicates that we’re performing poorly relative to other departments on campus.

We would like to meet briefly at some point between now and Tuesday to explain the issue and demonstrate the problem for our department, because this discovery has the potential to inflame an already volatile situation as campus reacts to the proposed cuts. If we have missed anyone who needs to be included in this discussion, please feel free to forward this and to include them in the meeting.

Thanks,


We received a response back from someone in ORI (names excluded because they really aren’t necessary here).

Apologies for the delay in getting back to you sooner, I was in meetings all day yesterday and had a university event that ran late into the evening.

The SRI for Statistics when the data snapshot was taken in May using AAU public institutions as a peer group was -.1. The Z score that resulted when comparing Statistics SRI to other UNL departments was +.549 indicating that on the SRI metric, Statistics is performing positively relative to other departments on campus.

Please let me know if you’re seeing something else in the data which leads to your understanding that the department was performing poorly relative to other departments on campus.

Kind regards,


Well, that’s great, but that wasn’t the issue we raised at all. We specifically identified that SRI z-score calculations were problematic, and that we’d confirmed with Academic Analytics that cross-discipline calculations are not appropriate using SRI (0.4) or custom SRI (-0.1). As we only had data from statistics, that was the only data we could use to demonstrate the problem – but the issue really wasn’t that Statistics had low scores so much as that the method was incorrect for the data.


<name>, that’s not the issue we want to discuss. The issue is the way that UNL has used SRI - a z-score is fundamentally inappropriate for this comparison. This has several implications beyond statistics that I want to make you aware of ??? there are at least 11 departments that are absolutely hurt by the way the averaging process was performed. I recognize that SRI isn’t the only metric used (but that is itself a concern ??? there are some other issues with how the research metrics are put together).

For what it’s worth, we confirmed our interpretation of this issue with someone at Academic Analytics before we reached out. So while yes, this has some implications for our department, it has many more implications in terms of how the decisions were made to cut departments overall. If you’re willing to meet at some point today I would be happy to stop by.


Another faculty member responded in kind:

The treatment of SRI is only one of the issues that we found in the analysis. I’d be happy to show what you are missing when dealing with SRI simply as one of the metrics. Grant numbers are severely underreported for Statistics faculty but included in the metrics multiple times. Despite being assured that faculty with secondary appointments would be appropriately included in the evaluation, this has not happened for any of the joint appointees in Statistics. I am also worried about the fallout from excluding the performance of 1/4th of the faculty hired after the cutoff date for Academic Analytics.

I realize that we will not change policies at this stage, but I would like to give you a chance to handle the factual errors before this becomes public knowledge and further damages the university’s reputation beyond the initial proposal to cut Statistics.


Finally, we received a positive response from the Office of Research and Innovation:

<We> would be happy to meet with you on Monday morning. Can it work to schedule 30 minutes sometime between 8-9:30 am? We can meet in 301 Canfield Administration. Let us know if there is a time that can work.

We set a time (8 am, Sept 29) and presented slides (http://srvanderplas.github.io/2025-stat-apc-report/statistics-slides.html) describing the problems we had identified to date.


On October 7, at 10pm, we received a document from the Office of Research. We have reprinted the document here with some information redacted to protect individuals’ privacy.

Response to Statistics

On behalf of the Executive Leadership Team and data analytics team, the following are responses to questions asked by Statistics faculty regarding the UNL budget reduction process and the metrics that were one part of the process.

  • Metrics one aspect of the budget reduction considerations
    Per the UNL budget reduction process website, please note the quantitative metrics approach was combined with other qualitative assessments, such as strength of the program, needs of the state, and workforce alignment. Quantitative metrics are one aspect of consideration.

  • Process and expertise in metrics development
    The metrics analysis part of the UNL budget reduction process was conducted by a team of data analytic professionals, including with graduate-level education and decades of experience working with institutional instructional and research administration data at UNL and other AAU-level institutions. In the metrics development process, feedback was received from UNL campus leaders (Deans, College leadership and Department Executive Officers), as well as the Academic Planning Committee, in Spring 2025. The Academic Planning Committee has also had the opportunity to validate analyses in Fall 2025. At this late stage in the process, the metrics themselves won’t be changed.

  • Access to data at the faculty level
    As the Chancellor has stated at various points in the process, the detailed source system data underlying the metrics calculations cannot all be released in full, given the unprecedented size and complexity of these data. It is also not appropriate to release individual-level data to those beyond their home program or with individuals not holding a supervisory or administrative role with the faculty member’s department or college. Much of the raw data is available to Department Executive Officers for their unit, such as through NuRamp, Academic Analytics, PeopleSoft, Watermark’s Activity Insights, SAP or HR and financial systems necessary for the operation of a given unit.

  • Academic Analytics Scholarly Research Index (SRI)
    The SRI was generated for each academic program relative to other AAU public institutions. Importantly, the set of reference institutions for the budget exercise was decidedly other AAU public institutions, an aspirational peer group. This is not the same as the default in Academic Analytics, which is all like programs across institutions of higher education captured in Academic Analytics. The chart distributed at the Board of Regents meeting was SRI relative to all institutions of higher education.
    The set of SRI scores across UNL academic programs was converted to Z-scores, as was the case for the other 17 instructional and research metrics included in the budget reduction process. While it is understood that the process of converting to Z-scores does not retain the interpretability of the original SRI for a given program, in terms of where it stands relative to like programs, it does retain the ordering across UNL programs (i.e., those with the highest SRIs relative to like programs will retain the highest Z-scores for this metric). This is a valid use of these data for the specific purpose of the UNL budget reduction metric analyses.
    There was a suggestion to consider the SRI percentile rather than index score. While we cannot change the overall metrics at this late stage in the process, we did reanalyze the research metrics replacing SRI with SRI percentile using the AAU public institutions as the aspirational peer group. There is no significant change to the ranked quantitative assessment of programs when using SRI percentile rather than SRI, and there is no change to the departments that ranked in the bottom tier using the quantitative assessment.

  • <stat department member’s> appointment
    SAP is the official HR data system for the University of Nebraska, and the official record leveraged to generate faculty appointment data for the purposes of the UNL budget reduction process. As has been pointed out, <stat department member’s> appointment in that system has not accounted for a continued appointment in Statistics, along with <their> appointments in <unit 2> and <unit 3>. The IANR HR team has been made aware of this error and is correcting it. In response to Statistics Department concern about this matter, we have reviewed the department research calculations. While we cannot change the overall metrics at this late stage in the process, the changes to the research Z-score would have been .001 lower had <stat department member’s> appointment in SAP reflected a .2 FTE appointment in Statistics. Additionally, if the authorship on the <book>, had been split between <stat department member’s> and <stat department member’s>, the research Z-score would have been .01 lower.

  • InCites access
    We have confirmed that the University Libraries does not subscribe to InCites or any similar tool


A point-by-point response:

Per the UNL budget reduction process website, please note the quantitative metrics approach was combined with other qualitative assessments, such as strength of the program, needs of the state, and workforce alignment. Quantitative metrics are one aspect of consideration.

The metrics analysis part of the UNL budget reduction process was conducted by a team of data analytic professionals, including with graduate-level education and decades of experience working with institutional instructional and research administration data at UNL and other AAU-level institutions.

In the metrics development process, feedback was received from UNL campus leaders (Deans, College leadership and Department Executive Officers), as well as the Academic Planning Committee, in Spring 2025.

The Academic Planning Committee has also had the opportunity to validate analyses in Fall 2025.

At this late stage in the process, the metrics themselves won’t be changed.

Information used in the reallocation and reduction process must be made available to the budget planning participants and affected programs in a timely manner so that corrections and explanations can be made before it is released to the public.

As the Chancellor has stated at various points in the process, the detailed source system data underlying the metrics calculations cannot all be released in full, given the unprecedented size and complexity of these data.

It is also not appropriate to release individual-level data to those beyond their home program or with individuals not holding a supervisory or administrative role with the faculty member’s department or college.

Much of the raw data is available to Department Executive Officers for their unit, such as through NuRamp, Academic Analytics, PeopleSoft, Watermark’s Activity Insights, SAP or HR and financial systems necessary for the operation of a given unit.

The SRI was generated for each academic program relative to other AAU public institutions. Importantly, the set of reference institutions for the budget exercise was decidedly other AAU public institutions, an aspirational peer group. This is not the same as the default in Academic Analytics, which is all like programs across institutions of higher education captured in Academic Analytics. The chart distributed at the Board of Regents meeting was SRI relative to all institutions of higher education.

Figure D.1: Dot plot of the importance of a unit/discipline for AAU compared to all universities.

The set of SRI scores across UNL academic programs was converted to Z-scores, as was the case for the other 17 instructional and research metrics included in the budget reduction process. While it is understood that the process of converting to Z scores does not retain the interpretability of the original SRI for a given program, in terms of where it stands relative to like programs, it does retain the ordering across UNL programs (i.e., those with the highest SRIs relative to like programs will retain the highest Z-scores for this metric).

This is a valid use of these data for the specific purpose of the UNL budget reduction metric analyses.

There was a suggestion to consider the SRI percentile rather than index score. While we cannot change the overall metrics at this late stage in the process, we did re-analyze the research metrics replacing SRI with SRI percentile using the AAU public institutions as the aspirational peer group. There is no significant change to the ranked quantitative assessment of programs when using SRI percentile rather than SRI, and there is no change to the departments that ranked in the bottom tier using the quantitative assessment.

SAP is the official HR data system for the University of Nebraska, and the official record leveraged to generate faculty appointment data for the purposes of the UNL budget reduction process. As has been pointed out, <stat department member’s> appointment in that system has not accounted for a continued appointment in Statistics, along with <their> appointments in <unit2> and the <unit3>. The IANR HR team has been made aware of this error and is correcting it.

In certain disciplines ??? especially the arts and humanities ??? there are forms of faculty scholarly activity that are not captured in the Academic Analytics database. These include residencies, exhibitions, and performances, as well as the research underpinning these activities.

When indices of research activity are employed, the components are weighted appropriately using discipline-specific measurements derived from nationally recognized sources.

We have confirmed that the University Libraries does not subscribe to InCites or any similar tool.

Figure D.2: Web of science database access from UNL Library.

  1. Presumably, with some error bars relative to the size of the comparison group, but again, we cannot access the data needed to estimate this.↩︎