The Integrated Institutional Effectiveness Model – Evidence and Improvement
Among the greatest resources available to an institution of higher education is information. This can and must include information about student learning and student success, however, simply gathering and inventorying data and information does not lead to an environment primed for continuous improvement. At its core, institutional effectiveness is about institutional improvement and the integrated institutional effectiveness philosophy, model, and approach, as communicated by AHEE, is essential to enhancing student success. Within the model, which includes accountability, direction, and evidence and improvement, it is this third area that is most directly impacted on a day to day basis by the institutional effectiveness professional.
Despite the plethora of opinion editorials and articles haranguing the use of assessment of student learning within higher education, multiple occasional papers from the National Institute for Learning Outcomes Assessment (NILOA) as well numerous articles and books (i.e. Kuh, Ikenberry, Jankowski, Cain, Ewell, Hutchings, & Kinzie, 2015) affirm the value of high quality, faculty-led assessment of student learning. In fact there are numerous examples of academic programs conducting rigorous assessment as well as a greater focus on the use of psychometrics and other rigorous methodological techniques to increase the validity of assessments. Where institutions run into trouble in regards to assessment of student learning, however, tends to be in three areas. These include:
- Failing to link course level assessment to program learning outcomes
- Ineffective assessment of general education,
- and a lack of effective assessment of the AES units
In each case, an integrated institutional effectiveness approach enhances the relevance and usability of assessment results. Regarding linking course level assessment to program learning outcomes, many departments don’t have the personnel or structure (i.e. curriculum mapping, assessment software, etc.) align course assessment with program assessment. An integrated IE office, with IR staff able to support data gathering and/or assessment staff able to assist with research methodology, maintenance of an assessment calendar and system, and willingness to meet with the department and assessment committees/liaisons, however, provides the support and leadership necessary to analyze and utilize the results for improving student learning. Maintaining similar relationships and responsibilities with general education committees/liaisons and department chairs can also ensure enhanced general education assessment. These intentional relationships between the academic departments and an IIE office ensures that assessments are conducted within a framework that ensures compliance with accountability standards and in alignment with planning efforts.
An integrated institutional effectiveness system can be especially helpful in regards to the assessment of the Administrative, Educational, and Student Support (AES) Units. In many colleges the assessment of student learning outside the classroom, indirect support of student learning, and support of the environment for supporting student learning can remain of secondary importance. By establishing an institutional assessment/effectiveness plan that integrates all assessment, thereby communicating the importance of assessment outside the classroom, along with the support and leadership from an office charged with institutional responsibility for assessment, AES units are better positioned to engage in the assessment of student learning and support outcomes, which are both critical to understanding student success. For more information on how to support AES assessment, including a taxonomy for support outcomes assessment, feel free to examine the Borough of Manhattan Community College’s Institutional Effectiveness and Analytics Assessment page. BMCC operates with an IIE approach and, accordingly, AES assessment is explicitly and directly linked to the college’s strategic planning efforts, is the foundational element of operational planning, and is part of a recognized model that exceeds regional accreditation requirements.
Assessment and evaluation are intentionally treated as distinct concepts by AHEE and within our integrated institutional effectiveness model. Assessment, at its core, is about using information for the purpose of making measured gains and demonstrating improvement. Evaluation, however, reflects objective judgement regarding effectiveness. It’s also important to note that assessment is most effective when conducted systematically and continually while evaluation is most effective when conducted systematically, informed by previous assessments, and conducted in regular intervals. There are two area in regards to evaluation where an integrated effectiveness approach supports greater student outcomes. These include:
- The Academic Program Review and AES Unit Review Processes, and
- The operationalization of grants
While one review process is specific to academic departments and the other AES units, both are most effective when previous annual assessment results are used to inform and guide the discussion and review. Through the use of an integrated IE approach, both the programs and units are located within a calendar, with a flexible template for the process, and with clear and concise guidelines. This process, when conducted comprehensively, will include a series of data and, where possible, dashboards and analytics to review, an assessment report detailing the use of assessment results for improvement, and the organization of internal committees, SWOT/environmental scans/focus groups, and external evaluators. The support of an integrated office able to provide support in each of these areas greatly benefits the units and departments in developing action plans that lead to sustained improvement of learning and support outcomes. Examples of assessment guidelines can also be found on the BMCC IEA Assessment page. As was the case with assessment, evaluation is more impactful when maintained as part of an IIE model. The result of the evaluation process is a plan for improvement and future assessments in the program or unit. When an office tasked with IIE supports and provides leadership over these efforts, these plans are integrated into and aligned with institutional planning efforts and meet expected integration of planning efforts associated with regional accreditation standards.
As higher education budgets have been reduced over the past decades, colleges and universities have been forced to rely on institutional grants to pilot programs designed to reduce achievement gaps and increase overall student success. Difficulties often emerge, however, when the grant comes to an end and there are difficult financial decisions regarding the transitioning of processes, offices, practices, and personnel. For the IE professionals tasked with providing information and supporting the evaluation efforts of these grants, it can be difficult to offer institutional guidance on the operationalization of the grant activities. Within an IIE model, however, the grant objectives have been intentionally aligned with institutional goals and strategic objectives and for individuals holding institutional leadership positions, they are charged with providing institutional guidance to members of the leadership team. This alignment of reporting and planning positions the institution to scale up successful elements of the grant and ensures that accreditation criteria have been accounted for.
When considering an integrated approach to institutional research, the focus changes from one premised primarily on data gathering, reporting, and compliance to one of leadership and support for improvement at the institutional, academic program, and AES levels. The IE professional maintains responsibility over a system designed to continuously question and assess progress through innovative presentation of data, use of dashboards and graphical representations of information, and the use of analytics to support planning and decision making. Rather than simply providing data, these individuals are the information storytellers and interpreters responsible for enhancing effectiveness and student success through the targeted, intentional use of evidence to guide direction setting and decision making. Two deliverables both common to and useful for evidence-based, information influenced strategic and operational activities include:
While dashboards are by no means new to higher education, the capabilities associated with greater integration of data systems and data mining has led to the ability to develop robust user interface systems that provide greater amounts of self-service data than previously possible. It is my contention that the most innovative and effective colleges are those where questions are welcomed; faculty, staff, and administrators are curious; and major decisions occur within the context of thorough data analyses and accurate interpretation. By providing drop down menus with significant amounts of customizable data regarding student success, faculty, staff, and administrators are able to gather data deemed relevant. This data can also be customized to specific groups when data is necessary and regularly requested, but more sensitive in nature. It is the responsibility of the IE professional to work with the college community to determine which information is most impactful and to ensure the integrity and reliability of the information. It is also, however, the responsibility of the IE professional, within an IIE model, to provide guidance on the interpretation of this information and to work with institutional leaders to use the information in a manner that drives innovation and effectiveness. At BMCC we offer what we refer to as Tables for Chairs to meet the specific data requests at the department level, but also maintain student success dashboards within our public website. By working with departments and units on assessments and program/unit reviews while also reviewing our inventory of data requests and external reporting requirements, we have developed a robust dashboard process that provides the college community with information regarding student success. We also use the dashboards during the operational planning processes to provide guidance on targeted interventions based on institutional priorities.
While the Balanced Scorecard has been used by numerous colleges, and is often supported by data from the IR office, the scorecards that I am referring to are institutional effectiveness and mission-based report cards developed in collaboration with IE professionals and which the IE office holds institutional responsibility. Colleges and universities utilizing an IIE approach will place the leadership, responsibility, and accountability over the final development, reporting, and analysis to the IE professional. In these settings, the IE professional works with the community to determine which metrics hold the most weight and will work with units to ensure they are pursuing both direction setting and evidence and improvement activities aligned with these metrics. During the reporting phase, the results from the scorecards are then presented by the IE professional to document areas where the institution can leverage strengths, mediate weaknesses, and address shortcomings. At BMCC, in addition to the institutional effectiveness scorecard, the Office of Institutional Effectiveness and Analytics also produces a student success scorecard, which is a foundational document for yearly operational planning institutionally and within units. The metrics within this card were shaped by institutional discussions, especially those from the college-wide, representative Equity and Inclusion Task Force. The information gathered in each of these institutional reports is used institutionally and within division to drive annual goal setting, guide assessments, and target grant activity.
Christopher Shults is Dean of Institutional Effectiveness and Strategic Planning and MSCHE Accreditation Liaison at the Borough of Manhattan Community College and Vice President of the AHEE Board. He has served on/led self-studies in three regions, authored/coauthored numerous publications and presentations on effectiveness, assessment, and leadership, provided regional accreditation workshops, and is currently working on a book designed to help community colleges build student focused business models.