Webster’s ongoing evaluation and assessment processes provide reliable evidence of institutional effectiveness that clearly informs strategies for continuous improvement.
Webster University has a decentralized, yet coordinated evaluation and assessment process which has proven effective for the University’s continuous improvement.
The world-wide network relies upon the Jenzabar CX integrated database. Each year a snapshot of institutional data is captured by the Office of the Registrar and the Institutional Research Office (IRO) to provide a current state-of-affairs picture of the university. The University Registrar provides an opening semester headcount compared to the previous year. The IRO creates the University fact book Sum & Substance. For an overview of the project inventory in IRO see.
[EXHIBIT: HLC2c.1 Fall ‘07 Opening Enrollment, HLC0.3 Sum and Substance, HLC2c.2 IRO Project Inventory]
For the myriad of customized reports needed for management decision-making a core group of expert users trained in COGNOS report writing and representing each vice presidential unit together form an institutional research group. This loosely formed group routinely checks assumptions, facts, definitions, and report outcomes with each other helping ensure uniformity in institutional information.
Worldwide enrollment trends and accompanying tuition dollars are closely monitored and evaluated by the Office of Enrollment Management, the Finance Office, and Academic Affairs Office. Term-by-term enrollment and fiscal data are assembled and analyzed for the current year to date performance in order to compare to budget as well as to compare to the prior year’s actual performance. This information is sorted and viewed from numerous angles in order to isolate those instances where a campus may not be performing according to expectations or as well as previous years.
Additionally, extended campus five-year enrollment profiles are prepared and updated on an ongoing basis in order to observe and assess trends over a longer period of time. The annual data as well as the five-year trend data are used to assess program and other operational performance results and could potentially lead to adjustments in program offerings, site personnel, or other resource commitments.
[EXHIBIT: HLC2c.3 Extended Campus Enrollment Profile]
Data to monitor retention of the full-time undergraduate population is provided after the second semester and between the freshman and sophomore year. This data results in a yearly retention report from the Dean of Students. In response to this information the university has developed a Student Success Committee to flag potential dropouts, increased staffing in Academic Resource Center, employed proactive advising strategies and improved student services.
[EXHIBIT: HLC2c.4 Retention]
The Academic Affairs Office, Students and Enrollment Management, Finance, and Institutional Research offices collaborate to determine the critical areas needed and otherwise to plan for full-time faculty growth.
Three times per year data collected by this group is synthesized into reports for the University’s Board of Trustees.
[EXHIBIT: HLC2c.5 Board of Trustees Report]
Yearly monitoring of graduation rates are prepared and compared to past rates to determine further action.
The Office of Assessment issued the Assessment Excellence 2006 report which documents the learning outcomes for all of the programs in our schools and colleges and offers the metric by which our academic programs are being measured in terms of student outcomes. This is an internal report under the Academic Affairs division.
Web site: [EXHIBIT: HLC2c.6 Assessment Excellence 2006, http://www.webster.edu/academics/assessment/excellence.shtml]
The University engages in constant assessment utilizing outside consultants and reviews. Some examples are:
- Utilization of consulting firm Hardwick Day to help monitor net revenue and tuition discounting that has resulted in increased revenue per student with increased size of the freshman class.
[EXHIBIT: HLC2c.7 Hardwick Day]
- Utilization of the consulting firm “The Lawlor Group” for advice on restructuring the Office of Marketing for greater effectiveness and integration of the staff. This resulted in a reorganization which has improved staff collaboration, output, and improved marketing planning.
[EXHIBIT: HLC2b.4 Lawlor]
- Based on the CAS standards, five-year departmental reviews, instituted for all Student Affairs units in 2006. Reviews have been completed for Health Services, Student Employment, and the University Center and Student Activities. Housing and Residential Life and the Webster Village Apartments utilize annual Quality of Life surveys to get feedback from residents. Campus Dining has used the NACUFS Customer Satisfaction Benchmarking Survey annually since spring 2006. Student Affairs staff attended a full-day retreat on assessment in October 2004 and developed guidelines for assessing student learning outcomes.
[EXHIBIT: HLC2c.8 Student Life Assessment]
- Military Installation Voluntary Education Review (MIVER) – ACE conducts reviews at our military locations.
[EXHIBIT: HLC0.17 Summary of MIVER Reviews]
- State licensures – Certain states that Webster operates in require periodic reports and/or reviews on a regular basis. These requirements typically vary from state to state and range from annual reports to unannounced visits, to more formal five year reviews.
[EXHIBIT: HLC0.14, HLC0.15 State Approval and Licensure]
- In 2006-07 Webster secured NCATE accreditation in the School of Education.
[EXHIBIT: HLC3a NCATE]
- In 2007 Webster is seeking ACBSP accreditation in the School of Business & Technology, and undertaking another NLN review.
[EXHIBIT: HLC3a ACBSP, HLC3a NLN]
Webster’s decentralized and loosely coupled process for assessment and continuous quality improvement has proven effective for Webster and its culture of consensus.
It has built a cadre of experts dispersed across departments and readily available to their constituents. Expert users imbedded within functional units allow for a deep understanding of nuanced data by those expert users, providing a richer understanding of sometimes complex reports. However, this model poses challenges to the institution for the following reasons.
- Lack of Information Architecture – Webster has a sophisticated, complex, and integrated data architecture in ERP terms. However, we lack the required reporting and analytic tools needed to pull important information from the data. Further, even if it had these tools, the University has not agreed upon the big picture (dashboard) reports needed to measure success and from which other more detailed reports would derive. Currently, analysis can be duplicative, ad hoc, and slightly different depending on the need and perspective of the requesting unit.
- Use of Live Data – Various offices can report information that does not jive with that of another, simply because live data was used and was accessed at different times. It has been difficult to agree on the use of “freezing” data from which to make decisions.
- Lack of Clear Definition of Terms – Sometimes different assumptions about the definition of a data element are used.
- Communication Challenges – It can be a continuing effort to properly coordinate and communicate across units.
Webster realizes it needs to reassess its institutional research model. As it has grown in size and complexity centralizing this function makes more and more sense. Review of our institutional research model is a high priority for the future. We are currently in the process of purchasing the latest set of reporting and analysis tools. We are developing plans to do data snapshots or small static data warehouses for reporting purposes which should address the inconsistency caused by the use of live data.