Comprehensive Standard: The institution identifies expected outcomes for its educational programs (including student learning outcomes for educational programs) and its administrative and educational support services; assesses whether it achieves these outcomes; and provides evidence of improvement based on analysis of those results.
X Compliance
Partial Compliance
Non-compliance
Georgia State University is committed to providing educational programs and administrative and educational support services which set high standards, to assessing whether the desired outcomes have been achieved, and to using the results of those assessments to improve. The University uses parallel processes for academic and administrative units which require each to identify desired outcomes, assess whether the desired targets were met, and develop action plans to improve future performance.
Each educational major has developed an assessment plan in which they have specified what students must know and do in order to graduate. The pilot versions of the assessment plans are available online. [1] Assessment plans and results for the 2005-2006 and 2006-2007 assessment cycles are available on the WEAVEonline site, the assessment management program used by the University [2]. Academic programs are encouraged to use direct assessments of learning and to incorporate effective assessment practices, especially the use of course-embedded assessments.
For the 2005-2006 and 2006-2007 academic years, academic programs reported:
For the 2005-2006 assessment cycle, all 170 academic programs posted their mission statements, learning outcomes, and measures for assessing the outcomes. All programs but one posted findings from their assessments, and 93% of the programs included actions plans which described steps they would take to improve. As of July 13, 2007, data for the 171 academic programs in the 2006-2007 assessment cycle showed that in addition to all programs having mission statements, learning outcomes, and measures, 94% had posted assessment findings. It is anticipated that this number will be close to 100% by September 1.
There is evidence that programs move to more effective strategies for assessing student learning as a result of their annual efforts. In the most recent reports, 35 programs mentioned using portfolios as part of their measures of student learning (more than 150 individual references). This is up from the 32 programs which reported using portfolios in 2005-2006.
All assessment reports may be examined at the WEAVEonline web site [2], and an example of an assessment report is provided to illustrate the format (Geology BS) [3]. Please note that reports for the bachelor's degree program in the Robinson College of Business consist of two parts: the generic college core outcomes and the program-specific outcomes.
There are a number of examples of actions taken by programs as a result of the assessment process, departmental discussions, and feedback provided from the Office of the Associate Provost for Institutional Effectiveness. As a result of the assessment process, the following action plans were implemented:
Administrative Programs
A parallel process has been developed to set standards, assess progress, and develop action plans for administrative and student support units. Assessment plans and results for the 2005-2006 and 2006-2007 assessment cycles are available on the WEAVEonline site, the assessment management program used by the University. [2] While the academic process for assessing student learning outcomes served as a model, the assessment paradigm developed for the administrative and support units accommodates the disparate responsibilities of the administrative units. Although most of these units do not have student learning outcomes, the underlying tenet (that an outcome reflects what the student learns, not what was taught) was mirrored in defining an administrative outcome as the impact that unit has on its customers, constituents or the institution (as distinct from a strategic goal or objective).
The 90 administrative and student support units spread across six divisions of the institution developed initial assessment plans in the Spring of 2004. These included identifying expected outcomes, measures for those outcomes, targets for success on each measure, and plans for collecting and utilizing performance data. The units were provided feedback from the Associate Provost for Institutional Effectiveness before they started their data collection at the beginning of the fiscal year (July 1, 2004).
The first round of reporting (for fiscal year 2005) took place between July 1 and September 1, 2005. At this point, the units described and interpreted their results and indicated how the information had been or would be used to improve performance. Again, all units received feedback on their reporting efforts. For the second round of reporting (for fiscal year 2006), the University converted to the online system (WEAVE online) described above. The reports were completed between July 1 and September, 2006.
An estimate of the broad-based participation of faculty and staff in the assessment process can be based on the fact that nearly 250 different individuals have logged on to the assessment reporting website to either enter or review the information for their unit. Many other individuals collected and reviewed their units’ assessment information prior to its actually being posted.
A number of improvements in administrative programs can be cited as a result of setting outcomes, assessing performance and developing action plans. The following are examples: