The institution
identifies expected outcomes, assesses the extent to which it achieves these
outcomes, and provides evidence of seeking improvement based on analysis of the
results.
The following
preface introduces sections 8.2.a, 8.2.b, and 8.2.c.
UL Lafayette has
established and maintains a systematic, comprehensive, and effective process by
which outcomes are identified, assessed, and analyzed leading to continuous
improvement efforts. The University continues to set goals and evaluate results
in order to improve educational programs, general education, and academic and
student support services.
Within the past
decade, UL Lafayette has made significant strides in formalizing assessment
practices. From 2010 through 2014, assessment was overseen by the Assistant
Vice President for Institutional Planning and Effectiveness (Office of Academic
Affairs). Responding to the growing assessment needs across campus, the
University created the Office of Institutional Assessment, and a Director of
that new office (reporting to the Assistant Vice President for Academic Affairs
– Academic Resources) was named.
Since January
2015, the Office of Institutional Assessment has worked with academic programs
and administrative departments to support and guide the University’s
institutional assessment efforts through the collection, analysis, and
distribution of data. The Office of Institutional Assessment promotes ongoing
and systematic assessment processes and best practices by:
· evaluating and sharing external survey data with University departments and divisions in order to enhance their overall assessment portfolio;
· reviewing assessment plans and providing feedback to all academic and non-academic entities; and
· promoting a consistent dialogue across campus regarding assessment.
Since 2010, a council composed of faculty and administrators has
guided University policy on assessment. The charge of the current University
Assessment Council (UAC) is to support the ongoing process of systematic planning,
evaluation, and continuous improvement across campus through a research-based,
integrated, and institution-wide approach (Sample Assessment Council Agendas, Minutes and Presentations). The UAC is
chaired by the Director of Institutional Assessment and includes two Assistant
Vice Presidents for Academic Affairs (ex-officio members); eight academic
Deans, Associate Deans, or faculty college representatives; and eight
additional administrators and professional staff representing Student Affairs,
Academic Affairs, Enrollment Management, Administration and Finance, and
Advancement. In 2015, Assessment Liaisons were identified within each academic
college (eight) and vice-presidential area (five) to communicate assessment
information from the Office of Institutional Assessment to the assessment
coordinators tasked with managing the assessment plans of each academic program
and administrative unit. Beginning in 2018-2019, each Assessment Liaison also
serves on the UAC.
Through a
consistent and systematic process, all academic programs and administrative
departments track goals, measures, criteria for success, results, improvements,
and reflections on an annual cycle.
The practice of
assessment allows departments to reflect on their missions and adopt changes to
ensure alignment. The annual assessment cycle follows the academic calendar,
beginning and ending in mid-Fall, and provides a structure to the assessment
process. The assessment cycle occurs in three stages:
1) Start of the Assessment Cycle (early Fall): Academic programs and
administrative departments are responsible for:
·
Reviewing
(or establishing) the department (or program) mission, vision, and values (as
applicable);
·
Affirming
that the mission aligns to the University’s mission and, if applicable, any
external accreditation agencies;
·
Defining
the goals, measurements, and criteria of success for that cycle. That is, What do you want to do? and How will you know you were successful?
·
Ensuring
that any previous action plans or unmet goals from previous cycles have been addressed
or updated in the current cycle; and
·
Entering
unit
mission, goals, objectives, criteria, and assessment narrative into LiveText;
·
Aligning goals to University’s strategic plan
or accreditation board standards, as applicable.
2) Middle of the Assessment Cycle (Fall through Spring): Academic
programs and administrative departments are responsible for:
·
Conducting
the assessments that have been established in the assessment plan;
·
Tracking
results/entering findings, and securing additional documentation (if
necessary);
·
Discussing
preliminary results and possible implementation plans;
·
Communicating
any dates and planning any meetings for the “End of Cycle” discussions about
findings and implementation plans; and
·
Reflecting on and discussing findings and
possible improvements within the department.
3) End of the Assessment Cycle (late Spring/Summer into early Fall):
Academic programs and administrative departments are responsible for:
·
Reviewing
all findings that have been submitted by the Assessment Coordinator, and
recommending implementation plans on those goals that were not met. That is,
now is the time to answer the question: How
did we do?
·
Entering
all findings, implementation plans, and reflections in LiveText; and
·
Identifying
which goals, measures, and criteria may need to change for the following cycle.
At the start of
each assessment cycle, the Office of Institutional Assessment creates and
distributes an Assessment Handbook (2016-17, 2017-18, 2018-19). Additionally, the Director meets
individually with new Assessment Coordinators throughout the year to guide the
assessment cycle, best practices, and timelines.
Over the past
decade, the University has utilized two assessment platforms: WEAVEonline
(2010-2015) and Assessment Insight System (AIS) by LiveText (2016-present).
Both platforms provide consistency to the annual assessment reporting process.
In WEAVEonline, academic programs and administrative departments tracked
student learning and program outcomes, measures, results, and action plans.
Similarly, in LiveText’s AIS, programs and departments record mission
statements (aligned to the University’s mission statement), assessment plans
(including goals, measures, criteria for success), assessment reports (findings
and improvement types), and reflections. In the 2017-2018 assessment cycle, an
additional set of questions was added to the assessment plan. These questions
prompted departments to reflect on assessment strategies, past improvement
attempts, and ongoing assessment needs:
1. What strategies exist to assess the outcomes?
2. What does the program/department expect to
achieve with the goals and objectives identified above?
3. How might prior or current initiatives
(improvements) influence the anticipated outcomes this year?
4. What is the plan for using data to improve
student learning and/or operations?
5. How will data be shared within the
program/department (and, where appropriate, the College/VP area)?
While the majority of academic programs and
administrative departments use the annual “Assessment Cycle” template, some
have opted for customized templates aligned to national accrediting boards
(such as the academic programs within the College of Engineering) or nationally
accepted best practices (such as Student Affairs and University Advancement).
In those cases, all sub-units assess the same unit-wide goals but customize
metrics and assessment tools. In all cases, the annual cycle of assessment is
followed.
In 2015, the UAC reviewed the assessment plans of academic and non-academic units across the
University, and met with each
unit to discuss results and improvements. In 2016, when LiveText’s AIS
was implemented, a set of rubrics for evaluating assessments was incorporated
into the platform and into the assessment handbooks (2016-17, 2017-18, 2018-19). At UAC
meetings, the Council reviews compliance of submitted assessment reports and
assessment liaisons per college and VP area, and departmental assessment
coordinators are charged with overseeing the quality of individual assessments.
Now that LiveText’s AIS is fully implemented across the University, a more centralized quality audit is in
development.
8.2.a Student Outcomes: Educational Programs
The institution identifies expected outcomes, assesses the extent
to which it achieves these outcomes, and provides evidence of seeking improvement
based on analysis of the results in the areas below: a) Student learning
outcomes for each of its educational programs.
x
Compliance o Non-Compliance o Partial Compliance
UL Lafayette has established and maintains a systematic,
comprehensive, and effective process by which student learning outcomes are
identified, assessed, and analyzed leading to continuous improvement efforts.
Evidence of institution-wide assessment infrastructure, governance, cycle, and
review is provided in the Assessment Preface in Section 8.2.
Since 2009-2010, the University’s academic programs have
consistently participated in the annual assessment process of establishing
goals and reviewing results to improve student learning and program outcomes.
Table 8.2.a – 1 shows that in the three most recent assessment cycles
(2015-2016, 2016-2017, and 2017-2018), nearly all academic programs entered
Assessment Plan Elements, Assessment Report Elements, and Reflections. The
Office of Institutional Assessment continues to work with Assessment Liaisons
to share information on best practices related to assessment plans and
reporting and aims to obtain 100% participation throughout the assessment
cycle.
Table 8.2.a – 1: Completion by
Academic Units over Three Assessment Cycles
|
2015-2016 |
2016-2017 |
2017-2018 |
Total academic entities |
101 |
100 |
100 |
Assessment Plan Elements 2015-16: Outcomes/Measures/Targets 2016-17: Goals/Measures/Criteria 2017-18:
Goals/Measures/Criteria/Assessment Narratives |
101 |
100 (100%) |
100 (100%) |
Assessment Report Elements 2015-16: Findings/Action Plans 2016-18: Findings/Improvement
Narratives |
95 (94.06%) |
94 (94.00%) |
91 (91.00%) |
Reflections 2015-16: Achievement Summary 2016-18: Reflection |
101 (100%) |
87 (87.00%) |
93 (93.00%) |
Through its academic colleges and departments, the University
identifies, assesses, and improves its student learning outcomes for each
academic program. Assessment reports for the nearly 100 academic programs
(including University College and some centers in the Colleges of Liberal Arts
and Business) are available in LiveText’s AIS for assessment cycles
2015-present; archived assessment reports generated from WEAVEonline for
assessment cycles 2009-2015 are available upon request from the Office of
Institutional Assessment. Table 8.2.a – 2 provides direct access to each
assessment report by academic program. To illustrate examples of student
learning assessment, summaries from selected academic programs are provided
after the table. The summary samples represent approximately 25% of the
academic programs from each college, and represent all degree levels (bachelor,
master’s, and doctoral), as well as traditional and online course deliveries.
Table 8.2.a – 2: Assessment Reports by
Academic Units over Three Assessment Cycles
Academic
Programs by College |
WEAVEonline |
LiveText’s
AIS |
|
College of the Arts |
2015-2016 |
2016-2017 |
2017-2018 |
Architectural Studies BS |
|||
Architecture M in Arch |
|||
Industrial Design BID |
|||
Institute for Traditional Music |
|||
Interior Design BID |
|||
Music BM |
|||
Music M in Music |
|||
Performing Arts BFA |
|||
Visual Arts BFA |
|||
College of Business Administration |
2015-2016 |
2016-2017 |
2017-2018 |
Accounting BSBA |
|||
Accounting MS |
|||
Economics BSBA |
|||
Finance BSBA |
|||
Hospitality Management BSBA |
|||
Insurance and Risk Management BSBA |
|||
Management BSBA |
|||
Marketing BSBA |
|||
MBA |
|||
MBA / Health Care Administration |
|||
Professional Land and Resource
Management BSBA |
|||
Small Business Development Center |
|||
College of Education |
2015-2016 |
2016-2017 |
2017-2018 |
Athletic Training BS |
|||
Center for Gifted Education |
|||
Counselor Education MS |
|||
Curriculum and Instruction BS |
|||
Curriculum and Instruction MEd |
|||
Education of the Gifted MEd |
|||
Educational Leadership EdD |
|||
Educational Leadership MEd |
|||
Exercise Science BS |
|||
Health and Physical Education BS |
|||
Health Promotion and Wellness BS
Online |
|||
Kinesiology MS |
|||
Sport Management BS |
|||
College of Engineering |
2015-2016 |
2016-2017 |
2017-2018 |
Chemical Engineering BS |
|||
Chemical Engineering MS |
|||
Civil Engineering BS |
|||
Civil Engineering MS |
|||
Electrical and Computer Engineering BS |
|||
Electrical Engineering MS (Note: Program created in 2013-14; began assessment in 2016-17) |
-- |
||
Industrial Technology BS |
|||
Mechanical Engineering BS |
|||
Mechanical Engineering MS |
|||
Petroleum Engineering BS |
|||
Petroleum Engineering MSE |
|||
Systems Engineering PhD |
|||
Systems Technology MS |
|||
College of Liberal Arts |
2015-2016 |
2016-2017 |
2017-2018 |
Anthropology BA |
|||
Applied Language and Speech Sciences
PhD |
|||
Center for Louisiana Studies |
|||
Child and Family Studies BS |
|||
Communication MS |
|||
Criminal Justice BS |
|||
Criminal Justice MS |
|||
Early Childhood Studies Lab |
-- |
-- |
|
English BA |
|||
English MA |
|||
English PhD |
|||
Francophone Studies PhD |
|||
French MA |
|||
History BA |
|||
History MA |
|||
Mass Communication BA-Broadcasting |
|||
Mass Communication BA-Journalism |
|||
Modern Language BA |
|||
Moving Image Arts BA |
|||
Political Science BA |
|||
Professional Writing Graduate
Certificate (Note: Program created 2014-2015; assessment began
in 2016-17) |
-- |
||
Psychology BS |
|||
Psychology MS |
|||
Sociology BA |
|||
Speech Pathology and Audiology BA |
|||
Speech Pathology and Audiology MS |
|||
Strategic Communication BA-Advertising
(formerly Mass Communication BA-Media Advertising) |
|||
Strategic Communication
BA-Organizational Communication (formerly Organizational Communication BA) |
|||
Strategic Communication BA-Public
Relations (formerly Public Relations BA) |
|||
University of Louisiana Press |
|||
College of Nursing and Allied Health Professions |
2015-2016 |
2016-2017 |
2017-2018 |
Dietetics BS (Note: Program discontinued) |
-- |
-- |
|
Doctor of Nursing Practice (DNP) |
|||
Health Information Management BS |
|||
Health Services Administration BS |
|||
Nursing BS |
|||
Nursing MS |
|||
College of Sciences |
2015-2016 |
2016-2017 |
2017-2018 |
Biology BS |
|||
Biology MS |
|||
Biology PhD |
|||
Chemistry BS |
|||
Computer Engineering MS |
|||
Computer Engineering PhD |
|||
Computer Science BS |
|||
Computer Science MS |
|||
Computer Science PhD |
|||
Environmental Science BS |
|||
Geology BS |
|||
Geology MS |
|||
Informatics BS |
|||
Mathematics BS |
|||
Mathematics MS |
|||
Mathematics PhD |
|||
Physics BS |
|||
Physics MS |
|||
University College |
2015-2016 |
2016-2017 |
2017-2018 |
General Studies BGS |
An approximate
25% sampling of assessment summaries from the nine academic programs in the
College of the Arts includes:
·
Architecture (BS)
·
Architecture (M Arch)
·
Industrial Design (BID)
The School of
Architecture and Design offers three nationally accredited degrees (BS in
Architecture, Master of Architecture, and Bachelor of Industrial Design). In the
Fall of 2017, the BS in Architecture and Master of Architecture programs
adjusted their goals and outcomes to align with new National Architecture
Accrediting Board’s Student Performance Criteria. In evaluating student
learning goals, the BS in Architecture adjusted the course content of ARCH 409,
Comprehensive Integrated Design Studio (comprehensive building project studio)
and its alignment with a related course in the curriculum sequence. The ARCH
409 studio now centers the course notebook as a measure of theoretical and
applied research methodologies, allowing for more thorough documentation of
student decision-making during the design process. The sequence of the building
systems courses was adjusted to ensure that ARCH 409 and ARCH 434, Building Systems
II (integrated building practices with emphasis on materials and assemblies,
environmental, structure, envelope, and service systems), are taken
concurrently. These changes allow the department to meet the specific learning
goals and requirements outlined in Realm
C of the NAAB’s conditions
for accreditation. The
results of these changes are being assessed over a three-year (academic) period
and will be reported during the next assessment cycle. The Master of
Architecture program introduced a more structured, “prescriptive” path in
response to assessment results. The structured path limits the variables at
play in the design process by prescribing the site and the program, as well as
provides a more focused set of requirements to maintain student concentration
on the core requirements demonstrating mastery. These changes have been
statistically successful in moving students from non-pass to low-pass, and from
low-pass to pass, allowing for a higher completion rate in the ARCH 599:
Master’s Thesis studio.
Similarly, the
Bachelor of Industrial Design program dramatically updated its goals, outcomes,
and assessment to align with the National Association of Schools of Art and
Design (NASAD) requirements. Preceding the NASAD accreditation visit, the
Industrial Design faculty realized that student outcome assessments no longer
aligned fully with the accrediting body’s standards. The NASAD provides a list
of essential competencies, experiences, and opportunities listing what a
student should know in the Industrial Design field upon graduation. That list
became the primary source for devising the five current goals and outcomes for
the program. In evaluating student-learning goals, the faculty rewrote for
clarity the objectives of INDN 499: Senior Project, with the goal of improving
students’ verbal communication and presentations. The senior project
presentations, both verbal and visual, improved in three key ways. To improve
visual presentations, the program began, in the second year and continuing
throughout the program, to require a 36” wide by 20” high poster that defined
the overall narrative of the project. In the third year, students now present
to actual clients and local professionals. Additionally, a greater emphasis was
placed on the student merit competition practice presentations for the
Industrial Designers Society of America (IDSA) Merit Awards. As a result, more
students were better prepared to present during the student merit competition
in the Spring semester of their fourth year.
An approximate
25% sampling of assessment summaries from the 12 academic programs in the
College of Business Administration includes:
·
Marketing (BS)
·
Accounting (BS)
·
Accounting (MS)
For the 2015-2016 assessment, one of the goals for the Marketing
BS program was that “teams will be able to effectively target customer segments
and effectively position brand(s) within these respective segments.” This goal
was assessed in the capstone marketing course’s simulation game and resulted in
an unmet goal of the 80% success rate. In response to these results, faculty
agreed on two instructional changes related to the game. First, the faculty
would spend more time on the importance of segmentation, targeting, and market
positioning. Additionally, because it was noted that teams spending the least
amount of time typically perform poorly, faculty would focus more attention on
identifying those teams exhibiting weakly earlier in the semester. Instructors
were asked to highlight the time spent on decisions in class as a way of
communicating the importance of investing the necessary effort to perform well
as a team. In 2016-2017, this goal remained unmet, but faculty continued to
focus on more instructional time related to these topics, and on identifying
underperforming groups earlier while readjusting the criteria. By 2017-2018,
the program successfully met its goal.
Beginning in 2016, academic programs within the Department of
Accounting updated the assessment process in order to increase focus on making
meaningful changes to the curriculum and other practices. The programs assess
all objectives using two or more measures, which allows the program to have
more informed decision-making: when one measure is met and one is not met, the
program can look to see what is working (or not) to more accurately prescribe
an improvement going forward. Additionally, beginning in the 2016-2017 cycle,
the department instituted a semester-end meeting devoted to assessment, where
results from prior semesters or the year were presented and discussed. This
meeting is mandatory for all full-time faculty; detailed, in-depth feedback is
shared and, often, decisions are made right then about future efforts.
The department solicits feedback about accounting majors from area
employers, as well as from CPA exam results. Employers consistently report that
students need more exposure to data analysis and stronger analytical skills. In
response, the Accounting BS program faculty have expanded the analytical
coverage in ACCT 333 to include specific foci, such as advanced Microsoft Excel
skills and accounting software knowledge. The subsequent feedback from
employers and students has been positive. In addition, a curricular change,
effective 2018-2019, has been implemented so that accounting majors are now
required to complete another information systems course that addresses advanced
data analysis.
The Accounting BS assessment was also modified in 2017 by removing
group work as a metric. Although the learning objectives were typically met,
the department questioned whether the results were indeed generalizable. The
department concluded that while group work is an important learning tool, it is
not necessarily a reliable representation of accounting students’ knowledge.
The assessment metric was therefore eliminated, prompting the restructuring of
some measures and the replacement of others.
Additionally, the department responded to
changing student interests, such as an increased demand for courses that help
students prepare for the CPA exam. In response, the department introduced into
the Accounting 333: Accounting Information Systems course a new, expanded unit
that provides information and includes material that helps students sit for the
CPA exam.
The Master of Accounting program offered its first
class in Fall 2014. The 2014-2015 assessment included the goal of “advanced
knowledge of core accounting disciplines” in ACCT 420: Tax Accounting, which
resulted in an unmet goal of 90% of students scoring 75% or higher on the tax
knowledge project. Assessment results were shared at the semester-end
departmental meeting, and faculty questioned whether the standards of
evaluation were appropriate. The faculty then revised the rubric to ensure
stronger rater reliability between the outside evaluator and the instructor of the
course.
An approximate 25% sampling of assessment summaries from the 12
academic programs in the College of Education includes:
·
Health Promotion and Wellness
Concentration (BS in Kinesiology) – online
·
Counselor Education (MS)
·
Educational Leadership (MEd) – hybrid
·
Educational Leadership (EdD)
The Health Promotion and Wellness (HPW)
concentration in the School of Kinesiology is the only fully online program in
the School of Kinesiology. All health courses in the curriculum are Online
Certified and were reviewed by content experts and peers following the Quality
Matters Rubric. The HPW program identifies four outcomes for student success,
measured through student-created work and an internship supervisor evaluation.
Performance targets are set at 85% achievement and have been met consistently
and with few exceptions. Faculty review outcomes in each cycle to include
action items for improvement. The current
cycle has provided increased success on all outcomes; however, an additional
English writing course (outside of the School of Kinesiology) was recently
added to the curriculum to support professional writing competency.
The Counselor
Education program has identified three specific student learning outcomes that
align with the Comprehensive Professional Counseling Examination (CPCE). The
CPCE is a good indicator of mastery of the eight core curriculum areas as
measured by other professional tests and addressed in the Council for the
Accreditation of Counseling and Related Educational Programs (CACREP) standards
for accreditation. All students in the department’s concentrations in Clinical
Mental Health Counseling and School Counseling must pass the CPCE in order to
graduate. The three areas selected for annual assessment are: 1) Ethics and
Professional Development, 2) Helping Relationships, and 3) Group Processes. The
CPCE is administered each semester as an exit exam. The department has
determined that adequate content mastery is achieved when students score no
lower than one-half of one standard deviation below the mean (based on the
national norming sample) for each content area. Scores from the three
identified core areas indicate an overall high level of success across the
student body. Students whose scores are weak (even if they are passing) are
interviewed to determine why they believe they did not do as well as they may
have expected or as well as their peers. These interviews have generated ideas
for program improvement. For example, students who did well in “Helping
Relationships” typically did well in “Group Processes” (and vice versa). This
led to collaboration between the instructors to reinforce critical concepts
across courses in order to improve retention and skills development. Likewise,
students who performed poorly in “Ethics and Professional Development” also did
poorly in “Group Processes.” The faculty suspected that the abstract nature of
ethics and professional development over a semester, without the benefit of
application, was less effective. Thus, the faculty implemented a group
experience in the Ethics curriculum and observed the subsequent CPCE scores.
Some improvement was observed across the board, but only slightly. Scores are
typically high, within passing range, and often substantially higher than
national averages. Consistently high scores have prevailed, and the program is
planning to change the target core areas for the next cycle of observation.
Specifically, the faculty have observed Multicultural Counseling scores and
Career and Lifestyle Development scores fluctuate more than some others, so
those are of particular interest, as is Theories of Counseling. This last,
while more consistent across administrations, serves as a basic course upon
which others are built. Other of the program’s required courses are not tested
on the CPCE; the department continues to explore ways to analyze those as well.
The student
learning outcomes for the Master’s in Educational Leadership (MEd) program
directly align with Educational Leadership Constituent Council (ELCC)
Standards. Using instructor-designed rubrics aligned to the ELCC Standards and
sub-elements addressing professional knowledge and skills, faculty assess
student learning outcomes by evaluating the content and quality of
performance-based tasks assigned in each course. To measure the overall
effectiveness of the MEd program, faculty analyze student learning outcomes at
the summative level, using a series of course-based artifacts (Using Data to
Affect Change, Analysis of Instruction, Analysis of Classroom Assessment, and
the Capstone Project). The MEd faculty also use non-course related activities
that are required of all students, including the six Mandatory Internship
Activities and the Standards Defense (a written and oral defense in which
students provide evidence that they have mastered the professional knowledge
and skills addressed in each of the ELCC learning standards). MEd faculty
analyze the results of these summative activities to identify any of the ELCC
standards in which student performance falls below a 90% passing rate. As a
result of this constant monitoring of student outcomes, assessments, and course
content, the six Mandatory Internship Activities were modified to better
reflect and address the most recent changes in K-12 administrative
responsibilities. These modifications include changes to mandatory internship
activities in order to provide the candidates with experiences that mirror the
evolving responsibilities of today’s educational leaders in the areas of
teacher evaluation, student diversity, educational equity, and professional
development.
The aim of the
Doctor of Education in Educational Leadership program is to prepare students to
analyze and solve problems likely to be encountered when leading and managing
modern complex organizations in either educational or non-educational contexts.
The EdD program has identified six standards assessed through three major
benchmarks, including the qualifying paper, proposal, and dissertation. The six
standards include: 1) problem statement, rationale, and key terms; 2)
literature review; 3) methodology; 4) data analysis and discussion; 5) summary,
conclusion, and recommendations; and 6) writing and formatting. Assessment
rubrics aligned to the three major benchmarks are used to determine student
performance. The rubrics delineate performance at four levels, including: Unacceptable
(0); Approaches Expectations (1); Meets Expectations (2); and Exceeds
Expectations (3). Performance targets are set at 100% achievement of students
meeting expectations (2). The target of Meets Expectations (2) is set for all
applicable standards for each major benchmark: qualifying paper (standards 1,
2, 6); proposal (standards 1, 2, 3, 6); and dissertation (standards 1, 2, 3, 4,
5, 6). Based on analyses of the rubrics for all benchmarks, doctoral students
have consistently reached the performance target of Meets Expectations (2) on
the identified standards for each benchmark. Action plans aligned with the
standard performance rubrics (specific rubrics for each major benchmark) have
been developed and are implemented when a student falls below Meets
Expectations. To correct deficiencies, supplemental activities within specific
courses or through the program structure ensure student success on the critical
standards. For example, analyses of benchmark rubrics indicated that some
students were performing below Meets Expectations (2) on standard 6 (writing
and formatting). To address these student needs, collaborative activities have
been developed and implemented in conjunction with the Writing Center to
strengthen student scholarly writing. In addition to partnering with the
Writing Center, the EdD program sponsored Dissertation Boot Camps consisting of
weekend writing sessions with faculty present to provide one-on-one and small
group assistance. The incorporation of additional writing opportunities, and
the consistent support of writing sessions offered by the Graduate School, has
resulted in the strengthening of scholarly writing, as evidenced through an
increase in rubric scores for standard 6 on all benchmarks, as well as quicker
student transitions from the first benchmark (qualifying paper) to the next
benchmarks (proposal and dissertation).
An approximate
25% sampling of assessment summaries from the 13 academic programs in the
College of Engineering includes:
·
Civil Engineering (BS)
·
Civil Engineering (MSE)
·
Industrial Technology (BS)
The undergraduate
Civil Engineering program’s published student learning outcomes align with the
Accreditation Board for Engineering and Technology (ABET) standards. Through the annual assessment process, the program
has used assessment results and data to inform decision making. For example,
the Civil Engineering Advisory Board reviewed the Civil Engineering seniors’
performance both overall and in specific knowledge areas. The data provided by
the National Council of Examiners for Engineering and Surveying (NCEES), which
develops the national exam (FE Exam) and administers the test, indicated that
students were consistently performing above the national average pass rate;
however, several areas measured below the national average. One such area was
engineering construction and project management. To address this problem, an ad
hoc committee reviewed the performance data, the exam specifications, the
course catalog description, the course syllabus, and construction example exam
questions provided by the NCEES. Based on this data, the committee recommended
changes to the course syllabus for CIVE 480, Construction Engineering. These
changes have been adopted, and the board will continue to monitor performance
in future exams to track any improvements.
Additionally, the
decision was made to remove the electrical and thermodynamics principles from
the Civil Engineering curriculum in an effort to add other important topics.
The need for greater instruction or academic credit involved with the CIVE 442,
Senior Design or capstone design course, and the inability to cover some
introductory topics in the CIVE 328, Geotechnical Engineering course for the
time allotted led to curriculum reforms focusing on course content and
sequencing. CIVE 442 is an extensive design experience involving a
multi-faceted project consisting of design teams and individuals responsible
for design components ranging across the sub-disciplines of Civil Engineering.
The project, which involves analysis—an open-ended design and research of their
project assignment that simulates the experience of professional practice—consumes
much of graduating seniors’ time. The original academic credit of two credit
hours associated with this course was insufficient to reflect the effort and
knowledge gained. The CIVE 328, Geotechnical Engineering course at that time
was structured as a two-hour lecture (2 credit hours) with a three-hour lab (1
credit hour), for a total of 5 credits. The lecture time initially allowed only
brief coverage of strength parameters for soils. Faculty desired better
coverage of strength parameters and an introduction to lateral and/or bearing
capacity, to provide a transition into the following foundation course, CIVE
438.
In 2014, the
opportunity to address these issues came with the one-hour reduction in the
campus-wide first year seminar (UNIV 100). In response to this reduction and
changes in the FE Exam, the faculty decided to drop the required electrical
(ENGR 201) and thermodynamics (ENGR 301) courses, resulting in a reduction of
six credit hours. These topics were replaced by a requirement for the second
physics and physics lab courses, PHYS 202 and PHYS 215 (5 credit hours). With
the available two hours, an additional one hour of credit was given to the
Capstone, CIVE 442 course, and an additional hour of lecture was assigned to
the CIVE 328, Geotechnical Engineering course.
In reviewing the
results of the changes, the faculty observed the following. First, the
additional credit hour for CIVE 442, Senior Design, allows better coverage of
the subject material and more adequately reflects the effort required. Student
evaluations indicate improvement in this area. Additionally, the additional
hour of lecture in CIVE 328 provides greater coverage of the material on sheer
strength and an introduction to lateral pressure and bearing capacity.
The undergraduate
program in Industrial Technology (ITEC) is accredited by the Association of
Technology, Management, and Applied Engineering (ATMAE). The ITEC department
follows the continuous improvement assessment model required by ATMAE, which
has been implemented by establishing three general outcomes, leading to nine
measurable program competencies. The program competencies are assessed on a
regular basis using three indirect measurements of student achievement
(student, employer, and alumni surveys), as well as a direct assessment of
student achievement through the evaluation of 20 specific course work products.
Any deficiencies in student achievement are noted, corrective actions are
planned and implemented, the results of the corrective actions are observed,
and adjustments are made as needed. For example, the ITEC BS program observed
in 2013 that students were not meeting expected goals in the area of industrial
safety. This finding led the department to incorporate certain aspects of
safety into several courses across the curriculum and to change the way one
specific class (ITEC 268: General Safety and Accident Prevention) was taught,
thus providing students with a better understanding of the practical
applications of incidence and severity rates. After modifying course content,
continued monitoring of expected results improved in one of two areas.
Additionally, because of ongoing requests from ITEC’s Industrial Advisory Board
for additional training in the area of safety, and the fact that all 92 job
titles disclosed by Industrial Technology alumni indicated some form of safety
risk in their jobs, the department determined that additional emphasis on safety
was needed. To address this request and to continue to analyze a low score in
one of the two areas, the department introduced ITEC 498, a
pilot course entitled “Applied Industrial Safety” in 2017, designed to address
the most common safety training needs of local industries. The course was
pilot-tested for two semesters. Data gathered from the pilot course indicated
that adding this course improved student knowledge retention in the area of
safety.
An approximate 25% sampling of
assessment summaries from the 30 academic programs in the College of Liberal
Arts includes:
·
Speech Pathology and Audiology (BA)
·
Speech-Language Pathology (MS)
·
Applied Language and Speech Sciences (PhD)
·
English (PhD)
·
Psychology (BS)
·
Psychology (MA)
·
Modern Languages (BA)
·
Criminal Justice (BS)
·
Sociology (BA)
The purpose of the undergraduate major
in Speech Pathology and Audiology is to prepare students to enter graduate
programs; thus, the program’s student learning outcomes cover the general
foundational knowledge required for eventual certification as a speech-language
pathologist or audiologist. In the 2015-16 assessment cycle, the Communicative
Disorders department took a close look at what student learning outcomes were
being assessed, and realized that the undergraduate students received very
little experience in clinical settings. An appointed ad hoc committee created a
pilot project that focused on one course (CODI 302) typically taught in the
fourth year and created probe questions targeting each of the student learning
objectives for the program. In this class, undergraduate students act as
assistants to graduate students who are the primary therapists; this experience
allows students to be immersed in the therapeutic process. At the end of the
semester, students were asked to reflect on how specific information about
hearing, speech production, language development, etc. helped them understand
the client, the disorder, and the actual therapy being applied. The department
expected to use this student feedback to look at the undergraduate curriculum
in a more holistic way.
The department began using this
assessment plan in 2016-2017 and continued in 2017-2018. In the first year of
implementation, a modest goal was established: 70% of students score at Level
Two (Adequate) or higher on a four-point rubric. The 70% goal was met on three
of the learning objectives, and nearly achieved for the other two. Students who
fell below the adequate level provided some indication in their responses that
they perhaps had not understood the task. Initially, students were simply asked
to “think broadly,” and attempt to integrate various sources of knowledge in
their responses. With this feedback, the department changed some of the prompts
to be more specific and incorporated example(s) to give students some ideas of
how they might answer the probe question, depending on the deficits seen in
specific clients. The department also noted that there were as many students
being rated in the highest category on the rubric as those in the lowest level.
As such, in the following year the criteria were modified to also distinguish
at least 20% at the outstanding level.
In the 2017-2018 assessment cycle,
students continued to reflect on all probe questions for all five learning
objectives, though only three were rated by the assessment team. Data was
encouraging: the number of students achieving a rating of adequate or better
ranged from 88% to 96%. However, upon closer reflection, a couple of trends
were noted. Overall there was improvement over the last assessment cycle, which
may reflect the changes made in the prompts, as well as the additional
instruction and examples students received. However, except for SLO 1, in which
88% of students were at the highest two categories, the number of students in
the adequate level fell for the other two objectives. The raters for these
student products suggested that the lower ratings on this objective (compared with
SLO 1) could be due to fatigue (students were asked to respond to all five
prompts even though the program was only gathering data on three, and the
length and overall depth of responses showed a clear decline from SLO 1 to SLO
5), or instructor prompts (the course instructor used the prompt for SLO 1 to
give examples of how students might respond, based on their own clinical
case).
In these two most recent assessment
cycles since changing how the program assessed basic knowledge, results have
allowed the program to see gaps in many students’ ability to think critically
and apply the knowledge obtained in the foundational classes to inform therapy.
Faculty recognize that students at this level may not possess the tools to do
this without specific guidance, and these recent results show the implemented
changes may have resulted in better understanding of the assignment. The fact
that the learning objective presented in class with concrete examples of how to
apply what was learned in a specific class to a clinical case showed greater
gains than the others may indicate a need to focus more on critical thinking
skills in undergraduate classes, rather than rote learning of specific facts.
Several more cycles of data and analysis will reveal whether changes are needed
in the undergraduate curriculum in order to meet program goals. The department
is confident, though, that the newly created assessment tool will assist in
providing those answers.
For the past 10 years, the MS in
Speech-Language Pathology program has included PRAXIS exam data in its
assessment process. Data for the past three years show that while the overall
pass rate has stayed relatively consistent (between 95% and 97%), and students’
average performance in the treatment-related subtest has stayed at 76%, scores
on the other two subtests have decreased from the 2016-2017 cycle. The
reduction is not dramatic but does fall below both state and national averages.
In the 2017-2018 assessment cycle, the department saw a similar trend in
competency ratings given to students by their clinical supervisors (on- and
off-campus). Graduating students consistently achieve proficiency ratings of
semi-independence in the areas of evaluation, intervention, and interaction
with clients. However, that area of assessment has been consistently rated
lower than treatment and professional practice for several assessment cycles.
Faculty were presented with these observations, and they agreed that this was
an area of the curriculum that needed strengthening.
In the Fall of 2018, a professional
seminar component was added to both clinical courses, CODI 510 and CODI 512.
The goal of this seminar is to provide additional opportunities for academic
and clinical faculty to work together in addressing areas of perceived weakness
in individual cohorts. The first targeted area is enhancement of students’
exposure to the assessment process. Specific assessment modules using actual
clinic cases or commercial simulations is presented, with faculty guiding first-year
students through the process. Second-year students also use the seminar to present
the results of assessments they have completed in the clinic, with the group
brainstorming ways to improve or change the process if needed. In addition,
clinical faculty have begun preparing video examples of best-practice of
various assessment procedures and will make these videos available to students
to view in the student workroom. Data will be collected over the next two years
using scores from the PRAXIS exam and competency ratings for graduating
students.
One of the PhD program’s student
learning goals in 2015-2016 was that students “will demonstrate a depth and
breadth of knowledge within the areas of specialization emphasized in their
program of study.” The goal was measured by 90% of students rated as Competent
or above on both aspects of the comprehensive exam scale, and 50% of students
rated as exemplary or highly competent; this goal was not met. The department
implemented the following changes as recommended by an ad-hoc committee of key
faculty working with PhD students: 1) new courses were added to the five-course
theoretical core and the three-course research core; 2) a professional issues
colloquium was added to accompany the existing research colloquium; and 3)
seminar courses were expanded to include the neurosciences, speech sciences and
disorders, and language sciences and disorders. It is thought that these new
courses will broaden students’ knowledge base, relative to the basic sciences.
Preliminary results on two students who took comprehensive exams in the Fall
2018 semester reveal the type of improvements the department anticipated, with
one student rated competent on both aspects of the scale, and the other student
rated as competent on content knowledge and exemplary in application of that
content knowledge. Additional data was gathered in Spring 2019.
The English PhD program aims for 90%
or more students to complete their secondary-area exams with an assessment of
"pass" or "pass with distinction" within two semesters, yet
this has not been met. As a result, the department reformed its comprehensive
exam process. First, the department changed the primary exam format from a
five-hour timed exam (responding to questions that students don't see in
advance, with no books, notes, or internet) to a portfolio. This allows for
better professionalization, as the genres in the portfolio lend themselves to
conference presentation and publication. Next, the department instituted a
policy whereby students may complete two courses in one of their secondary areas
instead of taking a timed exam; this gives students a richer learning
experience for that secondary area. Finally, the department adjusted curriculum
requirements for some concentrations to provide students more freedom and
flexibility when choosing their three secondary areas, potentially positioning
them for increased success in the exam process.
Additionally, the department has
identified a new goal to assess students' professional development:
"Students will develop their professional identities through such
activities as attending/presenting at conferences, publishing, performing
academic/community service, and seeking external training." To provide
scaffolding, the departmental Placement Committee has greatly increased the
frequency of professional development workshops, providing at least one per
week. The department has also distributed surveys to students who have achieved
candidacy to track their professional development over time.
In the Fall of 2013, the Psychology
department established a four-year assessment plan to systematically assess
various learning goals for the undergraduate Psychology BS degree. Assessment
data from the 2013-2014 assessment cycle confirmed the teaching faculty’s insight
that, although students did well in mastering content knowledge in the various
content domains established by the American Psychological Association (APA),
students showed significant shortcomings in the area of psychological and
scientific inquiry and were not effectively learning APA-style writing.
Recognizing the need for a
departmental resource to simplify the presentation of key concepts in APA-style
writing, the department created in 2015 a customizable PowerPoint and shared it
via the faculty Moodle page. Distribution of this resource resulted in greater
faculty awareness of APA-style writing issues, and a coherent teaching strategy
across the curriculum. Following implementation, assessment results for the
200-level students indicated that mastery of APA style rose from 34% to 71% by
the end of 2016; however, this improvement was not sustained. During the
2016-2017 assessment cycle, students did not consistently maintain an
acceptable level of performance (approximately 65% correct use of APA style).
By the 2017-2018 cycle, performance again increased to acceptable levels (>
70% correct).
Additionally, in the Fall of 2017, the
department conducted a Curriculum Map Assessment to determine at which points
APA-style writing was being taught in the curriculum. Based on this, the
department developed a Writing Throughout the Psychology Curriculum program
designed to expose students to all APA-style writing components at least twice
during their college career. As part of this program, in the Spring of 2018,
the department created customized content for introductory textbooks to provide
students with more information about the different types of writing encountered
in psychology, and, in the Fall of 2018, the department created Moodle-based
lessons and activities that could easily be adapted into any course for
APA-style writing instruction. The department will continue to assess student
writing outcomes, as well as faculty use of the APA-style writing resources.
Assessment has helped the department to clearly identify learning gaps, to
develop instructional interventions to address the gaps, and to transform the
way teaching faculty think about teaching.
The Psychology department collects
data on each of the following three areas from MS Psychology students:
research-based thesis, comprehensive exams, and internship evaluation. The
evaluation of thesis proposal and defense is completed by each of the thesis committee
members independently. The evaluation of comprehensive exams is completed by
three faculty readers independently. The internship evaluation is completed by
the Field Practicum supervisors who supervise students’ internships. Data on
the students who have successfully completed the NIH online ethics training are
also recorded.
Before Fall 2015, the program offered
two tracks to students: Experimental and Applied. Students in the Applied Track
could choose to complete a thesis (Applied-with-thesis), but few did (≤2
in two years). These students either took longer than expected to complete the
program, or they dropped the thesis and switched to the Applied-without-thesis
track. The program also assessed the outcomes of graduates from the Applied-without-thesis
track and found these students rarely applied to doctoral programs. Instead,
many went on to pursue a second master's degree in Counselor Education or
obtained generally low-paying social service jobs. These inconsistencies
between the program mission and student outcomes raised questions regarding the
utility of the Applied Track. To address these concerns, the department engaged
in efforts to redesign the curriculum and combine the best elements of its
Experimental and Applied-with-thesis tracks into a single master's program in
Psychology. The redesigned program and curriculum went into effect in Fall
2015. Thus, the 2015-2016 academic year is the last year with data from two
tracks of students (who were in the second and intended last year of the program).
Starting in Fall 2015 students pursue a master’s degree in General Psychology,
and are required to do research, including a thesis, under the supervision of a
faculty member throughout their graduate training. Clinically oriented students
may elect to complete up to 500 hours of supervised field practicum.
For the thesis, students are required
to pass thesis proposal and defense, respectively, with a rating of 1
(Satisfactory) or above (0 = Unsatisfactory; 1 = Satisfactory; 2 = Exemplary).
In general, the program has seen satisfactory ratings on the thesis proposal
(ranging from 1.05 to 1.50) and defense (ranging from 1.44 to 1.75). On average
per year, seven students successfully complete the thesis proposal, and eight
students complete the thesis defense. Despite students successfully passing the
thesis proposal and defense, students’ thesis progress tends to be slower than
recommended. The program’s goal is to have students propose the thesis by the
end of the first year. To this end, revisions were made to the graduate
curriculum in the Fall 2017 semester to incentivize timely thesis progress and
allow students to earn completion credit toward their Comprehensive Exams for
achieving thesis milestones in a timely manner. Specifically, students earn
credit for the Quantitative Psychology question in the Comprehensive Exams for
completing a successful thesis proposal by the end of the first year as
comprehensive evidence of quantitative knowledge. Additionally, students earn
credit for the Ethics and Standards in Psychology question for successful
submission of an IRB proposal before the end of the first summer as evidence of
comprehensive ethics knowledge.
Comprehensive exams are administered
at the beginning of the second year to evaluate the degree to which students
understand the basic principles of the science of psychology. In 2015-2016,
obtaining an 80% pass rate was the goal. Students who did not pass the exam had
to retake the whole exam, but with different questions. All students took the
comprehensive exam with a 91% pass rate, in which 20 out of 22 students
successfully passed. Although most students were passing, their responses were
generally not very strong. In addition, the content of the comprehensive exam
was limited in assisting students in meeting program goals. Thus, at the end of
AY2015-2016, the Graduate Curriculum Committee elected to revise the
comprehensive exam in both form and grading structure to more closely align
with the new graduate curriculum, which focuses on developing knowledge of
research methods, classic theories of psychology, and ethics standards, and
understanding the application of knowledge in the real world. As such, the
comprehensive exam was redesigned to assess mastery of ethics and standards in
psychology, conceptual and philosophical issues in psychology, and quantitative
psychology within the framework of each student’s individual research
interests.
The new comprehensive exam was
implemented in the Fall 2016 semester. Because completion of the comprehensive
exams is a requisite for completing the degree, a 100% pass rate is expected. A
new point-based scoring system is used to evaluate Comprehensive Exam
performance: 1 = Fail with substantially poor performance; 2 = Fail; 3 = Pass;
4 = Pass with above average performance; 5 = Pass with exemplary performance.
Thus, to pass the comprehensive exam, a student must earn an average of 3 or
higher across raters for each of the three questions. The new grading
procedures provide a more refined system of assessing knowledge of required
material and content and provide a higher ceiling than the simple pass/fail
procedures used in the previous version. The new exam also provides remediation
in the case of a student failing a question, in which students are given an opportunity
to reflect on their perceived weaknesses and convey their understanding
verbally. The committee members then provide oral feedback on the student’s
exam performance and clarify expectations if necessary. Students revise their
exam answer(s) for the second review. The remediation allows students to
demonstrate the ability to process and improve performance following feedback.
In 2016-2017, all 12 students who attempted the comprehensive exam successfully
passed.
In 2017-2018, nine students attempted
the exam, and eight passed; one student failed to earn scores on the revised
responses to meet passing requirements. At the completion of the comprehensive
exam cycle, a pass rate of 88.9% was achieved. Per departmental policy, the
student who did not pass the requirements with the revisions was asked to leave
the program. As stated above, in
2017-2018 the program added an alternative mechanism to the Comprehensive Exams
in order to incentivize timely thesis progress. The program is still collecting
data on the assessment of the new alternative mechanism, and the report will be
available in the 2018-2019 assessment cycle. The program continues to evaluate
the Comprehensive Exam policy and to look for additional revisions that may
help the program more effectively meet comprehensive exam goals.
The Department of Modern Languages
(MODL) aims to communicate creative and intellectual understanding of diverse
worldviews through languages and culture, fostering multicultural strength and
insight. The BA in MODL measures oral and written proficiency through oral
interviews and written portfolios (guided by the 2012 Proficiency Guidelines established
by the American Council on the Teaching of Foreign Languages [ACTFL]);
additional focus on cultural awareness and career opportunities is analyzed by
the faculty. In the 2015-2016 assessment cycle, the goal of 80% of students
demonstrating oral proficiency at the Intermediate High level or higher was not
met, with 68% of graduates meeting or surpassing the Intermediate High level of
oral language proficiency. The objective of 80% of students meeting or
surpassing the Intermediate High level of written language proficiency was
partially met, with 73% of graduating seniors reaching the Intermediate High
level. The objective of 85% of students being rated overall as "Good"
or "Excellent" on the evaluation rubric for awareness of cultural
diversity and of international perspectives based on knowledge of the
Francophone or Hispanophone world across its broad geographic distribution was
met, with 91% of students meeting the criteria. The goal of 75% of students
being rated overall as "Good" or "Excellent" on the
evaluation rubric for the ability to understand and analyze significant works
of literary or cultural importance was also met, with 77% of graduates
obtaining those ratings. The objective of 100% of students being aware of
career opportunities and describing their training as meeting or exceeding
their perceived professional development needs after graduation, was met, with
100% of graduates of MODL programs expressing satisfaction with their education
and training.
For 2015-2016, the department drew the
conclusion that more practice and feedback were needed to improve oral and
written competence of students in the program, although the approach to helping
students develop cultural awareness and the ability to analyze seemed
successful. Students reported that faculty provided adequate information about
careers with the languages. The department determined that more speaking
practice should be incorporated into courses at all levels to help students
practice and thus develop better oral proficiency. For written proficiency,
professors should include discussion of common grammatical errors and writing
strategies to help weaker students develop their writing skills in the second
language, especially at advanced levels. The department fosters cultural awareness
and an ability to analyze in its courses, but faculty are also encouraged to
mentor activities and organizations outside the classroom to foster interest in
and knowledge of the Modern Languages, as well as to support current students
and enrich the learning atmosphere.
In 2016-2017, the BA program exceeded
(82%) the goal of 80% of students demonstrating proficiency in their respective
target languages at the Intermediate High level or higher. In cultural
awareness in written work, measured by graduate portfolios, all graduates were
rated Excellent (64%) or Good (36%). In measuring students’ ability to analyze,
the Assessment Committee rated 91% of students as Excellent on the Evaluation
Rubric. Through interviews, the Committee evaluated students’ awareness of
career opportunities. It found that 55% planned to pursue graduate study, 9%
joined a Teaching Assistant Program, and 36% were considering joining a
Teaching Assistant Program. The committee concluded that the program needed to
raise expectations in all fields and added a new sub goal: 50% of all
graduating students attain the Advanced level or higher rating in written
language proficiency. Interviews with graduating students revealed several
suggestions for program improvement: 1) more emphasis on grammar and vocabulary
development; 2) more opportunities for natural speech, including
accommodation/recognition/promotion of the Hispanic/Spanish-speaking population
in Lafayette; and 3) more classes for the practical application of the
languages.
In the 2017-2018 cycle, in the measure
of oral proficiency, 88% of the students were rated at the Intermediate High
level or above for both oral and written proficiency in their language. The sub
goal for written proficiency was also met, with 55% of the graduating seniors
achieving Advanced level or higher. With regard to cultural awareness, students
were rated as either excellent (78%) or good (22%); for the measure of
students' ability to understand and analyze works, 55% were rated as excellent,
22% as good-to-excellent, 11% as good, and 11% as average-to-good. 100% of the
students demonstrated knowledge of career opportunities in sectors in which
ability to communicate in French or Spanish would be beneficial. 67% of
graduating seniors planned to pursue a master's degree either in the language
of study (33.5%) or in a field in which a second language would be advantageous
(33.5), 11% planned to pursue a career in education, and 22% planned to enter
the work force after graduating and pursue careers in which their language
knowledge would be a helpful or even significant ability. Student
recommendations included requests for a greater diversity in the content of
courses offered (especially Spanish and French for Specific Purposes [e.g.,
law, medicine, etc.] and linguistics), more advanced course offerings in a
given semester, more opportunities to do novel research, and improved
communication on study abroad opportunities.
Teaching faculty in French began
offering online courses with a wider scope and greater diversity of topics in
2016. These courses have been overwhelmingly successful, attracting new
students, minors, and majors. The University’s first Spanish for the Legal
Profession course is scheduled for Fall 2019 and is cross-listed with Political
Science and Criminal Justice. To promote more opportunities for conversation,
the department increased the frequency of conversation tables in French,
Spanish, and German; and Arabic was added to the language offerings, attracting
new groups of students, as well as current majors and minors wanting knowledge
of Arabic. Because low enrollment limits the number of upper-level courses
offered, MODL faculty are teaching more Independent Study courses, which help
majors graduate on time and focus on specialized topics.
Beginning in 2015, the Criminal
Justice department reset its assessment strategy to focus on the foundations of
teaching/learning criminology and criminal justice. This decision subsequently
led the department to emulate aspects of the Academy of Criminal Justice
Sciences (ACJS) assessment strategies embedded in accreditation standards to
better understand how the undergraduate population performed on generally
accepted industry-set priorities.
The 2015-2016 assessment cycle for the
department’s undergraduate program also lacked the resolution to shed light on
the struggles of students in classes, as noted anecdotally by the faculty. From
earlier assessment cycles, a critical enhancement was made to aid in closing
the anecdotal gaps in performance in key criminology and criminal justice
areas—namely, understanding and applying criminological theory, applying
critical thinking skills to policy-relevant decision-making, and gaining skills
in research methodology and analysis. This enhancement was achieved by adding
CJUS 499, Senior Seminar to the undergraduate degree plan (making it required
for all students). This course was meant to aid seniors in reinforcing
important elements of the discipline before graduation.
The 2016-2017 undergraduate assessment
cycle launched two drastic changes to previous cycles: 1) adopting ACJS
standards, and 2) enhancing the grading rubrics by using templates customized
by the faculty. The results of this cycle’s assessment were more in line with
the anecdotal feedback by faculty all along: students were having difficulty
with critical thinking and using evidence to propose policy solutions. Further,
undergraduates were having difficulty applying criminological theory. Despite
these gaps in critical areas, students reported satisfaction with the
curriculum, felt that the curriculum challenged them, and felt prepared for the
workforce. Based on this feedback, the faculty further adopted the ETS Major
Field Exam to better understand aspects of the undergraduates’ difficulties in
key areas of criminology and criminal justice.
The undergraduate program yielded
similar outcomes in 2017-2018 as in 2016-2017, with the added results of the
ETS Major Field Exam showing additional shortcomings by critical subject area
for undergraduates. While the ETS exam has not been made mandatory for exiting
seniors, the results of these exams gave the faculty pause. Subsequently, the
decision was made to get a broader sample by making this exam mandatory for the
2018-2019 assessment cycle. The faculty have decided to focus on criminological
theory, critical thinking regarding policy decision-making via
evidence-informed thought, research methodology, and the improvement of key
subject matter areas as informed by the ETS exam.
The first proposed intervention is to
provide an obvious linkage between the CJUS 305, Criminal Behavior course and
the CJUS 499, Senior Seminar. This can be done by better coordination among the
faculty teaching these courses to best rectify critical thinking skills and the
ability to use an evidence base to support policy related decision-making in
Senior Seminar. To do so, the faculty have decided to begin developing
assignments in both courses that use similar strategies as a way to test and
re-test this ability in 300- and 400-level coursework. Ongoing discussions
about support material to enhance this ability is occurring in the 2018-2019
assessment cycle.
Sociology BA students learn about
people as social beings and gain an understanding of the relationship between
society and the individual. Undergraduate students should demonstrate strong
research skills. This includes an ability to synthesize a body of sociological
literature and use it to support an argument that is then tested empirically
using appropriate qualitative or quantitative methodologies and results in a publishable
capstone project. These goals were refined over the last three assessment cycle
years as follows:
At the end of the 2015-2016 assessment
cycle and beginning of the 2016-2017 assessment cycle, several program
objectives were refined and assigned new assessment measures. For example, the
decision was made to link the synthesis and methods courses, and to require a
capstone project that bridged the two. The changes at the start of this cycle
were, in part, due to a newly developed understanding of the purpose and
methods of assessment. These new or refined objectives were developed and
implemented and assessed as a team, and frequent informal and formal meetings
on the subjects covered by these objectives occurred throughout the three
assessment cycle years.
First, the team designed a uniform
plan of teaching the (relatively new) synthesis course (SOCI 301) that serves
as the gateway to both methods courses, and as part one of the capstone
project. Initial evidence indicated that students who took 301 after
implementation of changes are better prepared for the next level of required
methodological coursework. The decision to link the synthesis and methods
courses with a required capstone project bridging the two courses worked well,
as demonstrated in the 2017-2018 assessment, specifically with Fall 2017 and
Spring 2018 course data (from SOCI 301/308/309). The data allow the department
to document improved student learning outcomes via the final combined
qualitative capstone projects. During that assessment cycle, the department
came to realize that pedagogical differences between instructors created
unexpected difficulties for students, as well as for assessing outcomes. In
response, the department has further adjusted teaching assignments to include a
new team teacher for the 2018-2019 academic year, and to assign a two-semester
sequence in the teaching load, so the same instructor teaches part one and part
two of the capstone project courses. The
next assessment cycle should allow for evaluation of SOCI 301 with both SOCI
306/07 and SOCI 308/09. In assessment year 2019-2020, the department will have
two dedicated instructors for the two- semester qualitative methods sequence
and two dedicated instructors for the two-semester quantitative methods
sequence; all will use team-developed teaching methods and lesson plans for the
universal synthesis course (301), and team-developed teaching methods and
lesson plans for the qualitative and quantitative portions of the
sequences.
Over the past three assessment cycles,
the overall impact of closing the loop has resulted in a) a refined
understanding of what was needed for assessment, b) a refined understanding of
what was needed for our relatively new synthesis course (301), c) a refined
understanding of what was needed to create a capstone project that bridged a
two-semester process, d) a refined understanding of the need for faculty
willing to work and teach as a team, especially for critical courses, and e)
the value of teamwork in improving the learning outcomes of our students. One
example of improvements in student learning is found in the report for SOCI
301, after implementing a team-developed uniform plan of teaching the course.
Students are also now exploring the potential to publish their newly completed
capstone projects.
Three years ago, capstone projects
were not required; there was no uniform plan for teaching the synthesis or
methods courses, and students often focused attention on multiple
partial-research projects with little understanding of how the parts might
piece together into a publishable whole.
Students today are in much better control of their understanding of the
connection between theory and research, of the different methods of research,
and on how the individual parts are synthesized into a publishable and
informative product that may be used for important policy decisions in the
public or private sector.
An approximate 25% sampling of
assessment summaries from the five academic programs in the College of Nursing
and Allied Health Professions includes:
·
Health Information Management (BS)
·
Nursing (BSN, including online RN-to-BSN)
·
Nursing (DNP) – online
One of the annual program goals for
Health Information Management (HIM) is that American Health Information
Management Association (AHIMA) data show that UL Lafayette graduates score at
or above the national average for all domains and subdomains on the
certification exam. The 2017 outcomes indicated that in the subdomain in the category
of “regulatory,” UL Lafayette graduate scores were 89% of the national average,
which represents the average score of graduates divided by the national average
score. Program faculty convened to plan corrective action to meet or exceed the
national average. The topics of this subdomain are mainly covered in the first
semester of the junior year of the HIM curriculum; to reinforce this knowledge,
faculty incorporated additional time during review sessions in the students’
final semester and added new review sessions. These corrective measures were
instituted in AY2017-2018, and the 2018 outcomes showed that for the three
“regulatory” tasks in the subdomain, UL Lafayette graduates scored 110% of the
national average, a significant improvement.
In addition to pass rates on the NCLEX-RN licensure exam, outcomes
for the BSN program are linked to elements highlighted in the mission and goals
of the Department of Nursing. One example of expected congruency between
mission, goals, and expected outcomes is in the area of leadership. Senior-level
students are required to complete modules developed by the Institute for
Healthcare Improvement as they prepare for transition into professional
practice. For the past three years, the benchmark of 100% has been achieved. As
this information is critical to ensuring safe healthcare practitioners, this
outcome will continue to be tracked, with measures put into place in a timely
manner in the event that the benchmark is not attained.
One example of measurement of online student
learning outcomes for the RN-to-BSN program is the successful attainment of the
benchmark related to NURS 327: Community Health Nursing with Diverse
Populations. All students must successfully demonstrate the ability to conduct
a community health assessment, with the benchmark that at least 75% will earn a
grade of 77 or higher on all three components of the final project. This
benchmark was met in 2016, 2017, and 2018. Student feedback on this experience
revealed that they perceive it to be very informative and integral to
understanding the core tenets of community health, which they do not receive in
their associate degree programs.
Accredited programs like the BSN often assess and document outcomes
outside of the Live Text platform. For example, the RN-to-BSN students
participate in virtual simulation during NURS 355: Health and Physical
Assessment. For 2016, the benchmark of a score of 77% was not met. Additional
online nursing health assessment resources were provided, along with
implementation of virtual conferencing sessions with faculty for students
having difficulty understanding the simulation. In 2017, 86.7% of the students
achieved the benchmark.
For the DNP program, one assessment target was that 90% of students would include documentation in their residency logs indicating that a minimum of 30% of their hours included inter-professional activities. It was determined that students were consistently not meeting this benchmark in their residency logs, although key assignments in other DNP courses meet the overall objective. These assignments focus on interdisciplinary collaboration advancing the level and quality of care across aggregates, populations, and systems. Effective 2018-2019, the assessment measure for this objective was revised, with a goal of 95% of students engaging in inter-professional activities by attending at least one day at the Louisiana State Capitol during a legislative session.
An approximate 25% sampling of
assessment summaries from the 18 academic programs in the Ray P. Authement
College of Sciences includes:
·
Biology (BS)
·
Computer Science (BS)
·
Environmental Science (BS)
·
Physics (MS)
·
Mathematics (PhD)
Biology BS [2015-2016, 2016-2017, 2017-2018]
The department of Biology recently adopted the core concepts of
“Vision and Change” endorsed by the Partnership for Undergraduate Life Science
Education (PULSE). As such, the Biology BS faculty sought to map course
objectives to the “Vision and Change” core concepts and competencies in order
to determine if any areas of the curriculum needed to be updated or adjusted to
ensure alignment. During Fall 2017, faculty evaluated courses using the PULSE curriculum mapping worksheet. During the
end-of-semester departmental retreat, faculty discussed the curriculum mapping results as they related
to the core concepts and competencies and developed
recommendations. Regarding core concepts, the Biology BS faculty determined that
they were addressing most concepts as appropriate to their courses. 100- and
200-level courses, however, did not report having balanced treatment of all five
Core Concepts; Evolution, Information Flow, and Pathways of Energy
Transformation were lower than Structure/Function and Systems, despite three
courses (111, 203, and 233) including evolution as a major part of the course,
110 having significant metabolism and genetics modules, and 233 having Genetics
and Evolution in its course title. The discrepancy may point to a real issue,
but could also be explained by a few instructors (n = 5) having more
conservative estimates of coverage compared with instructors of upper-level
electives (n = 15). Upper-level electives have strengths in addressing
evolution and structure/function. Faculty who teach these courses will consider
expanding coverage of other core concepts or offer electives that are focused
on genetics and energy pathways. Regarding core competencies, faculty agreed
that courses at all levels could do more to address the core competencies of
quantitative reasoning, modeling/simulation, and communication/collaboration.
Although many reported that their students have direct experience with the
scientific process, there may be opportunities to offer more of these authentic
experiences using large datasets in the public domain or those produced in the
course, and requiring students to work in teams to formally communicate their
results. Incorporating research into more courses, especially electives, would
simultaneously address the low scores in authentic research, team-based
learning, and model-based learning among the student-centered practices.
After this extensive curriculum review, the faculty agreed to
several actions. First, each course coordinator would review and refine her/his
stated learning objectives, look for opportunities to fill any gaps in content
and competencies, and redesign the courses to address and assess them using
recommended student-centered practices. Faculty agreed to consult with
colleagues to generate ideas for successful adoption of new practices, and to
build in assessments that would satisfy course needs. Additionally,
stakeholders for the required 100-200 level core lectures and labs were able to
propose common learning objectives for each course that all
sections/instructors will adopt and that, taken together, will give a balanced
treatment of the core concepts and competencies. This would give coordinators
of upper level electives a firm foundation of prerequisites on which to build
their courses. The catalog descriptions are being changed if they do not align
with these objectives.
Computer
Science BS [2015-2016, 2016-2017, 2017-2018]
The BS in Computer Science program assesses seven outcomes over a
two-year assessment period (three in the first year and four in the next year);
this robust assessment
schedule measures outcomes through direct and indirect means. In
2016-2017, for example, Outcome 3 (“Be proficient in more than one programming language on more than
one computing platform”) was assessed in three computer science courses (CMPS 351, 450,
and 460). The target was for 70% of students to average 2.8 on the departmental
rubric (where a score of 1 is “amateur,” 2 is “developing,” 3 is “developed,”
and 4 is “exemplary”). Faculty used the students’ scores on three different
assignments to make a determination on overall performance criteria. In the
courses offered in Fall 2016, 83% of CMPS 351 students achieved “developed” or
“exemplary;” 51% of CMPS 450 students achieved “developed” or “exemplary;” and
in CMPS 460, 77.4% achieved “developed” or “exemplary.” After analyzing the results, the department
observed that students in CMPS 450 needed more practical examples, particularly
for functional and logical programming. Thus, in a subsequent offering of the
course, faculty will demonstrate functional and logical programming on the
computer and solve programming problems in the classroom. As a result, students
will better understand the ideas and programming in different programming
paradigms. Even though this particular outcome will not be assessed again until
the 2018-2019 cycle, the Computer Science BS program employs a robust
assessment schedule to track seven outcomes every two years through direct and
indirect measurements, and the department’s assessment committee distributes
results and recommendations to the faculty to encourage ongoing improvement.
Environmental
Science BS [2015-2016, 2016-2017, 2017-2018]
The department of Environmental
Science expects its students to demonstrate a high level of work quality,
problem solving skills, and practical application of theoretical knowledge
gained. Environmental Science BS students participate in ENVS 472, which
includes an internship with an off-campus agency or organization. The
supervisors of the students are requested to rate their respective interns
using a 1-5 rating scale in terms of various criteria including attendance,
punctuality, general attitude, work quality, appearance, attitude toward
suggestions, initiative, problem solving skills, practical application of
theoretical knowledge, and professionalism. Success is considered achieved if
more than 75% of the student interns receive “Excellent” on work quality,
problem solving skills, and application of theoretical knowledge.
In 2016-2017, 73.3% of interns were rated as "Excellent"
on the categories of work quality and
practical application of theoretical
knowledge, which was an increase from the previous cycle (70% for work quality and 60% for practical application), but still below
the threshold of 75%. Interns scored 60% for problem solving skills, which was a lower rating from the previous
year of 70%. A close study of the internship reports revealed that, in general,
the interns were required to have some level of problem-solving skills on soil
analysis and mapping, field surveys, plant identification, use of GPS/compass,
radio telemetry, Excel data entry and charts, water testing, GIS skills,
organization of public events, social skills, public speaking, communication
skills (emails), understanding of research articles, and database management.
The faculty discussed these results in order to develop a plan for reinforcing
these skills in the existing curriculum. Specifically, faculty confirmed that
student interns in their Junior year lack the necessary knowledge and skills in
the area of water quality, soil health, field techniques, and data handling and
analysis to be able excel in their internships. As such, a change was made to
place interns only during their Senior year, or to only place those students
who have completed necessary courses and have developed skills that are
required for their successful internships. Additionally, in order to develop
students’ skills on data handling, management, and graphing, the faculty
required students enrolled in laboratory courses to pool their laboratory data,
create Excel databases, and perform necessary analysis for their lab reports.
As a result of these changes, the Environmental Science BS program
saw improvements in 2017-2018. 86% of interns were rated as “Excellent” on work quality and problem-solving skills, and 72% were rated “Excellent” on practical application of theoretical
knowledge. Upon further reflection, faculty determined that the interns in
general were required to have some level of laboratory determination of environmental
samples, mapping, data collection and handling, field surveys, plant
identification, and use of GPS/compass. These expectations were consistent with
previous results, and thus a plan was developed to address these specific
areas. Specifically, a 1-hour lab credit was introduced to the ENVS 490:
Environmental Pedology course to emphasize "hands-on" activities in
lab classes; it is expected to improve students’ work quality, problem solving,
and practical application skills.
Physics
MS [2015-2016, 2016-2017, 2017-2018]
The department of
Physics expects its Master’s students to demonstrate knowledge across the
discipline and have a deeper understanding in their areas of specialization.
General knowledge is assessed through regular evaluation in general classes,
while knowledge in their specialized fields is assessed through two seminar
presentations, a proposal defense, and a thesis/project defense. Non-thesis
track students take an additional written exam.
In recent
assessment cycles, three targets have been set: 1) Each candidate’s proficiency
in the specific subject of a class is evaluated through a final grade; the
target is to have all students pass with a grade of B; 2) For the non-thesis
track, the written exam is considered passed if the candidate obtains a minimum
of 50% in each of the tested areas. The target is to have all students taking
the exam pass; and 3) For the thesis track, the committee will vote to give a
score from 1-5 on the scale where 1=does not meet expectations; 2=approaching
expectations; 3=meets expectations; 4=slightly above expectations; and
5=exceeds expectations. A score of three (3) is considered a pass. The target
is to have 100% of the students taking this exam to pass. These three
assessment measures are reviewed together to make a determination on a
candidate’s preparation.
In 2016-2017,
Target 1 was not met, while Targets 2 and 3 were met. For Target 1, 14 students
each took an average of three classes and a seminar each semester; the majority
obtained grades of A and B, but one student earned grades of C and F. The
Physics department worked with other on-campus departments to deliver resources
to the student, but he withdrew before the end of the spring term (earning the
F). For Target 2, two students on the non-thesis track took the written
comprehensive exam, and both of them passed the four parts of the test (one
took the test for the second, allowed, time). For Target 3, one student on the
thesis track passed the comprehensive exam and defended his thesis. He obtained
all scores of 4 (“slightly above expectations”) and therefore passed. While two
direct measures were met, the department reflected on all results, and
additional actions were taken by the department. First, faculty enforced an
early research proposal defense. The program had one student defend his
proposal one semester earlier than the norm and defend his thesis in the third
semester (the average is four semesters); he graduated in three semesters and
became gainfully employed. Four other students defended their proposals on time
and are on track to graduate. Next, the department designed and implemented a
four-semester individualized plan for each student in order to help students
work towards their goals. Each customized plan is reviewed during one-to-one
meetings with the Graduate Coordinator, and as a student progresses in the
program, the requirements are shown as being achieved. Finally, the Graduate Coordinator
organized a seminar to discuss professional and ethical behavior in academia. Topics
included student-advisor and professional relations, recommendation letters and
rules, technical presentations, addressing requests, expected skills at
graduation, forms expected to be submitted as progress is made, and advice from
former graduate students in the department.
In 2017-2018, two
targets were met, and one was not assessed. For Target 1, all 15 students
obtained grades of A or B. Target 2 was not assessed because there were no
students on the non-thesis track. For Target 3, six students on the thesis
track passed the comprehensive exam and defended their theses, with average
scores ranging from 3 to 5. The graduate faculty are confident that the
four-semester individualized plans for students, as well as the professional
development seminars implemented previously have all positively contributed to
the achievement of these targets; as such, both of these efforts will continue.
Mathematics
PhD [2015-2016, 2016-2017, 2017-2018]
The department of
Mathematics expects its doctoral students to gain a deep understanding of the
subject matter and its connections with other areas, and to apply the knowledge
to problem solving in the real world or through research institutions or
academia. As such, students in the Mathematics PhD program are expected to
demonstrate a depth of knowledge by passing an oral exam in their area of research
specialization, following at least two semesters of advanced courses in that
area. The exam is given by a committee of at least three Mathematics graduate
faculty members with expertise in the field, and evaluated in accordance with
departmental rubrics. Success is defined by at least 75% of students who
attempt the oral exam in a given calendar year being rated as at least
"Satisfactory" in accordance with the departmental rubric. In
2016-2017, four students completed the oral portion of the Comprehensive Exam.
All four completed it in their first attempt, with all examiners rating their
performance "Satisfactory" or better. After several students failed
to pass the oral portion of the Comprehensive Examination during previous
academic years, the department put in place processes to better educate both
students and junior faculty on the Oral Examination and help the students
better prepare for the exam. As a result, the students are waiting less time to
take the exam on average, and yet the performance and outcomes have improved,
with no failed attempts in the last 18 months. Additionally, the department had
encouraged students to engage in "mock oral exams" with more advanced
students, and to interact with faculty members in their committee ahead of time
so that expectations are clear. In 2017-2018, the two students who attempted
the exam were scored as "Highly Satisfactory" or
"Outstanding" by all examiners, which demonstrates how the oral exam
results have improved. While the student outcomes are being met, the department
is cognizant that there is not always consistency in the expected level from
year to year in the comprehensive exams. As a result, the department is working
to establish solid baselines that can be used (and slowly modified as needed),
and expects this data to inform the content of the basic courses. This will
ensure a more uniform performance among graduates.
University College offers one academic
degree, the Bachelor of General Studies, summarized below.
The BGS, administered through
University College, is an interdisciplinary degree and, within its 120 credits,
students are able to choose 36 credit hours across three academic areas of
enrichment. Of these, one enrichment area serves as a foundation to complete a
concentration of 24 upper-level credits. Because students’ skills are acquired
through multiple academic disciplines, BGS graduates are expected to earn their
baccalaureate degree having demonstrated adequate oral and written competencies
through pre- and post-assessment essays and interviews, based on a
college-created rubric. Each semester, data are collected and compiled, in
which 75% of students assessed in the graduating semester must show improvement
based on comparisons of pre- and post-oral (interviews) and written (essays)
data, in which students meet or exceed expectations. These assessment data are
used to determine if students are prepared to: a) orally present themselves and
articulate their skills as they enter the job market, b) demonstrate adequate
ability to express their ideas in writing, and c) present feedback regarding
their career planning and other support obtained through the college. All of
these analyzed data are reviewed by the Dean and discussed annually with the
advising staff in University College. After each review, the team identifies
needed strategy changes aimed at improving advising and academic support to
students as it strives to produce stronger student outcomes.
UL Lafayette has established and
maintains a systematic, comprehensive, and effective process by which student
learning outcomes are identified, assessed, and analyzed, leading to continuous
improvement efforts.
2016-17 Assessment Cycle Handout
2017-18 Assessment Cycle Handout
2018-19 Assessment Cycle Handout
Assessment
Rubric: Non-Academic
BIOL
Curriculum Mapping Summary
BIOL Vision
and Change Curriculum Mapping Worksheet
Office
of Institutional Assessment: Assessment Cycle
Review of Assessment Plan email
Sample University Assessment Council Agendas
Sample University Assessment Council Minutes