Skip to main content

Glossary of Terms

A - B - C - D - E - F - G - H - I - J - K - L - M - N - O - P - Q - R - S - T - U - V - W - X - Y - Z


Academic Area

A group of closely related programs, disciplines, and courses that carry a common discipline designator.

Academic Assessment Year

A designated timeframe for assessment activities that begins in August of the Fall academic semester and runs through October 1st of the following year (due dates for major reports overlap with the beginning of the next assessment timeframe to provide extra time for discussion/completion).

Example: the 2021-2022 Academic Assessment Year runs from August 1, 2021 – October 1, 2022.

Academic program

Any course of study that results in a certificate or degree.

Examples: Biotechnology AAS, Communication Studies AA, Nursing AS, Audio and Production Certificate, etc.

Administrative program

Any administrative area or department that provides services to the College, including instructional or special opportunities for students or employees, that falls outside the realm of regular academic curriculums.

Examples: Student Services, The Assessment Center, WDCE, or the MC Library.

Administrative and Special Programs Assessment

The process that examines an administrative or special program’s success with achieving outcomes and institutional priorities. The results of this examination are incorporated into an institutional 6-year review for the program and includes recommendations for change/improvement.

Analysis of data

A detailed examination of the assessment data that has been collected to look for patterns and to determine essential features that might point out areas to celebrate or areas of concern.

Assessment (and Reassessment)

Assessment:
The process of systematically examining patterns of student learning across courses and programs and then using this information to improve educational practices10 (i.e., Years 1-3 of the assessment cycle).


10 Carnegie Mellon University. (2022a). Adapted from: www.cmu.edu/teaching/assessment/basics/grading- assessment.htmlnew window


Reassessment:
Examines the same patterns above, but with a comparative analysis of pre- and post- performance results within a given cycle period (i.e., comparing results from years 1-3 with new data from years 4-5 as part of the CAR report in Year 6).

Assessment Cycle

An ongoing process of identification, evaluation, reflection, and improvement for defined standards of performance. An assessment cycle generally consists of the following activities:

  • Planning - identify and define outcomes/goals and standards of performance
  • Data Collection & Analysis- collect data that reflect this performance and analyze results
  • Implementation of Actions for Improvement – implement specific, measurable actions that will contribute to the success of the identified goals/outcomes
  • Reflection & Revision – determine whether the implemented actions have had the desired impact and make any necessary revisions to the original plan to support further improvement
  • Repeat the Cycle

Assessment Cycle Year

Any year between 1-6 of the 6-year assessment cycle that identifies a discipline’s/program’s place within the cycle and the required activities that should be performed by that discipline/program; each cycle year aligns with a specific Academic Assessment Year.

*Also see: Assessment Group Designation and Academic Assessment Year

Assessment for Accountability

The assessment of some unit, such as a department, program, or the entire institution, which is used to satisfy some group of external stakeholders. Stakeholders might include accreditation agencies, state government, or trustees.11


11 Carnegie Mellon University. (2022b). Adapted from: www.cmu.edu/teaching/assessment/basics/glossary.html


Assessment for Improvement

Assessment activities that are designed to feed the results directly, and ideally, immediately, back into revising the course, program or institution with the goal of improving student learning.12


12 Ibid.


Assessment Group Designation

A group number that corresponds to a discipline’s/program’s assigned placement in the assessment cycle and determines what activities are required for that discipline/program for the assessment year in question.

Assessment Instrument

An artifact or activity that will be used to collect data that reflects student performance or levels of goal/outcome attainment. These may take many forms and depend on the type of assessment being conducted.

Examples: exams, surveys, practical exercises, papers, lab exercises, capstone projects, high impact practices, observation/demonstration checklists, portfolios, etc.

Assessment Plan

A formal document that includes details of what is being assessed (outcomes, competencies, goals, etc.), how it is being assessed (instruments, performance benchmarks, etc.), where it’s being assessed (where are data being collected), and when the assessment will take place.

*Templates of assessment plans can be found in the Assessment Repository in Blackboard – see “Assessment Repository” above or consult the specific assessment section in the Handbook for more information.

Assessment Repository

A comprehensive assessment site on Blackboard that stores documents and reports pertaining to the Montgomery College assessment process and provides up-to-date resources for assessment activities that are being performed at the college; a central location for faculty and staff to easily access assessment materials for the purpose of promoting optimal collaboration, organization, and efficiency during assessment functions.

*This is a Blackboard Community site that requires an MC ID to gain access. Instructions on how to join this Blackboard Communitynew window.

Assessment Tools

The collection of plans, instruments, performance criteria, and tasks that make up the assessment process for evaluating student performance or goal/outcome attainment.


Benchmarking

Determination of a standard measurement for evaluating or comparing student performance on assessment instruments.


Capstone course

The course where students reach the highest level of proficiency in program outcome criteria. This course presents an opportunity to assess all or most of the program outcomes.

Capstone assignment/rubric

A method of assessment that allows students to demonstrate the successful acquisition of program outcomes. Predetermined rubric(s) help to gather and report this information using the college assessment software. This assignment could take many forms including an exam, a research paper, an oral presentation, or an industry-based project.

CAR Recommendations

The culmination of the College Area Review is a set of recommendations for improvement or direction for the program(s) being reviewed. These recommendations are linked to the Academic Affairs goals.

Closing the Loop

A term that applies to the comprehensive fulfillment of all stages of an assessment cycle, with particular emphasis on closing out the last step of the cycle, where actions are implemented for improvement as a result of assessment findings.

Campus/Curriculum Advisory Person (CAP)

A member of the Collegewide Curriculum Committee (CCC) that assists faculty members on each campus with specific tasks regarding curriculum proposals and general education certification/recertification (assistance with forms, the overall process, answers questions, etc.).

College Area Review (CAR)

A comprehensive, ongoing process of self-evaluation for all academic areas (credit and non- credit), student affairs, and administrative units that takes place over a 6-year cycle. The CAR is an analytical review process that engages faculty, staff, and administrators. Evaluation methods utilize both qualitative and quantitative data to inform decision-making, resulting in implementable recommendations for institutional improvement. An official report that documents this process is completed every 6 years.

College Area Review (CAR) Report

A report that documents the results of any discipline, program, or administrative area’s self- evaluation during the 6-year College Area Review (CAR) process. Approved recommendations and comments from the Dean, Provost, College Area Review Committee (CARC), and the Vice President of Academic Affairs compose the final elements of the report and progressive updates on all recommendations in the report are required during the subsequent 6-year period.

College Area Review Committee (CARC)

A collegewide committee that examines report findings and recommendations including key data metrics, with consideration for how best to use College resources to support student success and institutional effectiveness. This committee is charged with the following duties:

  • Review academic areas (including degree programs, disciplines, and special programs), student affairs, and administrative unit reports and make substantive comments on recommendations
  • Define the program viability process and identify key indicators
  • Recommend programs for the academic program viability review based on selected key indicators
  • Participate in the viability review process and make recommendations to the senior vice president for academic affairs

Collegewide Assessment Team (CAT)

A collegewide committee that identifies needs and develops recommendations regarding collegewide assessment of student learning in order to strengthen the college and enhance its accountability. This committee is charged with the following duties:

  • Review individual programs’ assessment plans and reflections and recommend ways for improvement
  • Review General Education assessment reflections and recommend ways for improvement
  • Review guidelines and templates/forms for assessment plans and
  • Make recommendations for improving the assessment processes at the college

Collegewide Curriculum Committee (CCC)

A collegewide committee that is to review, evaluate, and update the curriculum; to oversee initiation, design, development, modification, and discontinuance of courses and programs offered by Montgomery College; and to inform the units of the College administrators responsible for implementation and the College community with respect to modifications in the curriculum.

Combined Assessment Schedule

A comprehensive schedule that provides a timetable of required assessment activities for all areas at the College which corresponds to a 6-year integrated assessment cycle. Assigned groups for all disciplines, programs, and administrative areas are reflected on the schedule, and the schedule aligns with the respective calendar years in which the activities are expected to be completed.

Course level outcomes

Each course should have clearly defined, student-friendly outcomes that align with program outcomes and reflect a progression of skills and knowledge that build on each other. Course outcomes should reflect the appropriate level of cognitive skill and competency based on that progression of skills. (Student-friendly does not mean avoiding all terminology or academic language but consider that student learn more when they can assess their own learning, so clear outcomes written in terms they can understand will increase their ability to achieve those outcomes.)

Curriculum mapping

Determining the alignment of course outcomes to program outcomes by indicating the course(s) where the knowledge, skills and attitudes reflected in program outcomes is either introduced (I) or reinforced (R) or mastered (M) and what method of assessment is currently being administered.


Data

The results of assessments that are reported and collated by the college software data collection and reporting system.

Data Collection

Systematically scoring and recording assessment results into the college software system.

Data collection plan (Assessment cycle planning sheet)

The timing and scope of assessment data collection based on size and complexity of a course or program. The data collection plan form for Gen Ed and Programs includes a schedule for collection and reporting based on the 6-year cycle and denotes when data collection will take place for Gen Ed courses and for program outcomes.

Demographic data

Once the basic scoring takes place, the reporting system links student scores to their M numbers and generates a report of student success levels that includes gender, race and other demographics that could help target planned improvements to support specific groups of students.

Direct assessment

Measures of learning that are based on student performance or demonstrates the learning itself.

Examples: Scoring performance on tests, term papers, or the execution of lab skills13


13 Ibid., 70.


Discipline

Generally, the name given to a set of courses which are identified by a particular four-letter prefix in the College catalog. The meaning is similar to “subject" in secondary school.

Examples: ENGLish, NURSing, BIOLogy, etc.


External Peer Reviewer

An outside faculty evaluator who has expertise in the particular academic area under review. The academic area is solicited for suggestions as to possible names of peer reviewers, with final approval from the Dean. The peer reviewer conducts a one-to-two-day visit of the College. After this objective review by the peer reviewer, the reviewer is expected to provide a written report that becomes a part of that area’s CAR report and is forwarded to the College Area Review Committee (CARC) as a part of the final review package (after approval from the area’s Dean and Provost).


Faculty Workgroup or Workgroup

An ad hoc faculty committee that reviews the data, evaluates the academic area, and completes the College Area Review Reports. The members, appointed by the lead Dean along with input from the faculty chairperson of the academic area, are faculty who teach in the area being reviewed.

Formative Assessment

Formative assessment refers to a wide variety of methods that teachers use to conduct in- process evaluations of student comprehension, learning needs, and academic progress during a lesson, unit, or course. The general goal of formative assessment is to collect detailed information that can be used to improve instruction and student learning while it’s happening.

What makes an assessment “formative” is not the design of a test, technique, or self-evaluation, per se, but the way it is used—i.e., to inform in-process teaching and learning modifications.

Examples: Questions posed in class during the learning process to determine what specific concepts or skills they may be having trouble with; specific, detailed and constructive feedback on student work, etc.14.


14 The Glossary of Education Reform (2014a). Adapted from: www.edglossary.org/formative-assessment/


General Education Assessment

The process that examines student acquisition of General Education competencies across all courses that are certified in the General Education Program.

General Education Competencies and Proficiencies

Skillsets that are considered to be fundamental to any undergraduate student’s academic curriculum; these skillsets are usually initiated through a General Education Program and continuously improved upon as the student increases course levels within any

academic major or curriculum; ideally, these skillsets should reach a level of proficiency upon graduation.

*See page 3 of the Handbook for specific criteria

General Education Program (Gen Ed)

A program that is designed to introduce students to the knowledge, skills, and values that are essential to the study of academic disciplines, encourage the pursuit of life-long learning, and to foster the development of educated members of the community and the world. The program requires students to take a central group of foundation and distribution courses in English, mathematics, arts, behavioral and social sciences, humanities, and science, and have the option to take additional courses in health and communications.

General Education Certification/Recertification

The process of ensuring that any course that is either applying for General Education status for the first time, or any course that is reapplying for General Education status (existing certified courses), meet the standards of the General Education Program at Montgomery College and the State of Maryland. This process takes place every time a discipline enters its designated 2nd year in the assessment cycle.

General Education Standing Committee (GESC)

A collegewide committee that guides the General Education course (re)certification process and makes recommendations for course (re)certifications to the Office of the Senior Vice President for Academic Affairs.

Goals (Area or Institutional)

Area Goals:
The end results or achievements that any discipline, program, or administrative area is trying to accomplish. These should reflect, and be directly supportive of, Montgomery College’s institutional and strategic goals.

Institutional Goals:
Clearly defined performance and behavioral expectations that support the achievement of the institution’s mission.


Holistic Assessment Process:

Organized assessment activities that are ideally faculty and staff driven, and that reflect an interconnected, comprehensive process to evaluate and improve student performance and goal achievement across and between, all levels of learning and all College services.


Indirect Assessment:

Assessments that use perceptions, reflections, or secondary evidence to make inferences about student learning.15

Examples: surveys, self-assessments, etc.


15 Ibid., 70.


Institutional Effectiveness

The ongoing assessment of how well an institution is meeting its mission and institutional goals.

Institutional level Outcomes

The collective knowledge, skills, and competencies that every student is expected to attain upon completion of their educational experience.

Integrated Assessment Cycle (see also: Combined Assessment Schedule)

A sequence of assessment functions that are combined within a given time period to promote efficiency, effectiveness, and an understanding of how assessment functions are interconnected and interdependent for improving student learning and institutional effectiveness.

Interim Yearly Update Report (or Annual Update)

A brief report completed at the end of years 1, 2, 4, and 5 (respectively) of the 6-year assessment cycle for disciplines that have general education courses and/or a degree or certificate program. This report provides information on the previous year’s assessment activities (i.e., data collection, etc.). The report is due by August 1st of the respective assessment year in which it is being completed.


Middle States Commission on Higher Education (MSCHE)

MSCHE is a regional accreditation agency that is recognized by the United States Department of Education and oversees accreditation activities for institutions of higher education. MSCHE is the agency that conducts Montgomery College’s accreditation process.


Office of Assessment

The office that oversees and organizes collegewide assessment activities; supports and maintains assessment committees that assist, evaluate, and provide feedback for assessment processes; provides training on assessment techniques, assists faculty/staff with assessment planning, and ensures that MC’s assessment practices meet accreditation standards.

Outcome

Something that follows as a result or consequence.16

*See also: Student Learning Outcome (SLO), Course Outcome, or Program Outcome 


16 Amended from: Merriam Webster (2022). Definition of Outcome from: www.merriam-webster.com/dictionary/outcomenew window


Planned Actions for Improvement

Clearly defined statements that indicate specific and measurable changes for improvement that address any weaknesses or gaps in performance or services. Upon approval, these changes should be expeditiously implemented.

Planned Data Collection

The organized process of collecting, scoring, and recording results of any required assessment activities that are applicable to any course, discipline, or program within their assigned assessment cycle schedule.

*See also: Data Collection Plan

Program

Any course of study that leads to a certificate or degree.

*See also: Administrative Program or Special Program

Program Learning Outcomes (or Program level Outcomes) (PLOs)

Outcomes that reflect the broader knowledge, skills and attitudes that are reinforced and mastered through the courses in the program. PLOs indicate the expectations for a student who has completed the series of courses indicated by the program curriculum.

**For more information, access the ELITE Course: “Writing Great Learning Outcomes” – (1.5 hours to complete)

Program Viability

The program viability review process will ensure that all programs effectively use the College’s instructional resources, support the College’s mission, and serve the needs of students and the College community. As part of the academic program review process, programs will be identified for a viability review at the request of the dean, vice president/ provost, the College Area Review Committee (CARC), or the senior vice president for academic affairs. The review can occur at the end of regular academic program review cycle or during other designated times. Triggers for program viability review include both quantitative and qualitative data metrics.


Reliability

The extent to which the results really measure what they are supposed to measure. This can be achieved through consistent adherence to planned scoring and data gathering by all involved, a clearly defined rubric, and an appropriate sample size.

Rubric

A clearly defined set of criteria for scoring student work that includes descriptions of various levels of performance success such as “Advanced”, “Proficient”, “Novice” and “Not Evident”, or something like this: “Mastery”, “Proficient” and/or “Inadequate progress”.

A rubric for assessing a program or a college area should include the program outcomes or area goals as well as these measures of performance.


Sampling

Selecting and analyzing a predetermined number of subjects (or data sets) from a larger population that will be representative of the population being studied.

Scoring data

Evaluating performance levels based on preset criteria (such as benchmarks).

Signature Assignment

Each General Education course should have one assignment, or a combination of assignments, that are administered near the end of the course or program that measures student performance for each applicable General Education competency.

Special Program

Any area at the College that offers specific academic instruction or experiential and learning opportunities that support or enhance a student’s academic curriculum. Examples: General Education Program, Scholars’ Programs, Paul Peck Humanities Institute, etc.

Student Learning Outcomes (SLOs)

Clearly defined, measurable student-friendly achievement goals that reflect a progression of skills and knowledge that build on each other that include unit outcomes, course outcomes and program outcomes. All outcomes should reflect the appropriate level of cognitive skill and competency based on that progression of skills.

Summary Reflection

The summary reflection report is required for both Gen Ed and Program assessment. After discussion and analysis of the data collected for the current period and comparison to the data collected and the planned actions of the previous reporting period, the faculty workgroup should capture the collaborative thinking of the group and submit the summary reflection to the office of assessment. (This form is available on the assessment repository on Bb) The summary reflection report includes the number of students assessed, the semester(s) that data was collected, a summary of the data including percentages and demographic results, and a comparison of results to previous data collection periods as well as planned actions for future improvements or innovations.

Summative Assessment

Summative assessments are used to evaluate student learning, skill acquisition, and academic achievement at the conclusion of a defined instructional period—typically at the end of a project, unit, course, semester, program, or school year. They are generally evaluative, rather than diagnostic—i.e., they are more appropriately used to determine learning progress and achievement, evaluate the effectiveness of educational programs, measure progress toward improvement goals, or make course-placement decisions, among other possible applications.17 Examples: midterm/final exams, capstone projects, culminating presentations or portfolios, etc.


17 The Glossary of Education Reform (2014b). Adapted from: www.edglossary.org/summative-assessment/


Validity (In assessment)

The extent to which an assessment instrument is actually measuring the skills, knowledge or competencies that it is intended to measure.

Example: assessment instruments such as exams, projects, etc.


Year-3 Integrated Reflection Report

An assessment report that is completed in the third year of the assessment cycle by any discipline that has certified general education courses, any discipline that has a certificate or degree program, and certain special programs. Disciplines and programs are asked to report on integrated assessment activities over the previous two years, to include information on the following: data collection activities, a reflective discussion of student performance based on data results, actions for improvement, updates for College Area Review (CAR) recommendations, and program enrollment/graduation information.

*For more information on this report, please see the handbook sections on “General Education Assessment” or “Program Assessment.”

Year-6 Reflection Report

An assessment report that is completed in the sixth year of the assessment cycle by any discipline that has certified general education courses, any discipline that has a certificate or degree program, and certain special programs. This reflection is attached as part of the College Area Review (CAR) report and includes a discussion of student performance based on data results and actions for improvement.

*For more information on this report, please see the handbook sections on “General Education Assessment” or “Program Assessment.”