The General Education program at Ferris State University has been systematically collecting student-level assessment data since the fall semester of 2017. The initial process attempted to use Nuventive Improve assignments to directly collect assessment data from course instructors. To accomplish this, instructors were required to indicate their intent to submit data by registering online via a Google form. The General Education assessment coordinator then used that information to create individual assignment in Nuventive Improve. These assignments were automatically delivered to the respective faculty members by email.
Instructors recorded course assessment data using standardized Excel workbooks. To complete these documents, the names and student ID numbers for the course first needed to be copy and pasted in. Then an individual score (either a ordinal or interval values) were assigned to each student. Dropdown lists were used in the workbook to collect several pieces of standardized meta-data (e.g. standard measure, standard pedagogy, outcome, etc.). The workbook automatically created a statistical summary of the results in the form of a short paragraph. This summary was used as the basis for the result that was entered into Nuventive Improve.
The instructor assignment included a link to the appropriate location within Nuventive Improve to record each result. The summary generated above was copied and pasted into the result field in Improve. The dropdown values from the spreadsheets were also used to populate similar fields in the Improve results. Finally, the completed data workbook was attached to the Improve result and submitted. This process was intended to be standardized and simple for people to complete. In practice, however, it has proved to be more time-consuming, complicated, and difficult than envisioned.
Limitations and challenges
Many lessons have been learned during the implementation of the initial data collection process. The majority of faculty members have made a good-faith effort to report data for their General Education courses. In the process of doing so, however, three main challenges have arisen.
Low levels of participation – The registration process unintentionally made the data reporting process more-or-less voluntary. If instructors declined to register online, they would not receive an assignment. Furthermore, there was no mechanism to easily track how many General Education sections were registering or reporting data. An average semester saw about 200 registrations (one registration could account for more than one section), while there are around 1,000 to 1,500 General Education sections offered each semester. That places overall participation at about 20-25%.
Completion difficulties – The data workbooks ended up being more difficult to complete than anticipated. The process of getting student names and ID numbers required several steps and many faculty struggled with it. In addition, the steps involved in attaching the workbook to the Nuventive Improve result were confusing to many (that is a structural issue within Improve and cannot be altered on our end). Most faculty members needed about an hour to complete and submit their data, whereas the intention was to require only about 15 minutes.
Administration difficulties – The initial process also proved to be difficult to sustain administratively. Several manual steps were required that made the assessment coordinator’s job more difficult. First, the registration form had to be configured and monitored each semester. Next, the registration data had to be used to create assignments for all faculty. Due to the complexity of completion mentioned above, the assessment coordinator often had to meet with faculty groups to facilitate data reporting. And finally - because of faculty misunderstandings, many pieces of reported data had to be manually edited (e.g. add student IDs or fix other data entry errors).
Magnitude of the project
The sheer size of the General Education program contributes to the difficulty in obtaining census assessment data. For instance, the General Education instructional dataset for the Spring semester of 2020 included a total of 25,394 student records. This represented 409 courses which were delivered in 1,148 total sections. These classes were taught by 522 different instructors. Some students were enrolled in more than one General Education course for a particular competency. Therefore, there were fewer unique students than records. In addition, some courses count for more than one of the General Education core competencies (double- or triple-dippers). Because of this, there are more total values (when adding up all the competencies) than the original dataset actually contained. Here is a brief breakdown of the records for this semester.
|Communication - Oral||3||31||12||818||823|
|Communication - Written||8||136||46||2,594||2,605|
|Diversity - Global||32||68||32||1,420||1,548|
|Diversity - Race, Ethnicity, Gender||31||82||27||2,064||2,436|
|Self and Society||42||123||40||2,868||3,500|
The code involved
I don’t know about you, but I do not like black boxes. To make this overall process as transparent as possible (within the confines of FERPA regulations), I have published all of the code necessary for anyone to replicate our work on GitHub.
This repo contains all of the code needed to run the project on your own machine. It also includes a small (fake) dataset to practice on. You will need a few things to get started.
- R - obtain here: https://www.r-project.org
- RStudion - obtain here: https://rstudio.com/products/rstudio/download/
- 0auth for Gmail API - learn about here: https://blog.mailtrap.io/send-emails-with-gmail-api/
- Then configure a few variables as described in the project
The modified process
To address the challenges described above and streamline the assessment data collection procedure, a new process is described here. This new workflow involves six steps but will be much easier to sustain into the future. Four of the steps have now been completely automated using R scripts. The faculty are only involved in reporting data and no longer need to directly interact with Nuventive Improve. And finally, compensated assessment coordinators are being used to ensure accurate and timely data entry into our Improve database.
Step 1) Generating a census of all General Education sections offered
Each semester, a standardized query will be run against the University’s Banner enrollment database to create a report of all General Education offerings. The output will be a single flat file in Excel (.xlsx) format. Due to the large amount of FERPA-protected information in this file, it will be stored in an encrypted folder and not widely distributed.
Produced by: Elise Gramza, Registrar and Jerome Forbes, IRT
Time required: Unknown, but this should be scriptable. Jerome has been very quick in responding to changes, so I assume that this is not too labor intensive.
Used by: Clifton Franklund, General Education Assessment Coordinator
Structure: The Excel file will consist of twenty-four variables (columns of data), which are defined below. Each record (row) will correspond to a student enrolled in a section of a General Education course. A typical semester will have somewhere around 15,000 to 20,000 students and the file sizes will vary between 1.5 and 2.0 Mb. This file will give an accurate report of the number of students, faculty, courses, and sections involved in General Education for each semester. The banner codes that are currently used are listed below.
Variables from Banner ============================================================ TERM = Current academic term (e.g. 202001) CRN = Unique course number for each section of every course SUBJ = The course prefix (e.g. BIOL) CRSE_NUMB = The course number (e.g. 121) SECTION = The section number (e.g. 001 or VL1) COURSE_COLL_CODE = The academic college code (e.g. AS or BU) ID = The eight-digit student ID for every registered student in the course section STU_NAME = The full name of each student registered in the course section INSTR_NAME = The full name of the instructor of record for the course section INSTR_EMAIL = The university email address of the instructor of record ACTC = All courses meeting this designator are denoted with an X CATC = All courses meeting this designator are denoted with an X CLTR = All courses meeting this designator are denoted with an X COMS = All courses meeting this designator are denoted with an X COMW = All courses meeting this designator are denoted with an X COLL = All courses meeting this designator are denoted with an X DIVG = All courses meeting this designator are denoted with an X DIVU = All courses meeting this designator are denoted with an X NSCI = All courses meeting this designator are denoted with an X NSCL = All courses meeting this designator are denoted with an X PROB = All courses meeting this designator are denoted with an X QUAL = All courses meeting this designator are denoted with an X SSOC = All courses meeting this designator are denoted with an X SSOF = All courses meeting this designator are denoted with an X
Step 2) Generating a summary report
The raw data from step one is necessary, but not so great to share with others. Therefore, a R script has been written that creates a human-readable summary (in Word, HTML, and PDF formats) that can be shared with others. The intention is to present this report to the General Education committee each semester. In addition, these reports will be entered into Improve and shared on the General Education website. ACTC, CATC, and CLTR are combined as “Culture”, NSCI and NSCL are combined as “Natural Sciences”, and SSOC and SSOF are combined as “Self and Society”.
Produced by: General Education Assessment Coordinator using an R script, “SummaryReport.Rmd”
Time required: Total time to configure the data and script and execute it is less than two minutes.
Used by: Academic Affairs General Education Committee and other interested parties
Structure: An actual example of a summary report for the spring 2020 semester is included above to show the magnitude of the General Education program.
Step 3) Generating Excel data reporting worksheets
The key to simplifying the process of assessment reporting is providing faculty with streamlined Excel workbook. Each course section will require a workbook with a completed roster (student names and ID numbers) along with a single column for the outcome score. A small amount of metadata will also need be collected to facilitate later entry into Nuventive Improve. Since this will require the creation of over 1,000 documents, another R script has been written to automate the process. This script reads the file collected from step one, splits out the information into individual Excel workbooks for each section, and stores them in folders by core competency.
Produced by: General Education Assessment Coordinator using an R script, “CreateSpreadsheets.R”
Time required: The time needed to configure and execute the script is less than two minutes.
Used by: Individual faculty members will ultimately use these files
Structure: The workbooks will use a standardized file naming system – The instructor’s last name, college code, course, section, and learning outcome all separated by hyphens (e.g. Franklund-CAS-BIOL286-211-SCI1.xlsx). The workbooks will each have two sheets (sheet 1 = meta data, sheet 2 = scores). The actual student scores are entered in the score sheet (shown below). Instructors only need to enter a score for each student listed (the class roster is now provided for each section). The values in the score column can either be an actual assignment score for each student (any floating point number) or a rubric score for an assignment (0-4, integers only). Examples of workbooks can be generated using the test dataset in the GitHub project.
The meta-data sheet will be used to collect additional information that is necessary to later enter the results into Nuventive Improve. The total number of points will be later used to calculate percent scores for conversion to rubric values. If rubric scores were used, the instructor would enter “rubric” in this cell. The assignment description and reflection fields will be used to provide some context for the assessment results in Nuventive Improve. Instructors will also be asked to mark an “X” to indicate which standard measure and action they wish to be used in the Improve database.
Step 4) Distributing the Excel worksheets via email
Creating assignments has proved to be too difficult in the current assessment process. Expanding this to over 1,100 emails is not practical. Therefore, yet another R script has been written to automate this process. The Gmail Application Programming Interface (API) has been used to allow sending bulk emails each of which can be customized using information regarding the course, section, outcome and instructor.
Produced by: General Education Assessment Coordinator using an R script, “SendEmails.R”
Time required: The time needed to configure and execute the script is less than two minutes.
Used by: Individual faculty members
Structure: The instructor for each course section will receive an email with the corresponding data workbook as an attachment.
Step 5) Completing and submitting the Excel workbooks
Each course instructor will complete their workbooks as described above. Then we need some mechanism to collect all of that work. The thought of receiving 1,100 emails with completed Excel worksheets attached is very unappealing. What is needed, instead, is some sort of central repository where faculty can submit their completed worksheets. Fortunately, Ferris has SharePoint.
Produced by: Individual faculty members
Time required: It is hoped that a typical instructor could complete and submit a data workbook in less than 15 minutes.
Used by: Assessment coordinators and others appointed to enter Improve results
Structure: To receive completed assessment workbooks, a Microsoft SharePoint site has been created. You can assess it using this URL.
This site is available to all Ferris State University faculty and administrators. A folder will be created in the “Documents” section for every academic term data is collected. Sub-folders will be created for each of our 10 core competencies (not 8, since there are two for Communication and two for Diversity). Faculty can simply drag and drop their completed workbooks into the correct folder.
Step 6) Entering results into Nuventive Improve
Produced by: Assessment coordinators and others appointed to enter Improve results
Time required: It is estimated that, with 100% participation – roughly 1,200 documents – it will take about 60 man-hours to enter all of the data into Nuventive Improve.
Used by: Anyone wishing to analyze program effectiveness using Nuventive Improve data
Structure: The folders contain the completed workbooks from the instructors. An extra column (Yes/No, No by default) has been added to indicate if the data have been transferred into Nuventive Improve. A screenshot is included below.
The person or persons tasked with moving this to Nuventive Improve would need to follow the following procedure.
- Open a data workbook from the SharePoint site.
- Open a blank General Education reporting workbook (from our first process).
- Rename the blank workbook using the name from the SharePoint file.
- Complete the “instructions” sheet using information from “sheet 1” and the file name of the SharePoint file.
- Copy and paste the student names, IDs, and scores from “sheet 2” to the correct columns in the “scores” sheet and save to a folder.
- Open Nuventive Improve and navigate to the correct Core Competency, Outcome, and Standard Measure.
- Enter a new result using the generated report and meta-data to complete all the required fields. They would also attach the Standard Workbook that they just created.
- Change the No to Yes in SharePoint for that file to indicate that the data have now been entered into Improve
I estimate that this entire sequence of steps can be done in less than five minutes per file. The General Education assessment coordinator alone could probably get this done. But some thought might be given to providing PDI to a couple of people to speed up the process. Overall, this process should greatly improve the reproducibility and sustainability of our General Education assessment process. I would really appreciate any suggestions, thoughts, or comments.