Disability Resource Center Student Satisfaction Survey

Spring 2008 Survey Report

Ryan Collier, Access IT Specialist, Bellevue Community College

Contents

Introduction

Purpose and Scope

The purpose of this survey is to collect student feedback on Disability Resource Center (DRC) programs and services. The initial data collected establishes baseline data for the DRC program. The survey instrument is designed to be implemented iteratively, allowing for continued data collection on at least two occasions per academic year. The continued collection of survey data and student feedback provides opportunity for program assessment, improvement, and adjustment of the survey instrument. Additionally, this survey is intentionally anonymous as it affords students the opportunity to share their views of current service offerings openly. The DRC will use the data collected to make program adjustments and address student concerns that have broad impact. DRC takes on the institutional responsibility of compliance based, equal opportunity services as mandated by the American's with Disabilities Act, the Rehabilitation Act of 1973 and other federal and state legislation. Because of this role, proactive engagement in service improvement is a primary objective of the DRC and its data collection activities.

Overview and Procedure

The survey was conducted at the end of Bellevue Community College's Spring Quarter 2008. A 5-point rating scale was used to collect data on each of the following constructs: (1) overall effectiveness of DRC in meeting student needs, (2) effectiveness of the accessible media program, (3) effectiveness of alternative testing services, (4) effectiveness of assistive technology equipment, training and facilities, (5) effectiveness of services for Deaf and hard of hearing students, and (6) effectiveness of note taking and classroom scribe services.

The survey was made available online and in paper form for all DRC students who had been enrolled in the previous academic year (Summer 2007-Spring 2008). Participants were notified of the survey through their self-reported email addresses collected at the time of DRC intake or gathered through college email address listings. In total, 316 participants were identified and notified of the survey. Of those 316, 72 email addresses did not reach their destination leaving a total of 244 potential respondents. 81 students ultimately completed the survey (33% of n=244). DRC data indicates a total enrollment of 137 students for the Spring 2008 quarter of which 53% (n=73) completed the survey.

Data

Demographics

Most recent quarter attended. (n=81)
Disability (n=81)
Type of services/accommodations receiving. (n=81)
Are you enrolled part time or full time? (n=75)
Age (n=78)
Gender (n=77)
Ethnic Background (n=78)
Degree or program of study (n=68)

Top degrees or programs sought by participants are (1)Direct Transfer (n=19), (2) Business (n=7) and (3) health care related fields (n=5). The survey provided an open-ended way for participants to descirbe there own areas of study. Due to this, little consistency was found among answers. The next iteration of the survey should include specific programs or degrees to select so that data can be better interpreted.

General Questions (n=81)

Internal consistency reliability was measured using Cronbach's alpha for the questions in this section [1].

For n=81, α=.934

The DRC staff responds to my questions and concerns.
The DRC staff effectively communicates my needs to faculty/instructors.
The DRC staff is effective in providing services and addressing my needs.
If warranted, I would use DRC services again.

Accessible Media (n=17)

Internal consistency reliability was measured using Cronbach's alpha for the questions in this section [1].

For n=17, α=.689

I received my alternative text in a timely manner and/or by syllabus dates.
I was able to navigate my data CDs or emailed ZIP files and successfully access course materials.
My accessible media allowed for clear understanding of my course materials.
The DRC staff provided adequate training in the DRC regarding the screen reader and/or text-to-speech software that I used to access my course materials.
Receiving alternative text from the DRC increased my ability to master course requirements.

Alternative Testing (n=58)

Internal consistency reliability was measured using Cronbach's alpha for the questions in this section [1].

For n=58, α=.580. Raw data demonstrated confusion on question 5**. We removed this question from consistency testing and reliability improved substantially. For n=58, n of items=5, α=.792. Future iterations of this survey, will refine this aspect of the Alternative Testing construct.

Exams were scheduled within the preferred time that I requested.
Exams arrived/were delivered on time.
When necessary, the DRC staff communicated my testing needs effectively to faculty.
The physical environment of the testing rooms inside the DRC provided for an appropriate alternative testing situation.
The physical environment of the testing rooms outside the DRC provided for an appropriate alternative testing situation.**
Receiving alternative testing increases my opportunity to demonstrate my understanding of course content.

Assistive Technology (n=8)

Internal consistency reliability was measured using Cronbach's alpha for the questions in this section [1].

For n=8, α=.820

The equipment/computers in the DRC operated reliably.
The DRC staff provided adequate training on the assistive technology equipment that I used to access my classes and course materials.
The location of the equipment was appropriate and met my needs.
Assistive Technologies provided to me by the DRC such as assistive listening devices, digital recorders, etc, functioned properly and were effective in meeting my needs.
The use of the assistive technology has increased my ability to complete work independently.

Captioning (n=2)

The captionist arrived on time and was prepared at the beginning of the class.
The captionist presented themselves in a professional manner (acted appropriately, refrained from providing me any counsel, advice, or personal opinion, and maintained confidentiality).
The information on the screen was clear and understandable.
The captionist was able to keep up with the pace of the instructor.
The captionist provided communication access which allowed me to actively participate during class.
The captionist was a good match for me and the subjects/topics of my courses.
The process of requesting CART services was easy and met my needs.

Interpreting

Internal consistency reliability was measured using Cronbach's alpha for the questions in this section [1].

For n=5, α=.977

The interpreter arrived on time.
The interpreter presented themselves in a professional manner (acted appropriately, refrained from providing me any counsel, advice, or personal opinion, and maintained confidentiality).
I understood the interpreter.
The interpreter understood me when I signed or finger spelled.
The interpreter was able to keep up with the pace of the instructor.
The interpreter accurately conveyed the teacher's comments.
The interpreter provided communication access which allowed me to actively participate during class.
The interpreter was a good match for me and the subjects/topics of my courses.
The process of requesting interpreters was easy and met my needs.

Note Taking (n=15)

Internal consistency reliability was measured using Cronbach's alpha for the questions in this section [1].

For n=15, α=.951

A note taker was set up for my classes in a timely manner.
I received my notes from my note taker in a timely manner.
I received clear and understandable notes.
My note taker made arrangements for me to receive notes in the event of his/her absence.
Receiving note taking services from the DRC increased my ability to master course requirements.

Scribe (n=3)

A scribe was set up for my classes in a timely manner.
In a testing situation, my scribe accurately conveyed my responses.
I received clear and understandable notes from my scribe
Receiving scribe services from the DRC increased my ability to master course requirements.

Comments and Qualitative Feedback

Comments have been obmitted from the public survey. If you would like to discuss this document further, please contact the DRC.

Discussion

Overall, DRC students have a positive view of DRC services and feel they are effective in meeting academic accommodation needs. Data analysis revealed that participants rated the DRC much higher than expected by chance alone. Scale statistics showed the overall DRC ratings were 3.296 standard deviations above the expected mean (where n=4, μ=12, X=17.79, and s2=12.893).

Some survey data can be generalized toward the overall DRC population. This being the first survey conducted, sections that were completed by larger numbers of participants are the most reliable (e.g. demographics, disability information, general DRC questions and alternative testing questions). Tests of reliability were conducted where sufficient data/responses were available. Testing for the construct of overall DRC effectiveness revealed highly reliable results, where n=81, α=.934.

Reliability should continue to be tested as the survey is administered in the future and sample sizes can be increased. One interesting finding about reliability arrived in the Alternative Testing section where the question about testing locations outside of the DRC offices (e.g. in a faculty office, private classroom or other facility where faculty provide test proctoring) seemed to cause confusion among participants. This was somewhat anticipated as the survey questions were developed. It is likely that many students taking the survey had little knowledge of this practice as most all students test within the DRC. Removing this question from reliability testing, substantially improved the results.

The DRC should use the data from this survey to improve services where possible. Student comments are largely positive and provides further explanation of the results of the survey. Nonetheless, many students offered constructive suggestions on specific areas of concern. Outward recognition and response to this feedback will help to sustain student satisfaction and could enhance student participation in future surveys. The survey should continue to be available to students on a regular basis.

Endnotes

[1] "Cronbach's alpa (or α) is a special measure of reliability known as internal consistency, where the more consistently individual item scores vary with the total score on the test, the higher the value. And, the higher the value, the more confidence you can have that this is a test that is internally consistent or measures one thing, and that one thing is the sum of what each item evaluates. (Salkind, 2008)