Disability Resource Center Student Satisfaction Survey

Spring 2010 Survey Report

Phil McGilton, Assistive Technology Specialist, Bellevue College

Contents

Introduction

Purpose and Scope

The purpose of this survey is to collect student feedback on Disability Resource Center (DRC) programs and services. The survey instrument is designed to be implemented interactively, allowing for continued data collection. The initial two years of data collected establishes baseline data for the DRC program. The continued collection of survey data and student feedback provides opportunity for program assessment, improvement, and adjustment of the survey instrument. Additionally, this survey is intentionally anonymous as it affords students the opportunity to share their views of current service offerings openly. The DRC will use the data collected to make program adjustments and address student concerns that have broad impact. DRC takes on the institutional responsibility of compliance based, equal opportunity services as mandated by the American's with Disabilities Act, the Rehabilitation Act of 1973 and other federal and state legislation. Because of this role, proactive engagement in service improvement is a primary objective of the DRC and its data collection activities.

Overview and Procedure

The survey was conducted at the end of Bellevue College's Spring Quarter 2010. A 5-point rating scale was used to collect data on each of the following constructs: (1) overall effectiveness of DRC in meeting student needs, (2) effectiveness of the alternative media program, (3) effectiveness of alternative testing services, (4) effectiveness of assistive technology equipment, training and facilities, (5) effectiveness of services for Deaf and hard of hearing students, and (6) effectiveness of note taking and classroom scribe services.

The survey was made available online for all DRC students who had been enrolled in the previous academic year (Summer 2009-Spring 2010). Participants were notified of the survey through their self-reported email addresses collected at the time of DRC intake or gathered through college email address listings. In total, 889 participants were identified and notified of the survey. 695 emails were sent in 2009 and 316 students were sent the survey the previous year in 2008. Of those 889, 34 emailed surveys did not reach their destination, leaving a total of 855 potential respondents. 90 students ultimately completed the survey. DRC data indicates a total enrollment of 469 students for the Spring 2010 quarter of which 18% (n=90) completed the survey.

Data

Demographics

Most recent quarter attended. (n=90)
Disability (n=90)
Type of services/accommodations receiving. (n=90)
Are you enrolled part time or full time? (n=83)
Age (n=85)
Gender (n=86)
Ethnic Background (n=89)
Degree or program of study (n=70)

Top degrees or programs sought by participants are (1) Technology related fields (n=18),(2) Direct Transfer (n=15), (3) Business (n=10), and (4)Health care related fields (n=7). The survey provided an open-ended way for participants to describe there own areas of study and the data was sorted into general fields of study.

General Questions (n=90)

Internal consistency reliability was measured using Cronbach's alpha for the questions in this section [1].

For n=90, α=.916

The DRC staff responds to my questions and concerns.
The DRC staff effectively communicates my needs to faculty/instructors.
The DRC staff is effective in providing services and addressing my needs.
If warranted, I would use DRC services again.

Alternative Media (n=16)

Internal consistency reliability was measured using Cronbach's alpha for the questions in this section [1].

For n=16, α=.839

I received my alternative text in a timely manner and/or by syllabus dates.
I was able to navigate my electronic text or mp3 files and successfully access course materials.
My alternative media allowed for clear understanding of my course materials.
The text-to-speech software (TextAloud, Natural Reader, etc.) and/or screen reader was an effective reading tool.
The DRC staff provided adequate training in the DRC regarding the screen reader and/or text-to-speech software that I used to access my course materials.
Receiving alternative text from the DRC increased my ability to master course requirements.

Alternative Testing (n=65)

Internal consistency reliability was measured using Cronbach's alpha for the questions in this section [1].

For n=65, α=.813

Exams were scheduled within the preferred time that I requested.
Exams arrived/were delivered on time.
When necessary, the DRC staff communicated my testing needs effectively to faculty.
The physical environment of the testing rooms provided for an appropriate alternative testing situation.
Receiving alternative testing increases my opportunity to demonstrate my understanding of course content.

Assistive Technology (n=13)

Internal consistency reliability was measured using Cronbach's alpha for the questions in this section [1].

For n=13, α=.86

The equipment/computers in the DRC operated reliably.
The DRC staff provided adequate training on the assistive technology equipment that I used to access my classes and course materials.
The location of the equipment was appropriate and met my needs.
Assistive Technologies provided to me by the DRC such as assistive listening devices, digital recorders, etc, functioned properly and were effective in meeting my needs.
The use of the assistive technology has increased my ability to complete work independently.

Captioning (n=1)

The captionist arrived on time and was prepared at the beginning of the class.
The captionist presented themselves in a professional manner (acted appropriately, refrained from providing me any counsel, advice, or personal opinion, and maintained confidentiality).
The information on the screen was clear and understandable.
The captionist was able to keep up with the pace of the instructor.
The captionist provided communication access which allowed me to actively participate during class.
The captionist was a good match for me and the subjects/topics of my courses.
The process of requesting CART services was easy and met my needs.

Interpreting (n=0)

The interpreter arrived on time.
The interpreter presented themselves in a professional manner (acted appropriately, refrained from providing me any counsel, advice, or personal opinion, and maintained confidentiality).
I understood the interpreter.
The interpreter understood me when I signed or finger spelled.
The interpreter was able to keep up with the pace of the instructor.
The interpreter accurately conveyed the teacher's comments.
The interpreter provided communication access which allowed me to actively participate during class.
The interpreter was a good match for me and the subjects/topics of my courses.
The process of requesting interpreters was easy and met my needs.

Note Taking (n=22)

Internal consistency reliability was measured using Cronbach's alpha for the questions in this section [1].

For n=22, α=.905

A note taker was set up for my classes in a timely manner.
I received my notes from my note taker in a timely manner.
I received clear and understandable notes.
My note taker made arrangements for me to receive notes in the event of his/her absence.
Receiving note taking services from the DRC increased my ability to master course requirements.

Scribe (n=0)

A scribe was set up for my classes in a timely manner.
In a testing situation, my scribe accurately conveyed my responses.
I received clear and understandable notes from my scribe
Receiving scribe services from the DRC increased my ability to master course requirements.

Comments and Qualitative Feedback

Comments have been omitted from the public survey. If you would like to discuss this document further, please contact the DRC.

Discussion

Overall, Disability Resource Center students have a positive view of the services the DRC provides and feel they are effective in meeting academic accommodation needs. This being the third survey conducted, sections that were completed by larger numbers of participants are the most reliable (e.g. demographics, disability information, general DRC questions and alternative testing questions). Tests of reliability were conducted where sufficient data/responses were available. Like in previous years there was not sufficient data to conduct a test of reliability in the areas of Interpreting, Captioning, and Scribing. Testing for the construct of overall DRC effectiveness revealed highly reliable results, where n=90, α=.916, which is up from a=896 in the 2009 student satisfaction survey. Reliability should continue to be tested as the survey is administered in the future and sample sizes can be increased.

One interesting finding about this year's survey results as compared to last year's is that many more Disability Resource Center students are majoring in technology related fields. Of the students surveyed last year, technology certificates and or majors ranked fourth on the list and has jumped up to the top academic area of focus of students surveyed in 2010. Health care related fields ranked first among students polled last year and fourth this year; virtually switching positioning with tech related fields.

DRC student comments and responses to questions are largely positive and provide further explanation of the results of the survey. When students were asked if the DRC responds to their questions and concerns, 86 out of 90 students positively responded with three other students answering neutrally, with only one student providing a negative response. When students were asked if warranted they would use services again, only a single student denied using DRC accommodations in the future.

Nonetheless, many students offered constructive suggestions on specific areas of concern. For example, under the category of alternative testing we had students disagree that their exams were proctored within a non-distracting environment. Due to large increases in enrollment and limited DRC testing space, on this year's survey students are giving more negative feedback regarding the noise and space than in previous years. The DRC has always strived to provide the least distracting environment possible for testing by striving to provide solo testing rooms for students with partitions when two or more students are testing in one room, but due to the restricted space and large increase in students we are forced to have three and sometimes four students in one testing room. The DRC staff has advocated for additional space for many years, even meeting with work place consultants, but the issue is out of our hands at the moment. Since the DRC doesn't have control over our physical space issues, we have chosen to address the issue by reducing the noise in the space provided.

The DRC team discussed this constructive student feedback and have implemented some new strategies to provide a less distracting extended time testing environment. First of all the DRC purchased twenty five new sets of noise canceling headphones to help students focus on their test and from being distracted by fellow testers. These noise canceling headphones are now available for student use in the main DRC office as well as in finals testing overflow room. Second, we have sent an email to all DRC students who qualify for the accommodation of a non-distracting environment to advertise the use of the noise canceling headphones and or the use of earplugs. Third, the DRC has meet and worked with the Assessment Office managers to address the noise brought on during finals testing when students are lining up to take the Compass exam. The students waiting in line to take the Compass exam must line up right in front of the DRC testing lab rooms and this has caused excess noise in previous quarters. The resolution to the meeting between the DRC and Assessment Office managers was that Compass testers would be asked to wait outside the front door of B132 and be brought in all at once to the testing room to reduce the noise of students waiting in line.

The DRC should use the data from this survey to improve services where possible. Outward recognition and response to this feedback will help to sustain student satisfaction and could enhance student participation in future surveys. The survey should continue to be available to students on a regular basis.

Endnotes

[1] "Cronbach's alpha (or α) is a special measure of reliability known as internal consistency, where the more consistently individual item scores vary with the total score on the test, the higher the value. And, the higher the value, the more confidence you can have that this is a test that is internally consistent or measures one thing, and that one thing is the sum of what each item evaluates. (Salkind, 2008)