Disability Resource Center Student Satisfaction Survey

Spring 2009 Survey Report

Phil McGilton, Assistive Technology Specialist, Bellevue College

Contents

Introduction

Purpose and Scope

The purpose of this survey is to collect student feedback on Disability Resource Center (DRC) programs and services. The survey instrument is designed to be implemented interactively, allowing for continued data collection. The initial two years of data collected establishes baseline data for the DRC program. After the third year of data collection, a longitudinal report will be created. The continued collection of survey data and student feedback provides opportunity for program assessment, improvement, and adjustment of the survey instrument. Additionally, this survey is intentionally anonymous as it affords students the opportunity to share their views of current service offerings openly. The DRC will use the data collected to make program adjustments and address student concerns that have broad impact. DRC takes on the institutional responsibility of compliance based, equal opportunity services as mandated by the American's with Disabilities Act, the Rehabilitation Act of 1973 and other federal and state legislation. Because of this role, proactive engagement in service improvement is a primary objective of the DRC and its data collection activities.

Overview and Procedure

The survey was conducted at the end of Bellevue College's Spring Quarter 2009. A 5-point rating scale was used to collect data on each of the following constructs: (1) overall effectiveness of DRC in meeting student needs, (2) effectiveness of the alternative media program, (3) effectiveness of alternative testing services, (4) effectiveness of assistive technology equipment, training and facilities, (5) effectiveness of services for Deaf and hard of hearing students, and (6) effectiveness of note taking and classroom scribe services.

The survey was made available online and in paper form for all DRC students who had been enrolled in the previous academic year (Summer 2008-Spring 2009). Participants were notified of the survey through their self-reported email addresses collected at the time of DRC intake or gathered through college email address listings. In total, 695 participants were identified and notified of the survey. 316 students were sent the survey the previous year. Of those 695, 93 email addresses did not reach their destination leaving a total of 602 potential respondents. 87 students ultimately completed the survey (14% of n=602). DRC data indicates a total enrollment of 405 students for the Spring 2009 quarter of which 18% (n=73) completed the survey.

Data

Demographics

Most recent quarter attended. (n=87)
Disability (n=87)
Type of services/accommodations receiving. (n=87)
Are you enrolled part time or full time? (n=73)
Age (n=74)
Gender (n=74)
Ethnic Background (n=74)
Degree or program of study (n=66)

Top degrees or programs sought by participants are (1) Health care related fields (n=14), (2) Direct Transfer (n=12), (3) Business (n=12) and (4) Technology related fields (n=8). The survey provided an open-ended way for participants to describe there own areas of study and the data was sorted into general fields of study.

General Questions (n=87)

Internal consistency reliability was measured using Cronbach's alpha for the questions in this section [1].

For n=87, α=.896

The DRC staff responds to my questions and concerns.
The DRC staff effectively communicates my needs to faculty/instructors.
The DRC staff is effective in providing services and addressing my needs.
If warranted, I would use DRC services again.

Alternative Media (n=14)

Internal consistency reliability was measured using Cronbach's alpha for the questions in this section [1].

For n=14, α=.714

I received my alternative text in a timely manner and/or by syllabus dates.
I was able to navigate my electronic text or mp3 files and successfully access course materials.
My alternative media allowed for clear understanding of my course materials.
The text-to-speech software (TextAloud, Natural Reader, etc.) and/or screen reader was an effective reading tool.
The DRC staff provided adequate training in the DRC regarding the screen reader and/or text-to-speech software that I used to access my course materials.
Receiving alternative text from the DRC increased my ability to master course requirements.

Alternative Testing (n=62)

Internal consistency reliability was measured using Cronbach's alpha for the questions in this section [1].

For n=62, α=.821.

Exams were scheduled within the preferred time that I requested.
Exams arrived/were delivered on time.
When necessary, the DRC staff communicated my testing needs effectively to faculty.
The physical environment of the testing rooms provided for an appropriate alternative testing situation.
Receiving alternative testing increases my opportunity to demonstrate my understanding of course content.

Assistive Technology (n=12)

Internal consistency reliability was measured using Cronbach's alpha for the questions in this section [1].

For n=12, α=.863

The equipment/computers in the DRC operated reliably.
The DRC staff provided adequate training on the assistive technology equipment that I used to access my classes and course materials.
The location of the equipment was appropriate and met my needs.
Assistive Technologies provided to me by the DRC such as assistive listening devices, digital recorders, etc, functioned properly and were effective in meeting my needs.
The use of the assistive technology has increased my ability to complete work independently.

Captioning (n=1)

The captionist arrived on time and was prepared at the beginning of the class.
The captionist presented themselves in a professional manner (acted appropriately, refrained from providing me any counsel, advice, or personal opinion, and maintained confidentiality).
The information on the screen was clear and understandable.
The captionist was able to keep up with the pace of the instructor.
The captionist provided communication access which allowed me to actively participate during class.
The captionist was a good match for me and the subjects/topics of my courses.
The process of requesting CART services was easy and met my needs.

Interpreting (n=0)

The interpreter arrived on time.
The interpreter presented themselves in a professional manner (acted appropriately, refrained from providing me any counsel, advice, or personal opinion, and maintained confidentiality).
I understood the interpreter.
The interpreter understood me when I signed or finger spelled.
The interpreter was able to keep up with the pace of the instructor.
The interpreter accurately conveyed the teacher's comments.
The interpreter provided communication access which allowed me to actively participate during class.
The interpreter was a good match for me and the subjects/topics of my courses.
The process of requesting interpreters was easy and met my needs.

Note Taking (n=19)

Internal consistency reliability was measured using Cronbach's alpha for the questions in this section [1].

For n=19, α=.824

A note taker was set up for my classes in a timely manner.
I received my notes from my note taker in a timely manner.
I received clear and understandable notes.
My note taker made arrangements for me to receive notes in the event of his/her absence.
Receiving note taking services from the DRC increased my ability to master course requirements.

Scribe (n=1)

A scribe was set up for my classes in a timely manner.
In a testing situation, my scribe accurately conveyed my responses.
I received clear and understandable notes from my scribe
Receiving scribe services from the DRC increased my ability to master course requirements.

Comments and Qualitative Feedback

Comments have been omitted from the public survey. If you would like to discuss this document further, please contact the DRC.

Discussion

Overall, Disability Resource Center students have a positive view of the services the DRC provides and feel they are effective in meeting academic accommodation needs. This being the second survey conducted, sections that were completed by larger numbers of participants are the most reliable (e.g. demographics, disability information, general DRC questions and alternative testing questions). Tests of reliability were conducted where sufficient data/responses were available. There was not sufficient data to conduct a test of reliability in the areas of Interpreting, Captioning, and Scribing. Testing for the construct of overall DRC effectiveness revealed highly reliable results, where n=87, α=.896. Reliability should continue to be tested as the survey is administered in the future and sample sizes can be increased.

One interesting finding about last year's survey reliability arrived in the Alternative Testing section where the question about testing locations outside of the DRC offices (e.g. in a faculty office, private classroom or other facility where faculty provide test proctoring) seemed to cause much confusion among participants. This was somewhat anticipated as the survey questions were developed. It is likely that many students taking the survey had little knowledge of this practice as most all students receive extended time testing accommodations within the DRC. Removing this question from reliability testing and this years survey, substantially improved the results.

DRC student comments and responses to questions are largely positive and provide further explanation of the results of the survey. When students were asked if the DRC responds to their questions and concerns, 82 out of 87 students positively responded. When students were asked if warranted they would use services again, not a single student denied using DRC accommodations in the future.

Nonetheless, many students offered constructive suggestions on specific areas of concern. For example, under the category of alternative testing we had a student disagree that their exams were scheduled within their preferred time. It may or may not have been the same student who displayed dissatisfaction with the fact that we are not open in the evenings to proctor extended time testing for students in evening classes. The DRC team discussed this constructive student feedback and have implemented two nights per week for extended time testing proctoring starting fall quarter 2009.

The DRC should use the data from this survey to improve services where possible. Outward recognition and response to this feedback will help to sustain student satisfaction and could enhance student participation in future surveys. The survey should continue to be available to students on a regular basis.

Endnotes

[1] "Cronbach's alpa (or α) is a special measure of reliability known as internal consistency, where the more consistently individual item scores vary with the total score on the test, the higher the value. And, the higher the value, the more confidence you can have that this is a test that is internally consistent or measures one thing, and that one thing is the sum of what each item evaluates. (Salkind, 2008)