Virtual School Symposium. VSS2008. Evaluation workshop.

Some notes from the workshop entitled: “

Evaluation of Online Learning Programs”
Powerpoint and other notes on the wiki at: http://vss2008.wikispaces.com/clark

Presenters: Tom Clark, Donna Scribner (VHS, Inc.), Mickey Revenaugh (Connections Academy) , Martha Donaldson, Alabama ACCESS, Talbot Bielefeldt, ISTE.

Discussed online guide from Dept of Ed. For evaluating online courses:

http://www.ed.gov/admins/lead/academic/evalonline/index.html

Evaluating online learning is a political process because of difference in levels of understanding of online learning. Whether evaluation results is perceived as effective is the political perspective.

Evaluation needs to be ongoing.

There is a difference between research and evaluation. Research shows one thing while evaluation shows another.

For example, in evaluation: Is this particular online course / Alg 1 getting the same results as the same class taught face to face?

Research Committee of NACOL. (Donna Scribner)

The interest of NACOL members has been driving this research.

  • Looking at online teacher support models and mentoring.
  • Research book: Online teacher support programs: Mentoring and Coaching Models
  • New research coming out…what policies exist. Sent out surveys…got 81 different schools…a snapshot in time…about their policies. 16% of those policies had no written policy.

Why evaluate online learning? (Tom Clark)

  • To demonstrate the value of program
  • To improve program over time
  • To document participant outcomes

What happens if evaluation neglected?:

  • Program set up without clear goals
  • Focus on activities and simple outputs
  • Desired change in participant outcomes undefined
  • Data essential to studying success
  • Focus on anecdotal evidence
  • Stakeholder information
  • Program unable to demonstrate value or worth

Prepare to Evaluate

  • Plan for evaluation early
  • Think about program goals and how to show they have been achieved
  • Think about program roles
    • Wht outcomes are desired
    • What data need to be collected
    • What program elements/policies would support success?

At the start of your online learning program, define what you will do to evaluate quality, effectiveness, and impact.

Martha Donaldson – Alabama

  • Sent out rfp for evaluation and ISTE won the bid.
  • Had online learning and interactive video conferencing
  • Look at program growth.
  • 04-05 300; Summer 2007 – 18,955

Alabama (ACCESS) Virtual School evaluation reports here: http://accessdl.state.al.us/showaccess.php?lnk=resources .

  • Classroom based program – adult facilitator as lab aide – non-credentialed. Students taking online classes are all in the same computer lab while taking these courses. Any school with a computer lab could participate in ACCESS program. Courses are in addition to regular classes. (Part time classes).
  • Financial support from the state
  • Rapid expansion of the infrastructure
  • Regional support centers
  • Counselors had to sign up students.
  • Teaching practice changing because tech integration and more student-centered
  • Student surveys from students: dissatisfied, better communication, concerns with student readiness, technical issue (working on pre-qualifying students).

Donna Scribner, VHS, Inc. Dscribner@goVHS.org.

  • VHS began in 1996 – Tech Innovation Grant – 5-year, $7.4 million
  • Once the grant ran out or they became self funded.
  • So, changed to consortium / fee based structure – so student and school satisfaction very important
  • Not diploma granting
  • Quality of courses (tie back to mission)
  • Quality of professional development (tie back to mission)
  • Quality of services & programs (tie back to mission)

Be careful that you evaluate over time…some years can be up and down.

Evaluating success of AP courses – challenge is getting AP test scores from member schools.

  • Part of the problem was having someone to collect the data
  • Quality indicators in first year teachers – so started having someone going into the classes to see if they are following the online teaching indicators (e.g. having announcements, how long it takes to respond to students). Teachers mark off when they have talked to students.
  • Level 1 – All new teachers start at level 1, then based on some criteria, move to Level 2.
  • Be aware of your “pain” points
  • Other challenge is that teachers in VHS, Inc, teach one period and are employed by school district as well. If they are not performing well teaching online, a little sensitive to report back to that teacher.

Question: How do you do evaluation without a budget?

  • Use internal staff, what indicators – no clear answer on this.

Mickey Revenaugh – Connections Academy – been around for 8 years

  • When student enrolls, get all information from the student, IEP, etc.
  • Built robust SIS right into the course management system. Gather this information
  • On they fly data analysis because of the data collected by the student.
  • Stuff done is logged into the system so that teachers have access to teacher
  • Robust yearly parent satisfaction survey – 14 schools in 12 states.
  • STAR tracker – a rating system that is built into every lesson in every online course. (eg. Amazon). Can rate the entire school on a 5 star basis. (5 is great; 1 is terrible and comments can be made for each lessson). Can change daily.
  • Measureable school goals.

Interactive Discussion – Student feedback tools.

Connections Academy (K-12) management system allows students to rate individual lessons.

Curriculum people can download this data at any time.

Teachers and curriculum team look at ongoing evaluations.

They look for patterns. Always a range of students who like or dislike a lesson.

How popular is course vs. how well did students do in the course.

Beginning to see that popularity doesn’t necessarily = effectiveness.

Challenging courses may be popular, but not necessarily passing the course.

Even young children can rate a lesson.

Having learning coaches (helpful adult) evaluate – heavily invested paraprofessionals and provided lots of training with them to use the system.

Also have parents rate the lessons.

Feedback from students, learning coach, and parents

Sign a learning coach agreement which defines the roles.

Student feedback tools:

Individual rating by lesson

Rating by course – learning coach, parents

Teacher to student communication (end of course survey, embedded into the learning management system, keeps track of all emails, phone calls, etc. Through advanced academics),

Every week, students submit and report their hours for each class at the end of each week. How many hours. Students self report the amount of hours they spend on a weekly basis.

See the VSS Tuesday workshop: all emails are within the system and provide student emails, system keeps track of all interaction in the system.

No one collecting student to student interaction.

Satisfaction surveys of students, mentors and parents.

See the book: “The Ultimate Question” (c. 2006). Track student/client loyalty… http://www.theultimatequestion.com/theultimatequestion/home.asp

Do pop up questions to get student feedback – just three questions that are content specific.

Lessons Learned/Next Steps (Dept. Ed, OII, 2008 – Evaluating Online Learning)

  • From Seven evaluation challenges for online programs
    • Meeting needs of multiple stakeholders
    • Building on existing base of knowledge
    • Evaluating multifaceted online resources
    • Finding appropriate comparison groups
    • Solving data collection programs
    • Interpreting the impact of program maturity
    • Translating evaluation findings into action

Panelist Question: What are lesson learned about effective evaluation practices?

Mickey: Missouri – MoVIP K-5 Results. User satisfaction high, teachers make the difference, validated the model.

Grades 3, 4 and 5 results were mixed. Parents who choose this are because they are not successful of regular school…came from independent homeschool. Standardized tests not great.

Connections Academy – Provide field trips to places as part of the courses e.g. state capitol.

Provide area amusement park trip…all parents and students in the school show up.

Positive results are no guarantee that the program will continue. However, every evaluation we do adds to the research base for online learning.

Lessons learned

  • No substitute for familiarity
  • Data transmission is an art: Challenge between collectors of data, data reviewers and the main evaluator.
  • Positive results are no guarantee…In Mississippi, they had positive results but the school did not continue.

Donna…

  • Hire evaluators that know what they are doing. Ask the right questions. You have to rely on this person for continuation of the program…
  • Need longitudinal data and look for trends…don’t just do something after one semester.
  • Define your questions from your mission for evaluation. (a question might be interesting but does it inform? E.g. does it really matter to know which school the student comes from)
  • Less is MORE…focus on the basics of the program for evaluation.
  • Continuous course improvements – student doesn’t need to learn how to be online any more. Research says every student wants to be valued and every student wants to have a voice.
  • Teachers/staff development need more time.
  • Need for teacher support – online, we need to connect more. Research says every student wants to be valued and every student wants to have a voice.
  • Do we develop courses to match the student or do we develop students to match the courses?

Martha Donaldson – Alabama

(face to face lab courses)

Develop an action plan as needed.

Problem areas was foreign language – had 5 courses. Found some problems. Needed to find more decent voice tools.

Needed headphones.

More detailed alignment and gap analysis process.

Needed to develop some courses in-house.

Communication one of the biggest factors. This is the key between teacher and student is critical part of success.

Moved to Elluminate for faculty meetings

Assigned a state department liaison for each support center. (using the web conferencing system rubbed off on other state dept of ed depts.)

Need a professional development plan – we need to let teachers know what it is expected.

Talbot

Alabama Dept of Ed has a relationship with local school districts.

Alabama support centers (trainings of paraprofessionals)

Training –

Satisfaction surveys…

  • Lack of prerequisite skills by students
  • Enrollment process cumbersome
  • Need more science courses
  • School technology inadequate

(Challenge of evaluator is that things change quickly – add a new course or

System not set up to collect data we want – no statewide ID numbers – protecting student confidentiality)

  • All training people said they want more interaction – between students and teachers.
  • Big push is to develop a system where we can assess where we can survey the teachers who teach f2f and online teach the same course content – comparison study.

Small Group Discussion

B. Changing course: Early warnings/interim evaluations/mid-course corrections.

What can you change mid-stream?

  • How close are you sticking to the goals and mission?
  • Student
  • Student learning the material?
  • Is there enough interaction?
  • Teacher
  • Content
  • Technology –

Research shows that 20% of the students get 80% of the teacher’s attention.

What measure can we use to determine that teachers care for the student?

Phone interview with students. Talk with the students.

Advertisements

1 Response to “Virtual School Symposium. VSS2008. Evaluation workshop.”



  1. 1 VSS 2008 Report: Blogging About VSS « Virtual High School Meanderings Trackback on November 3, 2008 at 10:12 am

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s




Flickr Photos

Archives


%d bloggers like this: