Monthly Archives: June 2014

Week 5 TEL One Planning

In my previous role at a university, I managed a project to deploy new e-learning tools to make training in the university’s student record system more efficient. Thinking about that project….

Who were your stakeholders?

I was part of the Student Registry who were responsible for managing the university’s student record system and also had overall responsibility for training staff in the use of the system. So, the Registry staff along with staff users of the system across the university were stakeholders in the project. Managers who would provide the funding for the new e-learning tools would also be stakeholders.

What resources were used?

Microsoft Project was used to allocate tasks, manage timescales and it seemed to work very well.

How clear/achievable was the project plan?

The project plan was fairly clear and included a stage for evaluation of various e-learning tools including seeking advice from other HE institutions using e-learning for similar staff training.

What fallback position, if any, did you build into your plan in the event of full or partial project failure?

There was no fallback position as such. Though the use of e-learning tools for staff training would make training more efficient, not having e-learning tools to support training wouldn’t stop the delivery of training.

What methods did you use to evaluate your project?

Though no formal evaluation methods were used, the project progress was monitored on a regular basis through meetings with line managers. Colleagues in other teams and departments were kept up-to-date through team, inter-departmental meetings.

How did you measure project success?

Feedback was a major factor in measuring project success. A university wide survey about staff training (in general) was used to measure success. Past results from the survey about Registry offered training courses were compared with results from surveys after the new tools were introduced. There was a significant improvement in areas like how the training was delivered, how accessible the material was offsite/offline and engagement with the training courses.

Did you celebrate your success and did this encourage further developments?

The project success was celebrated as a big achievement for the Registry. The project helped to modernise an area of the university’s staff training provision which had not kept up with changing times & technologies. In order to help encourage further developments (within my institution and in others), I was encouraged to join a HE “trainers network” to learn/share techniques about delivering his type of training to staff.

Advertisements

Week 4 TEL One e-Assessments

I have fairly limited experience of e-Assessments (and all of it from the learner perspective). So, here are my thoughts on the following questions….

  • Why did/would you choose a particular type of e-assessment? Describe why you think it is effective and how it can help deepen knowledge and understanding.
  • In your experience, what type of approach creates an environment conducive to self-directed learning, peer support and collaborative learning? How might technology help?
  • What opportunities and challenges does this approach present to tutors?

The best types of e-Assessments are probably MCQs or quiz type assessments which provide learners with the correct answer after the response is typed in. These types of assessments provide the learners with explanations about the correct answers and enable learners to really learn.

The 2 types of e-Assessment approaches suited for self-directed learning are MCQs and/or quiz type assessments. I experienced both these types of assessments in an online course that I did over 6 months. The feedback mechanism in this course was excellent. There was the opportunity for staff to post feedback on the assessment as a whole. This was an overview of the whole assessment. But then staff could also feedback on individual questions (that made up the assessment) and questions that weren’t answered adequately could be marked for resubmission. The learner could then read the feedback and resubmit only “marked” questions for re-assessment.

I have also seen “Who wants to be a millionaire” style clickers used in a large lecture room setting where the lecturer shows the question on a OHP and the learners then choose the correct answer using their personal clicker. The percentage of answers (both correct and incorrect) is immediately displayed on screen and the lecturer provides the rationale for the correct answer where appropriate. This type of e-Assessment is suited for individual and collaborative learning as groups could use one clicker between them.

Technology can play a big part in making assessments/feedback mechanisms more efficient and more engaging to learners. To make assessments more fun, lecturers should consider using personal response clickers and other similar techniques as much as possible. I have first-hand experience of how interest and engagement in lecturers goes up when such technology is used. Using technology for feedback is also very crucial as it’s recorded and can be typed in directly without having to be written on paper first. Email/SMS can then be used to encourage learners to view feedback once the feedback becomes available online.

E-Assessments are a great opportunity for lecturers to provide quick and immediate feedback. They are especially useful in online courses where (as described above) the tutor can provide feedback on individual questions and on assessments as a whole. Tutors can also track whether the feedback has been viewed by the learner or not (as access to course materials on the course I took was tracked to ensure learners actually read the materials). The biggest challenges with using technology are the availability of technological solutions, adequate training for staff in the use of these technologies and the support mechanisms to maintain/further develop the technology once implemented.

Week 3 TEL One Perspective of a learner

After the bank holiday weekend, I am in catch up mode this week (as its “reading week”). So, apologies for the late post. I reviewed the following…..

Khan Academy’s YouTube video: https://www.youtube.com/watch?v=2IfWIGby7K0

ElearningExamples:Candidate Match Game II  mainly

iEthiCS simulation: Introduction to the Andy Dufrayne Case http://www.elu.sgul.ac.uk/iethics/

1) What elements of these do you think are appealing to different learners?

The YouTube videos appear to be designed for a younger audience with the use of simplistic examples and characters/voices that younger audiences are likely to relate to. I only watched a couple of videos (incl. https://www.youtube.com/watch?v=0VIi4kxbbqw) but I certainly got the impression that the target audience was definitely a younger audience. The videos would appeal to learners who are patient and are willing to spare the required time. However, the videos are of high quality and are very well put together with good onscreen explanations and audio. Learners would require good internet connections and also possibly headphones for the audio (depending on where the learner is). This means these videos are probably not accessible to everyone – although I have not tried the videos on a phone.

The ELearningExamples games were very interactive and were again of a very high quality. They were designed to be informative (especially the Candidate Match Game). The game gave learners the opportunity to find out about where both candidates in the last US presidential election stood on various policies. This would ultimately help people choose which candidate to vote for (I am guessing). Users had a choice of simply getting a snapshot of where each candidate stood on a particular issue or the user could read more about what the candidate said/why/where/when etc. This kind of learning would suite someone who had a bit of a background about the subject area and were looking to learn more and thus make informed choices.

iEthiCS: This resource was very interesting especially since it allowed laymen (like me) to get a bit of a glimpse into the medical world. However from a technical standpoint, I thought the resource could have been better. Though videos are used, the material didn’t seem to flow in a logical manner. However, the way that the outcomes of various actions were explained through text and the use of scenarios would certainly benefit medical professionals in my opinion.

2) What learners, if any, would they be inappropriate for and why?

These materials may not work for learners who don’t have access to good computing power because almost all the resources use videos, animation and audio. The learner also needs to pay attention, engage with and spend time on the resources in order to get the most of the resources. So, the learners have to be really interested in the subject matter. Otherwise, they may not gain anything from the resources as the subject matter is almost always very specific.

3) How do each of these resources differ from that of the resources we’re using in ocTEL? Do they promote social learning, re-use of their materials, or open access?

These resources are similar to the ones being used in OCTEL. Some of the resources do encourage social learning (like the YouTube videos) and there is no reason the materials cannot be re-used.

4) What ways can you see to improve the effectiveness or potential reach of these resources? Effectiveness can be considered as allowing students to work at their own pace and review areas they need to, providing a richer learning experience by expanding the range of expertise which students will confront, or providing a range of materials in different media formats to suit students’ different learning preferences.

Making the materials compatible with mobile devices would make the resources accessible to more people. Organising the resources better (like maybe sub categorising the Youtube videos) and creating a better flow like with the iEthiCS resource will help enhance of the effectiveness of the materials in my opinion.