Sunday, March 24, 2013

Triple A priorities for Open Education - Activity 4 #h817open

My initial thoughts for activity and research in the area of Open Education

Activity 4, Week 1:  Imagine you are advising a funding organisation that wishes to promote activity and research in the area of open education.  Set out the three main priorities they should address, explaining each one and providing a justification for your list.

Thinking in broad terms, I would advocate research in the areas of Accessibility, Assessment and Accountability.
  • Priority One- Accessibility
    • Accessibility research would involve assessing who has access to Open Education and who doesn't and for what reasons.   The term "accessibility" often focuses on making web information available to people who have various physical disabilities (for example hearing or visual impairments).  As described by the Web Accessibility Initiative ( "Inclusive design, design for all, digital inclusion, universal usability, and similar efforts address a broad range of issues in making technology available to and usable by all people whatever their abilities, age, economic situation, education, geographic location, language".   Barriers to accessibility includes lack of access to necessary technology and/or lack of access to digital information because of low income, geographic or political barriers. 
    • Web technology has the potential to assist in overcoming many barriers and to make education more open but also faces barriers.  For example - for distance learning, the Internet makes information available in one's home but only if you live in a place where you can obtain Internet access.  
    • Is open education more inclusive of different learners who may not be accessing information through a traditional academic pathway?  Does open education allow for collaboration between people that is different than what occurs in more traditional education? 
  • Priority Two- Assessment
    • Assessment of learners- What are different ways to assess learners?  What is the role of peer assessment and how effective is it? What are authentic learning assessments in an open education environment?
    • Assessment of open education courses/materials- How should open education courses by assessed for quality?  How do you measure effectiveness of open education courses beyond looking at completion rates and satisfaction rates?
  • Priority Three- Accountability
    • Accountability includes looking at who is responsible for the quality of open education courses.  Is there transparency regarding the process in open education and intellectual property rights/copyrights?  Is there transparency regarding how open education is funded and potential conflicts of interest?

Saturday, March 23, 2013

Joining Open Education and developing a learning plan #817open

Somewhat to my surprise, I have decided to take another MOOC (Massive Open On-line Course).  This one is from the Open University in United Kingdom and is on Open Education (  I recently completed the E-learning and Digital Culture (EDC MOOC)and I learned about this course from the EDC Google+ group and on Twitter.  I

I am a pediatrician and I am involved in medical education during the clinical years (medical students and pediatric residents).  I learned a tremendous amount through the E-learning and Digital Culture MOOC particularly about the process of learning from peers in social media groups and about the many tools available for on-line education- for content delivery and for collaborative learning and creation.  It was a bit overwhelming and time-intensive, though I really enjoyed it and it "energized" me in my work.   I did wonder if I would apply what I learned in my teaching. 

Open Education #817 is an opportunity for me to think e-learning in a more systematic way and to think about what aspects might be applicable to medical education.  I include in medical education: trainee education, family and patient education, and collaboration/knowledge creation across disciplines and groups.

My Personal Learning Plan for this course includes:

1) Develop a strategy to organize what I am learning.
    - I am still using paper files (or the digital equivalent of Dropbox for Word documents).  I do not have a system for organizing what I learn, what I want to read in the future etc.  I plan to explore the use of Pintrest, Diigo, Evernote and Pearltree, both for my own organization and as teaching tools.  First stop is

2) Utilize social media to learn from peers
   - I joined Google + and Twitter six weeks ago as a part of the EDC MOOC.  I found it to be a wonderful community and way to learn.  Who knew I would like twitter chats (thank goodness for tweetchat during those frentic chats).  I haven't yet figured out circles, following etc but I will participate in the 817open google + group and have more opportunity to learn how I want to use social media for learning.

3) Use a variety of tools in creating learning objects
   - My technology skills have been for the most part limited to Word and PowerPoint.  I plan to explore a bit more of the many different tools that I was introduced to during the EDCMOOC and to think about when to use various tools (if at all).  The starting list of tools from EDC MOOC is below and I plan to continue to look at the various digital artifacts created-
  1. Facebook Interaction Tracker-
  2. Timeline-
  3. Scoopit- (Laurie Niestrath)
  4. Tiki-Toki Timeline - (HB Hessler)
  5. Diigo - (Rick Bartlett) (Laaurie Niestrath)
  6. Pinterest - (Ary Aranguiz)
  7. Glogster -
  8. Youtube-
  9. Ustream -
  10. http://infogr.am11.Mixbook -
  11. Storify-
  12. New Hive -
  13. Slideshare-
  14. WebDoc-
  15. BlogTalkRadio-
  16. Knovio-
  17. Google Hangouts - record your hangouts
  18. Prezi - (Laurie Niestrath)
  19. Voicethread- (Ary Aranguiz)
  20. Photostory- (Laurie Niestrath)
  21. Thinglink - (Kay Oddone)
  22. Animoto -
  23. Piktochart -
  24. Wix - (Jono Purdy)
  25. Popplet - (Jono Purdy)
  26. Animaps - (Jono Purdy)
  27. Museum Box - (Jono Purdy)
  28. Sqworl - (Jono Purdy)
  29. Popcorn Maker - (Jono Purdy)
  30. Ipiccy - (Anne Robertson)
  31. Sketchguru - free android app (Anne Robertson)
  32. Picmonkey - (Marina Shemesh)
  33. Wordle - (create word clouds) (Marina Shemesh)
  34. Adobe Captivate - (authoring tool) (Madhura Pradhan)
  35. Articulate Suite - (authoring tool) (Madhura Pradhan)
  36. Storybird - (Cristina Silva)
  37. ImageChef - (Cristina Silva)
  38. Dipity - (Cristina Silva)
  39. Livebinders (Eileen Lawlor)
  40. Videoscribe: (Angela Towndrow)
  41. PearlTrees: (Cathleen Nardi
  42. SlideRocket- (Annie Oosterwyk)
  43. Meograph- / (Annie Oosterwyk)
  44. Wallwisher: (Ora Baumgarten)
  45. Organize anything, together ! ( gianni buspo)
  46. Mahara (Linda Pospisilova)
  47. Jing (Kay Oddone) - Screen capture and screen casting tool - great for creating tutorials!

Sunday, March 3, 2013

Assessment, Miller's triangle and the importance of reflection #edcmooc

For EDC MOOC, the assessment was to create a digital artefact and to peer review at least three other digital artefacts.

I thought that the creation of a digital artefact was a meaningful and authentic way to assess this course.  For those not familiar with Miller's triangle of assessment (which is often referred to in medical education): creating an artefact ranks towards the top of the triangle as it "shows how" in a simulated environment.  The next step is "does" which means that one has incorporated what one has learned in this course into their work/life.  For those of us who are educators, time will tell if participation in this class changes what we do in e-learning and e-teaching- I suspect it will.
                         Miller's Triangle  (


Creating an artefact was a way for us to experiment with different digital modalities to express ideas from the course that were meaningful to us and to demonstrated an understanding of at least some of the course material.

In my job I am involved in work-based assessment- the evaluation of clinical skills.  We categorize assessment as low-stakes or high stakes, sometimes called formative and summative feedback.  High- stakes assessment is an assessment that you must pass to move on in your profession.  An example would be the having to pass a clinical skills evaluation in order to be eligible for a medical license in the US.  With high stakes feedback great care must be taken to be sure that it measures what you intend to evaluate, it is reliable (different evaluators would give the same score) and that it is free of bias. 

Low stakes feedback is given primarily to improve the learner's performance (though, of course, learning should happen with any assessment).  The evaluation of the EDC MOOC digital artefact is a low-stakes assessment.  A certificate is given regardless of the grade received, so the main purpose of the assessment is for both the creator of the digital artefact and the evaluators to learn from the experience. A specific rubric was given for evaluation.  There were also explicit instructions regarding the purpose of the feedback and how to learn from feedback.

One of the difficulties in assessing the artefacts and giving meaningful comments was that we viewed the artefacts in isolation and didn't know the author's learning goals for the artefact. I wonder if it would be better if one of the course requirements for the artefact should have been a self- reflection/self- evaluation as well as the artefact itself. It would have been easier to give specific feedback about the artefact- what worked well, how perhaps it could be improved if one knew the author's intent for the artefact.  I suspect that a number of people were "stepping out of their comfort and experience zones" in creating their digital artefact, I certainly was. 

Formative feedback can be both positive and corrective.  It should be specific in nature, stimulate reflection (and perhaps an action plan for the future) and should be in a supportive environment.   My personal experience with the official feedback was that 2 evaluators gave thoughtful and specific feedback.  The other two evaluators wrote less than a sentence, with one of them giving me the "feedback kiss of death" which is "very good".    What's very good? How could it be better?  On the other hand, I did get constructive feedback on my artefact from the Google + group.  I do know that many people spent a lot of time giving thoughtful feedback and evaluating more than the three mandatory artefacts. 

One thing that we all should have learned from this course is that human expression has many forms and perspectives.  We all come from different cultures and experiences.  While I may not have fully understood your artefact, I should not assume you have nothing to say.  I may have been entertained by your artefact, but did I also learn from it?