Full paper now published in ABS http://abs.sagepub.com/content/early/2013/03/03/0002764213479366.abstract?rss=1
Characterising the flow or transfer of knowledge in knowledge creating open online communities. Knowledge building as creation of epistemic artefacts shared in a community. Lots of actors, lots of artefacts with induced relationships. given a graph of dependencies between knowledge items, how to identify the main pathways of the evolution of knowledge? use references (citation links) between publications. Main Path Analysis approach to understand how often different paths or routes are used.
Very technical discussion, mostly going over my head after about 10 mins. Note to self, concentrate harder, it is a best paper nomination after all.
Blog posts as part of assessment. Student encouraged to promote each others’ work and use a variety of like ‘badges’ to flag good work. Aim to find out if high quality work was likely to be promoted by other students. Blogs were shared before assignment deadlines and could be used by other students to improve their own submitted work. Students required to post about a series of set topics and also required to make a reflective post using a blog template as well as comment on at least 2 other student blogs.
3 questions – did students actually promote other students’ work? Yes. Did they act on the promotion of another student’s work? Was there any link between quality and the number of promotions a post received? Some promotion was self perpetuating, that is, a popular post received more views and generated further promotions.
Pilot study revealed that students actively promote posts as they read them, high quality material was largely promoted, students were seen to be generally reliable (some students are better promoters than others, some student do promote poor material, students became better as time went on). A suggestion that student promotions could be used as a highlighting mechanism in the blogosphere and as a means of making preliminary assessments for graders. We can filter content based on student promotions and be reasonably confident that there’s a link to high quality, but be aware of the ‘that’s my friend, so I’ll promote their work’ factor.
Good practical discussion which focused on analysis of online discussions based in click-stream and on live events. Allowing goals to be set and for students to react accordingly. Challenges:
- Capturing meaningful traces
- presenting data in useful form
- supporting interpretation in decision making
Two basic underlying processes, speaking and listening online. Students can choose how to interact with online posts. Students can though get overwhelmed by volume, so helping students to understand how to react to others and create useful interaction is important.
In speaking need to ensure that what is written is rational, spread over time and is moderately portioned, ie not a long rant.
In listening, need to attend to the ideas of others, need to listen to ideas that are broad, but looking at ideas together in context.
Evidence that some students try to read all messages in a social sense or feel pressure to read all, but get little from the content and others who are more targeted.
Use a series of metrics to see what is going on: range, number of sessions, % of posts read, number of posts made etc. Found there’s a lot of reading activity that is not reading at all, but simply skimming or viewing. Produced a table for students so that they could see their own metrics compared to a peer average. Ideally would like some live means of feedback whereby students can see their viewed and unviewed posts so set up a visualisation for students which show discussions that student has already engaged with, which discussions have more activity associated with them etc. Students are asked to take collective responsibility for posts and so are more likely to respond to posts which are indicated as having no response to date.
re supporting interpretation, how do we make tools integrated into the learning experience sand part of the pedagogy, actionable by students and tutors. Offered 6 principles for supporting interpretation:
- integration to learning activity, what does the instructor expect?
- diversity of metrics, students found different metrics useful, trust in the numbers is very important
- agency in interpreting meaning, metrics not seen as absolute arbiters of activity engagement, students found goal setting useful and used multiple strategies in response
- reflection, metrics could be a distraction from the activity itself, so given separate time to reflect on own metrics
- dialogue, between students and instructor, grounded in the analytics, offered opportunities for regular discussion which were seen as supportive
- parity between instructor and students, analytics with, not on, students. Instructor seen as having a positive overseeing role.
Interesting presentation which most practitioners would be able to relate to.
Collaborative writing as an essential skill in academia is a highly complex process. Combines cognitive and communication requirements. Linear writing is not considered an optimal process.
Described a project which involved a large group of students working together on a jointly authored document with bounded word count and a time constraint. The groups of students were re-formed every 2 weeks. Students were required to formulate their individual understanding and then collaboratively work together to relate their readings to given themes. Changes to the joint document could be seen via a document revision history with version numbers and author id. As tutors, it was difficult to keep track of the various revisions and individual student input, so needed a tool to support this. So, they developed a revision map, topic evolution charts and topic based collaboration networks – all as visualisation tools. Colour coding to give a quick snapshot vie of the work undertaken on a document paragraph by paragraph.
Able to see where most revision ( and least) took place, when work was carried out, whether students worked sequentially or in parallel and how many students worked on each paragraph. Also able to see how topics develop over time and are referenced in later work. Some discussion of the potential discomfort with editing or even deleting text inserted by another student, although it was not clear how this was managed, ie was there a discussion forum or some other way of communicating about text changes?
Interesting to see some simple visualisation tools reflecting back the core activities in a resource heavy piece of collaborative text based work. Collaboration does not in itself = quality of output, although is clearly recognised as a key skill. And one which needs to be developed. Perhaps an allocation of some marks to collaborative input might encourage this?
Abelardo’s journey to middle space – starts with dialectics, considering things that are opposing in order to better understand. Interesting metaphor about 2 islands separated by sea, one with inhabitants who are interested only in tools and the other inhabited by educators. Need to build a bridge between them to bring both together and gain further insight. So, Abelardo sees learning analytics as an overarching bridge which provides agile and continuous feedback between the 2 ‘islands’. He thinks of himself as a natural inhabitant of the tool island, but became aware that he needed to become more familiar with educational theory and practice in order to teach more effectively. Has tried to move away from ‘more knobs = better tool……’ Hmmmm, OK, point understood.
But became aware that in order to understand a subject more effectively, he needed to expand his own understanding and look at issues around the core topic. Kugel argues that first experience of teaching is about self, how will I be perceived, etc. then build in confidence and are more able to focus on the topic as important, and material gets more sophisticated. But the final step is understanding that this should be about how students learn and how they perceive the topic. Students are at the centre and as active and independent learners.
Taking this into his own practice, he needed to design a new course from scratch. Normally we would jump straight into the design stage and write a course agenda with specific design activities in mind. On advice, he started by focusing on university objectives, ie what kind of skills should graduates have at the end? Then he needed to translate these into specific objectives for his particular module and design activities as the final step rather than the first so that finally there is a course agenda which just falls out. Students can then use the activities to achieve the desired objectives. Important to get students involved in reflecting on what they are learning and how.
First time that I, and I suspect many others, have come across the term ‘dogfooding’, using what you have developed yourself so that you are most aware of issues with it. A quick gallop through his recent research which was too interesting to type to, so…. Very skimpy notes indeed…
Development of technology is key but the need to facilitate the understanding and use of that technology is even more important. How courses are designed is now affected by the fact that we can observe activity use in real time.
Concluded with how important it is to remain in a middle space. In life, there is a choice between 2 paths. Simply survive or face the reality and let practitioners drive the design and adoption of learning analytics tools. Yay! I can only agree with that one…