You are here

OER Collective Intelligence

Collective Intelligence refers to the augmented capacity of a community to think more effectively, than when individuals are not aware of each others' ideas.

In OLnet, we're asking: What does CI for the OER movement look like?

The primary tools that we are working with are:

  • Compendium
    ...we dub this "Excel for Knowledge": a generic tool for managing ideas and their connections, including real time capture of discussions [home] [blog]
  • Cloudworks
    ...informal information exchange, discussion and resource sharing -- often used to provide an online space to accompany f2f meetings, including several OER conferences [home] [blog]
  • Cohere
    ...web-based annotation and discussion - Compendium-lite on the web). This will be refined and extended as we research and prototype infrastructure to aggregate the community's collective experiences of designing and using OERs, plus the current debates confronting the OER movement more broadly [home] [blog]

Right now...

As we proposed in the original project proposal (see end), we are interested in gathering many forms of evidence about an OER, since there are many stakeholders with different kinds of experience, and varying degrees of time to devote to sharing it.

The figure below shows that we intend to use the OLnet project team as a testbed community, to place our CI into our prototype infrastructure. This will then be expanded to the wider OER community as we build confidence in the tools.

Building layers of OER impact evidence (extract from original project proposal)

OER practitioners and researchers come from many intellectual traditions. What “counts” as legitimate evidence in order to make claims varies accordingly. Hard pragmatics often constrain the kinds of data that can be collected, from anecdotal stories from the field that are compelling but isolated, to more structured case studies, through to narrower qualitative and quantitative laboratory evaluations that strip out the “noise” of real life in order to yield insights into specific group or cognitive processes. A collective memory that is owned by such a diverse community will not only value these different kinds of narrative, but highlight the weight of evidence around a given issue by providing a way to build and navigate the layers of narrative through an engaging website.

Thus, we can imagine a user interface onto the evidence base that makes clear which of the following “evidence layers” underpin a particular OER:

Technical Reports on the Design Principles: Such principles may be of value to those making an OER selection decision (e.g. the following pedagogical philosophy and disciplinary principles informed the OER design, here is the rationale behind the use of the particular multimedia presentation mode.)

Contexts of Use: A description of the curricular locations where a particular OER might fit and the characteristics of the student population that would typically use the OER (e.g. this introductory course in symbolic logic is a requirement for computer science majors. Students who take the course are usually sophomores and over half of them are philosophy majors.)

Anecdote: Stories perhaps using text/images/video from the field that can help build understanding, even though they may lack hard evidence or conclusions (e.g. we’ve just completed the first trial of this OER and it has not met our hopes — but we have some clues as to why, which we’re chasing up.)

Comparative Review: Analytical comparisons of OER materials aimed to identify strengths and weaknesses in terms of learning resources, technical requirements, and content coverage and treatment (e.g. we have classified these OER in terms of their technical requirements and how these match to assistive and mobile technologies.)

Portraits: Illustrations of OER in use similar to what Lawrence-Lightfoot calls portraitures, that is, qualitative accounts of “the complexity, dynamics, and subtlety of human experience and organizational life” (e.g. we followed, videotaped, and questioned a user over a specific chunk of time and across multiple settings and present here some unintended side effects of simple design, sequencing, and formatting decisions.)

Case study – anecdotal with informal evidence: Partial descriptions and data that would benefit from further analysis and discussion (e.g. we have the following screencasts and interview MP3s that we’re happy to share because we need help to analyze them...OR we used the OER and have completed a snapshot about the experience using the template provided in the Carnegie Foundation KEEP Toolkit.)

Case study – structured research methodology and data analysis: Reports about a particular situation supported by analysis that draws conclusions (e.g. this article/website tracks a cohort of trainee teachers for 3 months, as they sought to apply OER, video analysis using Grounded Theory leads us to propose three key factors that influence their success.)

Controlled experiment: Supported comparative studies with qualitative and/or quantitative data (e.g. 48 undergraduate chemistry students grouped by ability and cognitive style used the ChemTutor OER to complete Module X, statistical analysis combined with think-aloud protocols supports the hypothesis, based on Learning Theory Y, that higher ability students would benefit most.)

Learning Analysis Studies: Provide a detailed picture of the experience that students are likely to go through, and constitute a resource for iterative design improvement (e.g. we examined the data log files and can articulate how students benefit from the different components and instructional devices that make up this OER such as  explanatory text, built-in videos, animated illustrations, self-assessment, learning by doing applets, and virtual labs.)

The mere presence of evidence layers can provide an approximate cue to the level of validation a resource has received, but is not, of course, a guarantee of its suitability for a given context (content may be culture-specific; conclusions may be controversial; methodology flawed; etc).

Tags: