MobSOS - Community Evaluation & Learning Analytics
Theoretical Basis
As part of their central goal of learning how to do it better, communities of practice [1] must be able to measure, reflect, and stay aware of their own successes and failures. Wenger et al. provide evidence for this essential need from several professional community contexts such as car manufacturers, insurance companies and consulting agencies [1][2]. Such awareness is essential for a self-sustainable overall community life cycle [2][3]. Communities must constantly create new and measurably relevant values and impacts for their members to justify and secure their existence [2]. In their endeavor for success in terms of sustaining, renewing and optimizing their practice, professional communities require support for reflection, custom-tailored to their specific goals, practices, needs and differing notions of community success over time. These values also include community tools that undergo constant scrutiny and development to match the requirements of an ever-changing and emergent practice of communities.
Community Information Systems (CIS) are combinations of people, technology and rules [4] inherently subject to the emergent mutual influence between social and technical as manifestation of structuration theory [5][6]. As such, any research on CIS success awareness must inherently follow an anti-positivist approach, i.e. acknowledge different individual notions of CIS success, dynamically changing over time. We particularly define CIS success awareness as an ongoing, comprehensive, ideally real-time situational awareness for CIS success, serving as informed self-reflection and guidance for community activity. Within communities of practice, we naturally find multiple independent, conflicting perceptions of CIS success among different stakeholders. Across communities and domains, differences in perception are even stronger. Especially within specialized niche communities with ongoing development activity, as we find them in Layers, these different perceptions must not be abstracted to a generic and imposed, but rather negotiated to a community-specific and shared awareness of CIS success.
Process applied in Learning Layers
The MobSOS (Mobile Community Information System Oracle for Success) CIS success awareness framework developed in Learning Layers supports the above theoretical basis with an evaluation methodology that seamlessly integrates CIS success modeling into Hevner et al.’s Design Science Research (DSR) framework [7]. MobSOS understands evaluation as cross-cutting activity separate from, but yet with strong mutual influences to design and development. In contrast to typical positivist, one-off research studies on enterprise information systems success, MobSOS postulates an inherently anti-positivist perspective with certain implications:
-
CIS Success Modeling: MobSOS conceptualizes CIS success modeling as ongoing collaborative activity to achieve collective CIS success awareness from a negotiated fusion of different individual stakeholder notions. CIS success is thereby modeled as a complex multi-dimensional construct dynamically changing with emergent life cycles of communities, individuals, and artifacts.
-
CIS Evaluation Perspectives: MobSOS conceptually includes summative, formative, and developmental evalution for different community life cycle phases. Summative and formative evaluation perspectives are appropriate for phases of relative stability within and around a community. For phases of emergent change, progress, or innovation, MobSOS emphasizes the need for participatory, agile, and data-driven developmental evaluation approaches [8][9], interwoven with respective agile development methodologies such as DevOpsUse.
-
CIS Evaluation Culture: concrete CIS evaluation processes embedded into DSR must first be established in a community-wide culture of embracing CIS success awareness as major benefit for the sake of agency for self-reflection and informed guidance of future developments.
-
IS Success & DSR Research: DSR provides a suitable research framework, including process models for CIS design and development with an emphasis on evaluation [7]. As template for CIS success modelling, MobSOS adapts the most influential measurement models from IS success research. In particular, the MobSOS CIS success meta-model adapts the structural template of Gable et al.’s IS-Impact model [10] as influential reconceptualization of DeLone & McLean’s seminal I/S Success model [11] and formalizes CIS success models as multi-dimensional formative indices [12][10], following a hierarchical meta-model of dimensions, factors, and measures. Fig. 1 shows a schematic overview of the resulting MobSOS-specific Community-Oriented DSR [13] framework.
-
Competing CIS Success Notions: with an inherently anti-positivist epistemology in communities of practice, MobSOS supports the existence of multiple success models for different community contexts, different CIS artifacts, different points in time, and different community-dependent constraints. CIS success models typically consists of a balanced mix of generic, universally applicable constructs on the one side and very community-specific, sometimes idiosyncratic factors and metrics on the other. In the presence of universal or at least reusable CIS success factors and measures, we introduced the concept of CIS success measure catalogues for improved measure reusability.
-
CIS Success Modeling as Media Transcription: MobSOS furthermore conceptualizes CIS success modelling as digital adaptation of transcriptivity theory [14][4]. CIS success models serve as transcripts, able to generate CIS evaluation products such as written reports or interactive dashboards as scripts from pre-scripts such as usage or survey response data. MobSOS thus conceptualizes CIS success models as fluid digital media, supporting CIS success negotiation processes with operations such as monitoring, assessing, exploring, modeling, measuring, visualizing, validating, and sharing. Fig. 2 depicts the process of creating and successively shaping CIS success models as a result of community negotiation trough transcription.
-
CIS Success Data Sampling: MobSOS acknowledges observation and surveying as two inexpensive, established and complementary pillars for basic data collection on CIS success, following the principle “Observe where possible, survey only where inevitable”. Observation is preferable due to its automated collection of quantitative, objective evidence for CIS use (usage data). Online surveys complement observation-based metrics with additional assessments on subjective, qualitative success aspects. With this combination, communities can collect sufficiently expressive data to capture both qualitative and quantitative as well as subjective and objective aspects of CIS success.
-
CIS Success Model Formalization & Validation: Before communities can reliably measure CIS success, the respective models must first be validated. MobSOS therefore postulates a combination of expert panel techniques with statistical regression techniques in order to assess content and predictive validity. Expert panel techniques are effective means for validation in all stages of the CIS success model building process, as every community includes a core of domain experts. These techniques are even more effective with quantifiable evidence in the form of CIS success models and regression-based validation. Pivotal statistics generated in the context of correlation and regression analysis for a whole model (e.g. adjusted coefficient of determination for model fit) and for its particular constructs (e.g. regression coefficients for effect strength, significance for relevance) serve as basis for reflection of past and decision of future action. Intuitively, improvements on factors with respective high absolute regression coefficient values and high significance values are expected to have stronger effects than insignificant factors with regression coefficient values close to zero. On a higher level, regression statistics enable heuristic strategies towards model refinements and operationalization. Intuitively, significantly relevant, but low-scoring success metrics for a given factor indicate the urgent need to improve the CIS artifact with respect to this respective factor.
-
CIS Success Awareness Infrastructure & Tool Kit: The MobSOS framework additionally includes a CIS technical infrastructure component, including a tool kit that provides a collection of fundamental services necessary to achieve a comprehensive sense of CIS success awareness. This component is further described in the Infrastructure section of this page.
-
Community Learning Analytics: with particular respect to applying MobSOS in systems dedicated to learning, the MobSOS methodology and its support tools serve well as community Learning Analytics (LA) framework [15][16][17].
Examples of Application in Learning Layers
In absence of a first technical infrastructure and later on in parallel to ongoing development in Learning Layers [18], we evaluated MobSOS in various service-oriented community information systems in the areas of cultural heritage documentation [19], healthcare [13], and technology-enhanced learning [16][17]. We applied the same methodology to Layers and integrated technical evaluation support into Layers Box [18][20][21], one of the key technical achievements of Learning Layers. Since more than two years, Layers Sandbox, a software as a service (SaaS) deployment of Layers Box hosted at RWTH (https://api.learning-layers.eu), serves as environment for evaluation studies with MobSOS. In particular, we employed MobSOS to analyze end-user activity around the Layers Box’s integrated OpenID Connect provider, based on authentication and authorization log data. Layers Sandbox will continue to operate beyond the project lifetime of Learning Layers in order to enable long-term data community analysis.
Glossary
- CIS - Community Information System
- DSR - Design Science Research
- MobSOS - Mobile Community Information System Oracle for Success
Contributing Authors
References
- E. C. Wenger, Communities of Practice: Learning, Meaning, and Identity. Cambridge, UK: Cambridge University Press, 1998.
- E. C. Wenger, R. McDermott, and W. M. Snyder, Cultivating Communities of Practice: A Guide to Managing Knowledge. Boston, MA, USA: Harvard Business School Press, 2002.
- A. Iriberri and G. Leroy, “A Life-Cycle Perspective on Online Community Success,” ACM Computing Surveys, vol. 41, no. 2, pp. 1–29, 2009. DOI: 10.1145/1459352.1459356
- R. Klamma, “Social Software and Community Information Systems,” Habilitation, RWTH Aachen University, Aachen, Germany, 2010.
- W. J. Orlikowski, “The Duality of Technology: Rethinking the Concept of Technology in Organizations,” Organization Science, vol. 3, no. 3, pp. 398–427, 1992.
- W. J. Orlikowski, “The sociomateriality of organisational life: considering technology in management research,” Journal of Economics, vol. 34, pp. 125–141, 2010.
- A. R. Hevner, S. T. March, J. Park, and S. Ram, “Design Science in Information Systems Research,” MIS Quarterly, vol. 28, no. 1, pp. 75–105, 2004.
- M. Q. Patton, Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use. Guilford Press, 2011.
- S. Gopalakrishnan, H. Preskill, and S. Lu, “Next-Generation Evaluation: Embracing Complexity, Connectivity, and Change,” in Next Generation Conference, 2013, pp. 1–20.
- G. G. Gable, D. Sedera, and T. Chan, “Re-conceptualizing Information System Success: the IS-Impact Measurement Model,” Journal of the Association of Information Systems, vol. 9, no. 7, pp. 377–408, 2008.
- W. H. Delone and E. R. McLean, “Information Systems Success: The Quest for the Dependent Variable,” Information Systems Research, vol. 3, no. 1, pp. 60–95, 1992.
- C. B. Jarvis, S. B. Mackenzie, and P. M. Podsakoff, “A Critical Review of Construct Indicators and Measurement Model Misspecification in Marketing and Consumer Research,” Journal of Consumer Research, vol. 30, pp. 199–218, 2003.
- D. Renzel, R. Klamma, and M. Jarke, “IS Success Awareness in Community-Oriented Design Science Research,” in New Horizons in Design Science: Broadening the Research Agenda, Switzerland, 2015, vol. 9073, pp. 413–420. DOI: 10.1007/978-3-319-18714-3_33
- L. Jäger, M. Jarke, R. Klamma, and M. Spaniol, “Transkriptivität: Operative Medientheorien als Grundlage von Informationssystemen für die Kulturwissenschaften,” Informatik-Spektrum, vol. 31, no. 1, pp. 21–29, 2008. DOI: 10.1007/s00287-007-0218-9
- R. Klamma, “Community Learning Analytics: Challenges and Opportunities,” in Advances in Web-Based Learning - ICWL 2013, vol. 8167, J.-F. Wang and R. Lau, Eds. Berlin-Heidelberg, Germany: Springer, 2013, pp. 284–293. DOI: 10.1007/978-3-642-41175-5_29
- D. Renzel and R. Klamma, “From Micro to Macro: Analyzing Activity in the ROLE Sandbox,” in Proceedings of the Third International Conference on Learning Analytics and Knowledge, 2013, pp. 250–254. DOI: 10.1145/2460296.2460347
- D. Renzel, R. Klamma, M. Kravcik, and A. Nussbaumer, “Tracing Self-Regulated Learning in Responsive Open Learning Environments,” in Advances in Web-Based Learning - ICWL 2015, Berlin-Heidelberg, 2015, vol. 9412, pp. 155–164. DOI: 10.1007/978-3-319-25515-6_14
- M. Derntl, R. Klamma, I. Koren, P. Nicolaescu, D. Renzel, K. Ngua, J. Purma, D. Zaki, T. Treasure-Jones, G. Attwell, O. Gray, T. Ley, V. Tomberg, C. Henry, C. Whitehead, D. Theiler, C. Trattner, R. Maier, M. Manhart, M. Schett, and S. Thalmann, “Initial Architecture for Fast Small-Scale Deployment,” Learning Layers Project, Deliverable D6.1, 2013.
- D. Renzel and R. Klamma, “Semantic Monitoring and Analyzing Context-aware Collaborative Multimedia Services,” in Proceedings of the 2009 IEEE International Conference on Semantic Computing (ICSC 2009), Sep 14-16, Berkeley, CA, USA, Los Alamitos, CA, USA, 2009, pp. 630–636. DOI: 10.1109/ICSC.2009.112
- M. Derntl, M. Kravcik, R. Klamma, I. Koren, P. Nicolaescu, D. Renzel, A. Hannemann, M. Shahriari, J. Purma, M. Bachl, E. Bellamy, R. Elferink, V. Tomberg, D. Theiler, and P. Santos, “Customizable Architecture for Flexible Small-Scale Deployment,” Learning Layers Project, Deliverable D6.2, 2014.
- R. Klamma, I. Koren, P. Nicolaescu, D. Renzel, M. Kravčík, M. Shahriari, M. Derntl, G. Peffer, and R. Elferink, “DevOpsUse - Scaling Continuous Innovation,” Learning Layers Project, Deliverable D6.3/Report 4, 2015.