MobSOS - Community Evaluation & Learning Analytics

Theoretical Basis

As part of their central goal of learning how to do it better, communities of practice [1] must be able to measure, reflect, and stay aware of their own successes and failures. Wenger et al. provide evidence for this essential need from several professional community contexts such as car manufacturers, insurance companies and consulting agencies [1][2]. Such awareness is essential for a self-sustainable overall community life cycle [2][3]. Communities must constantly create new and measurably relevant values and impacts for their members to justify and secure their existence [2]. In their endeavor for success in terms of sustaining, renewing and optimizing their practice, professional communities require support for reflection, custom-tailored to their specific goals, practices, needs and differing notions of community success over time. These values also include community tools that undergo constant scrutiny and development to match the requirements of an ever-changing and emergent practice of communities.

Community Information Systems (CIS) are combinations of people, technology and rules [4] inherently subject to the emergent mutual influence between social and technical as manifestation of structuration theory [5][6]. As such, any research on CIS success awareness must inherently follow an anti-positivist approach, i.e. acknowledge different individual notions of CIS success, dynamically changing over time. We particularly define CIS success awareness as an ongoing, comprehensive, ideally real-time situational awareness for CIS success, serving as informed self-reflection and guidance for community activity. Within communities of practice, we naturally find multiple independent, conflicting perceptions of CIS success among different stakeholders. Across communities and domains, differences in perception are even stronger. Especially within specialized niche communities with ongoing development activity, as we find them in Layers, these different perceptions must not be abstracted to a generic and imposed, but rather negotiated to a community-specific and shared awareness of CIS success.

Process applied in Learning Layers

The MobSOS (Mobile Community Information System Oracle for Success) CIS success awareness framework developed in Learning Layers supports the above theoretical basis with an evaluation methodology that seamlessly integrates CIS success modeling into Hevner et al.’s Design Science Research (DSR) framework [7]. MobSOS understands evaluation as cross-cutting activity separate from, but yet with strong mutual influences to design and development. In contrast to typical positivist, one-off research studies on enterprise information systems success, MobSOS postulates an inherently anti-positivist perspective with certain implications:

Community-Oriented Design Science Research

Figure 1 - Community-Oriented Design Science Research Framework

CIS Success Modeling as Media Transcription

Figure 2 - CIS Success Modeling as Media Transcription

Examples of Application in Learning Layers

In absence of a first technical infrastructure and later on in parallel to ongoing development in Learning Layers [18], we evaluated MobSOS in various service-oriented community information systems in the areas of cultural heritage documentation [19], healthcare [13], and technology-enhanced learning [16][17]. We applied the same methodology to Layers and integrated technical evaluation support into Layers Box [18][20][21], one of the key technical achievements of Learning Layers. Since more than two years, Layers Sandbox, a software as a service (SaaS) deployment of Layers Box hosted at RWTH (https://api.learning-layers.eu), serves as environment for evaluation studies with MobSOS. In particular, we employed MobSOS to analyze end-user activity around the Layers Box’s integrated OpenID Connect provider, based on authentication and authorization log data. Layers Sandbox will continue to operate beyond the project lifetime of Learning Layers in order to enable long-term data community analysis.

Glossary

Contributing Authors

References

  1. E. C. Wenger, Communities of Practice: Learning, Meaning, and Identity. Cambridge, UK: Cambridge University Press, 1998.
  2. E. C. Wenger, R. McDermott, and W. M. Snyder, Cultivating Communities of Practice: A Guide to Managing Knowledge. Boston, MA, USA: Harvard Business School Press, 2002.
  3. A. Iriberri and G. Leroy, “A Life-Cycle Perspective on Online Community Success,” ACM Computing Surveys, vol. 41, no. 2, pp. 1–29, 2009. DOI: 10.1145/1459352.1459356
  4. R. Klamma, “Social Software and Community Information Systems,” Habilitation, RWTH Aachen University, Aachen, Germany, 2010.
  5. W. J. Orlikowski, “The Duality of Technology: Rethinking the Concept of Technology in Organizations,” Organization Science, vol. 3, no. 3, pp. 398–427, 1992.
  6. W. J. Orlikowski, “The sociomateriality of organisational life: considering technology in management research,” Journal of Economics, vol. 34, pp. 125–141, 2010.
  7. A. R. Hevner, S. T. March, J. Park, and S. Ram, “Design Science in Information Systems Research,” MIS Quarterly, vol. 28, no. 1, pp. 75–105, 2004.
  8. M. Q. Patton, Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use. Guilford Press, 2011.
  9. S. Gopalakrishnan, H. Preskill, and S. Lu, “Next-Generation Evaluation: Embracing Complexity, Connectivity, and Change,” in Next Generation Conference, 2013, pp. 1–20.
  10. G. G. Gable, D. Sedera, and T. Chan, “Re-conceptualizing Information System Success: the IS-Impact Measurement Model,” Journal of the Association of Information Systems, vol. 9, no. 7, pp. 377–408, 2008.
  11. W. H. Delone and E. R. McLean, “Information Systems Success: The Quest for the Dependent Variable,” Information Systems Research, vol. 3, no. 1, pp. 60–95, 1992.
  12. C. B. Jarvis, S. B. Mackenzie, and P. M. Podsakoff, “A Critical Review of Construct Indicators and Measurement Model Misspecification in Marketing and Consumer Research,” Journal of Consumer Research, vol. 30, pp. 199–218, 2003.
  13. D. Renzel, R. Klamma, and M. Jarke, “IS Success Awareness in Community-Oriented Design Science Research,” in New Horizons in Design Science: Broadening the Research Agenda, Switzerland, 2015, vol. 9073, pp. 413–420. DOI: 10.1007/978-3-319-18714-3_33
  14. L. Jäger, M. Jarke, R. Klamma, and M. Spaniol, “Transkriptivität: Operative Medientheorien als Grundlage von Informationssystemen für die Kulturwissenschaften,” Informatik-Spektrum, vol. 31, no. 1, pp. 21–29, 2008. DOI: 10.1007/s00287-007-0218-9
  15. R. Klamma, “Community Learning Analytics: Challenges and Opportunities,” in Advances in Web-Based Learning - ICWL 2013, vol. 8167, J.-F. Wang and R. Lau, Eds. Berlin-Heidelberg, Germany: Springer, 2013, pp. 284–293. DOI: 10.1007/978-3-642-41175-5_29
  16. D. Renzel and R. Klamma, “From Micro to Macro: Analyzing Activity in the ROLE Sandbox,” in Proceedings of the Third International Conference on Learning Analytics and Knowledge, 2013, pp. 250–254. DOI: 10.1145/2460296.2460347
  17. D. Renzel, R. Klamma, M. Kravcik, and A. Nussbaumer, “Tracing Self-Regulated Learning in Responsive Open Learning Environments,” in Advances in Web-Based Learning - ICWL 2015, Berlin-Heidelberg, 2015, vol. 9412, pp. 155–164. DOI: 10.1007/978-3-319-25515-6_14
  18. M. Derntl, R. Klamma, I. Koren, P. Nicolaescu, D. Renzel, K. Ngua, J. Purma, D. Zaki, T. Treasure-Jones, G. Attwell, O. Gray, T. Ley, V. Tomberg, C. Henry, C. Whitehead, D. Theiler, C. Trattner, R. Maier, M. Manhart, M. Schett, and S. Thalmann, “Initial Architecture for Fast Small-Scale Deployment,” Learning Layers Project, Deliverable D6.1, 2013.
  19. D. Renzel and R. Klamma, “Semantic Monitoring and Analyzing Context-aware Collaborative Multimedia Services,” in Proceedings of the 2009 IEEE International Conference on Semantic Computing (ICSC 2009), Sep 14-16, Berkeley, CA, USA, Los Alamitos, CA, USA, 2009, pp. 630–636. DOI: 10.1109/ICSC.2009.112
  20. M. Derntl, M. Kravcik, R. Klamma, I. Koren, P. Nicolaescu, D. Renzel, A. Hannemann, M. Shahriari, J. Purma, M. Bachl, E. Bellamy, R. Elferink, V. Tomberg, D. Theiler, and P. Santos, “Customizable Architecture for Flexible Small-Scale Deployment,” Learning Layers Project, Deliverable D6.2, 2014.
  21. R. Klamma, I. Koren, P. Nicolaescu, D. Renzel, M. Kravčík, M. Shahriari, M. Derntl, G. Peffer, and R. Elferink, “DevOpsUse - Scaling Continuous Innovation,” Learning Layers Project, Deliverable D6.3/Report 4, 2015.