Thursday, May 29, 2014

Cognitive Engagement in Online Postings

Shukor, N. A., Tasir, Z., Van, M. H., & Harun, J. (2014). A Predictive Model to Evaluate Students’ Cognitive Engagement in Online Learning. Procedia - Social and Behavioral Sciences, 116, 4844-4853.

Shukor, Tasir, Van & Harun use an interesting methodology to explore cognitive engagement in online course postings.  Since educators need to verify the quality of online learning, examining the level of cognitive engagement generated serves as a good starting point.  After citing studies on how cognitive engagement increases motivation, improves grades and enhances learning, the authors look at studies investigating cognitive engagement in online environments.  Using a 12-point coding system developed in one of these studies (Van der Meijden, 2005), Shukor et al examined all postings for 20 students enrolled in a web development course at a Malaysian university.  Combining coded responses allowed students to be ranked as showing high, high-low or low cognitive engagement.  Finally, the researchers compared these rankings with course management data to investigate how often these students had logged in and read postings.  Using their content analysis results and dating mining techniques, the researchers offered a predictive model for what factors lead to cognitive engagement.

I was impressed with the coding system and intrigued by the use of data mining techniques but less impressed by the predictive model.  The researchers concluded that two factors predicted the level of cognitive engagement: total number of high cognitive engagement responses versus total number of unelaborated responses only offering information.  Perhaps I am missing something, but it sounds circular—if you do this behavior, you’ll show this behavior.  Nevertheless, it does suggest that students can be encouraged to increase cognitive engagement with more elaborated responses.  This connects to another, more informative finding. Levels of cognitive engagement were not correlated with participation.  In other words, students who logged in frequently and read discussions did not necessarily show higher levels of cognitive engagement. This does offer an instructor a starting point for encouraging engagement.  By offering more elaborated responses, students can deepen their learning.  

Several other factors limit the conclusions.  For one thing, since the online component was 20% of the course grade, apparently the bulk of the course was face to face.  Thus, students who showed high engagement in the online portion of the course may not have been highly engaged overall or vice versa. Since the quality of the course was not entirely invested in the online components, can we generalize these findings to fully-online courses? Another obfuscating factor is that English is unlikely to be the first language of most students at a Malaysian university, albeit an English medium one.  The ability to demonstrate higher cognitive engagement through elaborated responses will inevitably be affected by linguistic proficiency.  Finally, the course studied here was a practical class where students were solving problems related to designing websites.  It would be interesting to see whether the results might differ for humanities or social science classes designed to probe ideas.


Because of the narrow scope and research concerns, I would not recommend this article as a general online learning resource.  However, it could be useful for course developers seeking to increase cognitive engagement or researchers exploring the topic.  If nothing else, the coding methodology looks like a useful research tool and the literature review points to some promising-looking studies.

2 comments:

  1. This article's analysis into cognitive engagement also ties with one of my posts into activity theory and motivation related to students' success in an online classroom. I am most interested in your article's findings that quantity of participation doesn't equate to cognitive engagement, but rather depth of response--elaborating within the discussion does. As you point out, it does sound like there were other factors that could have affected the results. I am looking at ESL learners in a few of my posts and articles, so I focused on that aspect of your review, as well as your post in my blog entry. I would love to talk more about your teaching and experience in the ESL classroom.

    ReplyDelete
  2. I would love to talk more about what I have observed of ESL learners. We alluded to this, of course, in class discussions. I haven't done any interactive writing projects with ESL learners, but my guess would be that they would like asynchronous discussion better.

    ReplyDelete