Directing Collaborative Research Practice in a Global Arena

Research study presented at 2014 Midwestern Regional Conference of the Comparative International Education Society (MCIES). Please join us as we discuss how researchers are using collaborative and qualitative technologies to shape their research process.

For additional questions please email me @  nsabir@indiana.edu

Abstract:

Information technologies have rapidly shaped the scope of international education research practices as contemporary software developments that allow global collaborative research projects. This study evaluates value-based judgments about technologies considered relevant to the educational research process. To explore the role of these digital tools bounded within the context of researcher development and ongoing qualitative research projects, this participatory action research study looks at the reflexive journaling of ten doctoral students across a 14 week period. The shift in international education research paradigms and constant updating of expectation in practice prompt a need to better understand technological resources valued. The thematic journal analysis revealed a call for: (1) open educational resources across information management systems, used in the literature review process and theory building stages; (2) resources that promoted accessible, collaborative, and transparent information collection and analysis; and (3) technologies that integrate reflective practices and commentary during the entire research process. Digital tools that reflected these characteristics were highly valued in conducting collaborative research in a global arena. The findings from this study provide groundwork for current international education researchers to successfully navigate the technological logistics of conducting collaborative research endeavors.

 

Suggested citation:

Sabir, N. (2014, October). Directing collaborative research practice in a global arena. Presentation at the 2014 Midwestern Regional Conference of the Comparative International Education Society, Bloomington, IN.

Real-Time Audio Reflection of Video Analysis for Action Research

A right of passage for many teachers is a foundational video case analysis of their teaching, recorded and previewed by external faculty and staff members.  This practice is in place for pre-service teachers, teachers under review at new schools, and even associate instructors at the university level. While often viewed as a slightly intimidating process, the video review process is integral in establishing action research ideals for teachers.

The reflection process is crucial to not only the teacher’s development but also for enhancing their instructional approaches.  Many teacher preparatory programs strive to teach future teachers reflective practices that directly inform their action (Hatton & Smith, 1995).  Teachers need to think critically about, and learn from their past experiences through meaningful reflective practices.

While reflections can take place through listening, speaking, drawing, and any other way imaginable, the most meaningful reflections often take place after watching yourself perform tasks.  The idea is for teachers to video record themselves and capture objective descriptions of what happened, discuss feelings, ideas and analysis, and discuss how they reacted as a result of the experience.  The figure below represents the reflection process (adopted from Quinsland & Van Ginkel, 1984).

Quinsland_Van Ginkel

According to Quinsland and Van Ginkel (1984), processing is a practice that encourages one to reflect, describe, analyze, and communicate their experiences. The processing and reflection will not only allow for an enhanced learning experience but will also contribute to the teaching and learning of future students.  Past literature has shown that critical reflection will increase learning, understanding, and retention (Daudelin, 1996).  Additionally it invokes a process of taking meanings and moving them into learning (Mezirow, 1990).

The process of reflection is critical to action research (Kemmis, 1985), and action research need to be systematic (Gore & Zwichner, 1991; Price, 2001) that creates questions and answers them in the teaching context.  Historically, many teachers use a variety of tools such as observation logs and reflective journals (Darling-Hammond, 2012).

This activity will walk you though that process:

The first step is to insert the video observation into ELAN.  Give the video an initial viewing and add in annotations.  Annotations should be reflections of your teaching and immediate methods, they can also be ideas that you wish to further explore and revisit.

The second step is to create an audio-based discussion. As the video is playing, create an audio recording of your immediate reflections.  During the second video run-through, stop the recording periodically to voice record your thoughts.

Place the audio recording into the ELAN platform, synchronizing the wave with the video observation.  Once you silence your observational video you will be able to listen to your thought process overlayed to your observational data.  Another way of looking at this overlay: The reflected audio file replaces the audio component of the video observation.  This will allow you to pair your analysis to the observation, reflecting the moments of instruction.

Once audio and video, with annotations, are embedded and synced, add a second layer of annotations based on the alignment between your audio reflections.  This can be areas for improvement, implications for future practices, and moments that surprise you.  By integrating aspects of verbal, visual, and kinesthetic cues, teachers can establish retrieval systems that will allow them to change practices on the fly.

These approaches will allow teachers to self-reflect and create keys that indicate needs for change.  This systemic approach to identifying problems and providing solution, take a critical approach to teacher-based action research.  The benefit of using video and audio based reflections is the fluid and organic nature of reflection that allow teachers to improve their instructional techniques effectively (Altrichter, Feldman, Posch, & Somekh, 2013).

References

Altrichter, H., Feldman, A., Posch, P., & Somekh, B. (2013).Teachers investigate their work: An introduction to action research across the professions. New York, NY: Routledge.

Darling-Hammond, L. (2012).Powerful teacher education: Lessons from exemplary programs. San Francisco, CA: John Wiley & Sons.

Daudelin, M. (1996). Learning from Experience Through Reflection. Organizational Dynamics, 24(3), 36-48.

Gore, J. M., & Zeichner, K. M. (1991). Action research and reflective teaching in preservice teacher education: A case study from the United States.Teaching and teacher education,7(2), 119-136.

Hatton, N., & Smith, D. (1995). Reflection in teacher education: Towards definition and implementation.Teaching and teacher education,11(1), 33-49.

Kemmis, S. (1985). Action research and the politics of reflection. In D. Boud, R. Keogh & D. Walker (Eds.),Reflection: Turning Experience into Learning(pp. 139-164). New York, NY: Routledge.

Mezirow, J. (1990). How critical reflection triggers transformative learning. Fostering critical reflection in adulthood, 1-20.

Price, J. N. (2001). Action research, pedagogy and change: The transformative potential of action research in pre-service teacher education. Journal of Curriculum Studies,33(1), 43-74.

Quinsland, L. K. & Van Ginkel, A. (1984). How to Process Experience. The Journal of Experiential Education, 7 (2), 8-13.

 

Issues with Online Data Collection

In recent years the use of online data sources has lead to new developments including the standardization of resources and widening of scope; despite not all critical info is visible researcher.  However, online conversations are constantly evolving and perceptions online are always in flux.  Seymour (2001) claims that the internet releases the interview stage from “from its imprisonment in time and place” (p.158).

There are issues that cross over between virtual spaces and traditional data collection methods that stem mainly from the effects of media on the data gathering step.  Traditionally, in qualitative research documents/artifacts, interviews and observations are collected. In traditional data gathering humans are responsive and adaptive, whereas in electronic media data is often constructed via computer assistance. Now there are electronic forms of all these sources, where illustrations, programs and networks are artifacts.  Data collection differs with the nature of the medium; computer mediated communication has a unique effect on information construction.

There are some ethical considerations that need to be considered.  Merriam (2009) discusses four main ethical concerns with online issues: obtaining informed consent; ensuring confidentiality and security; determining what is public and what is private; and the last issue revolves around debriefing participants.  Many of these issues were brought up during the class, the most prominent being the issue of privacy and publicity, gaining informed consent, and issues of “reality.”

For the issue of public and private information, the discussion was centered on expectations for participants. For instance, if the researcher is reflecting on a public space about the participants than it might endanger the identity of the participants. Additionally, while participants often agree to having their information published as part of the study they might be less comfortable with having their information blogged upon.  Personally, I think this is a super important issue for in-service teachers who wish to engage in action research, self-reflection is a critical piece of the action research process (Carr & Kemmis, 2003).

 

References

Carr, W., & Kemmis, S. (2003). Becoming critical: education knowledge and action research. London, UK: Routledge.

Merriam, S. B. (2009). Qualitative research: A guide to design and implementation.. (pp. 156-163). San Francisco, CA: John Wiley & Sons.

Seymour, W. S. (2001). In the flesh or online? Exploring qualitative research methodologies. Qualitative Research, 1(2), 147-168.

 

Reflection on “Flexner, Accreditation, and Evaluation”

I want to start off by mentioning that evaluations are a little different than research in that they are client driven and often begin with a different frame of reference.  Also, often times, I have heard that evaluations are a study in common sense.

Hang on to these two thoughts as we will revisit them in a little bit.  Additionally, Flexner (1910) mentions common sense as a valid method.  Floden (1980) discusses accreditation as evaluations of a school; where schools refers to “departments, programs or colleges” (p. 35), and other educational institutes.

One of the first concepts to hit me while reading through this article is the idea of weighing internal and external evaluators’ strengths.  Floden mentions the importance of member checking, that there is incredible value at leveraging different perspectives during an educational evaluation. However, Floden brings up a critical question of who guides the evaluation process? Is it the clients? The major stakeholders? All of the participants? Floden (1980) questions the impact of certain groups and asks, “which groups will control the process” (p. 36).

In order to adopt this concept of evaluations in educational context, focused by differing parties we can adopt Flexner’s procedures and educational approaches.  Flexner (1910) has three guiding questions for educational programs, projects, and/or interventions.  The first question is centered around: 1) how should evaluation procedures be determined? Who should be in charge of directing procedures? Is it the evaluators (often experienced in such processes) that should be guiding the evaluation? Or is it the clients (often paying the bill) that should take the lead in the evaluation process? In the end, there needs to be some communication between client, stakeholders and evaluators to come to an agreement on procedures.

The next main question is centered around: 2) who should participate in the evaluation process?  While there is a need to communicate with key stakeholders and include ‘everyone’ in the process, there should be a priority placed on which parties can give the most valuable information (considering resources, as well).  In fact, Flexner (2010) as cited by Floden (1980) mentions that “insider knowledge” is required (p. 39) for successful educational evaluations.  This goes back to the idea of using internal and external evaluators for a well rounded evaluation. The final question adopted from Flexner is the idea of final recommendations and effects: 3) What are the effects of the evaluation?  This includes both the positive and negative impacts of an evaluation.

References:

Flexner, A. (1910). Medical education in the United States and Canada: a report to the Carnegie Foundation for the Advancement of Teaching (No. 4). Carnegie Foundation for the Advancement of Teaching.

Folden, R. E. (1980). Flexner, accreditation, and evaluation.  Educational Evaluation and Policy Analysis, 2(2), 36-46. Retrieved from  http://www.jstor.org/stable/1163932

Digital Tools (Visual Representation)

This graphic represents many of the conversations noted from class.  As we discuss the potential and possible uses of digital tools in (qualitative) research, the process of research and where each tools plays a part in an image that keeps popping into my head. I finally had to create a skeleton adopted from several sources including (Coffey, Holbrook & Atkinson, 1996; Davidson & di Gregorio 2011; Gratton & O’Donnell, 2011; Matthews & Cramer, 2008; Miles & Huberman, 1994; Paulus, Lester, & Dempster, 2013) as well as in-class discussion notes.

When we talk about information and research, there are really two different types of data; there is researcher generated information and there is also participant generated information.  For example Silverman writes that certain kinds of data could not exist without the researchers’ facilitation, these include focus groups or interviews (2011). So as you look over the image below consider the types of information and how different entities can facilitate or create meaning.

When recording field notes and conversations – you can go old school with a pen & paper, you can jazzy it up with video or audio recordings, or new tech such as audio recording pens (e.g. LiveScribe).  The abundance of smart phone applications that allow you to record audio are amazing, regardless of platform.  This shift and expanse of technological tools means that “new sites of study and ways of engaging with participants” (Hine, 2008, as cited by Paulus, Lester & Dempster, 2013, p. 70), is growing ever prominent and accessible.  So not only do we have new ways of constructing meaning but we can also gather information across greater distances (e.g. Gratton & O’Donnell, 2011 and Matthews & Cramer, 2008).  Through archival tools and web conferencing, new sites for study are being revealed to educational researchers.

These ideas of creating knowledge through data reminded me of a classroom discussion on revamping Miles and Huberman’s (1994) list of computer uses in qualitative research.  This idea of shifting from traditional ethnographies to netnographies, using “ethnographic approaches to study unique digital spaces” (Paulus, Lester & Dempster, 2013, p. 76), really pushed me to think about online communities, social networks, virtual worlds, and serious games as new research environments.

  DigitalToolQualResearch

References:

Coffey, A., Holbrook, B. & Atkinson, P. (1996). Qualitative data analysis: Technologies and representations. Sociological Research Online 1(1). Retrieved from: http://www.socresonline.org.uk/1/1/4.html.

Davidson, J. & di Gregorio, S. (2011). Qualitative research and technology: In the midst of a revolution. In Denzin & Lincoln (eds.) The Sage Handbook of Qualitative Research, 4th edition. (pp. 627-643). London, UK: Sage.

Gratton, M. & O’Donnell, S. (2011). Communication technologies for focus groups with remote communities: a case study of research with First Nations in Canada. Qualitative Research 11(2) 159-175.

Matthews, J. & Cramer, E.P. (2008). Using technology to enhance qualitative research with hidden populations. The Qualitative Report 13(2), 301-315.

Miles, M.B. & Huberman, A.M. (1994) Qualitative Data Analysis: An Expanded Sourcebook. London, UK: SAGE

Paulus, T. M., Lester, J. N., & Dempster, P. (2013). Digital tools for qualitative researchLondon, UK: Sage.

Silverman, D. (2011). Interpreting qualitative data: Methods for Analysing Talk, Text, and Interaction. London, UK: Sage.

Last Looks Out The Window

8 months from now I will be presenting my Dossier II. In our department, Dossier presentations are much like oral examination consisting of formal presentation (15 mins.) followed by an inquisition (30 mins.) and then faculty members leave the room – to decide your fate. Well this was my last set of presentations to watch before I have to present my own. There were a couple of things I learned (and many more I re-acknowledged), and I wanted to share a couple of my musings.
The need to concentrate and focus – despite the plethora of research topics, researchers need to pick concentrations and focus – they need to build upon the common feedback. There are so many areas of this to which I agree, but many more to which I am reserved in claiming specificity. It seems that the more I focus on a singular research topic the more information about the field at large I seem to lose.
And, I think that this is especially going to be a problem for me…in my presentation. I have a variety of research interest across diverse contexts, and I am fearful that my contexts are going to throw the review committee, where the contexts are going to matter more than the questions…I don’t see my scatterbrainedness as a determent, I think it gives me an edge, makes me more marketable.
If I had to define a research area, it would be making professionals/instructors even better at their job. Yes, that’s it! Everything I have done has been for the development of professionals, whether it has been teaching W200 or working on the PFF conference. There seems to be this underlying trend of trying to enhance the experience of future (and current) instructors but giving them opportunity to develop a particular skill set. Well, at least I think that is a common trend…perhaps I need to mull over it a bit more.
The other two concepts that struck me were the need to be consistent with word choice, and admitting to the gasps and areas of improvement. I think “it required further investigation,” is going to become a new favorite phrase. It becomes unrealistic to assume that you are going to be able to address all aspects of a particular research project…and, I don’t think students should fudge their responses, rather admitting to faults openly allows greater conversation (sometimes).
Another point that was made very clear: the need to tell a cohesive story. For the career shifters out there, this might be a bit harder. But, I think, that it is important to relate your research interests to your experiences and past. Sometimes more important that describing what you hope to achieve, is explaining your past and describing how your past changed you today.
And before I forget, I thought my peers had brilliant graphics displaying their research projects and teaching agendas. So I thought I would make my own, see below!

Teaching Experience

TeachingExp1

For the full interactive experience  

Creating meaning from the world around ‘you’

As I a peruse facebook, I come across my little cousin’s posts scattered about my timeline.  As I am one of the few adult “friends” she has given unlimited access to view all her information, it become instantly obvious that she profile is geared towards all of her friends.  There were instagram photos and comments about all the cool stuff she has been doing, encoded messages to special friends, and rants (upon rants, upon rants) about school and homework.  The thought occurred to me that her parents might not know that she recently, nearly failed a test….but all of her “friends” did.  Her entire identity was constructed with the pretense of creating an image that best represented herself to her friends.

I can only imagine how much would be required to “clean up” her profile before she added her parents as “friends.” In fact, this cleaning up process becomes often instituted when students are entering the job force. During one of my teaching sessions, I gave a short lecture on identity creation via social media….and I was amazed at how many of my students removed inappropriate comments and pictures.  (Keep in mind these are the people that are going to be future teachers).

But this whole notion of changing your online identity with a couple of clicks is very interesting. What if we could do that in the real-physical world? Wait, can’t we? What is stopping us from constructing an image specific to certain groups or individuals?  Let me give a personal example: When I step foot on campus (mainly the Education building), I revert to professional mode – the lecturer, hard working students, and caring mentor come out to play! But when I am at home (with my family), I am more of a hyper-ball of energy playing pranks and making a mess of things.  To distinct identities presented in different situations. To the point where my academic peers can’t imagine a goof-ball side and my family can’t understand how I was a teacher and professional.  Where am I going with all of this, hang on…I have a point I promise.

This week the idea of observation struck me. What if what I am observing, isn’t the whole picture or even the most important bits. Watt (2007) comments on the impossibility true or “pure” observations, but refers to the opportunity to interact with participants. Well that may be so, but something about my inability to capture the truth doesn’t sit well with me…after all, aren’t we truth seekers.  By nature, as researchers, we are attempting to capture the truth and accurately represent information: to the point where issues or reliability and validity are such sticking points in research defenses, even so there have been decades of paradigm wars commenting on the different of information representative (Alise & Teddlie, 2010). If by design we are attempting to capture truth and then accepting that we can’t truly do so…doesn’t that sound a little odd?

In McKernan’s (1996) Handbook on conducting action research in education settings, he mentions the value of observations.  In fact many of the cases and designs/methods presented in this cookbook begin with observation.  A supplementary text by Bogdan and Biklen (1998) also scaffold a handbook style approach to conducting educational research in classroom settings.  But, I think that we have already established that observations can be flawed, greatly.

Hatch (2002) claims that observations “capture more in-depth insights” into specific contexts (p. 133), he goes on to describe that observations are used for “constructing questions: (p. 133).  So if we are using observations to construct questions for interviews, focus groups, or even surveys (mixing data collection methods is fairly common), we are possibly compounding, even confounding, our error.  One of the key arguments for using a more qualitative approach in research is to allow for better understanding, one that is more descriptive and explanatory in nature.  But what if we are confounding our errors during observation?

Here’s recent case of perceived use of internet searchers: About a year ago, Google’s chairman visited North Korea and witnessed student using Google’s search engine for educational research. If an externalist witnessed this interaction, they might assume that this is a regular occurrence, and forget to note that North Korea has only a national intranet.

For more information about observations and bias check out Grimes & Schulz (2002).

References:

Alise, M. A., & Teddlie, C. (2010). A continuation of the paradigm wars? Prevalence rates of methodological approaches across the social/behavioral sciences. Journal of Mixed Methods Research, 4(2), 103-126.

Bogdan, R. C., & Biklen, S. K. (1998). Qualitative research in education. An introduction to theory and methods. Needham Heights, MA: Allyn & Bacon.

Grimes, D. A., & Schulz, K. F. (2002). Bias and causal associations in observational research. The Lancet, 359(9302), 248-252.

Hatch, J. A. (2002). Doing qualitative research in education settings. SUNY Press.

McKernan, J. (1996). Curriculum action research. A handbook of methods and resources for the reflective practitioner. England, United Kingdom: Kogan Page Limited.

Watt, D. (2007). On becoming a qualitative researcher: The value of reflexivity. The Qualitative  Report 12(1), 82-101

Evaluation – The song that never ends

This theme of evaluation seems to keep coming up, over and over again much like a song whose words are at the tip of your tongue but you just can’t seem to get them out. This week was I actually evaluated…it wasn’t a self-reflection of what I could have changed but an external review that involves a camera recording my entire class, a formal sit down, and report. Despite that external pressure I didn’t feel the need to over prepare or fluff up my class with cool new tricks. In fact, even thought I was being recorded I found myself just teaching and the evaluation stuff faded away. At that moment I was focused on delivering content in the best way I could, I wasn’t thinking about what I could have changed from last week or five minutes past.

So 24 hours post instruction, it still hasn’t hit me, the whole you are being evaluated thing. I think that be because I know everyone out there is going to have their two cents about teaching and what is the best way to do ‘this’ or ‘that.’ At the end of the day I am my own worse critic and I think more than getting that report or told how I preformed, I want a copy of me teaching so that I can analyze what I did and what I can do to make it better.

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 12 other subscribers

Calendar

May 2024
M T W T F S S
 12345
6789101112
13141516171819
20212223242526
2728293031