Help! How do I write a good AERA proposal?

Ring, ring! —  It is a hot Sunday afternoon.

It is the first day of my journey to visit family over a holiday break. Looking down at my phone I realize I have a missed call from…a professor…what? The education faculty never call me.

“Hello. Sorry I missed your call…” After a few seconds my mentor asks, “What are you doing on June __? Will you be in town?” I hesitantly answer with a negative, then affirmative. Turns out I am recruited, alongside a couple of other doctoral students, to do a presentation/workshop on how to submit successful AERA proposals.

As I crawl back into my car, it hits me – I have never had a conference proposal rejected. Why? Surely in my oh so very short journey as an alt-academic I must have had at least one, right?

Then I am struck with another, larger dilemma. I have no idea what I did right. I can’t even begin to describe to my mentees what I did “right” to get nearly two dozen peer-reviewed proposals accepted. If I have no clue, how can I explain it others.

Writing conference proposals had become so ingrained into my writing schedule that I hardly noticed when I take the time to compose them. I needed to shed this second skin, and think about my overall process (and the struggles I faced while writing AERA proposals the first time around).

Please remember that these notes are based on my experiences and preferences. Your faculty and friends are certain to have contradictory ideas.


Here is the list I came up with; Justin Whiting and Verily Tan from IU’s Creativity Labs also helped, and were present for the “Writing an AERA Proposal” Workshop. Please note that most of the links direct to specific 2017 AERA content.

Writing a successful proposal

General info

  • Don’t wait until the night before. This is not a 1-page proposal. Also log into the AERA submission system days before you start writing. This will help you structure sections on a word processor.
    • You can revise and upload a newer version of the proposal before the deadline.
    • If you are submitting a research project for presentation you will need IRB approval. If you don’t have it, be prepared to justify why!
  • Make sure your proposal follows APA guidelines, is spell checked and grammar proofed. Duh, right? But when you are pulling an all-nighter to write an AERA proposal you will be surprised what slipped through the cracks. Get someone to proof-read for you.
  • Work smart, not hard. If you have preexisting work, you can adapt it and build your AERA proposal off that.
  • Look over some example proposals.You can always ask your peers, faculty, and even family members look it over.
  • Figure out what kind of presentation (Paper, Poster, or Roundtable) you want to do before you start writing and tailor your proposal. More info here.
  • If you select the option of submitting a Paper.  Think about also letting your proposal be considered for a Poster or Roundtable that way at least you have a chance at presenting something.
    • Don’t be stuck on only wanting to do a Paper presentation. Posters and Roundtables can be more productive, especially for students as they give you a chance to interact more with audience. Posters are great for getting feedback, especially if you have some flexibility/are in the early states of your research project.
    • Also figure out what division and section, or SIG you want to send it to. That way you can tailor the proposal to better align with their call.
    • If you are confused about which section/SIG to submit to, read over their Call for Proposals. You can find individual SIG/Section calls in the submission portal too!
      • Link to Division descriptions
      • Link to Special Interest Group (SIG) descriptions
  • Remember that AERA likes completed studies. Write in the past or present tense. Also you should present some data…even if they are preliminary findings.
    • Methods section is particularly important – i.e., research question, data sources, analysis methods
    • Preliminary findings may suffice. If accepted the expectation will be to report more detailed findings and insights.
  • Review all of the detailed guidelines. Did you include all the parts that you needed to? If you are missing even one element, such as a discussion, it hurts your chances greatly.
  • Here are the sections you need to include:
    • Objectives or purposes
    • Perspectives(s) or theoretical framework
    • Methods, techniques, or modes of inquiry
    • Data sources, evidence, objects or materials
    • Results and/or substantiated conclusions or warrants for arguments/point of view
    • Scientific or scholarly significance of the study of the work
  • Make sure you meet the word count limit criteria!
    • Title (15 words)
    • Abstract (120 words)
    • Paper (no more than 2,000 words)
  • You do not have to incorporate the annual theme into your paper, but it does help if you can. You shouldn’t try to completely change your research, presentation, or writing style just to fit the theme, but it will help if your research is in line with others. Again, an important part of this is submitting to the correct SIG with similar research.
  • Omit author identification information from your proposal. That includes your writing, in text citations and references.
    • Confused? Here is how to do it: According to APA, in-text cite would be (Author, YEAR). References: Author. (YEAR).
    • Also you may have to change the order of authors on your proposal. AERA only allows a certain number of first author entries per person.

Specific to the proposal

  • Make your proposal stand out from the start! The abstract and introduction are very important. Remember most reviewers end up reading 10 or more proposals, make yours memorable.
  • Be descriptive (yet concise) in your methods section. Let your reader know what you did during your study! A good test for this is calling up a parent/sibling and having them read then explain the section to you. If they can accurately describe your methods after reading the section, then you win!
    • This is a hard skill to master, and it takes time – So, don’t expect your first proposal to be perfect even when you submit it. Just focus on getting it done to the best of your ability.
  • Explain why your research is important. Yes we know it furthers the knowledge base…but really, why should someone care. What is the impetus?
  • Your research doesn’t need to be perfect. Do your best and get feedback, but don’t get stuck on trying to change the world or have the perfect paper. Remember that you will have more than one reviewer, so don’t be discouraged! Proposal scores are averaged out between 3+ reviewers, so you get feedback from different perspectives too!

 

Questions? Shoot me a line at nsabir@indiana.edu

Please add anything I missed in the comments below.

AERA 2016 Workshop

You can view the hour long workshop on YouTube: https://youtu.be/gT73BwAvFxQ

 

 

KM Self Study Part III

I began a self-study of my learning organization’s growth and flexibility, in hopes to better understand and articulate challenges we faced in adapting new educational paradigms and standards. These postings cover not only the daily occurrences at my college of education, but also some of the experiences I have had while working at other organizations. If you have questions or thoughts please email me at nsabir@indiana.edu.

Institutes of education managing change and shift: A reflective piece utilizing Daft’s framework (part III)

Bias, mission & goals

I think the “most dangerous” of biases Daft discusses is, seeing what you want to see: Not because you truly want to “see” something, but rather because the information we gather, interpret and present to others is constantly shaped by our epistemological and ontological underpinnings. I’m stealing this story from a workshop I ran while working with Organization D.

So, Organization D’s mission statement is something along the lines of a peaceful world and each department has specific/specialized goals targeted at regions or themes. The organization’s goal doesn’t face conflicting departmental goals so much as the message gets lost in all the moving parts and sometimes there are not clear guidelines. On September 21, Peace Day, K-12 educators tackle teaching complex issues of global conflicts and cultural awareness often using social media and synchronous technologies. A couple of years ago there were not too many model schools for this initiative, and teachers dreamt up great projects for their students to do. One the more common examples was Skyping/email a classroom across the world and then reflecting about the experience. In some cases the lack of organizational support/resources lead to a propagation of negative stereotypes, especially at the k-5 level (“Japanese people know origami.” “Koreans eat dogs and that is gross!” “Baby girls are killed because parents want a son.” and “Peace means you can travel wherever you want.” – Just to name a few that were brought up at this workshop). As teachers began to post their students’ work on online forums the banter began about how irresponsible the teachers and Organization D had been to allow students to internalize stereotypes that could lead to conflict.

I think that sometimes working with an organization’s mission statement can be a game of Telephone particularly when they are broad and allow departments a little too much freedom. While having a broad mission statement can be great because it allows you to cover a lot of ground, different parts need to certain have specific objectives that support the official goal.

In many of teams I have worked on, the problem is rather apparent, there is a ____ need and the difficulties lie in crafting solutions. Often times the problem is over simplified by managers, as they are not the individuals placed in the field and their foundation for identifying and understanding the problem is relayed through secondary channels. Because the services the organizations deliver are so embedded in in-country dynamics that crafting solutions becomes a greater focus. Our organizations proposed solutions and timelines are also heavily impacted by major stakeholders – so it becomes a bit of a juggling act.

In our case, because the focus is shifted towards finding a sustainable and compatible solutions for all parties the identification of the issue at hand tends to get brushed over. In the past I have seen this lead to all sorts of communication-based issues, loss of funding, misusing fiscal and time based resources, and I’ve even see projects fail shortly after the first round of evaluations.

While the analytical and researcher based curriculum team I work with attempts to institute changes leveraging an incremental decision model, where instructional concepts are put through an iterative development process to ensure a best fit, this far from what actually happens. In the past four years each terms begins with the goal of trying to replicate a systematic and incremental decision process yet after a couple of weeks that fades away. Because the curriculum development team has to account for student (user) diversity and variation in needs, and constant flux in environmental factors a garbage can model would be more appropriate. Throw into the mix that we have a 60% instructor turnover, annually, and retaining structure can become problematic. Overarching goals are ill-defined, and problems and solutions are often identified vague and attended to simultaneously leading to additional problems. More often than not our team crafts “solutions” for problems which don’t exist and may not even be an issue, and major problems are sometimes pushed to the side for another term (another batch to deal with). Do I think this is effective? No. However, it is the culture of my organization and my mangers’ preferred style to allow for a more organic approach.

It is far too easy to go along with a group or majority decision. This happens a bit differently in my team. We currently have 1 very hands off manager and 11 instructor-designers, who have equal say in how we facilitate our programme. While there are some team players that naturally take a more leadership role and others who are more outspoken about distinct issues, a groupthink mentality is often applied to major decisions. Now this isn’t because of a lack of expertise or diversity but rather an understanding of “let’s agree to disagree” and if we continue to “disagree” the team isn’t nearly as productive, if at all.

Fin.

KM Self Study Part II

I began a self-study of my learning organization’s growth and flexibility, in hopes to better understand and articulate challenges we faced in adapting new educational paradigms and standards. These postings cover not only the daily occurrences at my college of education, but also some of the experiences I have had while working at other organizations. If you have questions or thoughts please email me at nsabir@indiana.edu.

Institutes of education managing change and shift: A reflective piece utilizing Daft’s framework (part II)

Structure, control & culture

At my current job, we have a fairly decentralized system (for immediate interactions) which leads to a series of issues. In an effort to empower the workers and support their professional development management takes a more horizontal approach. However, we our departments have major communication issues and because some people are so attached to their work/frame of mind this has also lead to discrepancies in polices. Since all parties are not on the same page this imposes stagnant approaches to problems.

In our organization the most frequent trap is, “people don’t have enough time to learn.” Because our organization experiences a disruptive environment that requires constant reevaluation our employees are always having to relearn skills and procedures. For example, just last week a peer created a learning module that involved a specific set of software and today we discovered that our student-consumers don’t have access to it. This required a complete revamp of the module and all our staff had to learn new material and procedures in the span of a couple of days.

I think the most difficult step I have seen organizations encounter is the first step, setting up procedures and guidelines. My experiences have shown me that a lack of clear vision often makes the initial process difficult. I have seen firms have a vague vision of execution, access to resources, and reflection but in the process of establishing these steps the organization distinguishes their current direction from their ideal process. That said, I believe that the research findings lack one step, reevaluation of the model or steps. Somewhere along the process there should be a place for leadership to pause and assess if their current process is allowing employees to learn in the most effective and efficient means possible, and if the scaffolding holds up to desired outcomes.

I believe that organizations’ design and structure should be drive by their goals, needs and access to resources, rather than what is considered “best practice.” The text talks about how vertical structures are more efficient and horizontal structures encourage more personnel growth so I think that leveraging both aspects would be a good approach. Our process aligns more with virtual network grouping’s model. While this approach enables flexible and is responsive to changes in the environments, it can be difficult to coordinate and communicate with all member of the organization. The only times I have seen this grouping truly be effective/efficient is when someone very motivated, organized, and patient took the role of integrator.

For day-to-day operations the college of education functions as a pooled Interdependence system; however in the large scale it serves as a reciprocal interdependent system. All of the departmental outputs and procedures feed into one another at the end of the terms however weekly operations allow individual offices and departments to function as separate entities with standardized procedures. This multifaceted approach requires our organization to function with a very high level of communication and collaboration. More often we feel like a cross departmental team working as a single entity rather than separate offices. The logistics trains are not felt by most of the employees but rather the supervisors take on the initiatives to collaborate all of the reciprocated activities.

I think that organizational decisions should be driven by needs, and not what the managers see working at another organization and hope to replicate in their own. Before an organization (re)creates a hybrid structure leadership should consider the weakness that need to be addressed.

I used to freelance for a Country B’s pharmaceutical company, Company A, working with both the local branch and the overseas offices. In Country B the organization is very centralized and represents a typical functional grouping model, however the counterpart organization in State Z is completely different. Because the US branch is considerably smaller it outsources most of its marketing and large-scale production, and much of the staff work across several departments. As citizens of Country B liaisons come to State Z for a year or two many of them actually struggle with adapting to “lack of structure” and several have even opted to return to the Country B’s company because the hybrid model in the US organization was uncomfortable for them. I think it is interesting how an environmental culture impacts an organization’s culture so heavily.

The term effectiveness and measuring an organization’s effectiveness does seems to get a bit ambiguous without a bounded case paired alongside. The goal based approach focuses more on the holistic meeting of an overarching organizational goal while the internal process approaches focuses more on internal workings, such as positive work environment and organizational morale, and not on the organization’s output. One of the great features about using a resource-based approach is that it includes the initial bargaining mix and can include the environment-organization factors.

At my current position several leadership department and offices have a very centralized command structure, which is bounded by accreditation, fiscal and international policy constraints. This bureaucratic system constrained employees to follow set protocols and stifled creativity. Several supervisors and office directors have shifted organizations’ directions by changing their leadership style to account for employee empowerment by moving from a bureaucratic to clan style of leadership and management. This transformation has been a slow progress with slight changes over several years and plenty of employee turnover. This shift allowed employees to further bring their expertise into the design of services.

To be continued…

Directing Collaborative Research Practice in a Global Arena

Research study presented at 2014 Midwestern Regional Conference of the Comparative International Education Society (MCIES). Please join us as we discuss how researchers are using collaborative and qualitative technologies to shape their research process.

For additional questions please email me @  nsabir@indiana.edu

Abstract:

Information technologies have rapidly shaped the scope of international education research practices as contemporary software developments that allow global collaborative research projects. This study evaluates value-based judgments about technologies considered relevant to the educational research process. To explore the role of these digital tools bounded within the context of researcher development and ongoing qualitative research projects, this participatory action research study looks at the reflexive journaling of ten doctoral students across a 14 week period. The shift in international education research paradigms and constant updating of expectation in practice prompt a need to better understand technological resources valued. The thematic journal analysis revealed a call for: (1) open educational resources across information management systems, used in the literature review process and theory building stages; (2) resources that promoted accessible, collaborative, and transparent information collection and analysis; and (3) technologies that integrate reflective practices and commentary during the entire research process. Digital tools that reflected these characteristics were highly valued in conducting collaborative research in a global arena. The findings from this study provide groundwork for current international education researchers to successfully navigate the technological logistics of conducting collaborative research endeavors.

 

Suggested citation:

Sabir, N. (2014, October). Directing collaborative research practice in a global arena. Presentation at the 2014 Midwestern Regional Conference of the Comparative International Education Society, Bloomington, IN.

System Update Available for Education 3.0

“Education 3.0 is characterized by rich, cross-institutional, cross-cultural educational opportunities within which the learners themselves play a key role as creators of knowledge artifacts that are shared, and where social networking and social benefits outside the immediate scope of activity play a strong role” (Keats & Schmidt, 2007, as cited by Lwoga, 2012).

The notion of Education 3.0 was first introduced in the literature circles in a First Monday article by Keats and Schmidt (2007) and then later expanded upon by Professor Lengel (2013). In short, Education 3.0 is a shift in how information is generated, communicated, validated and disseminated within a technology supported learning environment. The progress from education 2.0 to 3.0 mirrors the progress from web 2.0 to web 3.0 technologies. The move towards Education 3.0 is a result from the growing dissatisfaction of current education paradigms and a need to design a system that meets the challenges of today’s society (Abas, 2010; Daggett, 2012; Toffler, 1984; Watson, Watson, & Reigeluth, 2013).

According to Lengel (2007) education 3.0 describes transformative practices while education 2.0 focused on industrial age skills and education 1.0 focused on agricultural talent. Harkins (2008) takes this notion one step further by describing education 3.0 as “knowledge-producing” and education 4.0 is marked as “innovation-producing” education (p.19). However, Harkin (2008) disagrees with Lengel (2007) historic description of how education 3.0 was established, he writes that education 2.0 was internet enabled, while education 1.0 was focused on memorization. Moravee (2008) and McPheeters (2010) mark the shift into education 2.0 with the emergence of 21 century learning skills. Gerstein (2013) writes that education 3.0 is a connectivist, heutagogical approach to teaching and learning, where as education 2.0 was a cooperative and social teaching and learning process. Siemens (2005) defines connectivist learning as one that is connected, interactive and transformative. Additionally, Gerstein (2013) calls for educators to implement Education 3.0 practices instead of “talking about doing eduction 2.0” and actually doing education 1.0 (n.p.).

One of the fundamental backbones of Education 3.0 is the shift in openness and expansion of the learning environment (Paskevicius, & Ng’ambi, 2011), where the students are producers and collaborators using new tools and information available to them (Keats & Schmidt, 2007). With the shift towards Education 3.0 Free and Open Education Resources (FOERs) (Blackall, 2009; Heller, Chongsuvivatwong, Hailegeorgios, Dada, Torun, Madhok, & Sandars, 2007; Lwoga, 2012), mobile learning (Gerstein, 2013), and social networks have become imperative to successful implementation (Blackall, 2009). Furthermore, Keats and Schmidt (2007) claim that the interactivity of emerging technologies has the potential to connect students to larger “socio-political learning environments” (Carmichael & Farrell, 2012). Instructors are seen as conductors and facilitators of learning, while the student armed with internet resources contribute to the classroom experiences. Furthermore, the roles of institutes are also changing; the primary role has shifted to one of “accreditation” (Bradwell, 2009), moving away from the role of information gatekeepers. Several reports and texts (e.g. Davidson & Goldberg, 2009; Wiley, 2009) echo the call to restructure education to meet the changing students and society.

The chart below is aggregated from several readings, including: Moravec (2008b), Gerstein (2013), Keats and Schmidt (2007), and Lengel (2013). It discusses how the purposes and values of certain instructional elements has changed over time.  For example: the way in which meanings are constructed have differed; the technology competencies of the learners has grown; the learning pathways have changed, in that students no only learn from instructors; the space in which learning occurs has also changed; as have the instructor roles.

Edu3 Table

It is imperative that learners have a positive experience with the learning tools and environments in that the resources are user-friendly and accessible (Wang, 2013). Additionally, as individual learners have varying preferences, instructors leveraging Education 3.0 techniques need to consider learning styles (Oblinger &Oblinger, 2005). Kolb (2005) identifies four different learning modes: concrete experiences are a receptive and experience-based model; abstract conceptualization is an analytical and conceptual model; active experimentation is an authority-directed and impersonal learning model; and reflective observation is a reflective model. These four approaches can be further combined to include additional learning models. The ways in which people interact with technology also differs and produces varying opportunities ingrained in the world around them (Orlikowski, 1992). Watson et al. (2013) call for reform to current educational practices to better engage students.  Furthermore, Wang’s (2013) empirical study found that students engaged in traditional learning displayed less satisfaction than students using web 3.0 technologies.

 References

Abas, Z. W. (2010). A framework for higher education 2.0: 21st century education for 21st century learners.

Blackall, L. (2009). Open educational resources and practices. Journal of e-Learning and Knowledge Society, 3(2), 63-81.

Bradwell, P. (2009). The edgeless university. London, UK: Demos.

Carmichael, E., & Farrell, H. (2012). Evaluation of the Effectiveness of Online Resources in Developing Student Critical Thinking: Review of Literature and Case Study of a Critical Thinking Online Site. Journal of University Teaching and Learning Practice, 9(1), 4.

Daggett, W. R. (2010). Preparing students for their technological future. International Center for Leadership in Education.

Davidson, C. N., & Goldberg, D. T. (2009). The future of learning institutions in a digital age. Cambridge, MA: The MIT Press.

Gerstein, J. (2013, May 13). Education 3.0 and the Pedagogy (Andragogy, Heutagogy) of Mobile Learning. Retrieved September 8, 2014.

Harkins, A. M. (2008). Leapfrog Principles and Practices: Core Components of Education 3.0 and 4.0. Futures Research Quarterly24(1), 19-31.

Heller, R. F., Chongsuvivatwong, V., Hailegeorgios, S., Dada, J., Torun, P., Madhok, R., & Sandars, J. (2007). Capacity-building for public health: http://peoples-uni. org. Bulletin of the World Health Organization, 85(12), 930-934.

Keats, D., & Schmidt, J. P. (2007). The genesis and emergence of Education 3.0 in higher education and its potential for Africa. First Monday12(3).

Kolb, A. Y., & Kolb, D. A. (2005). Learning styles and learning spaces: Enhancing experiential learning in higher education. Academy of management learning & education, 4(2), 193-212.

Lengel, J. G. (2012). Education 3.0: Seven Steps to Better Schools. New York, NY: Teachers College Press.

Lwoga, E. (2012). Making learning and Web 2.0 technologies work for higher learning institutions in Africa. Campus-Wide Information Systems, 29(2), 90-107.

McPheeters, D. (2009, October). Cyborg learning theory: Technology in education and the blurring of boundaries. In World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education (1), 2937-2942.

Moravec, J. (2008a). Moving beyond education 2.0. Education Futures.

Moravec, J. (2008b, September 29). Toward Society 3.0: A New Paradigm for 21st century education. Retrieved September 12, 2014.

Oblinger, D., Oblinger, J. (Eds.), (2005). Educating the Net Generation, Educause. Retrieved from: http://www.educause.edu/educatingthenetgen/

Orlikowski, W. J. (1992). The duality of technology: Rethinking the concept of technology in organizations. Organization science, 3(3), 398-427.

Paskevicius, M., & Ng’ambi, D. (2011). The Potential for Education 3.0 in a Developing Context using Giddens’ Structuration Giddens’ Structuration Theory. Retrieved from: http://www.bluelightdistrict.org/wp/wp-content/uploads/2010/08/mpaskevi_Research_Paper_v2.pdf

Siemens, G. (2005). Connectivism: A learning theory for the digital age. International journal of instructional technology and distance learning, 2(1), 3-10.

Toffler, A. (1984). The Third Wave: The Clasic Study of Tomorrow. New York, NY: Bantam Publishing Group.

Wang, J. E. N. N. Y. (2013). Education 3.0: Effect learning style and method of instruction on user satisfaction. European Academic Research I 1(5), 755-769.

Watson, W. R., Watson, S. L., & Reigeluth, C. M. (2013). Education 3.0: Breaking the mold with technology. Interactive Learning Environments, (ahead-of-print), 1-12.

Wiley, D. (2009). Openness, Disaggregation, and the Future of Schools.TechTrends53(4), 37.

Small Numbers = Big Problems

Response to Appadurai, A. (2006). Fear of small numbers: An essay on the geography of anger. Durham, NC: Duke University Press.

 

Arjun Appadurai discusses the Fear of Small Numbers in the context of national populations, identity, and minority groups.  The “small numbers” represent the minority groups present within nation-states.  Interestingly, Appadurai addresses these “small numbers” in a very negative light attributing them to acts of terrorism and violence, and causing overall problems for the governance of nation-states.

Appadurai argues that minorities “create uncertainties about the national self and national citizenship because of their mixed status” (2006, p. 44), this view on minorities causes intolerance and tension on governance.  In general, the “small numbers” not only represent the minorities but also the marginalized, stereotypes and scapegoats of society; causing the disruption of the everyday balance of life of common (majority) citizens.

While Appadurai argues that these “small numbers” are a disruptive force he argues that these demographics are carved into the fabric of society, not born into it.  This “Fear of the Weak” is a concept derived from the counting of populations, enumeration and census representations, divisions created by the nation-state itself.

Appadurai also mentions that these “small numbers” retain a cultural identity often linked to the nation-state and religion, language is another component which needs to be added.  Taking a look at the case of Singapore, one truly sees how the “small numbers” were created and its effect within an education system.

The case of Singapore is a prominent one.  Early in its history primary and secondary schools were restructured to fall in line with the British-English system.  Notice in the table below, Westerners were not a majority, but this was implemented as a measure to ensure global competency. The goals of new education policies actively drew children from Chinese-medium schools, since English-medium schools provided a means of education for smaller fees.  Although the 1956 White Paper on Education pushed for Chinese schools receiving the same grant-in-aid there was an increase in enrolments of English schools while no efforts were made to improve employment for Chinese school graduates.  On the contrary, Chinese students were no longer able to travel to China for further study.

Singapore

Official Language(s)

English, Malay, Mandarin Chinese, and Tamil

Religion(s)

Buddhist (33%), Christianity (18%), Muslim (15%), Taoism (11%), Hinduism (5.1%), and No Religion (17%)

Ethnic Groups

Chinese (75.2%), Malay (13.6%), Indian (8.8%), and Other Eurasian (2.4%)

Notice in that table that Chinese make up the majority of the citizens, yet they were treated like the “small numbers’ described by Appadurai.  There was intense pressure to govern border travel, to the point where study aboard was restricted, and there was a strong “urge to purify” (2006, p. 53), that Chinese schools were slowly pushed out of the mainstream system.  The result of this back and forth clash was the establishment of a linguistic identity paralleled with a cultural identity.

The Singaporean-Chinese citizens were viewed as the disruptive force that required supplementary efforts from the governing parties.  The case of Singapore clearly follows Appadurai’s argument on the creation of minorities and the fear of “small numbers.”  However, Singapore represents a nation-state where the majority was not tolerated by the government.

 

Importance of Teamwork in Mixed Method Research Projects

With the implementation of survey instruments there is little movement of quantitative data, and minimal opportunity for varying interpretation of responses, as well as questions items (Bryman & Burgess, 1994).  With qualitative instruments integrated into mulimethod studies, the case is not as pronounced and therefore difficulties may arise in interpreting and evaluating qualitative data (Maderson, Kelaher, & Woelz-Stirling, 2011).  Hence in constant data collection phases, the management of information can become problematic when data are qualitative, collected by more than one researcher, and are intended for multiple users (Bryman & Burgess, 1994).

As two researchers working on individual projects are compounding data within a single research endeavor, the aspect of teamwork becomes crucial to the success of data analysis. Teamwork paired with reflexivity leads to improved productivity, effectiveness, and more robust research – overall higher quality (Barry et al., 1999). At the qualitative stage specifically, West (1994) reports that teamwork enhances the rigor of the methodological design, analysis, and interpretive elements of a research project.

Additionally teams can foster deeper conversations and higher levels of conceptual thinking than researchers working alone hence enriching the coding and analysis process at each stage (Barry et al., 1999).  This will include: integrating differing perspectives and ease at identifying bias (Liggett et al., 1994); a better standardization for coding and improving accuracy in theme creation and application (Delaney & Ames, 1993); and advancing the overall analyses to a higher level of abstraction (Olesen, Droes, Hatton, Chico & Schatzman, 1994).  In an effort to have a more rigours data analysis process and the reduction of personal bias, teamwork is crucial to the multiphase research model.

During the analysis phase of both the quantitative data and the qualitative information, the team aspect is crucial to the development of coding schemes and information interpretations.  The multidisciplinary discussions will act as a mindset for the two main analysis phases, sharpening the researchers to code of themes they might not have individually considered.

 

References

Barry, C. A., Britten, N., Barber, N., Bradley, C., & Stevenson, F. (1999). Using reflexivity to optimize teamwork in qualitative research.Qualitative health research,9(1), 26-44.

Bryman, A., & Burgess, B. (Eds.). (1994).Analyzing qualitative data. New York, NY: Routledge.

Delaney, W., & Ames, G. (1993). Integration and exchange in multidisciplinary alcohol research. Social Science and Medicine, 37, 5-13.

Friedman, T. (2005). The world is flat. New York, NY: Farrar, Straus & Giroux.

Liggett, A. M., Glesne, C. E., Johnston, A. P., Hasazi, B.,&Schattman, R. A. (1994). Teaming in qualitative research: Lessons learned. Qualitative Studies in Education, 7, 77-88.

Manderson, L., Kelaher, M., & Woelz-Stirling, N. (2001). Developing qualitative databases for multiple users.Qualitative health research,11(2), 149-160.

Olesen, V., Droes, N., Hatton, D., Chico, N.,&Schatzman, L. (1994). Analyzing together: Recollections of a team approach. In R. G. Burgess (Ed.), Analyzing qualitative data (pp. 111-128). London, UK: Routledge.

West, M. A. (1994). Effective teamwork. Leicester, UK: BPS Books.

 

Global classrooms promoting citizenship

As schools tout their mission to create global citizens competent in 21st century skills, as loosely defined as they are, the mission of schools must also be to create globally responsible citizens. Noddings (2005) writes that knowledge by itself is not sufficient for success; it can be used on the behalf of global concerns as well as self-interest, thus referring to a value based understanding of global citizenship. A sense of social justice, active civic engagement, and partaking in service learning leads to globally responsible citizenship.  Teachers are able to collaborate worldwide to develop curricula that help establishing global classrooms.

Teachers and students often view “good citizens” and “good persons” as synonymous (Ladson-Billings, 2004).  School-based institutions and instructors are essential in promoting these transformative ideals, and in developing the students’ capacities and commitments for effective citizenship and global awareness (Davies 2006; Westheimer & Kahne, 2004a, 2004b).

Looking at a historical anthropological perspective, all fundamental education is citizenship education (Ladson-Billings, 2004). This might be due to the historic nature of attempting to create students who would become active citizens in a better society.  Citizenship education focuses on students’ lives once they leave school and go out into the world: classes focus on proportional knowledge rather than critical values, attitudes and skills (Ladson-Billings, 2004).

Much akin to ideas of global competency and 21st century skills, terms such as global citizenship, transformative approaches, and global classrooms are difficult to define. There have been many attempts to define these terms based on research, etymology, and personal opinion. However, the terms are often contextual, and as such, the emphases change as necessary. There are broad conceptualizations of each term, but also additional goals that should be incorporated into the definitions.  Global citizenship is defined as the awareness of membership to the global community and environment to work towards the benefit of all members of the global community; and the goal of integrating this into education is to develop the skills and attitudes necessary to engage in other cultures and the global community in addition to the national community (Banks, 2004, p. 7).  Citizenship education is a means to establish important democratic ideals and develop students who are active, informed and critical global citizens.  The basis of one’s citizenship and active participation in the nation is “an outgrowth of the prevailing worldview of his or her society” (Ladson-Billings, 2004, p. 100).

Cotton’s (1996) criticisms of the current citizenship literature incorporate elements of gaps and lack in meaningful content, life experience relevance, and action-based learning.  In an attempt to remedy these criticisms, networks such as Round Square, TakingITGlobal and Flat classrooms incorporate innovative approaches and collaborative techniques.   The goals of integrating global classrooms into the teaching context is to move from mass curriculum to assist in creating more than personally responsible and/or participatory, but rather establish a justice-oriented sense with the motivation to effect such social change.

As discussed by Westheimer & Kahne (2004b), there are three main kinds of citizenship, within the confines of these three kinds of ‘good citizens’: personally responsible citizens concern themselves with the ways their actions will affect themselves; participatory citizens go a step further by actively participating; however, the ultimate goal is to create justice-oriented students who seek to eradicate problems.

 

References

Banks, J. A. (2004). Democratic citizenship education in multicultural societies. In J. A. Banks (Ed.), Diversity and citizenship education : global perspectives (pp. 3-15). San Francisco, CA: Jossey-Bass.

Cotton, K. (1996). Educating for citizenship. School Improvement Research Series. Retrieved from http://www.nwrel.org/scpd/sirs/10/c019.html

Davies, P. (2006). Educating citizens for changing economies. [Article]. Journal of Curriculum Studies, 38(1), 15-30. doi: 10.1080/00220270500185122

Ladson-Billings, G. (2004). Culture versus citizenship: The challenge of racialized citizenship in the United States. In J. A. Banks (Ed.), Diversity and citizenship education: Global perspectives. (p. 99-126). San Francisco, CA US: Jossey-Bass.

Noddings, N. (2005). Global citizenship: Promises and problems. In N. Noddings (Ed.), Educating citizens for global awareness; developed in association with the Boston Research Center for the 21st Century (p. 1-21). New York, NY: Teachers College Press.

Westheimer, J., & Kahne, J. (2004a). What kind of citizen? The politics of educating for democracy. American Educational Research Journal, 41(2), 237-269. doi: 10.3102/00028312041002237

Westheimer, J., & Kahne, J. (2004b). Educating the “good” citizen: Political choices and pedagogical goals. Democratic Dialogues , 38(2), 57-61. doi: 10.3102/00028312041002237. Retrieved from http://www.democraticdialogue.com/DDpdfs/WestheimerKahnePS.pdf

 

 

IRB approval & cloud storage

The Institutional Review Board (IRB) approval process drives even the best of faculty and researchers a little batty. The entire process can be confusing, convoluted, and inconvenient, depending on the institution and potential partners. The common push back to IRB approval processes is grounded in the belief that it is the researchers, not members of the IRB, who hold the specialized experience and knowledge required to make final decisions (Howe & Dougherty, 1993).

Our university recently underwent changes to move the document creation and submission process online. While the green effort was noble in attempting to streamline the application process and digitize documentation, the constant shift and recent changes proved to further irk faculty. In fact, one of my personal mentors was so troubled by the system that we called our local IRB office; and then refused to let them hang up until we had completed the application process.

While annoying, it is important to remember where the IRB process stemmed from: Recall only a few decades ago, we were giving people syphilis (Gjestland, 1954) and convincing them that they were responsible for electrocuting individuals (Milgram, 1963). Oh how we have progressed in standardizing research methodology?

While irksome, the IRB is in place to ensure that all participants are protected and appropriately informed of their rights. Over the past two decades IRBs have transformed the conduct of research endeavors involving any human subjects. Researchers are no longer able to implement research projects without weighing the risks they are asking participants to assume (Edgar & Rothman, 1995). As people hold a special status, their participation in research projects is a “means to a higher end, they also deserve to be treated with a particular kind of moral regard or dignity” (Pritchard, 2002).

IRBs have often shared a concern for information exchanged, stored, and analyzed via cloud storage (Carrell, 2011; Kumbhare, Simmhan, & Prasanna 2011). In fact, in a recent application for IRB approval our research team received much push-back when proposing to host research files on a cloud-based server. Through conversations with IRB members our research team was able to reach a compromise and ensure secure storage of data.

References:

Carrell, D. (2011, January). A strategy for deploying secure cloud-based natural language processing systems for applied research involving clinical text. In System Sciences (HICSS), 2011 44th Hawaii International Conference on (pp. 1-11). IEEE.

Edgar, H., & Rothman, D. J. (1995). The institutional review board and beyond: future challenges to the ethics of human experimentation.Milbank Quarterly, 73(4), 489-506.

Gjestland, T. (1954). The Oslo study of untreated syphilis; an epidemiologic investigation of the natural course of the syphilitic infection based upon a re-study of the Boeck-Bruusgaard material.Acta dermato-venereologica. Supplementum, 35(Suppl 34), 3-368.

Howe, K. R., & Dougherty, K. C. (1993). Ethics, institutional review boards, and the changing face of educational research.Educational Researcher, 22(9), 16-21.

Kumbhare, A. G., Simmhan, Y., & Prasanna, V. (2011, November). Designing a secure storage repository for sharing scientific datasets using public clouds. In Proceedings of the second international workshop on Data intensive computing in the clouds (pp. 31-40). ACM.

Milgram, S. (1963). Behavioral study of obedience.The Journal of Abnormal and Social Psychology, 67(4), 371 -378.

Pritchard, I. A. (2002). Travelers and trolls: Practitioner research and institutional review boards. Educational Researcher, 31(3), 3-13.

 

What package do you pick?

My research partner and I were having a conversation about what programs we should use to code our open-ended survey data and interviews. So we began listing out all our options from Google documents to complex CAQDAS packages and then some. We soon discovered an issue…we didn’t have a shared skill set, meaning that software my partner had experience with I did not and vice versa.  So instead of trying to find common ground we started looking into what packages offered trial versions long enough for us to complete data coding over the course of a semester.

That narrowed down our list some…only then we ran into another issue. I am a window/linux user and my partner is an apple user that works off their mobile device more often than not. In the end we couldn’t really make a decision because of our differences…we decided that we would have to compromise.  And then for the third time we rewrote our list and began a conversation anew. This time it was guided by functionality and a discussion by Taylor, Lewins and Gibbs (2005).

Since the size of our data set is fairly small this wasn’t too much of an issue, so that didn’t knock any off our list.  The next topic was collaboration: we want to be able to either send documents back and forth via email/cloud or collaborate directly without much hindrance. Surprisingly, we didn’t think about our data type until the third conversation.  Since we have text, audio, and pdf artifacts we needed a CAQDAS package that could support coding directly in the platform for audio elements.  Our fourth criteria was based on using frequency and comparative matrices across our data types.  Because we are doing a mixed methods study, we are very concerned about convergence (or even divergence) across themes.  Additionally, we are very interested in working with quantitative data (for minimal descriptive statistics) within a single package as well. Finally we settled on two possible software options.

What a hassle? There has to be a simpler, comparative chart that would have allowed us to check properties across various CAQDAS packages and cross off options that didn’t meet our criteria.  Turns out that these charts are already floating around the web, we just didn’t look hard enough. Here is an example from UNC that compares ATLAS.ti, MAXQDA, NVivo and Dedoose, and another Stanford site that compares Nvivo, HyperResearch, Studiocode, Atlas.ti, and Tams Analyzer.

 

References:

Taylor, C., Lewins, A., & Gibbs, G. (2005, December 12). Debates about the software. Retrieved from http://onlineqda.hud.ac.uk/Intro_CAQDAS/software_debates.php

Previous Older Entries

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 12 other subscribers

Calendar

May 2024
M T W T F S S
 12345
6789101112
13141516171819
20212223242526
2728293031