Help! How do I write a good AERA proposal?

Ring, ring! —  It is a hot Sunday afternoon.

It is the first day of my journey to visit family over a holiday break. Looking down at my phone I realize I have a missed call from…a professor…what? The education faculty never call me.

“Hello. Sorry I missed your call…” After a few seconds my mentor asks, “What are you doing on June __? Will you be in town?” I hesitantly answer with a negative, then affirmative. Turns out I am recruited, alongside a couple of other doctoral students, to do a presentation/workshop on how to submit successful AERA proposals.

As I crawl back into my car, it hits me – I have never had a conference proposal rejected. Why? Surely in my oh so very short journey as an alt-academic I must have had at least one, right?

Then I am struck with another, larger dilemma. I have no idea what I did right. I can’t even begin to describe to my mentees what I did “right” to get nearly two dozen peer-reviewed proposals accepted. If I have no clue, how can I explain it others.

Writing conference proposals had become so ingrained into my writing schedule that I hardly noticed when I take the time to compose them. I needed to shed this second skin, and think about my overall process (and the struggles I faced while writing AERA proposals the first time around).

Please remember that these notes are based on my experiences and preferences. Your faculty and friends are certain to have contradictory ideas.


Here is the list I came up with; Justin Whiting and Verily Tan from IU’s Creativity Labs also helped, and were present for the “Writing an AERA Proposal” Workshop. Please note that most of the links direct to specific 2017 AERA content.

Writing a successful proposal

General info

  • Don’t wait until the night before. This is not a 1-page proposal. Also log into the AERA submission system days before you start writing. This will help you structure sections on a word processor.
    • You can revise and upload a newer version of the proposal before the deadline.
    • If you are submitting a research project for presentation you will need IRB approval. If you don’t have it, be prepared to justify why!
  • Make sure your proposal follows APA guidelines, is spell checked and grammar proofed. Duh, right? But when you are pulling an all-nighter to write an AERA proposal you will be surprised what slipped through the cracks. Get someone to proof-read for you.
  • Work smart, not hard. If you have preexisting work, you can adapt it and build your AERA proposal off that.
  • Look over some example proposals.You can always ask your peers, faculty, and even family members look it over.
  • Figure out what kind of presentation (Paper, Poster, or Roundtable) you want to do before you start writing and tailor your proposal. More info here.
  • If you select the option of submitting a Paper.  Think about also letting your proposal be considered for a Poster or Roundtable that way at least you have a chance at presenting something.
    • Don’t be stuck on only wanting to do a Paper presentation. Posters and Roundtables can be more productive, especially for students as they give you a chance to interact more with audience. Posters are great for getting feedback, especially if you have some flexibility/are in the early states of your research project.
    • Also figure out what division and section, or SIG you want to send it to. That way you can tailor the proposal to better align with their call.
    • If you are confused about which section/SIG to submit to, read over their Call for Proposals. You can find individual SIG/Section calls in the submission portal too!
      • Link to Division descriptions
      • Link to Special Interest Group (SIG) descriptions
  • Remember that AERA likes completed studies. Write in the past or present tense. Also you should present some data…even if they are preliminary findings.
    • Methods section is particularly important – i.e., research question, data sources, analysis methods
    • Preliminary findings may suffice. If accepted the expectation will be to report more detailed findings and insights.
  • Review all of the detailed guidelines. Did you include all the parts that you needed to? If you are missing even one element, such as a discussion, it hurts your chances greatly.
  • Here are the sections you need to include:
    • Objectives or purposes
    • Perspectives(s) or theoretical framework
    • Methods, techniques, or modes of inquiry
    • Data sources, evidence, objects or materials
    • Results and/or substantiated conclusions or warrants for arguments/point of view
    • Scientific or scholarly significance of the study of the work
  • Make sure you meet the word count limit criteria!
    • Title (15 words)
    • Abstract (120 words)
    • Paper (no more than 2,000 words)
  • You do not have to incorporate the annual theme into your paper, but it does help if you can. You shouldn’t try to completely change your research, presentation, or writing style just to fit the theme, but it will help if your research is in line with others. Again, an important part of this is submitting to the correct SIG with similar research.
  • Omit author identification information from your proposal. That includes your writing, in text citations and references.
    • Confused? Here is how to do it: According to APA, in-text cite would be (Author, YEAR). References: Author. (YEAR).
    • Also you may have to change the order of authors on your proposal. AERA only allows a certain number of first author entries per person.

Specific to the proposal

  • Make your proposal stand out from the start! The abstract and introduction are very important. Remember most reviewers end up reading 10 or more proposals, make yours memorable.
  • Be descriptive (yet concise) in your methods section. Let your reader know what you did during your study! A good test for this is calling up a parent/sibling and having them read then explain the section to you. If they can accurately describe your methods after reading the section, then you win!
    • This is a hard skill to master, and it takes time – So, don’t expect your first proposal to be perfect even when you submit it. Just focus on getting it done to the best of your ability.
  • Explain why your research is important. Yes we know it furthers the knowledge base…but really, why should someone care. What is the impetus?
  • Your research doesn’t need to be perfect. Do your best and get feedback, but don’t get stuck on trying to change the world or have the perfect paper. Remember that you will have more than one reviewer, so don’t be discouraged! Proposal scores are averaged out between 3+ reviewers, so you get feedback from different perspectives too!

 

Questions? Shoot me a line at nsabir@indiana.edu

Please add anything I missed in the comments below.

AERA 2016 Workshop

You can view the hour long workshop on YouTube: https://youtu.be/gT73BwAvFxQ

 

 

KM Self Study Part III

I began a self-study of my learning organization’s growth and flexibility, in hopes to better understand and articulate challenges we faced in adapting new educational paradigms and standards. These postings cover not only the daily occurrences at my college of education, but also some of the experiences I have had while working at other organizations. If you have questions or thoughts please email me at nsabir@indiana.edu.

Institutes of education managing change and shift: A reflective piece utilizing Daft’s framework (part III)

Bias, mission & goals

I think the “most dangerous” of biases Daft discusses is, seeing what you want to see: Not because you truly want to “see” something, but rather because the information we gather, interpret and present to others is constantly shaped by our epistemological and ontological underpinnings. I’m stealing this story from a workshop I ran while working with Organization D.

So, Organization D’s mission statement is something along the lines of a peaceful world and each department has specific/specialized goals targeted at regions or themes. The organization’s goal doesn’t face conflicting departmental goals so much as the message gets lost in all the moving parts and sometimes there are not clear guidelines. On September 21, Peace Day, K-12 educators tackle teaching complex issues of global conflicts and cultural awareness often using social media and synchronous technologies. A couple of years ago there were not too many model schools for this initiative, and teachers dreamt up great projects for their students to do. One the more common examples was Skyping/email a classroom across the world and then reflecting about the experience. In some cases the lack of organizational support/resources lead to a propagation of negative stereotypes, especially at the k-5 level (“Japanese people know origami.” “Koreans eat dogs and that is gross!” “Baby girls are killed because parents want a son.” and “Peace means you can travel wherever you want.” – Just to name a few that were brought up at this workshop). As teachers began to post their students’ work on online forums the banter began about how irresponsible the teachers and Organization D had been to allow students to internalize stereotypes that could lead to conflict.

I think that sometimes working with an organization’s mission statement can be a game of Telephone particularly when they are broad and allow departments a little too much freedom. While having a broad mission statement can be great because it allows you to cover a lot of ground, different parts need to certain have specific objectives that support the official goal.

In many of teams I have worked on, the problem is rather apparent, there is a ____ need and the difficulties lie in crafting solutions. Often times the problem is over simplified by managers, as they are not the individuals placed in the field and their foundation for identifying and understanding the problem is relayed through secondary channels. Because the services the organizations deliver are so embedded in in-country dynamics that crafting solutions becomes a greater focus. Our organizations proposed solutions and timelines are also heavily impacted by major stakeholders – so it becomes a bit of a juggling act.

In our case, because the focus is shifted towards finding a sustainable and compatible solutions for all parties the identification of the issue at hand tends to get brushed over. In the past I have seen this lead to all sorts of communication-based issues, loss of funding, misusing fiscal and time based resources, and I’ve even see projects fail shortly after the first round of evaluations.

While the analytical and researcher based curriculum team I work with attempts to institute changes leveraging an incremental decision model, where instructional concepts are put through an iterative development process to ensure a best fit, this far from what actually happens. In the past four years each terms begins with the goal of trying to replicate a systematic and incremental decision process yet after a couple of weeks that fades away. Because the curriculum development team has to account for student (user) diversity and variation in needs, and constant flux in environmental factors a garbage can model would be more appropriate. Throw into the mix that we have a 60% instructor turnover, annually, and retaining structure can become problematic. Overarching goals are ill-defined, and problems and solutions are often identified vague and attended to simultaneously leading to additional problems. More often than not our team crafts “solutions” for problems which don’t exist and may not even be an issue, and major problems are sometimes pushed to the side for another term (another batch to deal with). Do I think this is effective? No. However, it is the culture of my organization and my mangers’ preferred style to allow for a more organic approach.

It is far too easy to go along with a group or majority decision. This happens a bit differently in my team. We currently have 1 very hands off manager and 11 instructor-designers, who have equal say in how we facilitate our programme. While there are some team players that naturally take a more leadership role and others who are more outspoken about distinct issues, a groupthink mentality is often applied to major decisions. Now this isn’t because of a lack of expertise or diversity but rather an understanding of “let’s agree to disagree” and if we continue to “disagree” the team isn’t nearly as productive, if at all.

Fin.

KM Self Study Part II

I began a self-study of my learning organization’s growth and flexibility, in hopes to better understand and articulate challenges we faced in adapting new educational paradigms and standards. These postings cover not only the daily occurrences at my college of education, but also some of the experiences I have had while working at other organizations. If you have questions or thoughts please email me at nsabir@indiana.edu.

Institutes of education managing change and shift: A reflective piece utilizing Daft’s framework (part II)

Structure, control & culture

At my current job, we have a fairly decentralized system (for immediate interactions) which leads to a series of issues. In an effort to empower the workers and support their professional development management takes a more horizontal approach. However, we our departments have major communication issues and because some people are so attached to their work/frame of mind this has also lead to discrepancies in polices. Since all parties are not on the same page this imposes stagnant approaches to problems.

In our organization the most frequent trap is, “people don’t have enough time to learn.” Because our organization experiences a disruptive environment that requires constant reevaluation our employees are always having to relearn skills and procedures. For example, just last week a peer created a learning module that involved a specific set of software and today we discovered that our student-consumers don’t have access to it. This required a complete revamp of the module and all our staff had to learn new material and procedures in the span of a couple of days.

I think the most difficult step I have seen organizations encounter is the first step, setting up procedures and guidelines. My experiences have shown me that a lack of clear vision often makes the initial process difficult. I have seen firms have a vague vision of execution, access to resources, and reflection but in the process of establishing these steps the organization distinguishes their current direction from their ideal process. That said, I believe that the research findings lack one step, reevaluation of the model or steps. Somewhere along the process there should be a place for leadership to pause and assess if their current process is allowing employees to learn in the most effective and efficient means possible, and if the scaffolding holds up to desired outcomes.

I believe that organizations’ design and structure should be drive by their goals, needs and access to resources, rather than what is considered “best practice.” The text talks about how vertical structures are more efficient and horizontal structures encourage more personnel growth so I think that leveraging both aspects would be a good approach. Our process aligns more with virtual network grouping’s model. While this approach enables flexible and is responsive to changes in the environments, it can be difficult to coordinate and communicate with all member of the organization. The only times I have seen this grouping truly be effective/efficient is when someone very motivated, organized, and patient took the role of integrator.

For day-to-day operations the college of education functions as a pooled Interdependence system; however in the large scale it serves as a reciprocal interdependent system. All of the departmental outputs and procedures feed into one another at the end of the terms however weekly operations allow individual offices and departments to function as separate entities with standardized procedures. This multifaceted approach requires our organization to function with a very high level of communication and collaboration. More often we feel like a cross departmental team working as a single entity rather than separate offices. The logistics trains are not felt by most of the employees but rather the supervisors take on the initiatives to collaborate all of the reciprocated activities.

I think that organizational decisions should be driven by needs, and not what the managers see working at another organization and hope to replicate in their own. Before an organization (re)creates a hybrid structure leadership should consider the weakness that need to be addressed.

I used to freelance for a Country B’s pharmaceutical company, Company A, working with both the local branch and the overseas offices. In Country B the organization is very centralized and represents a typical functional grouping model, however the counterpart organization in State Z is completely different. Because the US branch is considerably smaller it outsources most of its marketing and large-scale production, and much of the staff work across several departments. As citizens of Country B liaisons come to State Z for a year or two many of them actually struggle with adapting to “lack of structure” and several have even opted to return to the Country B’s company because the hybrid model in the US organization was uncomfortable for them. I think it is interesting how an environmental culture impacts an organization’s culture so heavily.

The term effectiveness and measuring an organization’s effectiveness does seems to get a bit ambiguous without a bounded case paired alongside. The goal based approach focuses more on the holistic meeting of an overarching organizational goal while the internal process approaches focuses more on internal workings, such as positive work environment and organizational morale, and not on the organization’s output. One of the great features about using a resource-based approach is that it includes the initial bargaining mix and can include the environment-organization factors.

At my current position several leadership department and offices have a very centralized command structure, which is bounded by accreditation, fiscal and international policy constraints. This bureaucratic system constrained employees to follow set protocols and stifled creativity. Several supervisors and office directors have shifted organizations’ directions by changing their leadership style to account for employee empowerment by moving from a bureaucratic to clan style of leadership and management. This transformation has been a slow progress with slight changes over several years and plenty of employee turnover. This shift allowed employees to further bring their expertise into the design of services.

To be continued…

IRB approval & cloud storage

The Institutional Review Board (IRB) approval process drives even the best of faculty and researchers a little batty. The entire process can be confusing, convoluted, and inconvenient, depending on the institution and potential partners. The common push back to IRB approval processes is grounded in the belief that it is the researchers, not members of the IRB, who hold the specialized experience and knowledge required to make final decisions (Howe & Dougherty, 1993).

Our university recently underwent changes to move the document creation and submission process online. While the green effort was noble in attempting to streamline the application process and digitize documentation, the constant shift and recent changes proved to further irk faculty. In fact, one of my personal mentors was so troubled by the system that we called our local IRB office; and then refused to let them hang up until we had completed the application process.

While annoying, it is important to remember where the IRB process stemmed from: Recall only a few decades ago, we were giving people syphilis (Gjestland, 1954) and convincing them that they were responsible for electrocuting individuals (Milgram, 1963). Oh how we have progressed in standardizing research methodology?

While irksome, the IRB is in place to ensure that all participants are protected and appropriately informed of their rights. Over the past two decades IRBs have transformed the conduct of research endeavors involving any human subjects. Researchers are no longer able to implement research projects without weighing the risks they are asking participants to assume (Edgar & Rothman, 1995). As people hold a special status, their participation in research projects is a “means to a higher end, they also deserve to be treated with a particular kind of moral regard or dignity” (Pritchard, 2002).

IRBs have often shared a concern for information exchanged, stored, and analyzed via cloud storage (Carrell, 2011; Kumbhare, Simmhan, & Prasanna 2011). In fact, in a recent application for IRB approval our research team received much push-back when proposing to host research files on a cloud-based server. Through conversations with IRB members our research team was able to reach a compromise and ensure secure storage of data.

References:

Carrell, D. (2011, January). A strategy for deploying secure cloud-based natural language processing systems for applied research involving clinical text. In System Sciences (HICSS), 2011 44th Hawaii International Conference on (pp. 1-11). IEEE.

Edgar, H., & Rothman, D. J. (1995). The institutional review board and beyond: future challenges to the ethics of human experimentation.Milbank Quarterly, 73(4), 489-506.

Gjestland, T. (1954). The Oslo study of untreated syphilis; an epidemiologic investigation of the natural course of the syphilitic infection based upon a re-study of the Boeck-Bruusgaard material.Acta dermato-venereologica. Supplementum, 35(Suppl 34), 3-368.

Howe, K. R., & Dougherty, K. C. (1993). Ethics, institutional review boards, and the changing face of educational research.Educational Researcher, 22(9), 16-21.

Kumbhare, A. G., Simmhan, Y., & Prasanna, V. (2011, November). Designing a secure storage repository for sharing scientific datasets using public clouds. In Proceedings of the second international workshop on Data intensive computing in the clouds (pp. 31-40). ACM.

Milgram, S. (1963). Behavioral study of obedience.The Journal of Abnormal and Social Psychology, 67(4), 371 -378.

Pritchard, I. A. (2002). Travelers and trolls: Practitioner research and institutional review boards. Educational Researcher, 31(3), 3-13.

 

What package do you pick?

My research partner and I were having a conversation about what programs we should use to code our open-ended survey data and interviews. So we began listing out all our options from Google documents to complex CAQDAS packages and then some. We soon discovered an issue…we didn’t have a shared skill set, meaning that software my partner had experience with I did not and vice versa.  So instead of trying to find common ground we started looking into what packages offered trial versions long enough for us to complete data coding over the course of a semester.

That narrowed down our list some…only then we ran into another issue. I am a window/linux user and my partner is an apple user that works off their mobile device more often than not. In the end we couldn’t really make a decision because of our differences…we decided that we would have to compromise.  And then for the third time we rewrote our list and began a conversation anew. This time it was guided by functionality and a discussion by Taylor, Lewins and Gibbs (2005).

Since the size of our data set is fairly small this wasn’t too much of an issue, so that didn’t knock any off our list.  The next topic was collaboration: we want to be able to either send documents back and forth via email/cloud or collaborate directly without much hindrance. Surprisingly, we didn’t think about our data type until the third conversation.  Since we have text, audio, and pdf artifacts we needed a CAQDAS package that could support coding directly in the platform for audio elements.  Our fourth criteria was based on using frequency and comparative matrices across our data types.  Because we are doing a mixed methods study, we are very concerned about convergence (or even divergence) across themes.  Additionally, we are very interested in working with quantitative data (for minimal descriptive statistics) within a single package as well. Finally we settled on two possible software options.

What a hassle? There has to be a simpler, comparative chart that would have allowed us to check properties across various CAQDAS packages and cross off options that didn’t meet our criteria.  Turns out that these charts are already floating around the web, we just didn’t look hard enough. Here is an example from UNC that compares ATLAS.ti, MAXQDA, NVivo and Dedoose, and another Stanford site that compares Nvivo, HyperResearch, Studiocode, Atlas.ti, and Tams Analyzer.

 

References:

Taylor, C., Lewins, A., & Gibbs, G. (2005, December 12). Debates about the software. Retrieved from http://onlineqda.hud.ac.uk/Intro_CAQDAS/software_debates.php

Organic writing process requires audio recorder?

During my digital tools class the discussion moved from representing findings in innovative ways to constructing meaning via organic processes.

A small group discussion began with the idea of leveraging digital tools to support a non-linear writing process. The first issue that was brought to the table was related to this idea of non-linear.  So let’s talk about that notion first (like any good writer we must define our terms first, no?).  So let’s go with the construction of a typical research article.  So what do you read first in a paper: the abstract, introduction, literature review, methodology, results, discussion, and then the conclusion.  Well more often than not, the paper is constructed in this manner. In fact, students are often encouraged to write the abstract last and to rewrite the introduction after the conclusions have been reached.

In this discussion of non-linear writing a student brought up the idea that their writing differed greatly between personal and academic writings.  They wanted to adopt their creative and fluid system of poetic writing to their construction of research papers.  How do you do that? The easy answer – use the same medium across all your writings.  What does that even mean? Well, if you discover that you write better by outlining sentences on paper and then digitizing notes, do that.  If you find that you think best on whiteboards that allow you to construct plans, reorganize and scribble, well then do that.  One of the writing processes that really struck a cord was the notion of writing through recording audio thoughts while out on contemplating walks.  Taking natural conversation and moving them into typed words.

 

On a side note – another idea that came up in the conversation about representing findings was the idea of video.  An example of video representations is prominent in projects like the 1000 Voices project.  This online archive collects, displays and analyses life stories of individuals with disabilities from around the globe.  The not only allows users to upload video but also encourages them to submit images, films, audio, text, or any combination of medium.  Recently one user even passed along personal art projects that told their story.

Another great representation of video based finding presentations are the PhD Comics 2 minute thesis contest videos.  This short animated clips encompass introduction, research questions, methods, and sometimes conclusions in the manner of two minutes.  Here is an example, that talks about how distant reading techniques can be used to acquire information.

Pictures are worth a 1000 words, but you don’t get that many when coding.

Mitchell (2011) claims that there is no roadmap when trekking through visual data, and there is no set way to engage in fieldwork or analyzing multimedia information (Pink, 2007).  Through the course of my development as a budding researcher I have often felt lost when trying to carry out carefully constructed research projects. Things don’t always seem to go as planned, and interpretations of procedures and information can sometimes become muddled.

In an attempt to explore how to best carry out such tasks as interpreting visual data, engaging in field work, and analyzing information I have turned to several texts and scholars.

One of the first courses, taken during my doctoral studies was grounded in evaluation research and needs analysis.  My journey through the semester allowed our research team to work with a client to establish goals that needed to be investigated. Surprisingly this course was very much a cookbook style that prescribed specific steps and assessments be taken in precise order.

Interestingly enough, none of my following methods courses truly provided the same structure.  During one qualitative inquiry course we worked through the Merriam (2009) and Seidman (2013) texts.  While these are great resources that provide starting points they provided too much variety in approaches.  It seemed that every unique project had a different approach to engaging in field work and analyzing information.  To further explore qualitative research designs, methods, and approaches, several students turned to Creswell’s (2012; 2013) texts.  While these cookbook resources are a great base to understand and compare qualitative, and mixed method, approaches to fieldwork, they don’t cover too many approaches to visual data analysis.

While text such as Emmison’s (2010) chapter on visual data offer great insight into the history of visual data and analysis approaches, they could be improved upon by commenting on best practices, and providing guidance for budding researchers. Other book chapters (i.e. Cohen, Manion & Morrison, 2011) provide an introductory discussion about visual data interpretation; but the larger lesson is that ‘it depends’ on the research questions and context.

 We seem to hear, “it depends” a lot in our field, no?

 

References

Cohen, L., Manion, L., & Morrison, K. (2011). Visual media in educational research. In Research Methods in Education (7th ed., pp. 526-534). New York, NY: Routledge.

Creswell, J. W. (2012). Qualitative inquiry and research design: Choosing among five approaches. Thousand Oaks, CA: Sage.

Creswell, J. W. (2013). Research design: Qualitative, quantitative, and mixed methods approaches. Thousand Oaks, CA: Sage.

Emmison, M. (2010). Conceptualizing visual data. In D. Silverman (Ed.), Qualitative Research (3rd ed., pp. 233-249). San Francisco, CA: Sage.

Merriam, S. B. (2009). Qualitative research: A guide to design and implementation. San Francisco, CA: John Wiley & Sons.

Mitchell, C. (2011). Doing visual research. London, UK: Sage.

Pink, S. (2013). Doing visual ethnography. London, UK: Sage.

Seidman, I. (2012). Interviewing as qualitative research: A guide for researchers in education and the social sciences. New York, NY: Teachers college press.

 

 

 

Journal Review of Compare

Compare: A Journal of Comparative and International Education, the official journal of the British Association for International and Comparative Education (BAICE), began publications in 1975.  From 1975-1992, the journal was published semi-annually, and since then the number of publications has grown to three annual issues through 2002, and is now up to six, including special editions.

Compare focuses on secondary articles that relate to educational development and change across the globe in an effort to analyze educational discourse, policy, and practices as they relate to interdisciplinary fields.  Using research papers, literature reviews, book reviews, editorials, case studies, and even obituaries, Compare attempts to cover a wide range of topics.  It investigates the implications of various theories on teaching, learning, and management across the primary, secondary, and tertiary levels as well as adult and specialty education (Journal Details, 2014).  The articles offer insight into academics, including graduate students, policy makers, and development agency staff.

Compare promotes cross-disciplinary research and teaching, as well as encourages networking between professional organizations.  In an effort to extend the coverage of its publication, the editors of BAICE seek papers that express a comparative dimension and have a particular interest of case studies in under-researched fields (Journal Details, 2014).  Often, the yearly BAICE conference themes are reflected in the themes of subsequent issues.  Compare includes the works of many European authors, and their affiliations seem European-dominated, with several authors members of European-based organizations and universities.  Few authors were based in Canada and the United States.  Education consulting firms, research organizations, graduate students, and authors of other related journals, primarily in the Western world, seem to heavily cite the works published in recent volumes of Compare.

Beginning in the early 2000s, Compare emphasized primary and secondary education with economic undertones.  Articles addressed contemporary topics such as gender issues, school health in primary education, and integrating technology into the classroom.  Starting around 2002, the publications gave attention to issues on globalization, citizenship, and national identity.  In 2005, the journal began to examine matters related to education in emergencies and conflict, peace-building, and diversity in the classroom.  In the next few years, a variety of subjects were discussed including nationalism and identity as well as education in AIDS crisis regions.  During 2007 and 2008, a majority of the focus was on impacting factors outside the school system at the primary and secondary levels and within adult education.  In the later part of the decade, the research included issues of social justice and K-12 students, but also began to focus on teachers.  Despite widespread interest in the results of international comparative tests such as the Progress in International Reading Literacy Study (PIRLS) released in 2001, 2006, and 2011, the Trends in International Mathematics and Science Study (TIMSS) released in 1999, 2003, and 2007, and the Programme for International Student Assessment (PISA) released in 2000, 2003, 2006, and 2009, there was little to no focus on such ideas and no discussion on comparative assessments.

The regions covered in Compare from 2000-2011 were numerous, although seemed to focus on European countries.  Outside of the Western world, emphasis was on South Africa and Japan with little attention given to Hong Kong and Latin America.  Beginning in 2003, there was a noticeable regional focus on African countries with some attention to Central America and Canada.  Middle Eastern countries were featured for the first time during the decade in 2003, and in 2005 the focus grew.  It was from 2006-2008 that Asia first came under the microscope.  However, the United Kingdom and Europe, particularly the eastern nations, seemed to dominate the scope of Compare while there was a growing emphasis on sub-Saharan Africa and Central Asia.  From 2009 to present, the journal became almost entirely focused on Asian and African countries.  Notably, the United States was mentioned only twice from 2003-2005 and twice again from 2009-2011; when Canada was mentioned, it was related to aspects of French language integration and post-colonialism.  Overall, case studies typically involved two regions within one or two countries, and other research papers compared two or three countries, rarely more.

BAICE is comprised of academics, researchers, policy makers, and members of governmental and non-governmental organizations (NGOs) (Sprague, 2014).  Utilizing Compare, BAICE encourages the continual development of international and comparative studies in education.  Compare aims to illustrate the effects of globalization and critical post-modern thinking on learning in a variety of aspects for professionals and everyday citizens.  While BAICE claims that Compare will reach the everyday citizen, upon further scrutiny it can be deduced that the journal targets a narrower audience and is not accessible to much of the developing world.  The high cost of a subscription coupled with the unavailability of translations limits the overall accessibility of the journal.

 

References

Journal Details. (2014). Retrieved from http://www.tandf.co.uk/journals/ccom.

Sprague, T. (2014, February 11). About BAICE. Retrieved from http://www.baice.ac.uk/about-baice.

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 12 other subscribers

Calendar

May 2024
M T W T F S S
 12345
6789101112
13141516171819
20212223242526
2728293031