This blog post has been cross-posted from Education Week.
Today's post is written from the researcher perspective. Stay tuned: Thursday we will share the practitioner's perspective on this research.
This post is by Joshua Lin, Research Analyst for the Philadelphia Education Research Consortium (@PHLedResearch).
In May 2016, fresh out of graduate school, I began my research career at the Philadelphia Education Research Consortium (PERC), a research-practice partnership (RPP) between the city's public education sector and local education researchers. My first project was, I thought, a simple descriptive analysis about the characteristics and outcomes of English learner students in the city's schools. As it turned out, this analysis - like others conducted in the context of RPPs - was not "simple." The research questions and analysis led to increasingly complex understandings of our city's schools. Ultimately, the research informed a community-wide conversation about educating EL students in Philadelphia.
Philadelphia educates about 200,000 school-age children in its public schools, with 32% of these students in 86 different charter schools. PERC supports schools in both sectors by partnering with the School District of Philadelphia (SDP) and the city's charter sector. Philadelphia is also home to nationally-respected education researchers at institutions including Drexel University, Temple University, the University of Pennsylvania, and Research for Action (the non-profit where I work). PERC serves as one platform for faculty at these institutions to use research to deeply understand and identify solutions for pressing issues facing Philadelphia schools.
Launching our research agenda
In the fall of 2014, SDP and charter sector leaders agreed that we did not know enough about the school experiences and outcomes for the city's rapidly increasing population of English learner (EL) students. PERC launched a long-term research agenda focused on understanding how to best serve ELs and the challenges educators faced in meeting their needs. In order to get there, we first needed to understand who these students were and how the population had changed over time.
SDP provided us with seven years of data on the District's EL population from 2008-09 to 2014-15. There were, on average, 12,435 EL students being served in SDP in any of these given years, with the number of EL students reaching a high in 2014-15 at 12,789 students. Overall, we had data on over 35,500 unique EL students whom SDP had served during the observation period.
Through the partnership, I was able to work with practitioners in SDP throughout the research project to bring this data to life — a very different experience from what I was used to from my graduate training. Together, we began to parse out what the data could and could not tell us about the District's 220 schools. Each served different populations of ELs, and each had its own set of assets and challenges.
How practitioners and stakeholders informed our analysis
Informed by our companion qualitative studies based on interviews with school administrators and teachers, we primarily looked at the diversity of EL school populations by population concentration, linguistic diversity and English proficiency levels. In 2014-15, while the city's median school EL concentration was just 6%, schools ranged from having zero ELs to having 57% ELs. Additionally, at both ends of this concentration range, schools had varying levels of linguistic diversity; schools below the median concentration had between 1 and 12 unique student home languages, and schools above the median had between 1 and 46 languages.
PERC presented these findings at a community forum in June 2016 attended by almost 100 teachers, administrators, education policymakers, journalists and community members. Fueled by our commitment to make research understandable and actionable, we presented our descriptive data in a variety of ways including maps. Graphics were a simple but important tool for communicating our findings: Practitioners finally had the visualizations and numbers to assess how their own observations compared to data for the entire city.
Implications and next questions
Most notably, presenting the descriptive characteristics of the EL population to the Philadelphia community started conversations about how to evaluate the successes and challenges of EL programming. While stakeholders agreed that a student's time to English proficiency (as measured by the ACCESS for ELLs assessment) was a meaningful program outcome, questions remained. For example, what amount of time is ideal? Based on our descriptive analyses, we knew these student trajectories would look different across different student and school characteristics. For example, only 36% of ELs who started Kindergarten in 2014 were at the lowest English proficiency level, suggesting that two-thirds of these students had at least some prior exposure to English. Compare this to newcomer ninth grade students, of whom 74% entered the district in 2014 at the lowest proficiency level. Although this complexity raises more challenges for research, our increased understanding has undoubtedly improved our upcoming analysis of EL student trajectories.
In the research world, descriptive analysis is sometimes viewed as just a stepping stone to causal analysis. As a result, doing descriptive analysis well has been largely unexplored, with the exception of a recent guide from the Institute of Education Sciences. However, in Philadelphia, descriptive analysis has not just opened the door for more advanced analyses--it has started and contributed to data-based conversations among communities of stakeholders, and contributed to our understanding of Philadelphia's social world. And thanks to our partnership with practitioners, our questions are more grounded in practice, analyses more applied, and recommendations more focused on impacting real change for our city's schools.