Contributions

Robert Pruzek’s Contributions to the Methodology of Educational Science

Dear Colleagues,

We at ACASE believe that scientifically based research in education has the potential to provide far more useful findings for improving the quality of educational processes than has generally been seen to date. A stronger basis for clear thinking about goals, assessments, research design, data collection, analysis and display as well as interpretation is needed.

There has been a persistent tendency among educational researchers to focus on methods of analysis as the leading element of methodology with the consequence that planning, formulation of questions, design, data display and nuanced interpretation have been given short shrift in the work-a-day practice of educational research. This might not have such damaging effects were research methods to have been refined and elaborated in ways that make their strengths and weaknesses clear; but in fact the majority of what passes for methodology and which is most commonly, almost exclusively, taught is focused on generic statistical modeling using summary statistics and formal inference despite the fact that many other matters cry out for more attention. A large part of the problem is that measurement and (statistical) analysis have in their research disciplines and in their teaching been divorced from one another for many years. This has had the consequence that most students, and now most professors or applied educational researchers, tend rarely to think much about them in relation to one another, much less concern themselves routinely with their fundamental interconnectedness in research practice.

When the applied educational researcher focuses almost exclusively on conventional statistical methods as his or her central thought in methodology, the most basic or essential educational questions are often obscured or masked. We need more often to ask questions such as: What should we try to measure, and in what specific contexts? How can we develop sound measuring instruments that achieve our central goals of measurement? How effectively can we design studies that are likely to yield useful information about students and student learning even when sample sizes are limited? How should we collect our data? How can we effectively accommodate (and take advantage of) individual differences? How should we proceed when we have multiple outcomes? How can we display our data most effectively? And, can we generalize (at least some of) our results to larger contexts or well-defined populations, allowing for the possibility that caveats and qualifications are likely to require careful use of language? Note that only the last of these questions makes clear reference to (conventional) methods for research, or implies an interest in formal inferences.

We generally find ourselves without the most needed outcome measures, or effective ways to use them to advance knowledge about what works and what does not, and for whom.

At ACASE, our work has been focused especially on building sound measures of learning outcomes. But we recognize that to use the most appropriate instruments effectively requires that special attention be given to questions such as those noted above. In his forty years at SUNY Albany, Professor Pruzek has helped to develop reasoned approaches to design, data collection, display and analysis that connect naturally with our interests in goals and instruments for assessment. We envision more effective or useful educational studies that follow from a clear focus on goals and instruments at the outset, followed by more effective methods for design, data collection, display, analysis and interpretation. Our ultimate aim is to help advance the effectiveness of research directed to improving curriculum, instruction, learning and assessment.

As noted, for several decades Professor Pruzek has focused on a number of concepts, processes and methods that appear to have special potential to facilitate better research design, data collection and analysis. These ideas, while in the mainstream of research thinking, often appear quite different from what has become conventional. These methods involve such things as the use of relevant (prior) information to form homogeneous subgroups (of students, or classrooms, etc.) and designs that account for subgrouping (or more generally, dependency in the data). Also, these methods entail analysis that follows visualization of data, possibly from different perspectives, where data summaries are employed only when sufficient thought has been given to individual data points, where these came from, and what they mean. When such modern methodological ideas are used to inform applied research it is not uncommon to learn that data for some students must be set aside or considered separately, with specific reference to the individual’s background or skills or deficits, before summarizing results. Use of such refined ideas and methods sometimes can make it clear that initial research questions need modification or revision before they can be answered effectively. Statistical significance, while sometimes important to note, is seen as less central and less relevant than understanding what data have to say about individuals or (sub)groups.

Take note that the focus is not just on methods, but also on more effective thinking about design, data collection, analysis and interpretation of data. What is to be found in the documents on this website is clearly not in all ways novel. What is here is largely in the mainstream of methodological thinking, but it is perhaps new in a number of ways, especially in the sense that it aims for especially sensitive and nuanced approaches to the collection and use of carefully defined data. Graphics hold special prominence in this new framework.

In due course as many as five or more different systems of analysis will be illustrated at this website, starting with two that seem most likely to show what these particular ways of thinking have to offer. As always the proof of the pudding lies in its eating. In order to help with the cooking, this website includes a number of software functions that Pruzek has written to run in R. Use of the R software is particularly appropriate since it is free to everyone, and because its already extensive libraries continue to be updated regularly. (R is perhaps now the de facto standard software system for statisticians around the world.) It is available from http://www.r-project.org.

We at ACASE and Bob Pruzek of course will look forward to your reactions, whether just thoughts about prospects for use, or actual attempts to execute research along the lines indicated. The documents shown provide several ways one can begin to think about, design and execute more efficient and effective experiments or observational studies.