[AAACE-NLA] (REVISED) Making adult education relevant
Kohring, Aaron M
akohring at utk.edu
Fri Nov 19 12:38:33 EST 2010
Oops- and I forgot to mention Juliet Merrifield's report that addresses Tom's 5 points below.
Equipped for the Future Research Report: Building the Framework<http://eff.cls.utk.edu/PDF/merrifield_eff.pdf>, 1993-1997
March 2000. Juliet Merrifield. This historical report gives an overview of the development of the national Equipped for the Future system reform initiative from 1993-1997
From: aaace-nla-bounces at lists.literacytent.org [mailto:aaace-nla-bounces at lists.literacytent.org] On Behalf Of Kohring, Aaron M
Sent: Friday, November 19, 2010 9:23 AM
To: National Literacy Advocacy List sponsored by AAACE
Subject: Re: [AAACE-NLA] (REVISED) Making adult education relevant
I have highlighted below some additional information on the EFF research.
UT Center for Literacy Studies
From: aaace-nla-bounces at lists.literacytent.org [mailto:aaace-nla-bounces at lists.literacytent.org] On Behalf Of tsticht at znet.com
Sent: Thursday, November 18, 2010 12:29 PM
To: aaace-nla at lists.literacytent.org
Subject: Re: [AAACE-NLA] (REVISED) Making adult education relevant
Colleagues: Regarding my comments on EFF research methodology, George asked:
When you say "extremely weak research methodology," what exactly do you mean, especially by the adjective and what is your definition of valid methodology?
My answer is that I was critical of the EFF project's research methodology from its outset when I realized that it had not done even the most basic of procedures when undertaking such expensive and extensive research and development. For instance it had not critiqued earlier work on what adults should know and be able to do, such as the Adult Performance Level and CASAS projects. It did not specify the advantages and shortcomings of those projects and indicate how the EFF project would extend the advantages and eliminate the shortcomings. This is basic when taking on a large-scale project costing taxpayers millions of dollars
I also criticized the project for its lack of evaluation. I was assured that such evaluation would be forthcoming but it never happened. When a project purports to improve something the burden of proof is on the project to demonstrate that it indeed improved something. This never happened in the EFF project. In fact, in some cases, claims that EFF had improved retention turned out to be invalid when I checked them out. Therefore, the validity of EFF as a successful intervention to solve some adult education problem is questionable.
Valid methodology is methodology that permits accepting the research findings and conclusions as accurate and not questionable.
In a report colleagues and I noted a couple of limitations to the EFF work, one of which the researchers themselves identified (see extracts below).
Following is from a report from around 1996:
Limitations to the NIFL Study. In the report on Equipped for the Future, the NIFL researchers themselves noted that the study had some important methodological shortcomings.
Threats to Generalizability. Though some 6,000 letters were mailed out to adult literacy programs by the NIFL, only 1500 responses were obtained from
149 programs, a 25 percent response rate if each adult letter is counted as one reply to each of the 6,000 requests (which they weren't), or a less than 3 percent response rate if the 149 programs that replied are considered as a sample from the 6,000 requests for responses. In recognizing that the limited number of replies potentially limits the generalizabilty of the results to the thousands of literacy programs and millions of adult learners in the nation, the NIFL researchers noted that "Since participation was wholly self-generated in response to the process described above, we can make no claims about how representative the writings we received are of the entire range of adult learners. We don't know why programs chose to participate or not to participate. We made no effort to control the number of responses from any one program. Some sent two or three. Some, sent dozens of responses." The latter factor means that some few programs may have heavily biased the data base.
--- Note: The writings from 1500 students reflect only some of the early data collected throughout the EFF initiative. During all phases of the research, a number of voices representing a variety of stakeholders provided invaluable input into the EFF Framework. For example, adult learners and their instructors from different types of programs from across the country were an integral part in the development of the EFF Standards and the Performance Continua. You can read more about the research in the publications and reports available online- and specifically about the two examples I just gave in these publications:
Equipped for the Future Content Standards: What Adults Need to Know and Be Able to Do in the 21st Century<http://eff.cls.utk.edu/PDF/standards_guide.pdf>
Equipped for the Future Assessment Report: EFF/NRS Data Collection Project, 2000-2001<http://eff.cls.utk.edu/PDF/EFFNRS%20Interim%20Report2.pdf>
Results That Matter: An Approach to Program Quality Using Equipped for the Future<http://eff.cls.utk.edu/PDF/results_that_matter.pdf>
Available at: http://eff.cls.utk.edu/products_services/online_publications.htm
Threats to Validity. A second major methodological problem encountered by the NIFL researchers concerns the reliability and validity of the four purposes for literacy that were identified in the research. While extensive, subjective coding of responses was performed, limitations in resources meant that data were not available on the reliability of the coding scheme. That is, no inter-rater reliabilities were obtained and no cross-validation, using independent, separate coding teams was conducted to determine how replicable the research findings were. This means that in the EFF study, it is not clear to what extent the four purposes accurately and reliably captured the statements by the "customers" or "clients," or instead expressed the beliefs and attitudes of the researchers.
And some additional background resources that address some of the other issues you and George have discussed in this forum:
EFF Research To Practice Note 1<http://eff.cls.utk.edu/PDF/01research-practice.pdf>
EFF Research Principle: A Purposeful and Transparent Approach to Teaching and Learning (PDF version, 232 Kb)
October 2002. Marilyn Gillespie. This 8-page digest summarizes the research basis for "a purposeful and transparent approach to learning", the first key research principle underlying the Equipped for the Future system reform initiative. It provides three examples of EFF implementation and program practices that support the research principle, related to designing education specifically around the goals of students in their real-life roles as family members, community members, and workers. Key terms and concepts are included in a glossary.
EFF Research To Practice Note 2<http://eff.cls.utk.edu/PDF/02research-practice.pdf>
EFF Research Principle: An Approach to Teaching and Learning That Builds Expertise (PDF version, 220 Kb)
October 2002. Marilyn Gillespie. This 8-page digest describes how research findings related to building expertise have been applied to the development of the Equipped for the Future Content Framework and assessment system. It provides three examples of EFF implementation and program practices that support this research principle, related to how learners use prior knowledge and experience to construct meaning and acquire new knowledge. Key terms and concepts are included in a glossary.
EFF Research To Practice Note 3<http://eff.cls.utk.edu/PDF/03research-practice.pdf>
EFF Research Principle: A Contextualized Approach to Curriculum and Instruction (PDF version, 244 Kb)
October 2002. Marilyn Gillespie. This 8-page digest identifies the research basis for a contextualized approach to teaching and learning, the third concept underlying the Equipped for the Future system reform initiative. It provides examples of EFF implementation and program practices that support this research principle, related to the active application of knowledge and skills and how the EFF approach to education encourages learning transfer. Key terms and concepts are included in a glossary.
EFF Research to Practice Note 4<http://eff.cls.utk.edu/PDF/Research_to_Practice.pdf>
EFF Research Principle: An Approach to Assessment Based on Cognitive Science (PDF version, 211 Kb)
2005. Regie Stites. This 8-page digest provides an overview of the cognitive science and measurement theory and research findings that support the EFF approach to assessment. It discusses the key findings learned from what research says about good assessment and provides examples of this principle in practice. Key terms and concepts are included in a glossary.
Following are five issues from the 1996 report as identified after studying five projects that aimed at identifying what adults need to know and be able to do. Note that the critique also applies to the EFF project.
(1) The "proliferation" issue. There is a tendency for these projects to develop very long lists of "competencies," as in the case of the APL and CASAS projects or "attributes" as in the case of the O*NET project. In these three projects, the number of adult knowledge and skill areas, sub-areas, and sub-sub areas of content ranged from 190 for the O*NET project to 317 for the CASAS. Generally, in such "outcome-based"
methodologies for specifying what people should know and be able to do, there is no rationale given for how many sub-areas should be identified, and this can get very specific, as in the 5,000 test items the CASAS has for assessing the 317 "competencies." In this case, each item can be seen as a specific "competency." In contrast, the SCANS and GED projects specify a few, very broad categories of knowledge and skill.
(2) The "overlap" issue. The "overlap" issue deals with the question of the interactions and similarities among the many "competencies" identified in the various projects. Factor analysis revealed only three factors in the APL study, not 270 competencies. As noted above, a general factor underlying both the GED and the National Adult Literacy Survey has been labeled as "...the ability to understand and use written information and to analyze information embedded in printed materials." It accounts for a 60 percent overlap in performance on the two tests. The Tests of Applied Literacy Skills (TALS), a commercial version of the NALS has been found in one study to have about a 50 percent overlap with the CASAS (CASAS (1995, June). National Summer Institute 1995: Assessment in an Era of Change. San Diego, CA: CASAS) These findings raise the important question as to just what the competence is that actually underlies the many things that adults can be identified as being able to do.
(3) The "levels" issue. Across all these projects there is a concept of "levels" that suggests that people can be assessed to discover their "level" on some competency or attribute. For instance, the APL study found
20 percent of adults in the lowest "level" of functional competence. The CASAS has measurement tests that both assign a person to a general level of competence and also identifies performance on separate competencies that people may need to work on improving (though see the criticism by Greg Jackson, above). The O*NET identifies seven levels for each of the skills of Table 5. The GED has levels for each of the five parts of the test. All these projects raise the question of just what it means to say that people have "levels" of knowledge, skills, competence, or literacy.
(4) The "developmental" issue. While all of the projects reviewed discuss knowledge and skill outcomes, such as "competencies, " "attributes,"
"levels" or "high school equivalency," none of them present information on how it is that adults come to possess these outcomes. How do they get developed? Have adults who possess them at "high levels" at the age of 18 or 19 at the end of secondary school developed them in the same way that adults must who choose to develop them in adult literacy programs? What must one do to help adults move from scoring at the 200 level of a CASAS reading test to a 236 level (a growth of about three standard deviations) ?
Given the relatively high inter-correlations among these various tests, will the person who moves from the 200 to the 236 level on the CASAS also show similar improvements on the NALS and the GED tests?
(5). The "who decides what the content standards shall be" issue. Finally, all of the foregoing projects have derived competencies based on the statements of business leaders, teachers, adult education program administrators, workers and various other stakeholders and constituencies.
But only the APL project reported listening to adult students, and even then the students' expressed wishes and desires for learning were not identified as such. As mentioned earlier, the "learner-centered, participatory" approach to adult literacy education (e.g., Fingeret, A. & Jurmo, P. (Eds.). (1989). Participatory literacy education. New Directions for Continuing Educaiton, No. 42. San Francisco, CA: Jossey-Bass) argues against programs in which the curriculum designer decides in advance what people should know and be able to do to fulfill their roles as parents, citizens or workers. Rather, they argue that adults should identify what they want to learn and the teacher should help the adults find resources both within themselves and from outside sources to pursue their learning objectives.
AAACE-NLA mailing list: AAACE-NLA at lists.literacytent.org<mailto:AAACE-NLA at lists.literacytent.org>
LiteracyTent: web hosting, news, community and goodies for literacy
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the AAACE-NLA