[AAACE-NLA] Literacy Today article
tsticht at znet.com
Wed Dec 8 19:48:30 EST 2004
Reprinted From Literacy Today for December 2004
Published in the United Kingdom by the National Literacy Trust
Both the US and UK governments have declared that millions of their adults
have problems with basic literacy skills. Tom Sticht, an international
consultant in adult education, examines the statistics, policies and
learners attitudes from both sides of the Atlantic.
Adult Literacy is Testing My Wits
Strange things go on in adult literacy education. For instance, in 1993,
the US federal government declared 47 per cent (90 million) of adults
deficient in literacy skills, then lowered funding for literacy education
for each of the next three years. Now, in 2004, federal funds are less
than US$220 per enroller; fewer than four per cent of literacy deficient
adults enrol in programmes in a given year and most do not stay for more
than 50 to 100 hours of instruction (U. S. Department of Education, 2003).
Similarly, in the UK, the 1999 Moser study reported that some seven
million adults assessed by the International Adult Literacy Survey were
seriously deficient in literacy. Unlike the US, the UK reacted to the
Moser report by instituting a new government office with a mandate to
deliver a new Skills for Life strategy and invested billions of pounds
into adult basic skills programmes. But a Guardian newspaper article
earlier this year indicated that only about 18 per cent (135,000) of the
750,000 adults taking courses under the Governments Skills for Life
strategy were from the lowest level of literacy identified as "at risk" by
the Moser report (Kingston, 2004).
So, whats going on here? How come millions of adults are being declared
"at risk" of deficient basic skills in the US and UK, yet programmes are
not being overrun with adults trying to get into them. While I know of no
certain answer, one thing is common in both countries: most of the adults
declared functionally incompetent in literacy, based on their test scores,
do not think they have a literacy problem. In the US, two-thirds to
three-quarters of the adults in the lowest level of literacy on the 1992
National Adult Literacy Survey thought they read well or very well.
Overall, more than 93 per cent of adults thought their literacy skills
were just fine and met their everyday needs at work and daily life.
In the UK, the 2003 Skills for Life Survey reported that some 96 per cent
of adults estimated that they were fairly good or very good at reading for
everyday life; only around five per cent estimated their reading skills to
be below average. Adults were somewhat less optimistic about their writing
skills, but were still overwhelmingly (93 per cent) apt to rate themselves
as fairly or very good at writing to meet everyday needs. Surprisingly,
4.3 million (83 per cent) of the 5.2 million adults classified in the
three Entry levels of the Skills for Life standardised test, and hence
considered to be the most poorly literate, estimated their skills to be
fairly or very good.
So, in both the US and the UK, the great majority of adults that the
governments say are deficient in literacy, based on government tests, do
not think they have a basic skills problem. Maybe this is one reason that
millions more adults in these nations do not enrol in provision.
This raises questions about how well the government tests reflect adults
use of their literacy skills in their everyday lives. Are these tests
ecologically valid? For instance, should the same tests be used for 16 to
24-year-olds as for 50 to 60-year-olds? It is well established that adult
cognitive abilities such as short term memory and information processing
efficiency change with age. Should the same tests be used with young
adults as with older adults?
Another problem with these tests comes when they are used to measure
progress in learning basic skills. In the UK, a government-sponsored study
of learning in adult basic skills programmes found that while most adults
made improvements in their skills from the beginning to the final testing,
30 per cent lost over 12.5 points from what they scored at the beginning
of the course (Brooks et al, 2001). This is like unlearning literacy for
almost a third of the adult students. Can this be true? If not, then why
are tests being used that permit this sort of negative-gain score change?
Surprisingly, in both the US and the UK, literacy programmes arent
allowed to teach what is on the tests used to evaluate learning. Teaching
to the test is considered cheating. But no rationale is given for why
tests that measure something that isnt being taught should be used in
preference to tests that measure what is being taught.
Is it appropriate to assess adult literacy skills in national surveys and
evaluate learning in literacy programmes with these sorts of tests and
procedures? It tests ones wits to think so.
U. S. Department of Education. (2003). Adult Education and Family Literacy
Program Year 2001-2002: Report to Congress on State Performance.
Washington, DC: Office of Vocational and Adult Education.
Brooks et al (2001) Progress in Adult Literacy: Do Learners Learn?,
London: Basic Skills Agency
Peter Kingston (2004) Question over Labours skills target claim,
Guardian Education, 29 June 2004, www.guardian.co.uk
Department for Education and Skills (2003) Skills for Life Survey: A
national needs and impact survey of literacy, numeracy and ICT skills,
Contact Tom Sticht at tsticht at aznet.net
More information about the AAACE-NLA