[Reading-hall-of-fame] Unlearning Literacy in ABE

Thomas Sticht tsticht@znet.com
Sat, 6 Nov 2004 16:14:20 -0800 (PST)


Research Note                                        November 6, 2004

  Unlearning Literacy in Adult Basic Education Programs

  Tom Sticht
  International Consultant in Adult Education

  Thirty years ago a colleague and I prepared a paper entitled "The Probl=
em
  of Negative Gain Scores in the Evaluation of Reading Programs."(1) Toda=
y,
  as then, the issue of zero or negative gain in adult literacy education=
 is
  hardly ever discussed, though it is still occurring For example, a year
  2000 report from MassInc. (2) reported some 4 percent negative gain sco=
res
  and 40 percent no change in scores in the Adult Basic Education program=
s
  they examined. A year later a report from the United Kingdom's Basic
  Skills Agency (3) reported over 30 percent negative gain scores with no
  discussion of how zero and negative gain scores were treated.

  Generally, data from pre and post test scores are interpreted as
  indicating the extent to which learning is taking place in a program. F=
or
  instance, if average reading test scores for students increase, say fro=
m a
  grade level of 4.8 to 5.8, this is interpreted to mean that students
  gained one grade level in their reading skills. However, by the same
  logic, if some students do better on the pre test than they do on the p=
ost
  test, say they  score at the 4.8 grade level on the pre test but only 4=
.2
  on the post test, this would indicate an "unlearning" of 6 months in
  reading skills. From the point of view of interpretive validity, if we
  interpret positive gain to mean learning has occured, shouldn't we
  interpret zero gain to mean no learning and negative gain to mean
  unlearning has occured?

  This raises fundamental questions about the validity of using tests for
  which zero or negative gain scores are not infrequent for assessing the
  extent to which programs are promoting learning by students. Can the sa=
me
  tests measure learning, no learning, and unlearning? How are zero and
  unlearning to be interpreted for program accountability? Can students w=
ho
  have spent time in programs and learned nothing and those who have had
  their literacy unlearned claim they have been subjected to mal-literacy
  practice and sue for damages?

  National Negative Gain Reports

  A different, though related problem of interpreting zero or negative ga=
in
  can be found in the federal accountability system for adult education a=
nd
  literacy programs. Today, to assess learning in the federally funded St=
ate
  Grants program, the National Reporting System (NRS) obtains data from t=
he
  States and U.S. Territories to determine the percentage of students in =
the
  State Grants program that increase their literacy proficiency enough to
  progress from one of the NRS six levels of proficiency to a higher leve=
l.
  These data are derived by the States and Territories  from pre and post
  scores on nationally normed and standardized tests, such as the TABE,
  ABLE, CASAS and others. However, in reporting these percentages of
  students progressing from one learning level to another, the data on th=
ose
  making zero or negative test score gain are not revealed, only the
  percentage of program enrollees moving upward from one level of
  proficiency to another is reported by the NRS.

  Each year now the NRS prepares a report for the U.S. Congress that
  reports, among other things, the percentage of adult learners moving
  upward from one level of basic skills proficiency to a higher level. Th=
e
  report for Program year 2001-2002 presents tables showing the percent o=
f
  enrolled adults who acquired the level of basic skills needed to comple=
te
  at least one education level (minimum Grade Level Equivalent=972 years)=
 in
  Program Year 2000-2001, which is called the Baseline Year, and then the=
se
  same kinds of data are presented for Program Year 2001-2002. This allow=
s
  one to determine if the percent of adults progressing from one level up=
 to
  another has increased from the Baseline Year (2000-201) to the current
  year (PY 01-02).

  Averaged over the 50 States, District of Columbia, and Puerto Rico, the
  data indicate that in the Baseline Year of 2000-2001,  36% of program
  enrollees moved up from one level to a higher level as measured by a
  particular State=92s testing system. In PY2001-2002 this percentage
  increased by one percentage point to 37%, suggesting improvement in the
  federally funded State Grants programs overall, but still short of the =
U.
  S. Education Department=92s performance goal of having 40 percent of ad=
ults
  show increases in their learning by moving from one level to a higher
  level. Interestingly, no indication of how the federal performance goal=
 of
  40 percent making improvements was determined is given in the report.

  More to the point here, though, are the individual data for the 50 Stat=
es
  and Territories. These data are given below for Adult Basic and Seconda=
ry
  Education. Similar data are available for English as a Second Language
  learning but those data are not discussed here.

  The following table shows the percentage of adults enrolled in adult ba=
sic
  and secondary education programs who acquired the basic skills needed t=
o
  complete the level of instruction in which they were initially enrolled
  and move up to a higher level. The first column shows the State or
  Territory, the second column shows the Baseline year of PY00-01 and the
  third column shows the current PY01-02 year being reported to Congress.
  The table shows that 20 States made negative gains in percent of adults
  making level improvements from PY00-01 to PY01-02, 3 States made zero
  gain, and 29 made positive gains (note that there are 52 total reports,
  but reference will be just to States to avoid redundancy in referring t=
o
  Territories).

  Now the question is, what do these data mean? Do they mean that in 20
  States the ability of programs to teach basic skills to adults declined
  and in three States their teaching ability stayed the same from one yea=
r
  to the next? Do they mean that in 29 States the programs got better at
  teaching basic skills to adults? Does it mean that students in PY 01-02
  got more difficult to teach in those States where improvements declined
  from the baseline, or that teachers were less apt? Why do States have s=
uch
  large differences in baseline and current program years in the percenta=
ge
  of adults making some movement up the NRS levels?

  Along with the issues of zero and negative gain using individual
  standardized, nationally normed tests, the problem of interpreting data
  such as that reported by the NRS raises serious questions regarding the
  validity of our national accountability system for federally supported
  programs across the nation and its Territories. At the present time I d=
o
  not see how the U. S. Congress, or anyone else for that matter, can rea=
d
  these data and use them to determine anything about teaching and learni=
ng
  in adult education and literacy programs funded by the U. S. Education
  Department=92s State Grants program. I also wonder about the legality o=
f
  continuing to use such blatantly invalid methods for holding individual
  teachers or programs accountable for teaching.

                  PY        PY
                  00-        01-
  State        	01        02

  Alabama	26        25  down
  Alaska                55        44 down
  DC                55        37 down
  Georgia        29        28 down
  Hawaii                39        36 down
  Idaho                55        44 down
  Illinois                30        29 down
  Indiana                38        37 down
  Kentucky        58        57 down
  Maryland        55        46 down
  Minnesota        24        22 down
  Montana        53        35 down
  N. Hampshire        55        40 down
  N. Mexico        42        29 down
  New York        35        31 down
  N. Dakota        82        69 down
  Ohio                59        55 down
  Puerto Rico        75        53 down
  Utah                43        42 down
  W. Virginia        55        54 down
  20 down

  Florida                36        36 same
  Missouri        31        31 same
  N. Carolina        36        36 same
  3 same

  Arizona        34        39 up
  Arkansas        36        44 up
  California         25        29 up
  Colorado         48        50 up
  Connecticutt        28        39 up
  Delaware        27         35 up
  Iowa                27        39 up
  Kansas                53        76 up
  Louisiana        38        45 up
  Maine                38        43 up
  Massachusetts        22        23 up
  Michigan         26        29 up
  Mississippi        42        43 up
  Nebraska        30        36 up
  Nevada                31        46 up
  New Jersey        27        32 up
  Oklahoma        29        42 up
  Oregon                43        47 up
  Pennsylvania        29        33 up
  Rhode Island        55        82 up
  South Carolina        20        36 up
  South Dakota        34        42 up
  Tennessee        38        40 up
  Texas                25        29 up
  Vermont        10        11 up
  Virginia        30        40 up
  Washington        33        41 up
  Wisconsin        57        80 up
  Wyoming        48         53 up
  29 up



  1. Caylor, J . & Sticht, T. (1974, April). The Problem of Negative Gain
  Scores in the Evaluation of Reading Programs.
         Chicago, IL: paper presented at the meeting of the American
  Educational Research Association.

         2. Comings, J., Sum, A. & Uvin, J. (with others) (2000,
  December).New Skills for a New Economy: Adult Education's Key Role
         in Sustaining Economic Growth and Expanding Opportunity, Boston,
  MA: MassInc.

         3. Brooks, G. et al (2001, January). Progress in Adult Literacy:=
 Do
  Learners Learn? London: Basic Skills Agency.



  Thomas G. Sticht
  International Consultant in Adult Education

  2062 Valley View Blvd.
  El Cajon, CA 92019-2059
  Tel/fax: (619) 444-9133
  Email: tsticht@aznet.net