Adults don’t read much

I ran across an alarming survey of adult reading patterns from PEW. Less than three-fourths of adults (just 72%) said they had read a book in the previous 12 months (the survey was taken in March/April 2015). This is down from 79% in 2011. Men were less likely to have read a book than were women (67% compared to 77%).

Younger adults were most likely to have read a book–80% of those age 18-29 had read a book compared to 71% at age 30-49, 68% of those 50-64, and 69% of those older than 64. Those with more education and higher incomes were more likely to have read.

But even those who read, don’t read much. According to the study the median was 4 books read and the mean is 12. (Unfortunately, the study does not say if the calculation excluded those who had not read any book.) Again, there was a gender difference. The median was 3 books for men, 5 for women, and the mean was 9 for men and 14 for females. Again, those with more education had read more books. But even those who graduated from college had a mean of just 7, although the median was better at 17.

The large difference between the median and the mean shows there was a sizable group who read far more than the mean.

Now, one can argue that there is nothing miraculous about reading that makes someone a better person or even informed. A person who reads the newspaper every day and watches PBS science shows may be better informed than someone who read 20 Star Wars novels in the last year (although I’d argue that reading a book forces the reader to interpret words and so is less passive than watching even good television.)

I’d love to see a survey that compared the views and civic participation of readers versus non-readers (controlling for income and education of course). Unfortunately, this study did not.

Claimed College Readiness Numbers Are Lower than Actual College Attainment Numbers

Over the holidays, people may have missed this NYT piece about fears that rising graduation rates (82% for the class of 2014) show how it has become too easy to graduate from college. Fears of lower standards is one reason for the Common Core and the whole standardized testing movement. There’s little evidence of falling achievement—some people point to declines in SAT scores, but this has more to do with changes in the number and population taking the test. Still, the percentage deemed college-ready is very low. The article cites how in the most recent 12th grade NAEP (2013), less than two-fifths (40%) met the National Assessment Governing Board’s criteria for college readiness.

However, this is misleading, because more than 40% go on to college and graduate. According to census data from 2014, among Americans age 25 – 29, 64.3% have gone on to college. Yes, some of these left before gaining a degree, but this could be due to nonacademic reasons such as the high cost of college. The percent who finished college is still higher than NAEP’s 40% figure. Among age 25-29, 44.1% had a college degree, and among age 30-34, 47.3% did.

Moreover, the NAEP figure only includes those taking the exam at the end of the senior year, so excludes dropouts. When I exclude K-12 dropouts from the college calculations, the percentage of high school graduates who earn a college degree goes to 48.5% at age 25-29 and 53% at age 30-34.

Since there most likely were some students who left college for non-academic reasons, the actual percentage who could have finished college is most likely higher. Yes, colleges have remedial courses and resources to help low-achieving students, and some who were not college ready in their high school senior year may have become ready after spending time in the workplace. Still, it does not seem realistic that 53% of high school graduates would be able to graduate from college if only 39% were college-ready.

Now since those age 30-34 would have graduated high school 12-16 years ago, it is possible that today’s high school graduates have fallen behind. However, comparing long term trends data from 2012 and 1996 (http://nces.ed.gov/nationsreportcard/lttdata/), show only a one point drop in math and a one point drop in reading. So it is unlikely that there was a large difference in student performance that could account for 30-34 year olds being more college ready while they were in high school.

Of course, America needs more students to graduate college-ready and more students to complete college. Because so many jobs require further education, all Americans need to graduate high school and complete some form of postsecondary education or job training. Still, it is overly alarmist to claim that less than 40% of seniors are college ready when 53% graduate from college.

Misleading information on charter schools

The National Alliance for Public Charter Schools released a statement challenging Hillary Clinton’s townhall statements about charters. I examined their statements and found them technically accurate but misleading. The first few make an apples to oranges comparison of all charters to all other public schools, even though the percentage of charters in urban locations (which have a higher percentage of English Language students) is twice that of all public schools. The other statements cherry pick data.

 

“There is no difference in the percentage of English Language Learner (ELL) students served between charter and non-charter public schools.”

Technically, this is correct. The percentage is slightly higher in charter schools, according to their source, 9.1% at traditional public and 9.8 at charters.  However, this is actually an apples and oranges issue. The same source says that city schools are 15.1% ELL students while rural schools are 4.8% (suburban are 8.6% and rural 4.8%). But Digest of Education Statistics Table 216.30 shows that over half of all charters are urban (56.7%) compared to only a quarter (25%) of noncharter public schools. Also just 10.8% of charters are rural compared to 29% of noncharters. So based on location, if charters were reflecting the ELL percentage of their community, we’d expect a higher percentage for charters. A more accurate comparison would compare the percentage of ELL students in charters and traditional public schools in the same district lines.

Also, the same source says that only 65.6% of charters have at least one ELL student compared to 74.3% of traditional public schools.

 

 “37% of charter schools have at least 75% of their students in poverty as compared to 23% of non-charter schools.”

Again, because the percentage of charters that are urban is twice that of noncharters, this comparison is misleading. According to the Department of Education’s Condition of Education report, in 2010, 38% of city schools have at least 75% of students eligible for free or reduced price lunch compared to 10% of rural schools. A better comparison would be poverty in the charter schools compared to poverty in the district.

 

“Nationally, in the 2013-14 school year, charter schools served a higher-percentage of low-income students (57%) – than district-run schools (52%) – and have better outcomes”

The first half is just a variant on the previous statement. Again, as poverty is not evenly distributed, a better comparison would be charter schools to noncharters in the same district.

Also the link provided as evidence of better outcomes was to a CREDO study that compared urban charter schools to other urban schools in the same community (precisely what I advocate above for their other measures).  Moreover, this report was for urban schools only. It does not prove better outcomes for the other half of charter schools. A different CREDO study looked at all charter locations in the participating states but this national study found only a slight advantage (the equivalent of eight extra days) for charters in reading and no difference in math.

 

“2015 NAEP scores show that in Los Angeles, there was dramatically better student performance in charter schools than with district-run schools. Proficiency rates were triple that of non-charter schools. Los Angeles charter schools demographics are 75% low-income students and 85% of student have minority status.”

 

This is a classic example of cherry picking the data. Los Angeles is just one of several districts on which NAEP provides information. But if they didn’t cherry pick, they couldn’t claim an advantage for charters. Overall, there was no statistical significance between in the difference between charter and noncharter NAEP scores for large cities.

I also compared charter/noncharter for national public overall. The only statistical significance was in fourth grade math where noncharter public students outscored charter students.

 

“In New York City, charter public schools do a better job of retaining students with disabilities than their non-charter public school counterparts. Specifically, 53% of charter school kindergarteners with disabilities were still in the same schools 4 years later, compared with 49% of non-charter schools.

This is accurate. It makes the right comparison, unlike the others above, comparing NYC’s charters to noncharters. Again, however, this is only one district and, as the report only looked at NYC schools, there is no way of knowing if this is true for other districts.

 

So, overall, this statement is completely truthful in what it says but highly misleading. Mathematically, because the percentage of urban schools among charters is twice that of non-charters, urban charters could serve a lower percentage of ELL and low-income students than their neighboring noncharter schools, and still have higher percentages compared to all public schools (in all locations).