The Jacksonville Public Education Fund has millions of reasons to spin for the district. (rough draft)

The Jacksonville Public Education Fund (JPEF) once
again treaded into territory they never should and that’s policy. They did so
in a piece about the release of the tenth grade ELA scores.

This year there was a new test that has been
controversial for a whole host of reasons and the JPEF’s gist was don’t worry
about Duval’s poor performance.

Before I continue I have to tell you the test has not
even been validated yet and furthermore I think it’s a bunch of crap anyways but
what we can do is use it to compare how Duval is doing against other districts
throughout the state.   

JPEF on the other hand all but discounts the results
and Jason Rose their policy expert sites three reasons why.

To be more specific,
55% of students statewide passed the 2014 10
th grade FCAT 2.0 Reading assessment
last year. Because passing-level cut scores have not been yet been established
for the 2015 
 10th grade FSA English/Language Arts
assessment, the results released today were approximated by finding the score
at which 55% of students on this year’s test were above and making that the
“passing score”, and evaluating everyone else relative to that.

For what it’s worth,
48% of 10
th graders
in Duval County Public Schools passed the FSA English/Language Arts test
according to the scores released today (same as last year), and 57% of DCPS
students passed the Algebra 1 EOC (down slightly from 58% last year).
Statewide, scores in this release remained the same as last year (~55% passing
10
th grade
ELA, 67% passing Algebra 1 EOC) – as would be expected with equipercentile
linking.

It should be noted that
waiting until after a new test is administered for the first time to establish
appropriate cut scores or other properties (particularly without sufficient
field-testing) is not inappropriate or uncommon.
  To some degree, students’ performance relative to
each other in the first large-scale administration of the test will influence
where appropriate levels are set – in conjunction with input from content-area
experts, psychometricians, and others. This is a common and accepted component
of test specification and validation.

Now Rose is right the problems with the test are many
and cut scores have not been released (determined??) yet either but as I said
earlier we can use the scores to compare where Duval is in relation to the
other big districts, spoiler alert we are dead last and that is something Rose never
mentions.

Furthermore his data is a bit off and his reasoning is
way off. First 55 percent of students passed in 2014 but this year the state
says it’s only 54 percent. Furthermore it’s true that both this and last year
Duval had 48 percent passing but a lot of other districts had difference in
their scores.  If the state was just using
last year’s stats to set this year’s bar as Rose suggests then somebody screwed
the pooch because there are different scores all over the place.

Then when Rose says,  “It
should be noted that waiting until after a new test is administered for the
first time to establish appropriate cut scores or other properties
(particularly without sufficient field-testing) is not inappropriate or
uncommon.
”  He
minimalizes the lack of field testing basically saying it’s not necessary
because once students take the tests then the state can figure things out.
 

This is a different point of view than most superintendents
have. In fact superintendent after superintendent including, Albert Carvahlo of
Miami and Nikolai Vitti of Jacksonville chided the state for not field testing
the exam. This was one of their and many experts’ chief complaint and Rose
makes it sound like it is done all the time and what does it matter anyways.

Here are a couple links
sighting superintendents who disagree with Rose.  
It was not long ago that
JPEF was the butt of jokes from then school board member Tommy Hazouri who
asked, who are these guys? Fast forward a few years and the founder of JPEF Gary
Chartrand has his handpicked superintendent and school board in place and
suddenly it’s hard to tell where the JPEF ends and the district begins. JPEF
has a lot of self-serving interests as they are currently managing millions of
dollars for the district to spin Duval’s poor performance in a positive
fashion.

Again, Rose is right,
the test has lost of problems, Chartrand is also on the state board, picked AIR
our testing provider and has steered us into the quagmire we find ourselves in
so he has to bear some responsibility for that too right? He however misses, I
am sorry tries to hide the bigger problem and that’s compared to the other big
districts, we are badly 
under performing.  

One Reply to “The Jacksonville Public Education Fund has millions of reasons to spin for the district. (rough draft)”

  1. Hi Chris,

    Just wanted to clarify a few potential misunderstandings here in terms of the concerns you raise with our post. One apparent area of confusion in your post is the difference between how statewide and district-level results were determined, as evidenced in the following selection:

    “First 55 percent of students passed in 2014 but this year the state says it’s only 54 percent. Furthermore it’s true that both this and last year Duval had 48 percent passing but a lot of other districts had difference in their scores. If the state was just using last year’s stats to set this year’s bar as Rose suggests then somebody screwed the pooch because there are different scores all over the place.”

    There seems to be two different sources of confusion going on in this paragraph. The first being the apparent disconnect between the explanation of equipercentile linking and the fact that the state reported 55% passing in 2014 and 54% in 2015. In the links provided to the score reports at both the beginning and end of our blog post, the state provides the following disclaimer at the top of each page addressing this issue:

    “Note: The equipercentile linking method used this year holds constant the results from last year at the state level, with a small difference due to rounding.”

    I felt this disclaimer was clear enough at the time, but agree that perhaps I should have incorporated a mention of it directly in the blog post for anyone who did not click through to review the source material provided.

    The second apparent source of confusion in the paragraph above is the conflation of how equipercentile linking produces results at the statewide (Florida overall) level versus at the individual district levels. As explained in our post, “Changes will still be apparent at the district and school levels because cut scores are established at the statewide level, and districts or schools may still do better or worse relative to each other than they did last year.”

    While I agree with you and the superintendents cited that a full administration of sufficient local field-testing would have been preferable and could have avoided some of the current confusion by helping to establish FSA appropriate cut scores prior to this year’s actual baseline administration and release, that ship has clearly already sailed. Because real-world political demands, financial requirements, and planning needs don’t often line up with research ideals, the fact is that this type of problem is not uncommon in new test development and rollout cycles and certainly not unique to Florida.

    The reality now is that the first full administration of the test has been given and, being the only comprehensive sampling available on how the test performs, will be what the state has to use in the cut score setting process. While not ideal, the other options would have been to (a) double end-of-year testing on a significant portion of students while FCAT 2.0 was administered again this year for accountability purposes and FSAs were administered in full for field-testing, or (b) only administer a field-testing year of FSAs and suspend all accountability-related reporting or use of those test results for a year.

    As a reminder, we have long-advocated for the latter (option b) in anticipation of the issues we are now starting to see in terms of confusion around these scores. The state has chosen an option closer to (b), in terms of suspending accountability measures associated with this year’s results, but has not made clear yet how they plan to report on the results. Last week’s initial release seems to indicate they may release two different sets of scores – one before and one after new cut scores are determined. If so, this is likely to cause understandable public confusion and the overall point of our post, in anticipation of that, is to remind everyone that these are not the final results that will be considered our baseline scores moving forward.

    As always, if you have any other questions please feel free to contact me directly.

    Jason Rose
    JPEF

Leave a Reply

Your email address will not be published. Required fields are marked *