« CODE77 Rubrics for Administrators 2010 Part 5 of 10 | Main | CODE77 Rubrics for Administrators 2010 Part 3 of 10 »
Wednesday
Jan062010

CODE77 Rubrics for Administrators 2010 Part 4 of 10

I warned you these were coming.

Self-evaluation Rubrics for Basic Administrative Technology Use (2002) 2010

 Not everything that counts can be measured. Not everything that can be measured counts. - Einstein

IV.          Data Use (TSSA Standards I.E, III.A, IV.D) NETS-A 2009 (4b, 4c)

Level One:             I do not use, or have available to me, reports or data produced by information systems in the district to help make operational or policy decisions.

Level Two:             I can analyze census, discipline, scheduling, attendance, grading, and financial data reports produced by administrative systems to spot trends and highlight problems in my building or department. I can communicate the conclusions to staff, parents, and the community in understandable ways. I help my staff access, analyze and use student performance data to design instructional strategies. I have the statistical knowledge to make meaningful and accurate judgements based on data.

Level Three:             I recognize areas in administration for which additional data is needed for the efficient and effective operation of the building, department, or district and can make recommendations about how that data can be gathered, stored, and processed electronically. I can use data mining techniques to draw conclusions about programs’ effectiveness and use such data to create building plans and evaluated their success.



Why leaders need to use technology to analyze data:

For far too long, educational leaders have used only past practice, political pressure, generalized research, and gut instinct to make decisions. It was what was available. But technology’s ability to store, retrieve and analyze large amounts of information is allowing us to begin to make decisions based on real, site-specific data.

A stand-alone, individually created database made by programs such as FileMaker Pro can be used to analyze small sets of data such as the test scores of single grade. But the creation of larger databases of any degree of sophistication is best left to experts. Relational databases (those that use data from multiple linked databases) are very powerful, but very complex.

The thoughtful, combined efforts of curriculum specialists, assessment specialists, building administrators, and technology departments are beginning to ensure that school leaders have the software tools to successfully extract and interpret data to determine program effectiveness or and the need for program change. Special databases are being built that merge and interpret data from many sources over multiple years and can be used to give meaning to this data through what is commonly referred to as data-mining.

Databases that provide data-warehousing/data-mining operations do some or all of the following:

  • Keep accurate information about individual student progress.
  • File timely, accurate state reports.
  • Identify individuals or groups of students who are performing outside the standard performance range as demonstrated on a range of assessment tools.
  • Track, identify and isolate the strategies, programs and interventions that may be impacting student performance.
  • Judge the total effectiveness of building and district programs and improvement plans.

These databases hold basic student data imported from the school’s student information system, program information identified by the school, and test score data imported from state or commercial testing services. The ability for a database to import existing data already in some digital format saves time and is more reliable.

The concept behind data-driven decision making is that certain sets of data (indicators such as test scores) can be used to determine whether programs or circumstances (interventions such as summer school) have an effect on certain types of students (indicators such as grade level). Information searches for cross-building comparisons as well as individual buildings need to be possible in larger districts.

The database search feature needs to enable the user to find and understand the data through sorting, filtering and summarizing. At a basic level, the user will be able to sort by multiple combinations of each of these areas:

Identifiers            Identification of the person or group. These are factors that are not changeable or controllable. (Name, ethnicity, gender, grade level, date of enrollment, teacher, socio-economic background, attendance rate, etc.)

Interventions             The programs, strategies, or other factors that may cause or may be correlated with change. (Summer school, special reading programs, Title One, special education programs, ESL programs, gifted and talented program participation, etc.)

Indicators            The data that indicate the extent to which change has occurred. (State test scores, standardized test scores, course grades, G.P.A, etc)

Of all the technology skills required of educational leaders, the ability to make good decisions using meaningful data is probably the newest and most challenging, especially since training in statistics may be rudimentary, at best. Yet as budgets tighten, these skills are becoming increasingly important. We need to expend our finite resources on programs we can prove improve student performance or on improving programs that don’t. Using data wisely can help us do just that.

EmailEmail Article to Friend

Reader Comments

There are no comments for this journal entry. To create a new comment, use the form below.

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>