Kithaven Connections

Name:
Location: New Castle, Indiana, United States

Friday, May 27, 2011

Ignorance is Bliss...Until You Learn What You Missed

I missed the disconnection from the bsu iweb account, so I lost all my files. Note to self: Learn the art of backing-up. Luckily blogger backed up some of them, or this blog would be completely lost in the world of bits and bites. But, it means all those links from 2007 are dicey; some work, some don't.

Connecting is still important to me, so I'm going to still call this Kithaven Connections, and hope that any readers bring their own connections to share. In that way, we all teach and learn with each other.

Labels: , , ,

Wednesday, July 18, 2007

Online Classes for High School? Why Not!

I've been looking at all my classmate's blogs (yes, it's an assignment, but really, they have a LOT of interesting stuff), and this one caught my eye. Kevin Biddle created an online history course using Angel, and then talked about the reasons his school system is moving in this direction. Check out the reasons in the post entitled Applied Informatics Project. I think he has something significant happening in his district!

Labels:

Exercise Your Noodle with Moodle

Ok, ok, that's bad for a title, but really, the latest and greatest in online classroom management is a free application/website called Moodle. My classmate's blog mentioned it, so I had to go check it out. It's totally open source, and can be used for one teacher to a whole district full of teachers. At least, that's what the website says, and at least one of my EDTEC classmates says his school is moving to that platform this fall. That's as in, high school, NOT college. I'm familiar with Blackboard, since that's what BSU uses, so I thought I'd wander around the website and see what I could find. Oh, and here's a link to a paper comparing Blackboard with Moodle: Learning Management Systems for the Workplace: A Research Report
And, what I found looks nice...I admit I haven't used it yet, but I'm thinking about creating a course for my emergency preparedness program and seeing what happens. (See this blog for that.) One thing I really like is that it provides a wiki capability. In my EDTEC class last semester we played around with a wiki -- and it was painful because it wasn't associated with Blackboard, the interface was (as a coworker described an application the other day) "user abusive" -- and I felt I always had to keep half a dozen windows open in order to do the tasks. Having the wiki integrated with Blackboard would have been nice -- just another tab to work through. (Although Moodle admits their wiki isn't very powerful yet, but they're working on that.)
Surveys are also included as standard Moodle content. Again, I had to learn how to use Survey Monkey for this course (here's Kirk's bit on Survey Monkey) -- I admit -- the technology isn't usually difficult to learn -- it's the fear of going to a new place that I have to deal with -- but I like the one-stop-shop concept.
More on Moodle after I've actually registered and played with it a bit.
Resources:
http://moodle.org/
Winter, M. (2006). Learning management systems for the workplace: A research report (Electronic copy). Christchurch, NZ: Core Education Ltd. Retrieved July 18, 2007, from http://www.tanz.ac.nz/pdf/LMS_Final.pdf

Labels: , , , , , , ,

Keep Your Cool with DataMining Software

I admit, I got a bit behind with my posts here, but I plead that I published two the last time I wrote! I suspect I'll be doing the same again today. Work was insane last week....

Anyway, last week, besides being insane, some of my fellow EDTEC classmates and I attended a presentation on K12 Datamine, a software application created by MISi located in Indianapolis, IN. This powerful application is a data warehouse with a wonderful user-friendly portal for schools to manage their day-to-day administrative "stuff" as well as all those reporting requirements that various governmental bodies have imposed on schools these days due to things like NCLB at the federal level, all the way down to reports the local school board wants to see.

Everyone knows that teachers need to be spending their time teaching and not sorting through piles of papers to gather statistics to meet some kind of reporting requirements, right? If teachers wanted to be gathering statistics, they would have studied that instead of whatever subject they elected to learn to teach (with the possible exception of math teachers). So, when teachers complain, they probably have some basis for being grumpy. And, when you get right down to it, those same statistics show that when teachers spend more time teaching, the students do better. A dilemma....

K12 Datamine to the rescue! (I've added some screen shots so you can see what I'm talking about.)
The first thing I noticed at the top of the launch page was the DOE Reports section. The template for every report is automatically included in this application. Find the data, plug it into the template, and voila, your report is finished. Being a technical writer, I love templates -- they really speed up the work, and when it is just a matter of letting a computer application fill in all the blanks, even better. I want one for my job (just kidding -- I might find myself out of a job!).

At the other end is another reports section, Report Writer. This is for all those non-standardized reports that someone somewhere decides they absolutely have to have. Need attendance statistics for last semester? This place can create it, and save it for the next time someone wants that report. Need to know how many students got kicked off the bus this year for something? You can create a report for that, too. Good management is usually about getting the kind of information you need quickly. This application will do that.

So, ok, you're the vice principal in charge of discipline. Mrs. Jones calls in the middle of basketball practice and wants to know why Suzy Smith got detention because Suzy never gets into trouble. You haven't a clue what Mrs. Jones has to do with Suzy Smith...until you open up your K12 Datamine application and search by guardian. Not only is Mrs. Jones Suzy's guardian, she is a foster parent for a couple other children, as well as having her own child in the school.

Now, you're the teacher, and you really suspect that Jack Helpful is much brighter than his grades would indicate, and you wonder if there is any way to find out. Aha! If you could see his ISTEP scores compared with his grades over a specific period of time, like the last 2 or 3 years, maybe that would give you a clue. You pull out trusty K12 Datamine, and look at the comparison chart with grades and ISTEP scores. Just as you thought! He is passing English ISTEP with flying colors, and almost failing your class. Now that you have the answer, maybe you can come up with some ideas on how to motivate Jack in class.






So, what's the bottom line here? The president of MISi said $50,000 to $500,000 depending on the size of your school district. That sounds pricey...but I ran some numbers (making some assumptions, because I don't know how your school works, but you can compare your numbers with mine and tell me if my assumptions are any good).
These are my assumptions:
Each teacher spends (on average) 1 hour a day doing administrative stuff -- grading, taking attendance, handling discipline -- the data gathering types of activities K12 Datamine addresses.
That teacher is paid $37,500 per year. The same teacher works 5 days a week, 40 weeks a year. That is 200 days, and 1600 hours per year (at an 8 hour day). So, you are paying that teacher $23.45 a day to spend time on administrative stuff. Multiply that by the number of teachers in your school district. My daughters' school district had 72 FTE teachers last year. That is $1687.50 on administrative stuff PER DAY. Multiply that by the number of days in a school year (remember that number, 200, above?) = $337,500 per school year. Ouch! That makes K12 Datamine seem cheap!
I'm not suggesting you run right out and buy this product, but it seems to me it would save schools time and money that could better be spent on teaching.

You do the math, and tell me what you come up with for your school district.

Meanwhile, if you want a great analysis of it from the development perspective, here's what Becky Hammons had to say about it: k-12 data mining/MISi review

Resource:
Management Information Solutions, Inc., 11611 North Meridian Street, Suite 705, Carmel, IN 46032. http://www.misi.com/. 800-464-6191.
Contact John Hayden, and tell him Ball State's EDTEC 685 class sent you.

Labels: , , , ,

Monday, July 2, 2007

More on Rubrics

As I was reading some of my classmates' blogs, one caught my eye. In Reading Counts for Everyone, the author, Ms. McClung, provided a few links to a variety of rubric development applications and websites. In addition, the blog included a great article on rubric development, Scoring rubrics: What, When, and How? which explained the difference between analytic and holistic rubrics. I knew rubrics were becoming more popular. I didn't realize how much more popular -- at least based on the literature I've been seeing. This article provides a great explanation on how to write and score a rubric, regardless of the tool you use (or don't use) to develop one.

Resource
Moskal, Barbara M. (2000). Scoring rubrics: what, when and how?. Practical Assessment, Research & Evaluation, 7(3). Retrieved July 2, 2007 from http://PAREonline.net/getvn.asp?v=7&n=3

Labels: , ,

Rubric Development Using rGrade: 72-Hour Kit

I am in charge of an emergency preparedness program for my church (The Church of Jesus Christ of Latter-day Saints). We recently conducted a needs assessment to identify how prepared the members of the congregation are in various areas of preparedness. One of the results of the assessment showed that most did not have a 72-hour kit, which is one thing that is considered the absolute minimum that members should have to be prepared for an emergency.
Based on this information, I decided to create a rubric using rGrade that would both help members know what things they should include in a 72-hour kit, and help them assess how much they already have done and what they still need to do. It can also be used to help them build a 72-hour kit using the criteria included within the rubric. I have provided a URL with the rubric for more information on 72-hour kits. Members could also refer to other materials they might identify pertaining to 72-hour kits. The idea is that members should not go spend a lot of money on a professionally-prepared kit, but that they can create one from the materials they may already have at hand.
I used three threshold levels: Beginning, Developing, and Accomplished. These represent the way I believe members should perceive their preparation; that they can range from beginning to accomplished, or somewhere in-between. I created 10 rows based on elements that should be included and/or considered when building a 72-hour kit, each worth 10 percent. Quite often the areas of Food/Water, Bedding/Clothing, and Personal Supplies are the ones which have the most focus, and the other areas are neglected. We have a tendency to emphasize the physical support, and forget about the emotional needs that people have. Therefore, I made each category equal in weight to emphasize the idea that all of these elements are necessary to be fully prepared for an emergency.
This rubric will be used as members participate in a workshop to help them prepare 72-hour kits. Each individual who attends the workshop will be asked to complete the rubric (in paper form) at the beginning of the session. This will help each person determine what areas need work. Then upon completion of the workshop, all participants can take a copy of their rubric with them to continue working on their 72-hour kits.

I believe the rubric is a good method to use for assessment of this nature because it provides those who use it with insights about the nature of being prepared -- that it encompasses multiple aspects of their lives -- as well as the idea that it doesn't need to be hard. I used the specific threshold names to demonstrate that while everyone is not necessarily accomplished in emergency preparedness across the board, we all have some areas where we are doing well, and other areas we can work on. I attempted to build it to give those who complete it a sense of accomplishment, and ability to succeed rather than a sense of failure. Using it as both a beginning point, and an ending point enables those who use it to see progress.
After creating the rubric, I searched for standards that might be relevant to the idea of being prepared for an emergency, or the principles that might apply to the elements included in an emergency kit. I was pleasantly surprised to find standards that might relate to each area within the rubric. I concluded that the 72-hour kit lesson and rubric might be one that could be included as an object lesson for many lessons around the standards I included beyond the particular use I intend to make of it.
For instance, in a lesson addressing the
Indiana High School Standards > GHW.9.1 on natural disasters and interventions/preparedness (see link for description), part of the unit might include having the students assess their family's preparation for a natural disaster, and prepare a 72 -hour kit for their family. The standards I have used in this rubric demonstrate that the idea of being prepared for an emergency, and knowing what should be included in a kit to help a person be prepared, can span multiple age groups, and could be adapted for any class, depending on the focus.
This approach relates to the ideas presented in the readings and video by Rick Stiggins on incorporating student input into the assessment process. He suggested that when students are involved from the beginning in assessment; knowing what they will be assessed upon, and working through the process as they learn, that they are better learners, and are more attuned to what they need to learn throughout the learning process. I think that as individuals use the rubric to identify how they can better prepare, they will have a better sense of what they need to do to be better prepared, and will understand the entire process more completely.
If the rubric were incorporated into lessons on nutrition, health, and safety, understanding what it takes to achieve sound nutrition, good health, and improved safety could give a foundation to what it takes to make sure those things happen in an emergency. Likewise, understanding and evaluating special needs, or transportation issues, are necessary to know how to take care of them in emergencies. This could give a broader introduction to the human experience, and a deeper understanding of why certain elements of that experience are important even in times without emergencies.
As I used the rubric to evaluate my own family's level of preparedness in this area, I felt it helped me identify areas of weakness that we need to work on, as well as areas of strength. I think using it as a beginning and progress assessment would be valuable for those interested in this subject.

Resource:
72-Hour Kit Rubric

Labels: , , , , ,

Tuesday, June 26, 2007

Assessment Technologies and Real World Answers

While looking for some examples of assessment technology, I ran across the following:
The Development of a New Scientific Instrument: "Views on Science-Technology-Society" (VOSTS)
This article describes the method for developing an instrument that would assess what students already know about science, as well as their views about what they know. The authors state that many instruments designed to assess scientific knowledge make an assumption that the student understand the test questions in the same way the researcher does. This introduces questions of validity for the instrument. To address this, the authors determined to use a different approach for instrument development.
The authors wrote a series of questions to which 5000 students responded both with multiple choice, likert scale, paragraph, and oral responses to determine what prevailing beliefs students had about science-technology-society content. Based on these responses, they created an instrument with a variety of responses that could potentially reflect student views and knowledge. They stated the empirical development methodology provides a more reliable instrument for accurately assessing what students understand, and could be used for both pre- and post-test scenarios. The instrument was developed in 1992 in Canada. The methodology was described in detail, and could be replicated to meet national or state standards and to reflect regional diversity.
Copies may be obtained by writing VOSTS, Department of Curriculum Studies, College of Education, University of Saskatchewan, Saskatoon, SK. S7N OWO, Canada.
It seems to me it could be highly useful in determining the "stickiness" of science teaching. I have often felt that simple multiple choice assessments don't capture the full understanding of a learning experience, and are limited by the individual writing them. I have often argued (silently) with multiple choice tests over a variety of responses that I felt did not reflect what the question was asking. The approach described here seems to address those issues well.

Resource:
Aikenhead, G. S., & Ryan, A. G. (1992). The development of a new scientific instrument: "Views on science-technology-society" (VOSTS) [Electronic version]. Science Education, 76, 477-491. Retrieved June 26, 2007, from http://www.usask.ca/education/people/aikenhead/vosts_2.pdf

Labels: , , ,