Qualitative data analysis references

I just had to share this photo I took in the Botanical Gardens in Manurewa last Sunday. Dene and I went back this morning so that I cld find out what it’s called. It’s the ‘Bird Lady’ and is one of many sculptures being exhibited there at the moment. It’s truly eye catching isn’t it?

Below are 2 references which I found to be useful for qualitative data analysis. I’ve also included the points covered for ease of reference.

  1. Ryan, B. R. (2008). Methodology: Analyzing qualitative data and writing up your findings. http://eprints.nuim.ie/871/1/methodology.pdf
  • Why one should analyse data and not just assume that it can ‘speak for itself’;
  • What analysis entails;
  • Getting started: preliminary analysis and coding;
  • The nature of evidence;
  • Doing more advanced analysis;
  • Writing up your findings;
  • Discussion-oriented writing.

2. Tips for analysing qualitative data. (May 2007).  http://www.insites.org/CLIP_v1_site/downloads/PDFs/TipsAnalzQualData.5D.8-07.pdf

  • Reviewing your data
  • Organizing your data for analysis
  • Example – questionnaire data
  • Developing your codes
  • Coding your data
  • Finding themes, patterns and relationships
  • Summarizing your data
Posted in Uncategorized | 1 Comment

Wimba class today

I do find the online Wimba classes extremely useful as you get immediate feedback on your work 🙂

Tom, Fred and I attended today’s session where Bronwyn gave Fred very useful feedback on his raw data … yes he has completed his data collection quick smart 🙂

Bronwyn also gave Tom and me her feedback on our questionnaire in Google forms which Tom has since revised further. It’s an ongoing process refining the instruments isn’t it? But that’s the way it goes …

We also talked about how the data collection was going in our group and I feel we are making good progress. We each have 2-3 samples to collect data from and logistics and practicality played a key role on how we divvied them up.

My ‘To do list’ includes the following:

  1. Continue with data collection. This week Tom had completed the Cover sheet and consent and instructions for the triallists of the iLP. He also trialled it this morning with a user and the input has been used to further refine the data collection instruments we’re using. What Tom is doing is extremely useful as it serves to further refine the data collection instruments we’re using. My target is to complete mine end of next week the latest.
  2. Look at the results section in the exemplars to get ideas on how to report our results for Assessment 2.
  3. Identify relevant literature to be used in the discussion of the results section.
Posted in Uncategorized | Leave a comment

Data collection Part 1

Continue reading

Posted in Uncategorized | Leave a comment

Reflection on Assignment 1

Reflection on Assignment 1

As I reflect on the process, it does seem we’ve come a long way since we first started. In this blog I will share the following:

  1. Sections I was responsible for in completing Assignment 1
  2. Working in a team
  3. Giving feedback to other groups’ evaluation plans

Below I have itemized the sections I was responsible for in the planning process and how I went about completing it.

A.     Sections

1. Background and rationale – I deliberately gave as much information as I could about the design and development of the iLessonPlan (iLP) as it is a prototype lesson planning tool still in development. Initially it was a concept in my head, then it took shape and became a concrete object as in the tool itself. It is still in the alpha stage of development so it is challenging for people on the outside to fully understand what it is. So that was the reason for including as much information as I could about it in this section.

I do appreciate Bronwyn’s comments to keep the background directly related to the evaluation because if you don’t then you could lose your reader … so I trimmed the Background section and have included the relevant parts in the Appendices.

2. Process of using the iLP – the same argument as above applies. Tom and Mark gave very useful feedback as they thought it better to list the steps. I thought this was a good idea as previously I had described the steps using paragraphs. Again the rationale was to make it reader friendly. Then Tom made it even better by including screenshots for each stage in the process. I’m very pleased at the end result as ‘a picture says a thousand words’.

3. Aims and Evaluation Questions – If I remember correctly, either Tom or Mark or both had started off this section, and then I added my part.

I knew that usability testing was an important part of the formative evaluation as the iLP is a tool which had not been trialed formally before. Reading the e-Learning Guidelines (Massey, 2007) provided valuable insight as to how the aims could be structured. I felt quite excited as I was reading the Guidelines as I thought how brilliant to have a set of user friendly guidelines to use.

We started with something general and have now refined the aims and sub questions further.  Now there are two aims linked to two e-Learning Guidelines as below:

a. ST9: Does the iLP help ESOL trainee teachers to successfully develop their knowledge of lesson planning?

b. MO1: Can the ESOL trainee teachers easily use the iLP?

Earlier I had included another aim and e-Learning Guideline as below:

SD3 Do ESOL trainee teachers gain knowledge relevant to planning an ESOL lesson?

I felt this was a relevant aim but due to pragmatic reasons we decided to stick to the two above. I might add this at much later phase of evaluating the iLP… possibly evaluating it next year after it has been hosted on a website.

Table 1: An overview of the essential elements in a formative evaluation plan – I find working with tables one of the best ways to relate information logically and coherently. So I often use them to do just that before I start writing the report. Also using tables helps me to easily identify easily missing information and information that doesn’t quite fit in the boxes. So I was quite pleased to find that Tom, Mark and Bronwyn liked Table 1.

4. Methodology – It was interesting to read up on the different evaluation models. Both quantitative and qualitative data collection methods were included though each does have its limitations. But using both together provides richer data which then provide stronger evidence for the results and conclusion.

I must remember to note things which work and do not work while collecting the data. Also to note how emerging problems were resolved and if they couldn’t be resolved then to account for them in the interpretation of the findings. Hmm… the next few weeks will be very challenging for sure…

5. Sample and Instrumentation – Again this section was started off by Tom and/or Mark and I added my contributions. Due to the discontinuation of the CATESOL Programme, thetarget population for whom the iLP was developed for was unavailable so we had to resort to using trained ESOL teachers. A real challenge has been to highlight this point from the beginning of the evaluation plan. Mark and I did that and I tidied it up further to be more consistent.

6. Conclusion – On hindsight I should have included in the design and development of the iLP, an evaluation component as it is an important part of the design and development of the tool. The ideas which informed the design and development were conceived with one product/outcome/output in mind but by incorporating evaluation many of the issues could have been resolved much earlier. But then that is part of the creative process isn’t it? Learning from one’s mistakes and making sure that it doesn’t happen in future. That to me is what learning is all about. Evaluating is a cyclical process and I do expect to re-evaluate the iLP at a later stage. To me this is an essential part of refining the tool.

B.     Working in a team

As Fred said in his blog, there are advantages and disadvantages working in a group. A great advantage is the multiple perspectives brought to the project and the collaboration which enhances the quality of the work put in. Each member contributes differently and to me it is important to ensure each member is valued and appreciated in relation to the knowledge, experience, and skills brought to the task.

Communication is crucial to keep everyone in the loop and updated on what’s happening. Working on the wiki initially was a nightmare. We resolved this problem by having weekly f2f meetings, emailing constantly each other our contributions, versions of the plan etc using attachments and posting the updated versions on the wiki so that Bronwyn had an idea of the collaboration taking place behind the wiki.

Emailing also has its challenges as the tone, language and message communicated may not necessarily be the one intended by the writer. And how it is received and perceived by the reader is another matter altogether. One option would be to use Skype as this synchronous form of communication would help resolve some of the problems faced in asynchronous communication. But as I found out, MIT does not allow lecturers to have Skype installed for professional development (PD) purposes. So on one hand we are encouraged to do more PD but then are not allowed to have the tools, in this case Skype, to do so.

On a personal note, I’m someone who constantly reflects on how things can be improved and further progress made. So I do tend to take it upon myself to take the necessary steps needed to initiate changes. I must remember that I’m part of a team and therefore consultation is crucial before action is taken.

Tom and Mark are excellent team members and I’m happy to be working with them on this assignment.

C.      Feedback provided to other teams’ evaluation plans

a. Fred

b. Sandra H and Rosanne

Posted in Uncategorized | Leave a comment

Wimba session today

Hi

The youtube video above is on using google forms as an alternative to survey monkey. The link is from Gary whom I met today in the Wimba session. The link is  http://www.youtube.com/watch?v=H-lNffCvY3A

I find the Wimba sessions extremely useful as there’s input from Bronwyn and other classmates. Today’s session was pretty good as well 🙂 We were joined by Tom, Gary and Fred. It was the first time my Wimba session worked so I was quite excited about using it. But then I resorted to using the phone as well as the volume was much louder.

Some useful things that were discussed were:

  1. Clarifications about our draft evaluation plan – This was extremely useful as Tom and I clarified with Bronwyn her comments and feedback on our wiki. At times our emails would be flying past at the rate of ninety miles an hour … 🙂 Having Tom there was particularly good as he gave his insight on various things we had done in our plan.
  2. Problems working with wikis – We all agreed that wikis are NOT user friendly. Tom, Mark and I emailed each other what we had done and had f2f discussions once a week. Last Monday we had a whole day marathon session getting our draft plan ready… what a feat that was! Bronwyn suggested Google docs as an alternative to working with wikis. Hmmm what about using it for the next assignment? Will have to check with Tom and Mark.

Got to go now … catcha later 🙂

Posted in Uncategorized | 2 Comments

Piha and Exemplars

I’m posting this again as it had disappeared earlier in cyberspace 😦

I had posted it on 25th Sept…

Hi

I took the picture in Piha last Wedensday on a cold grey day. Lion Rock is on the left and believe it or not there were 4 surfers out …I was dressed in my winter coat :-0 It was nice to be out spending time with Dene and our good mates 🙂

We’ve been putting together our evaluation plan and it’s good to see it all come together … the more you read the better you understand how the pieces fit together. I found Reeves and Hedburg (2008) extremely readable compared to their  earlier  2003 work. Chapter 5 on Formative Evaluation gave good insights into what we’ve been doing. It’s like a jigsaw puzzle isn’t it … the pieces fit together and the more pieces you have, the more you see the whole taking shape.As my nephew Luc would say … like lego Aunty Al 🙂

Our evaluation questions started off with being quite general initially but now they are more specific as they need to be tied in directly with the decisions, goals, and instruments. Also Bronwyn’s comments and feedback to the other groups’ work has pointed us in the right direction. The exemplars are also useful in illustrating what the final product i.e. the evaluation plan and report should look like. Very often I like to look at what the final product should be so that I can focus on the outcome. I also look at the assignment criteria to ensure that all the requirements have been met.

Chapter 3 Planning and Managing Evaluation by Reeves and Hedburg (2003) is excellent in elaborating with clear examples what needs to be included in the Evaluation plan.The data collection methods have to be unbiased and if they are biased then they should be mentioned as a limitation when reporting the findings.We have listed our questions and I’m sure they’ll be tweaked further as we go along.

Reliability and validity are two important criteria to be considered in using the instruments to collect data. ‘Reliability deals with the consistency of measurement… and Validity is about the degree to which an instrument achieves its aims’ (Reeves and Hedburg 2008: 9). At times it is not possible to absolutely observe both of them but it is important to show that they have been considered when analysing and interpreting the findings. Data analysis as I said in my earlier blog is extremely challenging. I’m still working on it 🙂

References

  1. Reeves, T. C., & Hedberg, J. G. (2008). Evaluating E-Learning: A User-Friendly Guide. (In press.) http://emit.manukau.ac.nz/bbcswebdav/courses/906.709-112A/ElearnEvalBook_Chapter7.pdf
  2. Reeves, T. C., & Hedberg, J. G. (2003). Interactive learning systems evaluation. Englewood Cliffs, NJ: Educational Technology Publications.
Posted in Uncategorized | Leave a comment

Sunset in Spring

At the end of the day ...

Sunset in Spring

Spring is one of my favourite times of the year. It gets warmer gradually and there is a zing in the air as Pixel my cat bounces about with ounces of energy in the garden.

Thinking back on the evaluation plan, we have come a long way from the first version of the plan which has since gone through many revisions. Working as a group has meant that we delegate the work and we have a lot more resources at hand, as well as the knowledge, experience and expertise in our group and the different perspectives that each of us bring to the project.

I’ve been thinking of the process involved so far and this is what it seems to me in a nutshell:

1. What do we want to find out? Answers to evaluation questions.

2. How do we go about finding it? By using data collecting methods:

a.      Quantitative methods e.g. questionnaires, interviews, surveys etc

b. Qualitative methods e.g. verbal report, concurrent think aloud etc

3. How do we analyze the data?

This depends on what we want to find out and so it should link back to the evaluation questions and the data collecting methods.

a. Quantitative methods e.g. questionnaires – frequency counts as in numbers and or frequencies, averages etc

b. Qualitative methods e.g. verbal report, concurrent think aloud protocols – use content and thematic analysis to identify predominant trends in the data.

4. What conclusions can we make at the end?  

We look at the evaluation questions and answer them using the data that has been collected and analyzed. Within the time frame given to complete our project, there are limitations as to what we can do and this should be stated in the plan.

My understanding is that the process of planning the evaluation is dynamic and cyclical. So when we carry out the evaluation itself, it may not be possible to implement everything as stated in the plan. Reasons stating why it is not possible should be stated in the report and so it is good to note these and when they arise for easy reference later when we do the report .

Having a small sample to work with helps make the process of analyzing data more manageable. There are several statistical packages that are available e.g. SPSS (Statistical Package for Social Sciences). Getting someone in stats to help you would certainly be a BIG help 🙂 I myself feel very challenged in this area 😦

The challenge using qualitative data is in ensuring that collecting, recording and analysing the data is done in a systematic and organized manner. As Bronwyn has said in http://wikieducator.org/Evaluation_of_eLearning_for_Effective_Practice_Guidebook/Planning

For this project it is best to keep the analysis simple. I am a visual learner to some extent and so I like to have tables displaying the data in relation to categories identified in the evaluation questions. It helps me to identify the predominant patterns in the data and makes reporting the results easier. Our team is in the process of discussing how to analyze the data at this stage so will report on the progress in my next posting 🙂

References

Planning Evaluation (28 August 2011). http://wikieducator.org/Evaluation_of_eLearning_for_Effective_Practice_Guidebook/Planning

SPSS (18 August 2011). http://en.wikipedia.org/wiki/SPSS

Posted in Uncategorized | Leave a comment