Yellow Ribbon evaluation plan


Evaluation Plan for the Youth Suicide Awareness Trust:

Yellow Ribbon NZ


Professor Ian Evans

Human Behaviour Solutions


Narelle Dawson

Registered Psychologist, DAIAS

August 7th, 2002

Revision: October 3rd, 2002

Introduction and Background

The Youth Suicide Awareness Trust was established in 1998 for the purpose of raising awareness of the unacceptably high rate of suicide among young people in New Zealand. The Yellow Ribbon NZ initiative is the educational arm of the trust. The ultimate goal of the Trust is to reduce youth suicide and the Yellow Ribbon programme is designed to increase the likelihood of school students recognising the need for help-seeking, asking for help when needed, and increasing the culture of peer support in schools.

Thelma French, a founding member of the Trust and Executive Director of Yellow Ribbon, approached Narelle Dawson to investigate the possibility of conducting an evaluation study of the programme activities, and she in turn recommended the involvement of Ian Evans. Ms Dawson is a Registered Psychologist in private practice (Dawson Assessment Intervention Advocacy Services, DAIAS) in Hamilton, with a long experience of working with teenagers and adolescent concerns. Dr Evans is a Professor of Psychology (quarter time) at Massey University, Palmerston North, who has an extensive record of research in child, school, and family issues, and is also a consultant in the development and evaluation of clinical interventions (Human Behaviour Solutions).

Narelle Dawson and Ian Evans met with Thelma French and the Yellow Ribbon staff on July 31st and were impressed with their dedication and commitment. Thelma and her team were aware that an independent evaluation component was essential for ensuring that the programme is meeting its objectives and monitoring its effectiveness in a credible way. We were also made aware of various levels of controversy surrounding the programme, with one government agency, for example, arguing that the programme must demonstrate that it is “safe, effective, and evidence based, in a rigorous and scientific way.” We were asked to further develop an initial plan, and this proposal is in response to that invitation.

Evaluation Principles Guiding this Proposal

Evaluations can be either formative or summative, and in some circumstances can be both. Formative evaluations provide the project managers with information and data that can guide the activities of the programme and future developments. This may also include management decisions surrounding related themes, such as publicity activities, or ways of introducing the Yellow Ribbon programme in schools. Summative evaluations provide information that allows one to assess the outcomes of the programme, essentially asking the question “does this programme work?”

Evaluation often involves issues other than simple outcome assessment:

* There is the question of whether the approach developed is “evidence based”, that is to say whether the programme uses principles and procedures that have been empirically validated by other researchers.
* A somewhat more manageable version of the same question is whether a programme is based on “best or most promising practices”, which can usually be judged by accepted experts in a given field.
* Another issue is that programmes often have unintended or hidden outcomes that might be judged as favourable or unfavourable. A concern that has been raised, for example, is that “awareness-raising initiatives may impact [negatively] on those young people who are already vulnerable.” A related concern is that there may be trickle-on effects in which promotional materials and publicity have unintended negative effects for parents, families, or whanau.
* In addition, programmes have costs (financial, as well as others) that need to be considered in relationship to their benefits.
* Also there are the perennial questions of whether, if a programme can be considered successful, it is better than other alternatives; whether it can be sustained over time; and whether it remains true to its original objectives.
* Closely related to the previous issues is the degree to which the programme can be integrated with other programmes or government initiatives that have the same general goals; is Yellow Ribbon a valued component of the national strategy?
* A further important principle is whether the programme, if it is achieving some success, is doing so equally for all segments of the intended recipient social groups. Is the programme acceptable to the different cultural and ethnic groups in New Zealand and is it reaching all segments of the vulnerable population? Are cultural needs being addressed? Do families value the programme?

A number of leading commentators on suicide prevention note that there may be benefits in targeting only young people who are at particularly high risk. For example, suicide is thought to be related to significant levels of prior mental health needs. It would be potentially useful at some stage in the evaluation to introduce a more formal psychological measurement component that considers the nature of mental health symptoms in students. The Yellow Ribbon philosophy, however, would argue that while there may be a need for schools to be vigilant in recognising signs of significant emotional distress, there is an important principle in not stigmatising young people as having psychiatric syndromes before appropriate supports can be introduced. In other words, the message of it being “OK to ask for help” potentially applies to anyone at certain critical times, not just to those who might have a diagnosable mental disorder. This evaluation model therefore, presupposes that there could well be an outcome of increased acceptability of professional referral, while at the same time avoiding the presumption that one first has to experience a mental disorder before help-seeking behaviour is appropriate. At the same time it is important to ensure that suicide is not portrayed as an inevitable, likely, viable, and predictable response to stress. This is a delicate balance and data will need to be gathered as to how well this balance is being maintained in the “mythologies” around suicide that exist in targeted and non-targeted schools.

The potentially most difficult question to answer is the scientific validity of a programme, which deals with the question of whether a programme produces meaningful results and does so for the reasons stated—in other words the outcomes are not simply due to the injection of money, interest, energy, and so on, into a system. This in turn raises some interesting questions regarding what exactly might be the mechanism of change: if the Yellow Ribbon programme is successful in promoting help seeking behaviour and peer support, what were the critical, necessary, effective ingredients and is this immediate goal causally related to the ultimate goal of reducing youth suicide? (Despite its tragic personal consequences and tremendous social cost, the base rate for youth suicide is quite low, making preventative efforts uncertain.) This level of scientific credibility is rarely reached in the evaluation of programmes that are already ongoing, because there are practical difficulties in introducing proper controls when programmes are already in operation. Nevertheless, if the resources are available, sound quasi-experimental evaluation is possible.

Finally, in a document prepared by Dawson and Evans (2002) entitled Yellow Ribbon’s Commitment to the Principles of the NZ Youth Suicide Prevention Strategy, it was pointed out that young people themselves could be actively involved in the process of evaluation and development of the programme. Programmes that are youth-led and that encourage young people to take personal responsibility for the type of social conditions they would like to enjoy, are more likely to be shaped by young people’s own needs and interests. In this spirit of empowerment, we will endeavour to include high school students in the planning and gathering of information. This may be done formally in conjunction with social science classes, or informally in the spirit of “participatory research.” The same concept will apply to other influential figures in the educational system, such as educational psychologists, teachers interested in research (or completing higher degrees such as Masters and doctorates in education), and guidance counsellors.

The following proposal addresses some of these issues and offers different levels of information gathering that can answer different types of question.

Overview of Possible Evaluation Questions

The following general questions and issues need to be addressed at some point in time. Note that the categories of information described below are slightly arbitrary, i.e., there is some overlap between categories and they are not supposed to represent absolute groups of issues:

Short-Term or Immediate Outcomes

This refers to the sorts of changes in individuals or groups (such as schools) which the Yellow Ribbon programme targets as the direct benefits of the programme, regardless of whether or not they lead in turn to the ultimate desired outcomes described above.

* Does introducing the Yellow Ribbon programme in a school influence the school atmosphere or emotional climate in a positive way?
* Does the Yellow Ribbon programme result in young people actually asking for help?
* What form does this asking for help take and how widespread is it?
* Is asking for help an effective coping strategy for young people’s emotional distress?

Process Variables

Sometimes known as “moderator variables”, these issues relate to the process of change—does the programme have its influence (if any, as defined above) through mechanisms that can be identified? These intermediate changes resulting from the programme may not have any particularly desirable value of their own, but are considered important because they are the means to an end—if these process variables are not happening then the programme may have good short term or long-term outcomes but we do not know why and therefore cannot ever enhance its effectiveness or make modification in the programme for even better outcomes in the future.

* What actually happens when someone asks for help? Are young people initiating such requests actually receiving support from peers and adults, and is it perceived as positive?
* Are there some young people who do not respond to help seeking encouragement and are there ways of making this strategy more acceptable to those who are least likely to use this approach to coping with personal problems?
* Do those offering help/support, such as the ambassadors, know what the important issues might be that require serious professional input?

Programme Monitoring

This type of information is more like an audit: are things happening the way they are supposed to? Is the implementation of the programme true to the principles or ideals as described in programme materials, and are the methods used acceptable to stakeholders? Programme monitoring is more a question of measuring the actions of those implementing Yellow Ribbon, rather than assessing the effects on those who are the recipients of the programme.

* As the Yellow Ribbon programme expands, is it being delivered in a way that is true to its own principles?
* Do young people accept and value the ideas of the Yellow Ribbon programme?
* What are the opinions of adult and professional stakeholders regarding Yellow Ribbon in the schools?

Ultimate Goals

This refers to the achievement of the ultimate aims of the programme: does it work or really produce the results that are hoped for?

* Does the Yellow Ribbon programme lower the expected rate of suicide among the targeted group (currently high school students)?
* Does the Yellow Ribbon programme contribute directly to a reduction in problems (personal emotional distress) that sometimes lead to suicidal ideation (thoughts, plans) in students attending the targeted schools?
* Will exposure to the Yellow Ribbon message (It’s OK to ask for HELP) provide young people with additional useful coping skills that might reduce their risk of self harm in the future?


In addition to designing specific evaluation studies, it is the intention of this proposal to lay out a realistic timetable of when different studies can be implemented

A. Short-term or Immediate Outcomes

Comment. A presumption of the YR model is that as a result of the programme, young people in school will be more inclined to seek help with a personal problem. Another issue is that when a student asks a peer (YR Ambassador) for help, does he/she receive it, and is it in fact helpful? Then, of course, we would need to document the consequences of asking for help: having asked for help, does the individual’s personal issue (problem, challenge) diminish in some way? Does learning to ask for help increase a young person’s resilience in any way?

Does introducing the “It’s OK to ask for HELP” theme, influence the atmosphere of a school in ways that might be considered valuable for dealing with depression, other emotional needs such as family and relationship concerns, drug and health issues or thoughts of self-harming behaviour? Aspects of atmosphere that might be important would be a reduction in bullying, a greater level of acceptance of students who are different in some way, reducing feelings of alienation in some students, and increasing the emotional sensitivity (“emotional intelligence”) of teachers and possibly senior pupils/student leaders. These are important variables to consider as they are thought to be part of the known risk factors for youth suicide.

To investigate these questions it would be necessary to assess levels of understanding and awareness in a school prior to the YR programme being initiated. It might be possible to do this with focus groups in which the students for the focus group would be pre-selected as emotionally mature students and they would be asked to report on what the general level of understanding, knowledge, and discussion was around harmful, risky, and suicidal ideation among their friends. In other words, what is the current “mythology” around these issues that is believed by most teenagers?

Study 1: Young people’s exposure to the topics of help-seeking, coping, and suicide, and Yellow Ribbon’s influence on this.

This study will gather some information on what kind of exposure typical Kiwi teenagers have to the topic of suicide. In order to reduce the possibility of a reactive effect, we will limit this study to a number of small focus groups of high school students, in groups of 5-8. The focus groups will be conducted by experienced psychologists (probably Narelle Dawson and Ian Evans, or one or two additional people specifically employed for this task.) Focus groups will be conducted in a number of different settings, such as Auckland, Hamilton, and Palmerston North, and will cover schools at various decile levels (1, 5, and 10, for example), thus allowing for the conduct of about 12 focus groups in all. We will conduct half of the focus groups in schools where YR has been introduced (where possible we will try to avoid YR ambassadors as participants), and half in matched schools where it has not. Focus groups would last between 90 and 120 minutes, refreshments would be served and the meetings would take place on school grounds.

Participants will be selected by school counsellors and principals as students for whom it would be safe to discuss a variety of topics, including suicide. Participation would be invited after full information is provided and parental and student consent obtained. Prior to and after the focus group there will be a de-briefing session to ensure that the participants have not been upset or unduly influenced by the discussion that has taken place. Ground rules for the discussion of suicide in a respectful, honest, and non-melodramatic way will be laid out.

The format for the focus groups will be to address the following questions:

* How do young people like you cope with stress or emotional and family problems? What are some of the most appropriate ways and what are some of the ways that are maladaptive?
* Does anyone in the group know of anyone, among friends or family, who has attempted or completed suicide?
* What is your personal understanding of what suicide is? Why do you think it comes about?
* What do you know about suicide in New Zealand? ( A check on understanding, and on information versus misinformation)
* How, when, and where, does suicide get discussed among your friends (and family)? Is it considered a taboo topic that shouldn’t be talked about? Do people joke about suicide or refer to it in a “black humorous” way? Do teachers ever talk about it?
* What do you hear about suicide from the media? Do you come across themes of suicide in literature, the media, song lyrics, web sites, etc?
* What do you think the dangers might be of young people getting a glorified image of suicide, or seeing it as an acceptable way of dealing with a problem?
* What do you know about which young people might be at risk? Can you always tell if someone is joking or serious if they make a threat? How should you respond to someone indicating suicidal thoughts?
* What do you think should be done to reduce attempted and completed suicide?
* What do you think should be done to support those who are affected by the suicide of someone known to them?

During the discussion, the group will be asked to come to some sort of consensus on the major issues and concepts that could be taken away from the groups. Two weeks after the focus group, each participant will be personally contacted and given the opportunity to comment on any additional thoughts, images, of feelings they may have had after the focus group. This will serve as a check on the debriefing process and allow for individuals to raise any remaining issues that were not addressed in the focus group discussion itself.

Data analysis:

Careful notes will be kept during the focus group discussion, the groups’ summaries will be collected, and if the group consents the discussions will be tape-recorded and transcribed. The themes and content of the discussion will then be synthesised, and a reliability check will be conducted whereby psychology graduate students, naive to the issues and the nature of the schools, will be asked to review the raw data and confirm the synthesis provided. An overall summary of issues and themes will provide some insights into ideas young people in New Zealand have about suicide.

Judges naïve as to which schools each focus group came from will judge the consensual opinions of the students in the focus groups in terms of the following criteria:

Sophistication of understanding about causes of suicide

Knowledge of strategies existing within the school and among individual students

Degree to which understanding comes from reliable and respected sources, such as teachers, versus from the media

This will allow comparison of YR versus non-YR schools.


The major time and effort required to conduct this study is in contacting schools, nomination of participants, and obtaining fully informed consent. The focus group will need to be conducted by the principal evaluators; however we may be able to appoint a few other experienced psychologists to conduct some of the focus groups. Refreshments will be required, tape recorders with grab mikes, butcher block paper and pens; it would also be appropriate to provide the participants with some rewards for their time, but not promise them this ahead of time so as to reduce any sense that they may have been coerced into participating.

Study 2: Effects of Yellow Ribbon on school atmosphere and students’ perceptions of connectedness with the school

Ideally this survey should be carried out before and after YR has been introduced to a particular school. It would also strengthen the design if the measure of school atmosphere were administered twice to schools who have requested Yellow Ribbon, but for whom it has not yet been possible to set up the programme. We would need at least two schools in each condition.

Measure of school atmosphere/climate: various measures have been developed by Moos and his colleagues. We would review the available measures and select one or design our own based on the assumptions of the YR model. Atmosphere consists of a huge variety of possible dimensions, but common ones are the “mood” of the school, the degree to which it is seen as fair and supportive, the involvement of students in important decisions, and respect shown for students’ opinions. The dimensions are measured by a questionnaire format in which aspects of school life are judged by the students. Examples of items are as follows:

This is a school in which bullying is dealt with firmly by teachers


In this school most of the students feel that teachers and counsellors are very approachable


Students in this school are listened to if they have concerns


Generally I look forward to coming to school each day


Teachers in this school tend to favour their own “pet” students

Agree………………………………….Disagree (scored negative direction)

I know of another high school that I would much rather attend if I could

Agree………………………………….Disagree (scored negative direction)

Although this school has rules, they are always fair and reasonable


At the same time we would administer the ERS-Youth Self Report (Kovacks, 2000) which is a measure of major coping strategies used by young people when “sad or upset”. We will also administer the BarOn EQ-i:YV (Bar-On & Parker, 1997), a measure of young people’s emotional responsiveness.


The atmosphere questionnaire would need to be designed and piloted and put into a format that is attractive to teenagers; the other measures are available and are currently being used in a doctoral thesis supervised by Evans. Administration is fairly routine, and possibly class time can be used. A good survey questionnaire would only take about 15-20 minutes to complete. Most of the work would involve obtaining school agreements (there might be some teacher or principal opposition as such measures look as though they are encouraging pupils to evaluate the school staff), administering the questionnaires, and ensuring their return.

Data analysis is quite straightforward. Since “atmosphere” consists of a number of different dimensions, a questionnaire such as this yields factor scores and there may be good reason to think that there would be a change in one or more of those factors with the introduction of Yellow Ribbon to the school.


Questionnaires would be administered before YR is introduced and again once it is well established in the school, say about 6 months later. The same time frame would be used for schools not receiving the YR programme. The questionnaire could be routinely administered before and after each school receives the YR programme, and the matching of schools and obtaining the balance of deciles etc, could be done later as data are accumulated.

Study 3: Help-seeking behaviour and social support: What happens when a young person asks for help and how does it make them feel?

If someone does seek help for a problem, what sort of reaction do they typically experience? Is the help forthcoming? What sort of help is it? Did the help contribute to the solution of the problem?

These questions can be addressed either by a series of in-depth interviews, based on a random selection of young people, or by a questionnaire designed for ratings of help-seeking experiences. Both strategies may need to be used.

For a measure of programme outcomes, this study would be conducted in schools where Yellow Ribbon has been in place for some months, and the results compared to comparable schools without Yellow Ribbon.

The questionnaire needs to be developed, although we will review the literature to see if any exist currently. The basic strategy will be to ask the teenager to think back to the last time he/she experienced emotional distress (end of a relationship, loss of a loved one in the family, unexpectedly poor grades, family conflict, etc) and to identify a specific incident. To think about the situation and recall the details. This image would be followed by questions such as:

Who did you first talk to (share information) about your concerns?

How helpful was this person? (Did they preach or lecture you, or did they listen and provide constructive support)

What could this person have said or done that would have been the most helpful to you at that time?

What other people did you seek help from?

What were the results of that experience?

Were there any individuals you would have liked to have asked for help but felt that you could not?

Why not? (What were the barriers to seeking help from these other people? List each person this is relevant to.)

If someone came to you with a similar problem, how well do you think you would be able to offer them constructive support?

Resources: This type of questionnaire requires open-ended responses which need to be classified and then counted to cover the range of possibilities. Once this measure has been developed and refined it can be used as a routine measure to evaluate the changes that might take place in a given school as a result of the YR focus on peer support. Changes might be that peer support is more available and acceptable, that peer support is more effective, and that peer support is just one part of a hierarchy of help-seeking behaviours leading to professional involvement.

Study 4: Stories of success

Schools that have introduced YR will be canvassed on an annual basis with an invitation to all students to come forward if they have an important story to tell abut their experiences of feeling an overwhelming problem that was alleviated by asking for help. It would be important not to look like we were fishing only for success stories. We would just need to create an opportunity for someone to come forward.

After possible pupils with such a story have been identified, they would be interviewed by Dawson or Evans and the interviews tape recorded and later transcribed. The interview protocol would be written so as to provide some sort of corroborating evidence that the perceived benefit to the young person came about because of asking for help or some closely related feature of the YR programme. e.g.,

Why have you come forward now to tell your story? (What do you expect to gain from telling your story?)

In your own words, what actually happened to you? (Has it happened before? Have you had these feelings for some time?)

Explain what it was you did that allowed you to address your problem.

What was it about your friends/counsellor, etc that made a difference to the way you were coping with the problem? (Can you recall specific words said? How others made you feel?)

What do you see as the benefits of YR?

What other things take place at school that might have been as useful or more useful if you had made use of them? (What other things have you done in the past to cope with problems?)

What have you learned from this experience? (What would you like to tell other students? How might you cope more effectively in the future?)

B. Process Variables

Comment. To have any influence, the message of the YR programme must be received, heard, and understood. If YR is to have any impact on young people, it must be accepted by them. This means that young people should have a positive attitude towards the programme. Who does and does not warm to the ideas of the programme, after being exposed to it? Is it actively rejected by any type of student or group of students? This is an important question that relates to the psychological concept of individual differences. Young people all have different personalities and there may be some individuals for whom the YR message is very helpful and some for whom it is less so, in its present form. We need to know more about how different students are influenced by different strategies.

Since the Yellow Ribbon concept was originally an American one, and since New Zealand culture has traditionally been more focused on self-reliance, is the entire exercise of asking for help something that comes less readily to Kiwis of all kinds than to young people in the US? And if this is so, are there some other ways the essential message can be modified so as to increase its impact on Kiwi youth? What exactly is the meaning of “It’s OK” to Kiwi youth? Is that phrase interpreted as permission giving, or something deeper, such as encouraging interdependence? Do teachers, when asked for academic help, not encourage pupils to try to solve a problem on their own? What kind of help is it OK to ask for and when should one attempt to use one’s own resources first? Or is the actual meaning of asking for help something more like being willing to self-disclose, open up one’s feelings, talk about personal issues with friends or peers, and again, is that an acceptable behaviour across all youth cultures in New Zealand? It will be necessary to find out if there is a group of students for whom the message is negative in some way, possibly because of cultural or religious issues/beliefs.

It would be useful to know how often students do in fact “ask for help” from friends, ambassadors, teachers, parents, etc. What do we really know about the barriers to making such requests—pride, shame, distrust? Is there a problem with some young people not recognising that they have a problem and need help, rather than not being able to ask for it; is it more a matter of insight and the kinds of coping strategies that they are used to using? There is a large literature on children’s coping strategies and a very large literature on the sorts of social networks children have that allow them to talk about their issues. There are gender differences in the degree to which girls and boys talk about their own personal feelings. There is also the problem of relational bullying, in which girls, predominantly, torment each other with gossip, innuendo, and manipulation of relationships. Is there a risk that open yourself up to the wrong person could radically change your social status if confidences were not kept?

Ambassadors are told they are not counsellors or therapists, but in fact peers very often do provide something that is very similar to counselling. Just how good are ambassadors at dealing with really tricky situations that might come to their attention? What sort of advice do they actually give and are there some students who are becoming ambassadors because of emotional needs of their own? Conversely are highly successful, competent, and popular students becoming ambassadors, and as a result of their own status are not easily able to recognise problems in others or are not approached by others as potentially sympathetic listeners, e.g., does a very popular successful young person really understand loneliness, rejection, and alienation?

Study 5:

This study will further investigate the behavioural details of help-seeking behaviour in adolescents of this age range. Interest focuses on what sort of issues or problems young people feel comfortable discussing with peers? What are some of the barriers to help-seeking: Lack of trust; not having close friends to confide in; not thinking the YR ambassadors could be helpful; not liking or trusting the YR ambassadors; others?

One issue is that even if young people are encouraged to ask for help, the most vulnerable will not do so. Possible reasons for this are:

1. distrust of peers and social isolation
2. prior history of help-seeking not being successful
3. shyness and anxiety about seeming to be weak
4. not having the skills about who to approach or when to approach someone
5. strong cultural beliefs in self-reliance
6. not being aware of the need for support
7. being too depressed to initiate a new positive action

These possible barriers to help-seeking could be examined in a more intensive programme on asking for help that might be conducted by school counsellors under supervision of the principal evaluators. The counsellors would be paid for their time, or they might be counselling students in training. (We will discuss this project with Susan Webb, the Director of the counsellor training programme at Massey University).

Counsellors would be asked to conduct a series of small groups for groups of volunteer students interested in enhancing their “interpersonal skills”. Participants will be told that their work will enable other students to be more willing to ask for and receive peer support in the future. Groups would be school-based and conducted during convenient school or after school hours. Each group would be approximately four sessions long and would focus on the value of social support from peers or family. Participants in each skills development group (approximately 7-10 students per group) would follow the protocol listed below:

Session 1: introduction to the purpose of the group. Administration of the questionnaire described in Study 3. Questionnaires are gathered up for later scoring and participants are asked to describe the sorts of things they described in the questionnaire. Group leader organises these reasons on a white-board. Discussion of reasons among the group.

Explanation of first homework task: sometime during the next week, tell someone something that you would normally not disclose. Observe their reaction and report back to the group how the other person responded and what you felt about it.

Session 2: participants report back to group what they did on their homework assignment. If assignment not done, emphasise that it is hard to do something new. Start to discuss common barriers. Role play some approaches to asking for help.

Assign next homework task: ask a close friend whether it is OK to ask for help and bring back his/her answer to the group, whatever it is.

Session 3: participants report on their homework assignment. Discuss any negative attitudes revealed by participants’ friends. Brain storm procedures to come up with solutions for possible barriers to help-seeking. Plan final homework assignment: sometime during the next week, actually ask someone for help with some aspect of your life. (Use a relapse prevention strategy in which failure to complete the homework task is not a failure but an opportunity to learn more about yourself and your needs).

Session 4: report back on homework exercise; discuss successes and failures regarding help seeking. Make a list on the whiteboard of all the factors that seem to make help seeking a good thing; write a list of all the factors still remaining that people think are hard. Ask the group to come up with strategies to overcome the hard part. Summarise the group sessions and review what has been learned. As the group to suggest ideas that would make it easier for other young people to ask for help.

Data would be the reasons given by the students as to why help seeking had barriers. These barriers would be collated and the frequencies of the different issues being reported would be recorded. Other students’ responses to help seeking behaviour would be recorded as evidence regarding what sorts of reactions young people are likely to encounter. These reactions to be catalogued and on the basis of them a model developed of help-seeking behaviour, why it is difficult, what barriers there are, and how other individuals respond to requests for help. This model would then yield questions to be posed in the student evaluation questionnaire of their Yellow Ribbon training experience.

Study 6: Students’ awareness of the Yellow Ribbon programme

This study will measure the short term impact on students’ attitudes towards and understanding of the YR programme.

The measurement instrument to be used will be the questionnaire developed by Dr Carolyn Coggan and her team at the Auckland University Injury Prevention Research Centre. The study will be conducted in consultation with Dr Coggan. This is a very useful questionnaire and there is already data based on the introduction of the YR campaign at Howick College. In that evaluation there was a very high return rate of the questionnaire and it yielded some interesting basic data that can be used as a benchmark to compare the introduction in other schools over time.

Thus the same questionnaire will be used in order to allow comparisons over time. Some minor modifications will be made, especially in typing format and making sure the form is for universal use (not restricted to one form or one school). Some more important modification would be to consider asking students which cultural group they associate with, how old they are, if they have any religious affiliation? The questionnaire instructions would be more explanatory and the format more teenage friendly.


The administration of this survey would become routine for every school in which YR is introduced. As part of the packet of information provided the schools when they apply for the YR programme, they would be informed that a follow-up questionnaire would be administered and requesting the school’s support in ensuring that the questionnaire went to as many students as possible.

We would need to keep track of the questionnaires, ensure that they are returned, and estimate the number returned compared to the number distributed and the eligible number of students in the school—which gives us information on return rate.

Data analysis:

Initial data analysis would be descriptive, using the same reporting strategies as the IPRC report, for benchmarking purposes.

Second, however, we would conduct cross tabs analysis of the responses. The IPRC report did do this for boys and girls, and for age ranges. However it would be useful to try to predict from other responses given on the questionnaire, what the characteristics are of those students who do not seem to favour the YR programme, know about it, or feel comfortable in asking for help.

In addition to all the items on the current IPRC questionnaire, we might add just a few items to help with identifying more about those young people who seem to be uninfluenced by the YR campaign or who might be negative. In particular we would look for one or two items to add that would indicate whether the student feels emotionally connected to the school, as a study by Resnick et al. (1997) of 12,000 American teenagers indicated that emotional connectedness was the best (and only school) predictor of suicidal ideation.

C. Programme Monitoring

Comment. Sometime called “treatment integrity” it is often important to obtain some estimate that the programme is being delivered as stated (in a programme guide or procedural manual). Of the four categories of information proposed, this is the easiest to carry out, since it is mostly within the operational procedures of Yellow Ribbon. Programme integrity can be evaluated by such methods as summarising the ambassador’s evaluation of their training, by monitoring the introductory material of the coordinator, or checking with guidance counsellors that the programme has been introduced as promised.

Part of programme monitoring requires understanding of the face validity of the programme to important stakeholders. What is the impression of the programme held by principals, teachers, and school counsellors, for example? Does the introduction of YR into their school make their task easier or harder?

What are the impressions of the YR programme by major cultural/ethnic groups in New Zealand? What are the opinions of educational leaders and youth workers in Maori, Pacific nation, and Asian groups? What has been the involvement of young people from these groups, thus far, in the Yellow Ribbon programme?

Study 7: Ambassador training evaluation

These evaluation forms have been routinely gathered after all Ambassador training programmes. A largish number are currently held on file, but have never been analysed. This is a useful source of data and the forms need to be summarised and the data made into a usable form.

In order to quantify the answers in a more manageable way, we will develop a simple coding strategy for the open-ended answer items. This will be done by first reading a large number of responses and listing the various themes reported. These will then be listed and from then on the individual questionnaires can be scored against these items.

For example, for Item 1, “What made you decide to become a YR Youth Ambassador”, we might reduce all the answers to the following categories:

Perceived social need, e.g., “There are a lot of unhappy kids in this school” “Something has to be done about teen suicide”

Influence of friends, e.g., “My best friend encouraged me” “My rugby team said they were going to do it”

Personal gain, e.g., “I thought it would be fun,” “There are neat handouts”, “I was kind of bored”

Publicity, e.g., “Saw it advertised” “Saw Michael Jones on TV”

Other, e.g., “Don’t know” “I just came along”

If these categories cover all possible answers, then the distribution of motives and the change in this distribution can be studied over time.

Belief in the programme can also be a critical factor and the reasons for the belief can be coded. Optimism/pessimism by Ambassadors can be related to the school, the students’ ages, and also to the person doing the presentation or training.


The instrument has already been in use for some time and the current data need to be analysed. For future use there are some features of the instrument that need to be looked at:

As it is not anonymous, the respondent is likely to be less candid or truthful in their reply.

Those Ambassadors willing to be contacted again should put their name and address on a separate slip of paper and hand that in at the same time they return the questionnaire.

The questionnaire needs a brief explanation that the information will be used to improve training and the programme in general.

Data analysis:

Once some of the forms have been perused and we have developed a scoring system, the remainder of the questionnaires could easily be scored by an assistant, such as a university student.

Study 8: Ambassador experience survey (end of year or of ambassadors’ term)

We will conduct a survey of the experiences of ambassadors who have been functioning in schools in which the Yellow Ribbon programme was introduced. For some of these young people, 2002 will be their last year in the programme. Surveys will be anonymous in order to try to get as accurate a self-appraisal as possible. Themes for the questionnaire will cover:

Satisfaction with being an ambassador

What I have learned from being a YR ambassador

Highlights or specific experiences that were very meaningful

Thoughts about how the programme could be improved

This survey will be designed and piloted as soon as possible.

Study 9: Independent observation of YR programme segments (programme integrity check)

Two independent observers will sit through and observe the YR staff (coordinators) implement each aspect (segment) of the programme. It will be important to emphasise to the staff that this is not an evaluation of them as individuals but an audit to ensure that critical features of the Yellow Ribbon programme are being effectively and accurately communicated.

Two observers are needed so as to ensure that the ratings of the implementation are not simply due to the personal views of the observer; one observer could be from the evaluation team, and another could be someone from the school, such as a teacher. Observers will rate the presentations according to specific criteria and note down any information that might be useful for the future. The rating criteria will be developed by Thelma French and her staff, but as an illustration it is envisaged that it will look something like this:

Observational Rating of Programme Implementation

A. Presenter explained the purpose of the programme in a way young people could understand and appreciate:



B. Presenter was enthusiastic:



C. Students were attentive, not restless:



D. Presenter emphasised the “It’s OK to ask for HELP” theme:



E. Presenter mentioned “suicide” only in the context of how the programme started in America and New Zealand:



F. If specifically asked about suicide, presenter discussed the topic in a factual way without glamorising it and indicating that suicide was serious but not an epidemic among high school students:



Students asked the following interesting questions (reply by presenter given in brackets)

1. (response: )

2. (response: )

Impressions of the presenter (underline all that apply):

prepared confident student-oriented respectful gentle honest warm lively facilitative sensitive responsive culturally aware professional authoritative

[Possible positive score: 14]

Impressions of the students (underline all that apply)

serious mature enthusiastic reflective engaged confident respectful focussed genuine committed curious happy

open-minded thoughtful

[Possible positive score: 14]

Rating of segments of the launch:

1. Introductions (should be interesting, culturally sensitive, and convey critical information)

Excellent Good Fair Poor

above expectations meets criteria needs some work revisions needed


2. Activities (should include everyone, be fun, and be age-appropriate)

Excellent Good Fair Poor

above expectations meets criteria needs some work revisions needed


(other segments as needed)

This sample rating form might be used for the activities that launch a particular programme. Each component of a launch might be rated by the observer.

Ratings of implementation will be gathered routinely and will be scored for the general level of success. Specific criticisms or comments can be fed back to staff in a positive way. Scores will reflect:

Was the programme implemented according to stated criteria?

Was the programme positive?

Was the programme well received by students?

Did the presenter (co-ordinator) seem professional, knowledgeable, and able to communicate with the participants?

Subsequently these scores might be related to the outcomes and apparent success of the programme in that particular school.

Study 10: Views of key stakeholders

The evaluation effort will conduct focus groups of key stakeholders. The group members will be encouraged to express any views or attitudes that they like. They will first be asked to comment on Yellow Ribbon as they have experienced it, they will then be shown various Yellow Ribbon materials and they will be asked to comment on these.

The following groups of key stakeholders will be represented in the study:

* High school principals, deputy principals, teachers, and guidance counsellors from YR schools
* High school principals, DPs, teachers, and guidance counsellors from non-Yellow Ribbon schools
* Mental health professionals (social workers, psychologists, child psychotherapists) from Child, Adolescent, and Family Mental Health teams, CYFS, and SES
* Medical practitioners (GPs, child psychiatrists, paediatricians)
* Parents of teens in Yellow Ribbon schools
* Parents of teens not currently in Yellow Ribbon schools
* Maori professionals and youth workers
* Pacific nation professionals and youth workers
* Asian professionals and members of groups working with Asian and migrant students

Focus groups will be conducted by the evaluation team, with additional members recruited for the Maori, Pacific Island, and Asian groups.

Focus groups will last 90—120 minutes and approximately follow the protocol of focus groups already described. Participants will be volunteers who will be approached and asked to participate in an open process. Confidentiality will be assured and the ground rules explained.

The following questions will be addressed:

What do you know about YR?

What is your involvement with the YR programme?

What are your impressions of the programme?

Are there any negative features of the programme that concern you?

How might the programme be improved?

Once that discussion has concluded, various Yellow Ribbon materials, brochures, and publications will be passed out to all group members, who will be asked to look them over for 5-10 minutes.

OK, now that you have looked at these materials, what impresses you most positively about YR?

What impresses you the least or causes you concerns?

What is your overall impression of the materials examined?

What is your final judgment about the value of Yellow Ribbon in our schools/communities?

Data analysis: Discussion will be recorded in notes or on tape recorders if the group agrees and summarised by the evaluation team according to general themes emerging.

Resources: Since experienced evaluators will need to conduct the focus groups, this will be labour intensive. Organising the focus groups, finding a neutral location in maybe different parts of the country, and covering any expenses of focus group members to get to the meetings, will require some costs.

Study 11: Participants’ and schools’ satisfaction survey

This component is designed to obtain some sense of how well received the entire programme has been at each individual school. Satisfaction with the programme is important for ensuring that it has a positive reputation and will be continued, but should not be confused with effectiveness or the degree to which a programme has met its objectives. This survey will be in the form of a questionnaire that can be administered to teachers and students in a given school, after the programme has been in place for a set period of time, say one year.

As this is an attitude survey it will consist of both ratings by the respondents as well as some other way of judging how the Yellow Ribbon programme has been conceptualised by those exposed to it. Attitudes are rarely simply good or bad, positive or negative, but are dimensional, and semantic differential methodology can be used to assess these attributes or perceptions. In this method a pair of polar opposite adjectives is presented and the respondent is asked to indicate on a 7-point scale the extent to which this construct applies. Examples are given below:

School Survey: Impressions of the Yellow Ribbon Programme

Name of school……………….


I was involved with Yellow Ribbon:


If I were to give the Yellow Ribbon programme in my school a grade, I would give it a(an)


I would judge my degree of involvement with Yellow Ribbon

About right for me Too much Too little

When I think of my involvement with Yellow Ribbon, I feel (underline all that apply):

happy enriched saddened frustrated appreciated worthwhile

humbled fulfilled doubtful neutral worried regretful

fortunate educated optimistic satisfied hopeful

[positive score possible = 10]

Please rate your perceptions of the Yellow Ribbon programme, as you know about it or experienced it, according to the following descriptions

Hot :……:……:……:……:……:……:……: Cold

Good :……:……:……:……:……:……:……: Bad

Safe:……:……:……:……:……:……:……: Dangerous

Useless:……:……:……:……:……:……:……: Useful

Frivolous:……:……:……:……:……:……:……: Serious

Active:……:……:……:……:……:……:……: Passive

Commercial :……:……:……:……:……:……:……: Spiritual

Pakeha:……:……:……:……:……:……:……: Multi-cultural

Dumb :……:……:……:……:……:……:……: Smart

Kaupapa Maori :……:……:……:……:……:……:……: non-Maori

Childish :……:……:……:……:……:……:……: Mature

Calming :……:……:……:……:……:……:……: Exciting

The adjectival pairs are examples only and need to be selected by YR ribbon staff. The answers would be scored according to overall satisfaction and perceptions of the programme according to the dominant dimensions arising from the semantic differential. The satisfaction survey would allow the YR organisation to monitor opinion about the programme by those centrally involved, over time and according to demographics—is there some segment of the population, some type of school (decile, location), or some aspects of the programme that are less appealing to the intended recipients than others?

Data can be easily analysed as a routine part of YR staff activities and collated over time.

Resource needs are minimal once the questionnaire has been developed and piloted, it would routinely be administered to school at a predetermined moment in time after the programme was launched in that school.

D. Long-term Outcomes (Ultimate Goals)

Comment. In principle it might be expected that the Yellow Ribbon programme would have the effect of reducing youth suicide nationally, which is a stated goal of the Trust. However it is not reasonable to expect this to happen beyond the boundaries of the schools where the programme has been formally introduced.

To demonstrate that YR has had a specific causal effect on the suicide rates in the high school age range is a near impossible task. The only approach is statistical and would require careful examination of the annual suicide rates in the relevant age group over quite long periods of time. As the rate is going down already, one would need to show that the rate of decline accelerated at the time that YR was introduced, or maybe a year or so later, and that no other new national initiatives occurred at the same time. This is called an interrupted time series design and it requires data over a long time period. As stated earlier, it seems unreasonable to expect YR to influence rates quite this noticeably; YR is only one possible component of a national suicide prevention initiative.

To look at rates within schools where the programme has been introduced would require data on the number of suicides in each school. Rates per school will be very low, especially considering that some young people in the high school age-range who suicide are not actually in school.

A closely related goal is presumably to reduce suicidal ideation in young people. In other words, can the programme reduce adolescents thinking about suicide, feeling that they have few or no reasons for living, and contemplating plans? This sub-lethal behaviour might also include self-harm (e.g., cutting) and related attempts at harm. Attempts requiring hospitalisation or medical attention can be documented, and ideation can only be assessed by self-report—either interview or questionnaire.

Another approach to thinking of the YR programme as a suicide prevention approach is not to argue that it will be able to lower the national statistics, but to suggest that it can and has helped specific individual children. Since the causes of suicide are highly complex, there is no guarantee that the “asking for help” strategy is a universal solution—there may well be young people who are way beyond such possible intervention. But there may be other young people for whom help at the right time could make the difference between suicide and life. It can be argued that if this is true for just one person, then the strategy is worthwhile. The implication of this argument is that outcomes can be considered not in terms of the overall decline in rates, but in terms of individual success stories. These would need to be just that—stories: highly personal, detailed anecdotal accounts by a young person regarding their experience of feeling suicidal but getting peer or other social support (help) at exactly the right moment. “Testimonials” of this kind do not carry a lot of scientific weight, but if well-documented are just as valid as group statistics.

Another possible major outcome is that because of the YR message, young people learn a skill (asking for help) that becomes accepted or ingrained in their approach to problem solving, so that in the future they are more likely to seek external help or personal support when they are having the most serious difficulties. If this were true it would support the idea that YR has a delayed effect and that it might still be useful as a suicide prevention strategy for the young people in the very vulnerable young adult age range (18-25). National statistical data will tell us the suicide rate but presumably not provide the details of what actual school a young adult who committed suicide attended when they were in high school—and one suspects that there would be a high rate of young people who had dropped out of school early or attended a variety of different schools. Would schools themselves ever find out reliable information on how many of their former pupils are deceased within X number of years of leaving school?

These issues and concerns lead to the following suggested evaluation studies:

Study 12: Statistical analysis of the national data base.

Obtain access to the national data base and plot monthly suicide rates/incidents for the target population (age range). Gather this data for as many years preceding YR’s introduction as possible. Plot onto the time-series specific YR activities and the introduction of other possibly relevant events or initiatives. Use time series analysis methods to try to detect any relationship between introduced initiatives and changes in the rate/incidence of suicides.


Access to the national data base

Monitoring of YR and other suicide prevention activities

Assistance from a bio-statistician in plotting the time series and relating any changes to identifiable events.

Time frame:

Ongoing, over at least 5 years—this could be put in place as a routine exercise and simply monitored over time. Someone would need to be responsible for maintaining the data on a monthly basis.

Study 13: School-based frequencies of pupil safety

We would need to get cooperation from each school that introduces YR, to agree to provide us with any information they have regarding harm coming to their currently enrolled pupils. The information may not be kept systematically, but they could perhaps start to do so, simply providing us with the numbers, perhaps at quarterly intervals, of any knowledge they have of high school students in the following categories:

Death by suicide

Hospitalisation for attempted suicide

Death by accident (road, drowning, accidental over-dose etc)

In-patient psychiatric admissions

Suspensions or expulsions for drug use

Number of new school-counsellor referrals

Information would be strictly confidential, but would allow us to collate it on a regular basis in order to examine rates of unsafe episodes of student behaviour in the targeted schools. Schools would be provided with a form in which the above incidents would be defined and someone designated by the school principal would be responsible to sending in the completed form, with the information, to the YR office, where it would be collated.

The information would allow monitoring of unsafe student behaviour in targeted schools, but would not allow us to compare these rates with (a) before YR was introduced, or (b) with those schools where YR has not been introduced. To get this additional comparison information would be good, but difficult to obtain, since it would require cooperation from schools not yet associated with YR.

Study 14: Influence of Yellow Ribbon on suicidal ideation

While the above data might be monitored for every school eventually included in the YR programme, a closely related study might focus on suicidal ideation in a random sample of teenagers in just one or two schools, before, immediately after YR is introduced, and 6 months after YR has been ongoing in the school. A random sample of students, no less that 40, would need to be interviewed by an experienced and trained counsellor or psychologist. After establishing some rapport, the interview would move on to questions such as (not necessarily in these words):

Do you every get really blue, down in the dumps?

What do you do to deal with these feelings?

Have you ever felt so miserable you want to harm yourself in some way?

Have you ever thought of committing suicide?

What were your thoughts?

Were the thoughts fleeting, or did you dwell on them?

What did you think would be the consequences (for yourself, friends, and family)?

Did you have plan?

Do you still have these thoughts?

What have you been doing about them?

Can you think of three things that would help keep you safe if you felt any suicidal thoughts in the future?

There would be a formal appraisal of the actual level of ideation expressed and follow up with referral and other support.

Resources: the second part of this study would require an experienced interviewer to conduct at least 120 clinical interviews per school, which is quite labour intensive. Both this study and Study 13 require ethical approval from an established ethics committee and we would need to make sure every possible precaution is in place. However both parts would be important for an overall evaluation effort for some time in the future.