Volume 14 (2)

Reflections on Assessing Mission-Related Outcomes at Zoos and Aquaria: Prevalence, Barriers, and Needs 
Louise Bradshaw and Ed Mastro

We continue our Reflection on Evaluation and Research feature this month with a dialogue sparked by Jerry Leubke and Alejandro Grajal’s recent article in Visitor Studies [Volume 14(2)], Assessing Mission-Related Learning Outcomes at Zoos and Aquaria: Prevalence, Barriers, and Needs.The article sparks discussion about the role of institutional mission and the feasibility of mission-related evaluation in informal learning institutions.

For this discussion, Louise Bradshaw, Director of Education at the Saint Louis Zoo, and Ed Mastro, Director of Exhibitions at Cabrillo Marine Aquarium in San Pedro, CA, were asked to reflect on the article and the role of evaluation as it relates to mission.

Jim:  In Luebke and Grajal’s article, we learn about current practices related to evaluation in AZA (Association of Zoos and Aquariums) member organizations, as well as the extent to which evaluation efforts are linked to or driven by institutional mission.  To begin, I wondered if you might start with an initial reaction to the article.

Louise: I do think this is a very key question in our business, how we actually perform against our mission, so I was intrigued from the start.  There seems to be an assumption by the researchers that if you do visitor research, of course the primary focus would be on the impact of the educational mission: looking for cognitive, affective and behavioral gains. My experience is that visitor research in AZA institutions is often primarily focused on business-oriented measures, which we all need to continue to do well as an institution.  Visitor Research does well at our institution because our manager of audience research spends the vast majority of her time on business-oriented measures. She also does a wonderful job on targeted educational mission projects, for example recent work on front-end and formative work that helped us improve interpretation and visitor experience at a new exhibit under construction, and designing our program impact tool and supervising our work in gathering and analyzing that data. I'm sure we are not alone in feeling a pull for resources between business and educational mission. It would have been helpful to know what percentage of resources institutions with visitor research spend on business versus mission impacts.

Jim: Louise, could you say more about what you mean by “business-oriented measures”

Louise: These are measures related to guest experience—level of satisfaction for guest services, café experiences, and so on.  A larger scale example might be the study we did a few years ago to find out how an extended freeway closure might impact visitation at our site.  We wanted to learn more about visitor perceptions of the closure, and what concerns they had, if any, about visiting the zoo. This helped inform our communications to guests so they could choose the best route to the Zoo to avoid any confusion and frustration.

Ed: There is no surprise [to me] that marketing departments drive audience research and no surprise that the bigger facilities have more resources and perform more research. But that might be an artifact of the need to keep people coming in the gates and sign up for our programs. Many zoos’ and aquariums’ survival depends on the "gate" and we need to keep attendance up. Meeting our mission is desirable, but paying the bills is critical.

Jim: So what I’m hearing is that while we would probably all agree that it would be useful to know whether or not the institution is achieving its mission, it may be more critical, from a logistics standpoint, or maybe a ‘survival’ standpoint, for many (if not most) institutions to gather information that will help the business to survive.  And given the limited resources of the institution, survival takes priority.  Ed, what about you?  What caught your attention as you read the article?

Ed: I agree [with Luebke and Grajal’s idea] that our industry is in flux. Old school education and exhibits were focused on presenting the animals with some facts (cognitive goals)—featuring not the habitats, but featuring animal adaptions and their unique habits. “See the turtle, did you know they can sleep for 6 months and can hold their breath for 10 minutes. How long can you sleep? How long can you hold your breath?  Get ready, let's try....”

Then there is the goal we are all after, changing behavior, and saving the world and all the plants and animals in it. But how do we evaluate that? How do we measure success when we have a limited contact with the public and it is hard to measure behavior change when they are not at our facilities?  And then which behaviors do we want to change? It seems that we are trying to focus on education and behavior modification on two fronts—the local environmental issues, such as watershed protection, as well as the issues that are worldwide, such as climate change or ocean acidification.  And there are also issues out there that do not seem to have a direct effect on us, such as many of the conservation programs that are saving habitats or species in other countries. Which behavior do we want to "modify"?

Jim: Well that’s part of the challenge, isn’t it? Evaluating whether you ‘inspire wonder’ or help visitors to ‘develop a lasting relationship with nature’ is not something you can do directly.  These statements need to be operationalized, or re-framed into something that can be measured.  But with that comes questions like ‘Which behavior do we want to modify?’So,I’m wondering then, from your perspectives, how would mission-related evaluation support what you do?  As the paper suggests, having the capacity to conduct any visitor studies is an obstacle.  But if you DID have the means, how important would such evaluation be?

Louise: I was on a small working group that developed our new our mission statement.  That one [the new mission statement] is actually a slightly reworded version of the one we developed in 1913 when we started as an institution.  One of the things that I found frustrating in the process is that I kept sitting there thinking ‘How do we measure this?’ and everyone else was just focusing on “We need to stay true to 1913. We’ve gotta have the words from 1913, that’s our heritage.”

Jim: So people weren’t really thinking of the mission as a measureable outcome.

Louise: It was like this reverse process.  It was about making sure we could describe what we already do.  It didn’t seem aspirational, but rather more descriptive.

Ed: Yes, it’s more of a target.  But you don’t really think about measuring it.

Jim: So what we’re talking about is the perceived role of the mission.  So maybe it’s not about being measurable.  Rather it’s a guide that helps take you where you want to go.  I’m not suggesting that you shouldn’t try to assess how programs or activities stack up to the mission.  It’s just that it’s not framed that way.  It serves as a guide for what you do.

Ed: Our institutional mission is related to engaging visitors in marine life of Southern California through recreation, education and research to promote a better appreciation and conservation. It is just like many others mission statements throughout AZA. This mission determines what we can do, or can’t do—guiding decisions related to the programs we do, or the exhibits we develop.  I couldn’t do an exhibit related to the geology of Canada unless it connects directly to southern California marine life.  Engagement might be evaluated, but our mission is so large in scope, the challenge is figuring out which part were we successful in achieving.  I like to think of our mission statement as somewhat like a stew, complex and composed of many other ingredients. Each ingredient can be evaluated on its own, but put them together in a pot… Now this gets interesting…

Jim: So the mission, and its underlying complexity, serves as a guide for what you do, or should do.  But it may be less important to specifically assess that?

Louise: Agreed. It’s assumed that we got it.  It’s all taken care of.  And I don’t think we want to [assess it].  Because, perhaps, we don’t want to know that we’re not [meeting the mission].

Jim: Of course, but that’s the nature of evaluation, right?  It’s not wrong to have that concern—it’s human nature.

Ed: It’s not that we’re not interested in visitor research, or even staying true to our mission. But I think we’re more focused on changing what we do to do better, but not necessarily the big “are we meeting the mission in the first place” question.  Rather we have program-specific questions, such as ‘Do visitors understand that research is done here?’ or ‘Do kids understand global climate change better, as a result of participating in that program?’

Louise: I think we don't evaluate mission at the global level because it is really big and amorphous and means a lot of different things to all our stakeholders. We all do it at the compartmentalized level: numbers, attendance, budget, learning, and satisfaction level. It isn't that we can't, but I do believe we need to develop new research tools to get there.

Jim: One last question.  As a researcher, I am always concerned about whether my findings or publications are useful to the practitioners and practices I’m studying.  In what ways does this article help you as a zoo or aquarium practitioner?

Ed: This article is a good reminder as practitioners we need to “keep our eyes set on the goal.” We may never really truly know if we truly are meeting our goals as defined in our mission statements, but teasing out segments of the mission should be evaluated and success in these areas then may be used to help define if indeed we are heading in the correct direction as defined in our mission statements. Visitor satisfaction is critical to our continued successes, but this article also serves as a reminder that we need to continue to conduct more evaluations to determine if indeed we are “making a difference” in conservation education.


Louise Bradshaw is Director of Education at the Saint Louis Zoo, St Louis, MO.

Ed Mastro is Director of Exhibitions at Cabrillo Marine Aquarium in San Pedro, CA.

Jim Kisiel is Associate Professor of Science Education at California State University, Long Beach.