We are excited to announce the relaunch of the Southern California Evaluation Association (SCEA)! This year a group of evaluation professionals came together to envision how SCEA could address the interests and needs of those involved in evaluation in the Southern California area. We have recently updated the website and are currently planning events and other programming for evaluation professionals that provide professional development, collaboration, and networking opportunities. We are also going to be surveying our member list to get input on how SCEA can meet their needs. 

If you are interested in learning more about SCEA there are a few ways to connect with us:  

  • Review our website and learn about our mission statement, leadership team, and upcoming events
  • Follow us on LinkedIn
  • Join our mailing list to hear updates about the group
  • Email us at scea.us.info@gmail.com

We look forward to connecting with you!


SCEA Member Spotlight: Jacob Schreiber

Jacob_SchreiberName: Jacob Schreiber

California State University, Long Beach – Master of Arts in Applied Anthropology
University of California, Irvine – Bachelor of Arts in Psychology and Social Behavior

What is your current position(s) and what do you do?
I am currently a Research/Evaluation Analyst in the Department of Medical Education at the Keck School of Medicine of USC (KSOM). In this role, I support the Director of Research and Evaluation to shepherd student and faculty projects towards presentation and publication. I also track the completion of research/evaluation projects, coordinate research/evaluation workshops and meetings, analyze and write reports, and disseminate those reports to various stakeholders throughout KSOM.

What led you to the field of evaluation?
I took a program evaluation course as a component of my master’s program at CSULB. I enjoyed the applied aspect of the field. I knew that I wanted to do something that made a difference, and to me the most important way to do that is to ground decision-making with data and facts. Although I don’t have a formal background or degree in evaluation, I enjoy that it brings many different disciplines together to solve real-world problems.

What has been your favorite moment in your career so far?
I am conducting a longitudinal qualitative study of student feedback of the medical school curriculum over a four-year period. When I presented preliminary findings to our leadership, they were impressed by the methodology and surprised by the detail of the results. Working in a medical school with primarily MD’s and life scientists, it has been rewarding to show them how social research can be used to effect decision-making.

What motivates you at work?
Hearing how much the students appreciate what our office does, and that they can see changes based on the feedback they provide us is a big motivator. With my background in Anthropology, I also enjoy knowing that the research I do effects the way that training is conducted at medical schools to produce doctors that are treating the whole person: taking cultural, socio-economic, and personal experience into consideration when treating their patients in the future.

If you could give advice to young professionals in evaluation, what would it be?
As a young professional in evaluation, myself I’m somewhat hesitant to offer too much advice. But if you are still in school, I’d recommend building your skillset as a mixed-methods analyst. I have found that being open to taking on a variety of responsibilities that are sometimes outside of my training, and being able to interact with people from other disciplines to understand their theoretical perspectives and methods has been a key to growing in the field. If you are in the market for an evaluation job, I recommend being persistent and getting creative with your search strategies. There are great organizations hiring people with evaluative skillsets that aren’t necessarily posting a position for an “evaluator.” That was something I learned after graduating with a degree in Anthropology, which is a discipline with skills that fill many organizations’ needs, but you will rarely see in a job title. I think that can apply to many of the social sciences.

What are your favorite resources for evaluators?
In my opinion, AEA remains the best resource for evaluators, and their job boards were where I found my current position. I think that it’s also important to keep in touch with related disciplines, so I still follow trends at the American Anthropological Association and the American Psychological Association.

What do you like to do in your free time?
I’m a big nerd who loves to play Dungeons & Dragons and video/board games. I also enjoy listening to and collecting records, seeing live music any chance I get, and singing karaoke.

SCEA will be posting member spotlights every other month on our website. Stay tuned to learn more about SCEA members and check out our events page for ways to connect with other local evaluators.

AEA Evaluation 2018 Proposals

We are looking forward to connecting with SCEA members and other AEA local affiliates this fall at the American Evaluation Association (AEA) Evaluation 2018 Conference. This year, the conference will take place in Cleveland, Ohio on October 31 – November 3. The theme for this year’s conference is “Speaking Truth to Power.”

Proposals are due Thursday, March 15. Submission information can be found here.

Stay tuned for details on SCEA networking activities at Evaluation 2018!



An opportunity for evaluation professionals: CDC Evaluation Fellowship Program

The Centers for Disease Control and Prevention (CDC) is seeking applicants for the 2018 class of the CDC Evaluation Fellowship Program. This is the eighth year of this initiative and represents a major commitment by CDC to program evaluation and program improvement. Fellows will work under the leadership of CDC’s Chief Evaluation Officer; they will be matched with CDC host programs in Atlanta, Georgia to work on program evaluation activities for/with those programs.

About the Fellowship

  • The CDC Evaluation Fellowship Program aims to expand the capacity of CDC programs to conduct evaluation and increase its usefulness and impact.
  • The Evaluation Fellowship Program signifies CDC’s dual commitment to making program evaluation a standard part of practice and to developing a cadre of professionals with the skills to make that happen. With the CDC Evaluation Fellowship, programs will have the resources, tools, and leadership to continuously improve their work.
  • Started in 2011 with an initial class of five Fellows, the Fellowship has about 15-20 Fellows each year.
  • Though the Fellowship has grown dramatically in just a few years, the Fellowship remains highly competitive, with only about 10% of applicants being selected for the Evaluation Fellowship Program.
  • The Fellowship is intended to be a two-year program, with the second year contingent on satisfactory performance and availability of funds.

About the CDC Evaluation Fellows

  • Experience: Fellows are PhD or master’s degree professionals with backgrounds in evaluation, behavioral and social sciences, public health, and other disciplines relevant to CDC’s work. Successful applicants also typically have significant experience in applied evaluation projects.
  • Placement: Fellows are based in Atlanta working within one of CDC’s programs.
  • Funding: Fellows receive a monthly stipend depending on education level and experience and, unlike many fellowships; CDC Evaluation Fellows receive substantial financial support for their training and professional development, and a supplement to offset health insurance costs.

Applying for the Fellowship

  • Applicants for the Fellowship go through an extensive selection process, including interviews with potential host programs, culminating in CDC host programs and Fellows matched based on mutual interest.
  • Candidates must have received their qualifying degree within the past 5 years.
  • Qualifying degree must be completed by end of Spring semester of the year candidate is applying and on transcript by June.
  • Fellows are brought on through the Oak Ridge Institute for Science and Education (ORISE) and are not considered full time employees of CDC during the fellowship.
  • This is a training fellowship so Fellows do not become employees of CDC.
  • Non-U.S. citizens are eligible to apply. For more information, see the ORISE Guidelines for Non-U.S. Citizens.
  • The deadline for applying to the CDC Evaluation Fellowship Program is April 13, 2018, 1 pm Eastern Time.

To apply, create a profile, complete a short survey, and upload your CV/resume here: https://www.zintellect.com/Posting/details/4050.


  • April 13: Application deadline
  • Late April: We will select and notify finalists
  • April 26-May 29: Host programs interview finalists
  • Early June: Fellow selection and notification
  • August 13: Selected Fellows must start by this date

Share your Data Viz Techniques with SCEA!

We are looking for a few people to give short presentations on different aspects of data visualization. Are you interested?

The presentation can be about a visualization tool you use, a report you have completed, a technique you are fond of, an evaluation you are currently doing, or anything else that relates to data visualization. This is a great opportunity to reuse an existing presentation from a conference or previous talk.

If you are interested, post a comment here or contact Alana Kinarsky by Sunday, Feb 25 with a few sentences about your presentation idea. Your proposed presentation does not need to be fully developed, we can work with you to finalize the idea in the coming weeks. Once we have heard back from everyone, we will pick speakers and topics to ensure a cohesive panel.

The event will take place at a co-working space in Hollywood, but we have not yet finalized the date. We hope to have the event details finalized by the end of February so we have plenty of time to advertise to our network.

We look forward to seeing some new data visualization ideas!

The Evaluators’ Institute (TEI) March 2018 Training in Southern California

The Evaluators’ Institute (TEI) welcomes SCEA members to participate in its March 2018 CA Program. Contact TEI@cgu.edu to access the special SCEA discount. 

(March 12 – March 17, 2018)

The Evaluators’ Institute (TEI) offers top-tier training in evaluation for beginning, mid-career, and advanced evaluation professionals. Taught by leaders in the field, TEI courses offer technical rigor in evaluation methods and practice through participant-centered, adult-learning approaches.

For participants who are interested in charting a full course of study, TEI offers a number of professional certifications.

You can find course descriptions and information on how to register on the TEI website.

Don’t miss this great opportunity for SCEA members to build professional skills alongside great faculty and fellow participants. Act now before the classes fill up!

The TEI team is standing at tei@cgu.edu to answer any questions.


TEI Schedule.png

Katy Nilsen on the ALERT and Robopad Systems

Dear fellow evaluators,

I want to share about the importance of developing computational thinking skills in young children that I learned from my research. Recent policy in the Next Generation Science Standards (NGSS) outlines science and engineering practices for K-12 students, including computational thinking. The creation of programming environments as spaces where children have opportunities to develop these skills is essential, as well as the evaluation of children’s experiences in these contexts.

In a recent study on the affordances of spatial programming, we performed a qualitative investigation of children’s interactions in two programming environments: Active Learning Environment with Robotic Tangibles (ALERT) and Robopad. The ALERT system allows for physical human-robot interaction (HRI), where children can use their whole body to physically interact with the robot in a room. In contrast, the Robopad system is a virtual programming environment that parallels the ALERT experience on a computer screen. We recruited nine study participants (7 boys, 2 girls) of first grade students. Students were divided into pairs or into a trio, with the two girls forming their own pair. Students interacted with each other and with the robots over the study’s five day period. They first participated in a “play robot” session, followed by free play and task sessions with each technology. Video data were collected and transcribed. The transcripts were coded and analyzed for trends across groups and days. We found that both systems afford opportunities for young children to engage in spatial programming, including sequential programming and improvisational “just-in-time” programming. In addition, children were collaborative with each other in their programming and interactive with the technology in these environments. There were some study limitations in that the sample size was small and came from one school, so the results are not generalizable. Also, all students first engaged with the ALERT system and then interacted with the Robopad system. As a consequence, students may have applied what they learned in the ALERT environment to the Robopad environment.

In conclusion, both the ALERT and Robopad systems supported children’s spatial programming. For both environments, students engaged in planned sequential programming; however, this was not as obvious in the Robopad environment. Also, there was evidence for both systems that children engaged in just-in-time programming, but these interactions comprised the majority of activity with the Robopad system because programming the virtual robots tended to be more instantaneous than planned. Technologies like ALERT and Robopad show promise for providing spatial programming opportunities to students in order to foster engagement in computational thinking.

For more information, please refer to the following reference and click on the link to access the full text from the publisher: Burleson, W., Harlow, D. B., Nilsen, K. J., Perlin, K., Freed, N., Jensen, C., … & Muldner, K. (2017, July 7). Active learning environments with robotic tangibles: Children’s physical and virtual spatial programming experiences. IEEE Transactions on Learning Technologies, 99. doi:10.1109/TLT.2017.2724031

Katy Nilsen, PhD, is a researcher and evaluator with a focus on K-12 Science, Technology, Engineering, and Mathematics (STEM). She examines teacher practice and student learning within the disciplines of environmental science, computer science, and robotics. She also studies technology implementation initiatives in schools.