The Southern California Evaluation Association (SCEA) is a community of evaluators who are committed to quality and innovation in evaluation practice and research. SCEA’s mission is to strengthen the evaluation field by providing professional development, collaboration, and networking opportunities to evaluators.
Who We Are:
The SCEA was established in 2003 as a local affiliate of the American Evaluation Association (AEA). SCEA members are evaluation practitioners and scholars who are committed to the advancement of the evaluation field. SCEA members practice, research, and teach in various sectors, including education, energy, social services, health, and more. Employment settings include universities, foundations, government, non-profit organizations, and private consulting firms. Members are located throughout Southern California from Santa Barbara to San Diego and from West Los Angeles to the Inland Empire.
To achieve its mission and serve its members, the SCEA offers opportunities to:
- Engage in presentations on topics relevant to evaluators,
- Network with other evaluators,
- Access educational learning resources made available through partnerships with other AEA local affiliate groups,
- Connect to a national network of evaluators, and
- Have insightful and fun discussions with fellow evaluators.
Anne T. Vo, Ph.D. is Assistant Professor of Clinical Medical Education and Director of Evaluation at the Keck School of Medicine of the University of Southern California. Anne’s research interests lie at the intersection of comparative evaluation theory, evaluation capacity building, and organizational development. Her scholarship aims to enhance the field’s understanding of how evaluation can be done rigorously and how evaluative knowledge can be leveraged to drive change. Anne’s work can be found in the field’s flagship journals (American Journal of Evaluation, New Directions for Evaluation, Evaluation and Program Planning) as articles, book chapters, and books. Anne’s evaluation practice is centered on serving organizations whose constituents come from communities of need, including low-income disadvantaged youth in the K-20 pipeline across Greater Los Angeles, educators working in politically challenging environments in Southeast Asia, and those with stigmatized health conditions such as HIV and obesity. She has also lectured and offered workshops on logic modeling, data analysis and research methods, and publishing in peer-reviewed journals at various universities, evaluation societies, and medical conferences across the country.
Alana Kinarsky, M.A. is currently pursuing her Ph.D. in Social Research Methodology with a focus on Program Evaluation at the University of California, Los Angeles (UCLA), where she earned her Masters in 2015. Her research focuses on campus support for undergraduates with foster care experience, how evaluation can strengthen nonprofit organizations and foundations, and better understanding what contributes to evaluation use. At UCLA, she is a Graduate Student Researcher for the SRM Evaluation Group where she works with campus programs that aim to increase access to higher education and support student success. In recent years she has conducted evaluation work with organizations such as the Bronfman Youth Fellowship, Jewish Federation of North America, and UCLA. Prior to returning to graduate school, she worked for Interfaith Youth Core in Chicago where she lead a team of researchers and staff in assessing the religious and spiritual climate on U.S. college campuses.
Alma Boutin-Martinez, Ph.D. is the Senior Institutional Research Analyst at Fielding Graduate University. In this role, Alma provides data and analyses to support program evaluation, institutional and program accreditation, learning assessment, and senior management decision-making to advance the university’s mission. Alma earned a PhD in Education from the University of California, Santa Barbara (UCSB) and a Master’s degree in Psychology from California State University, Fullerton. Prior to joining Fielding, Alma worked at UCSB Collaborate, the University of California Educational Evaluation Center, and the Fullerton Longitudinal Study on projects related to instructional design, educational program evaluation, and longitudinal data analysis.
Emi Fujita-Conrads, M.A. is a current Ph.D. student in the Social Research Methodology division at the University of California, Los Angeles specializing in Evaluation. She earned her M.A. from the University in 2017. Emi is a graduate researcher at UCLA, where she conducts evaluations with campus support programs and community colleges. Prior to graduate school, Emi supported research and evaluation work at a national youth mentoring non-profit. Her research interests include using evaluation for program improvement and methods for conducting social justice focused and culturally responsive evaluations.
Erika E. B. Kato, Ph.D. is a faculty member at California State University, Long Beach in the Educational Leadership Department. She serves as the Center for Evaluation and Educational Effectiveness (CEEE) Project Director for the CSULB HSI-STEM campus evaluation and the CSU HSI-STEM systemwide evaluation, among other campus research projects. She has over seven years of experience in evaluation, quantitative research and institutional research. Prior to joining CSULB, Erika served as Senior Manager of Data Analytics at the CSU Office of the Chancellor, was a researcher for five years at the University of California Educational Evaluation Center, and taught high school and middle school mathematics.
Jacob Schreiber, M.A. is an assistant in the Keck Evaluation, Institutional Research/Reporting, Assessment (KEIRA) office at the Keck School of Medicine of USC. In this position, Jacob supports data collection and analyses in efforts to evaluate the medical school’s curriculum and faculty. Prior to working at Keck, Jacob earned an M.A. in Applied Anthropology from California State University, Long Beach in 2016, and worked as a behavioral analyst for the Newport-Mesa Unified School District. His research interests concern power and epistemic knowledge in the medical sciences, and the social constructions of illness and wellness in educational and occupational settings.
Katy Nilsen, Ph.D. is an independent research and evaluation consultant. She graduated with her doctorate in Education from the University of California, Santa Barbara. She has extensive experience in both conducting STEM research studies and in performing program evaluations. Her interests include examining teacher practice and student learning within the disciplines of environmental science and computer science. She is also interested in studying and evaluating teaching and learning with technology.
Lara Hilton, M.P.H., Ph.D. (ABD) is a Senior Analyst at RAND Corporation and consults on program evaluation design and implementation. She has worked since 2000 as a research methodologist and analyst specializing in program evaluation and evaluation capacity building with a stakeholder-centered approach. Her content specialties are integrative medicine, military behavioral health, chronic pain, and patient-centered care. She has led or coordinated large scale research and evaluation projects for National Institutes of Health, National Center for Complementary and Integrative Health (formerly NCCAM), Agency for Healthcare Research and Quality, the Department of Defense, and Veterans Administration. She is published in top-tiered medical and evaluation journals (JAMA, JGIM, Annals of Behavioral Medicine, American Journal of Evaluation) and presents annually at American Evaluation Association Conference and Annual Meeting of American Public Health Association.
Miriam Jacobson, Ph.D. is a technical specialist at ICF, where she evaluates workforce development, training and technical assistance programs for government and higher education clients. Prior to this position, Miriam completed her PhD in Evaluation and Applied Research Methods at Claremont Graduate University, where she conducted evaluations of education and youth programs. She regularly presents at the American Evaluation Association conference and publishes in evaluation journals. Her evaluation interests include collaborative and responsive evaluation approaches and mixed methods analysis.
Nida Rinthapol, Ph.D. received her doctorate in Education with an emphasis in Cultural Perspective and Comparative Education and Quantitative Methods in Social Sciences from University of California, Santa Barbara (UCSB). Nida is the associate director for data resources in the Office of Academic Planning and Budget at UCLA. Prior to UCLA, Nida was a research associate at the National Institute of Excellence in Teaching (NIET) and a graduate researcher at the University of California Educational Evaluation Center (UCEC), where she had the opportunity to work with nationally-prominent scholars in the area of teacher effectiveness and educational evaluation research. Her most recent presentation on Data Mining to Identify Grading Practices received the Best Presentation Award from California Association for Institutional Research (CAIR) conference. Nida also teaches both undergraduate and graduate research method classes at various educational institutions.