Category: Adult Education

  • Training Needs Assessment: Methods, Tools, and Techniques

    Training Needs AssessmentTraining Needs Assessment: Methods, Tools, and Techniques by Jean Barbazette
    Get it at AMAZON

    This book covers the essentials of needs analysis from the emerging trainer’s perspective by providing just the right amount of support and knowledge without going too deep into the subject. The topics covered include when and how to do a training needs analysis; using informal and formal analysis techniques; goal, task and population analysis; and how to develop and present a training plan for management approval. Each chapter includes appropriate data gathering tools. “The Skilled Trainer” series provides practical guidance for those who’ve had some exposure to training and would like to take their career to the next level.

  • Experiential Learning: Experience as the Source of Learning and Development

    Experiential LearningExperiential Learning: Experience as the Source of Learning and Development by David A. Kolb
    Get it at AMAZON

    Experiential learning is a powerful and proven approach to teaching and learning that is based on one incontrovertible reality: people learn best through experience. In this book, David A. Kolb offers a systematic and up-to-date statement of the theory of experiential learning and its modern applications to education, work, and adult development. Kolb models the underlying structures of the learning process based on the latest insights in psychology, philosophy, and physiology. Building on his comprehensive structural model, he offers an exceptionally useful typology of individual learning styles and corresponding structures of knowledge in different academic disciplines and careers. Kolb also applies experiential learning to higher education and lifelong learning, especially with regard to adult education. This is an indispensable resource for everyone who wants to promote more effective learning: in higher education, training, organizational development, lifelong learning environments, and online.

  • The Adult Learner: The definitive classic in adult education and human resource development

    The Adult Learner- The definitive classic in adult education and human resource developmentThe Adult Learner: The definitive classic in adult education and human resource development by Malcom S. Knowles, Elwood F. Holton III, and Richard A. Swanson
    Get it at AMAZON

    How do you tailor education to the learning needs of adults? Do they learn differently from children? How does their life experience inform their learning processes? These were the questions at the heart of Malcolm Knowles’ pioneering theory of andragogy which transformed education theory in the 1970s. The resulting principles of a self-directed, experiential, problem-centered approach to learning have been hugely influential and are still the basis of the learning practices we use today. Understanding these principles is the cornerstone of increasing motivation and enabling adult learners to achieve. If you are a researcher, practitioner or student in education, an adult learning practitioner, training manager, or involved in human resource development, this is the definitive book in adult learning you should not be without.

  • Andragogy in Action: Applying Modern Principles of Adult Learning

    Androgogy In ActionAndragogy in Action: Applying Modern Principles of Adult Learning by Malcolm S. Knowles
    Get it at AMAZON

    This classic work by a pioneer in the field of adult learning provides over thirty case examples from a variety of settings illustrating andragogy (principles of adult learning) in practice, including applications in business, government, colleges and universities, religious education, remedial education, and continuing education for the professions.

  • Introducing “The Journey Toward Diversity, Fairness, and Access Through Education” Curriculum Design

    by Michael Roosevelt

    We are excited to announce the completion of NASJE’s newest curriculum design!

    NASJE Curriculum DesignThe history of this effort began when NASJE undertook, with support from State Justice Institute (SJI), the task of developing a comprehensive set of curriculum designs to advance the profession of judicial branch education based on core competency areas.

    Soon after the project got underway it became apparent that not all topics would or could be covered—namely fairness, diversity, and access— based on the original core competency areas that had been identified.

    Upon completion of the first round of designs, the Diversity Committee (now Diversity, Fairness, and Access Committee) recommended to the NASJE Board that it undertake the development of a new design to address diversity, fairness, and access. A recommendation was approved by the Board and the Diversity, Fairness, and Access (DFA) Curriculum Workgroup was formed to develop the design.

    Early on the Workgroup decided to focus on a design for “Entry Level” and not the “Experienced Level.” The decision to create an entry level design was practical. Since knowledge of diversity-related topics varies greatly among the membership, we thought much would be gained by approaching the design at a level where most in the profession likely fell. The design is not intended to make judicial educators subject matter experts on race, bias, stereotypes and diversity, but help them to understand the importance of these areas/topics to the profession and delivery of education programs.

    NASJE’s Curriculum Committee (now the Education and Curriculum Committee) put in many hours to see this curriculum design realized. Check out this excellent resource for judicial branch educators!

    Click here for a link to the electronic version of the Diversity, Fairness, and Access curriculum design. You also can always find it and the other curriculum designs on the NASJE website via the pulldown menu “Resources” > “Curriculum Designs.”

    A hard copy of the DFA curriculum design will be sent to NASJE members in the Spring.

    Finally, the DFA Committee looks forward to formally presenting this new and exciting curriculum design to the membership in October during the Annual Meeting in Seattle.

  • Conducting Needs Assessments: A NASJE National Webcast

    Robin Wosje
    Robin Wosje

    NASJE MEMBERS: There is a link to a recording of the webcast in the Members Only area (go to the pull-down menu “Resources” > “Member Area”).

    Wednesday, March 18, 2015, 12pm Pacific / 1pm Mountain / 2pm Central / 3pm Eastern (1 hour 30 minutes)

    One of the most important practices of a judicial branch educator is determining what education your entity needs. What are some of the best practices in the judicial education community?

    This upcoming webcast will assist you with developing, administering, and using the results of your needs assessment. It will also be a great opportunity to share with your peers your challenges and successes with different types of needs assessments.

    Gordon Zimmerman
    Gordon Zimmerman

    After this webcast, you will be able to define needs assessment and its use and application in judicial branch education; list the benefits and drawbacks of various data-gathering approaches and methodologies; and list methods and tools to assist faculty using needs assessment.

    Faculty

    Professor Gordon Zimmerman, University of Nevada, Reno, and former NASJE President Robin Wosje, Justice Management Institute, are the faculty for the session.

  • My Experience with the Mentor Program

    by Dr. Anthony Simones

    When I accepted the job as Manager of Judicial Education and Programming for the Missouri Office of State Courts Administrator, one of my first actions was to join the National Association of State Judicial Educators (NASJE). As a new member, one of the first inquiries directed my way was whether I wanted a mentor. My immediate reaction was to say, “No thanks, you can assign a mentor to somebody who needs one.” After all, I had been a professor of constitutional law and government for two decades, and served as a mentor for dozens, if not hundreds, of people.

    Dr. Anthony Simones, second from left, with Lee Ann Barnhardt, third from left. Photo by Margaret Allen.
    Dr. Anthony Simones, second from left, with Lee Ann Barnhardt, third from left. Photo by Margaret Allen.

    Then I remembered that even though I had known success in other arenas, I was new to the field of judicial education. Even if I had been experienced in judicial education, I would have been wise to bear in mind I was new to this specific position and could benefit greatly from regular conversations and consultations with someone in a similar job for some time. Finally, I recalled the wisdom of Horace: people “cease to think when they think they know it all.” I agreed to be assigned a mentor.

    It was one of the best decisions I ever made.

    I had the extraordinarily good fortune to have Lee Ann Barnhardt of North Dakota assigned as my mentor – a bundle of energy, experience, and enthusiasm. From the beginning, she was a beacon in an environment frequently fraught with uncertainty. Lee Ann taught me about the profession of which I had become a part and what I needed to do to transcend adequacy and enter the sphere of excellence. She revealed what was expected of me. She educated me about the work of those who’d come before me, pointed me in the right direction for information, and explained how I could benefit from my predecessors to avoid reinventing the wheel. She answered my questions and listened to my concerns, serving as an effective sounding board when I needed one most. She told me about her experiences, passed along wisdom she had gained, and spared me from making some of the mistakes she had made. She made suggestions about what I should say and do that impressed and amazed the people at my agency.

    Lee Ann was able to be that most valuable of advisers – the objective expert. She was someone who had no stake in the outcomes of the situations I encountered in Missouri and no history with the individuals with whom I was interacting. She was able to look at my situations with complete objectivity and provide input based upon logic and experience. As much as I appreciated her suggestions when I did not know what to do, I more often appreciated Lee Ann when her guidance served to confirm my hunches. If someone as impressive as Lee Ann confirmed my instincts, then I knew I was on the right path. Armed with the confidence I derived from our relationship, I found myself trusting my judgment and making the proper call time after time.

    Would I have known success without Lee Ann Barnhardt as my mentor? Perhaps, but I submit
    it would have been hit-or-miss, as I would not have had constant access to a source of expertise and experience. I would not have known a welcoming and reassuring voice was always just at the other end of the phone line. I would not have realized that someone who had experienced great success in my field was there for me and cared about my success. Lee Ann made me feel like I was a valuable, and valued, member of an important field. She helped me believe I had made the right choice by changing careers and taking this job.

    I can say this. Without Lee Ann, I would not be on the Board of Directors of the National Association of State Judicial Educators in my fourth year as a judicial educator. She taught me how to excel in this profession. She inspired me to push myself and not settle for the back bench that is so comfortable to those who are new to a field. She expressed her confidence in my ability to follow in her footsteps and assume a leadership role. I consider her a good friend as well as a colleague, and one of the greatest thrills of my new career was when she asked me to facilitate a session with her at last year’s conference.

    I cannot guarantee you will be assigned a mentor as amazing as Lee Ann Barnhardt. However, I do guarantee you will not have an experience like mine if you choose not to take advantage of NASJE’s mentor program. If your mentorship even approximates the pleasure and value of mine, you will come to regard your decision as rewarding as it is wise.

    If you are interested in having a mentor, please contact:

  • Conducting Effective Training through Careful Evaluation

    by Theresa L. Bohannan, MPH

    Disclaimer: The following is an excerpt and adaption from A Guide to Conducting Effective Training Evaluations: Recommendations, Strategies and Tools for Dependency Court Improvement Programs (Guide), authored by Dr. Sophia Gatowski and Dr. Shirley Dobbin.

    A Guide to Conducting Effective Training Evaluations: Recommendations, Strategies and Tools for Dependency Court Improvement ProgramsProfessionals who want to increase their knowledge about specialty topics in their field typically attend conferences, continuing education programs, or other forms of training. Professional development can take resources away from daily activities so it is critical that we know we are spending our time wisely and that trainings are effective. Trainees fill out surveys, either online or in-person, asking what they thought of the food, the venue, the materials, the speaker, etc. However, is that training increasing trainees’ knowledge about a subject matter? Is the training effective at changing attitudes and behaviors? The way to get to these answers is by conducting effective training evaluation. Conducting effective training evaluation is imperative to ensuring people are receiving the intended benefits of the training event.

    State Court Improvement Program (CIP) coordinators tasked with judicial trainings related to child abuse and neglect issues may find evaluation a difficult step in planning. The stress on Continuous Quality Improvement (CQI) in recent years has illuminated the need to improve and tailor training. A component of this is careful evaluation before, during, and after training events.

    The National Council of Juvenile and Family Court Judges (NCJFCJ), as part of the National Resource Center on Legal and Judicial Issues, a service of the Children’s Bureau, developed A Guide to Conducting Effective Training Evaluations: Recommendations, Strategies and Tools for Dependency Court Improvement Programs. The Guide assists in identifying training needs, developing training methodologies and evaluation tools, and assessing training outcomes. The Guide does not recommend a universal method to evaluation, but rather provides the tools and resources needed for training coordinators to adopt their own evaluation method. It offers approaches for measuring objectives and outcomes. The Guide seeks to help training managers better target, design, and deliver trainings.

    Development of the Guide first started with a comprehensive review of effective strategies for adult education and training programs. We evaluated resource materials and interviewed judicial educators. We examined options for training design, mode of delivery, and obtained perspectives on training evaluation generally and specifically for dependency court systems. An advisory committee made content recommendations and several state CIPs piloted the strategies, guidance, and template evaluation tools. Feedback and lessons learned from these pilot sites enhanced the final product.

    The main goals of the Guide are to:

    1. Provide guidance and strategies for the program planning, design, implementation, and evaluation stages of dependency court training efforts;
    2. Encourage training coordinators to use the strategies to support local, state, and national training agendas; and
    3. Provide template tools that facilitate an assessment of satisfaction, skill or knowledge acquisition, behavior, practice and attitude change, and training impacts or outcomes.

    The Guide consists of six chapters that lead training program managers through the entire evaluation process.

    Chapter One, “Training Program Planning and Evaluation”, focuses on the critical program planning tasks and training evaluation basics. In order to tailor training, a first step is to conduct a training needs assessment to discover gaps in professional development. Training needs assessments will help inform future training planning and ensure professionals in the field are receiving up-to-date information that is relevant to their daily work. Having a comprehensive resource available to training coordinators can greatly improve their planning and eventually improve outcomes for trainees.

    Chapter Two, “Training Satisfaction and Reaction Measurement”, provides recommendations and strategies for ensuring that the measurement of training participants is useful.

    Chapter Three, “Measuring Learning Acquisition”, centers on measuring learning acquisition by explaining the primary learning modes that occur during training and providing methods for evaluating learning.

    Chapter Four, “The Assessment of Behavior and Practice Change”, focuses on measuring behavior and practice change and provides guidance of how to assess whether trainees applied their newly acquired skills.

    Chapter Five, “The Assessment of Training Outcomes”, reviews approaches to determining the outcomes associated with training programs.

    Chapter Six, “Analyzing, Interpreting, and Reporting Training Evaluation Data”, provides ideas for analyzing, reporting and utilizing training evaluation data and covers the analytical techniques needed to understand results.

    Associated with each chapter are Tools and Resources that facilitate development of unique evaluation and assessment tools.

    This comprehensible resource will assist State CIPs and other training professionals assigned to designing effective dependency court training across the nation. Methods introduced in the Guide will allow training coordinators to go beyond reporting the number of training programs held and participants trained. It will help assess the impacts and outcomes from training events. The Guide is available on the NCJFCJ website, along with a navigation tool to assist users in determining their understanding of training evaluation and where to begin.

  • Procedural Fairness: Using Blended Learning to Extend the Reach

    By Nancy Smith, Field Trainer, Pima County Superior Court, Tucson, Arizona

    Knowledge is of no value unless you put it into practice. – Anton Chekov

    As judicial educators, we focus on how to present important topics to the judiciary in ways that not only inform them, but also assist them in changing practices which at times are deeply ingrained in judicial tradition and legal practice. When considering how to teach procedural fairness, Washington state judicial educators searched for a way that extended beyond the traditional conference plenary session so commonly used in our state. We sought to do more than inform, but also to convince people to change.

    Adult learning theory teaches that adults learn better when they see both the extrinsic and intrinsic value of what they are being taught, when the topic is relevant to their work, and when they get to practice what they are learning. We also know that chunking a topic into easily digestible pieces and spreading learning out over time helps. This article describes a recent learning series on procedural fairness executed using a blended model in Washington State, following those adult learning precepts.

    Procedural fairness concerns the public’s perception of how they are treated by the judicial system. Studies have shown that the perception of unfair or unequal treatment in the courts is the most important factor in public dissatisfaction with the American legal system (Burke and Leben, 1). If court users feel that they have been treated fairly, they are more likely to accept the outcome of their case, even if they lose. Judges and lawyers are concerned with fair legal outcomes—did the defendant get what he deserved? The public, on the other hand, is more concerned with whether they were treated fairly from a procedural standpoint—was the decision arrived at using fair methods? Professor Tom Tyler describes four key components to procedural fairness:

    1. Voice – Court customers expect to be able to express their viewpoint, or their side of the story.
    2. Neutrality – Court users believe that decisions are made consistently based on sound legal principles by unbiased decision makers.
    3. Respect – Individuals are treated with dignity and their rights are protected.
    4. Trustworthy authorities – Authorities care about the individuals before them and listen carefully to what they say. They address litigant’s needs by explaining decisions. (Burke and Leben p. 6)

    The need for procedural fairness begins at the courthouse door (or on the courthouse web page) and permeates many aspects of the administration of the courts, as well as what occurs in the courtroom. Thus, things like good signage, a useful website, and helpful staff impact procedural fairness, as does what actually occurs when the defendant stands in front of a judge during hearings and trials.

    For more information about procedural fairness, what it means and how it affects courts, please go to the Procedural Fairness website.

    In order to understand how a blended learning model can be applied to the topic of procedural fairness, it is important to know what blended learning is. According to the Sloan Consortium, blended learning consists of courses or programs in which 30%-79% of the learning is offered online while the rest is face-to-face (Allen, Seaman, and Garrett, p. 5). The online portions and the face-to-face portions can be combined in whatever order best fits the learning objectives of the program. Online segments can be webinars, self-paced elearning, web-based research and activities, wikis, blogs, email and much more. Face-to-face portions can be lectures, workshops, seminars, discussion groups, or experiential learning modes such as field trips and interviews. For a longer discussion on a blended learning model for the courts, see Smith, N., Blended Learning: Seven Lessons Learned through Experience in the August 2012 NASJE News.

    In Washington, we sought to put together a blended learning series that incorporated many best practices for adult education. Based on input from our diversity committees, our education committees and two Supreme Court commissions, we knew the material was important and relevant to our intended audience of judges and court administrators. We looked for methods to chunk the content, spread the learning out over time, offer opportunities to practice what was being taught, and provide practice activities to be able to self-assess progress. Ideas for the learning series are based on the model Context-Challenge-Activity-Feedback, as explained by Ethan Edwards at Allen Interactions (Edwards, p. 6). In short, and in the context of eLearning, Edwards counsels instructional designers to provide a relevant, work-related context for the learning design so learners’ interest is heightened. He suggests providing realistic challenges for learners, and interactive activities to allow them to practice solving the challenges. Finally, he advises providing feedback that offers guidance to the learner, not just “you’re right” or “incorrect, try again.”

    To accomplish these goals and decide on learning modes that provided the necessary context, challenge, activity and feedback, several questions needed answers.

    1. In what order and in what time frame should the various parts of the program be presented. (context)
    2. What groups in the court community would serve as resources and planners for the program? (context)
    3. Who could be the respected expert(s) to meet judicial officers face-to-face and teach the topic? (context)
    4. What did judicial officers need to know prior to the face-to-face session, if anything? (context)
    5. How would judicial officers and court administrators know how they were doing with respect to procedural fairness? How could they measure their current and future status? (activity, feedback)
    6. What concrete steps could judicial officers take to be able to see themselves through the public’s eyes in the courtroom? (activity)
    7. Could judges be persuaded to take concrete steps? How could they be persuaded of the value of the steps? (challenge)
    8. How could judges measure their progress in their courtrooms? (activity, feedback)
    9. How could judges and court administrators measure progress in their courthouse? (activity, feedback)

    One question we did have to answer was how to fund the project. The Washington Courts Blended Learning Project, a grant from the State Justice Institute, provided funds.

    We found a willing expert and advocate in Judge Kevin Burke of Hennepin County Courts in Minnesota. Judge Burke has conducted research on the topic and written a white paper and other articles about it, as well as speaking nationally. His passion for the topic served as an inspiration for the Washington Courts educators involved in the project.

    With Judge Burke on board, we were able to enlist the help of the Washington Supreme Court Minority and Justice and Gender and Justice Commissions. The Diversity and the Education Committees of the District and Municipal Court Judges’ Association and the Equity and Fairness Committee of the Superior Court Judges’ Association all climbed on board the project train early in the process. As it turned out, the judges on these committees really took a big risk later in the project to make it real to the audience, as you will see later. Washington Court Education Services educator Nancy Smith and several support staff completed the project team. The importance of buy-in from the various commissions and committees cannot be overstated in establishing the legitimacy of the project.

    We decided on a four- or five-part learning series:

    • Part One: Read the Burke/Leben white paper called Procedural Fairness: A Key Ingredient in Public Satisfaction.
    • Part Two: Complete a Web-based Self-Assessment to measure procedural fairness throughout the court house.
    • Part Three: Attend a face-to-face session at spring conferences with Judge Burke and Washington judges as faculty.
    • Part Four: Participate in a webinar: Procedural Fairness: Real Steps for Real Improvement; facilitated by Judge Burke, with faculty from Washington State.
    • Part Five: Repeat the web-based self-assessment to monitor progress. This part is optional, but encouraged.

    To begin the series and establish a context for the topic, a lesson on the basics of procedural fairness was essential. We chose Procedural Fairness: A Key Ingredient in Public Satisfaction, a white paper of the American Judges Association written by Judge Burke and Judge Steve Leben as our basic “text.” Reading this article would help judges to understand what is meant by procedural fairness and how good techniques in this area could impact not only their workload, but also compliance with court orders, while also increasing public satisfaction. The article also cites sociological research on the topic, thus providing credibility about the value of the concepts put forth. We provided all judges and administrators with a hyperlink to the article and included a printed copy in conference materials.

    In addition to having basic definitions and concepts, we decided it would be helpful for judges and administrators to have a good tool to assess where they stood with regards to procedural fairness in their courtrooms and court houses. We searched for a means to allow and encourage them to measure their effectiveness. Through the assistance of a court educator in California, we found the answer in the report published in 2011 by the Center for Court Innovation on research conducted on procedural fairness in California: PDF. We are grateful to Ms. Diane Cowdrey and Mr. Douglas Denton of the California Administrative Office of the Courts for sharing this report.

    Besides a wealth of information, tools, techniques and suggestions, this report contains a self-assessment for court leadership. We loved the idea of a self-assessment as an activity that would provide court officials with a means to see how procedural fairness impacts many aspects of their courts, as well as to get feedback about their effectiveness. They could also quickly see their strengths and weaknesses. The problem was persuading court officials to actually complete the survey.

    In order to make it as easy as possible for the court officials, we worked with our web designers to transform the paper assessment into an online tool, accessible at any time, easy to complete, and providing instant results. Another benefit realized from this method was that the data from the survey fed a spreadsheet from which analysis could be made and shared. The online self-assessment became part two of the project, with the hope that it could also be a part five for officials willing to reassess themselves at a later date. The assessment gathered information on date of completion and court level, but was otherwise totally anonymous.

    Over 200 judicial officers and court administrators completed the self-assessment, although not everyone completed it before the face-to-face session which constitutes part three. In addition to providing a contextual activity with feedback for participants, results of the self-assessment were used to guide the focus of part four of the learning series, a webinar, as will be explained below. Thus, parts one and two of the learning series provided in-depth background and context for the learning, as well as serving to inform judges of their own level of procedural fairness through an interactive activity.

    With the way prepared for him, Judge Burke presented face-to-face sessions at two judges’ spring conferences as part three of the series. In April, he presented a 2-hour session to Washington’s Superior Court Judges and Administrators, an audience of close to 200 people. In June, he returned to the state to present to the District and Municipal Court Judges at their conference. We videotaped the June presentation so that anyone who missed the live conference session could also view it when they wished to do so. In June, the presentation lasted three hours and included a listening self-assessment for 175 audience members.

    While Judge Burke is an important and well-known figure nationally, one of our goals was to involve local judges in the conference sessions to provide relevance. In order to do this, two judges from the Equity and Fairness Committee of the Superior Court Judges’ Association and two judges from the Diversity Committee of the District and Municipal Court Judges’ Association agreed to be videotaped during an entire half day while presiding in their courtroom. We shared the videotapes with Judge Burke, who subsequently worked with each of the four judges about what he saw in their videos and what techniques they could use to improve their fairness. During the live presentations, Judge Burke shared video of judges in other states, and led discussion and provided commentary on these videos. Next, the local judges testified as to how videotaping helped them, and what changes they planned to improve their procedural fairness. After the live session, judges had a solid understanding of what procedural fairness is, why it is important, how it can make a difference for them in their jobs every day.

    The two conference sessions reached over 350 judges and court administrators and were highly rated for content and effectiveness. We also found that many more judges completed the online self-assessment after hearing the live presentation.

    The goal of the fourth part of the learning series, the webinar, was to provide concrete steps for improving procedural fairness in areas Washington judges identified as being less developed in their courts. It would also serve as a reminder of what they had previously learned, and add to their knowledge of the topic. Finally, it would suggest activities for judges and administrators to undertake in order to get more feedback about their progress on procedural fairness in both court room and court house. In preparation for the webinar, we analyzed the results of the self-assessment to identify where Washington judges believed they needed the most help.

    Table 1: Self-assessment analysis of responses for all courts completing the survey.

    Self assessment analysis
    KEY: W=weak, D=Developing, S=Strong, Total=total number of respondents, %S=Percentage of respondents indicating a strong response.

    The table shows the nine areas of self-assessment as developed in the California Courts report. As can be seen, Washington judges considered themselves and/or their courts weaker in three areas, as indicated not only by the number of weak responses, but also by the lack of strong responses. These areas are Understanding Court Proceedings, Ensuring a Voice in the Court, and Limited English and Culturally Diverse. Due to time constraints, faculty for the webinar decided to focus in on the first two areas of need; another webinar is planned for the third area.

    During the webinar, we showed clips from the Washington judges’ videotapes, and the judges themselves provided commentary related to the specific clips they chose. We also offered concrete activities that judges and administrators could do to gather feedback and make improvements in each of the focus areas. We polled the audience during the presentation to encourage them to seriously consider being videotaped and to discover their opinions on relevant topics. We also asked for their feedback using chat about steps they have taken or would like to take in their jurisdictions.

    Sixty-five people attended the live webinar, called Procedural Fairness: Real Steps for Real Improvement, and another 80 have viewed the recording. According to polling taken during the webinar, 82% of participants said they are “very likely” to videotape themselves in order to see what they look like and how they sound to court users, while 18% said “maybe.” No one said they definitely would not be videotaped. Participants also shared many ideas they thought would work in their courts to improve procedural fairness. Participants overwhelmingly rated this webinar high. In applicability to their jobs, likelihood of implementing what they learned, increase in content knowledge level, and content delivery, over 90% agreed/strongly agreed in every case.

    While our polling and evaluations indicate high interest and satisfaction with the learning they experienced about procedural fairness, we have no way of knowing how much change has actually occurred, or is occurring, because of participation in all or part of the learning series.

    We do know several things:

    1. We reached many judges and court administrators through our efforts. Over 400 people participated in one or more of the components of the process.
    2. We did much more than simply present a conference session. We provided several concrete activities with opportunities for participation, discussion or feedback to help make the learning more real. We chunked up the content and spread it out over a five month timeframe.
    3. There is a “buzz” around Washington Courts on the topic. Anecdotally, the author has heard judges from several different court levels talking about incorporating aspects of procedural fairness into sessions at the Washington State Judicial College. In addition, several conference sessions are planned for spring 2013 that tie into the topic—judges have stated this while planning the sessions, and as a reason for having the sessions. “This will tie in well with what we learned last year about procedural fairness.”
    4. Concepts integral to procedural fairness are appearing in plans for reorganization of the Washington State Judicial Branch.

    Will this learning series improve the public’s perception of procedural fairness in Washington’s Courts? One can always hope so. If nothing else, evaluations show that participants found the series relevant to their work, and they tell us they will apply what they learned in their jobs. If they follow through, positive change will occur.

    References

    Allen, I. E., Seaman, J. and Garrett, R. (2007) Blending In: The Extent and Promise of Blended Education in the United States, Sloan Consortium.

    Burke, K. and Leben, S. (2007) Procedural fairness: A Key Ingredient in Public Satisfaction. The American Judges Association.

    Edwards, E. (2012) Creating e-Learning that Makes a Difference, http://info.alleninteractions.com/?Tag=CCAF.

    Tyler, T. R. (2006) Why People Obey the Law.

    Porter, R. (2011) Procedural Fairness in California: Initiatives, Challenges, and Recommendations. Center for Court Innovation, New York, NY and Judicial Council of California/Administrative Office of the Courts, San Francisco, CA.

    Smith, N. (2012) Blended Learning: Seven Lessons Learned through Experience, NASJE News, https://nasje.org/blended-learning-seven-lessons-learned-through-experience/.

    NASJE member Nancy Smith recently moved from the Washington State AOC to the Pima County Superior Court in Tucson, AZ where she has assumed the position of Field Trainer.  In this position, Nancy is responsible for providing training to six courts of limited jurisdiction in Pima County on topics such as the case management system, legislative changes, ethics and more. She has worked in judicial branch education since joining the Washington State Administrative Office of the Courts (AOC) in September 2008 as a Court Education Professional. She has worked in education for most of her career, including 14 years as a teacher at the community college and secondary levels in Tucson. Prior to moving into court education, she assisted the Deans of Curriculum at the Evergreen State College in Olympia, WA planning and producing the curriculum for Evergreen’s full-time programs. In a past life, she spent four years as an Army Intelligence Officer.

     Ms. Smith has produced education events for judges and court staff at all levels and in several different formats, most recently adding eLearning to her repertoire. She completed a certificate in Electronic Learning Design and Development at the University of Washington. She has organized a variety of webinar and self-paced learning modules for different court groups. In 2009, Ms. Smith was awarded a grant from the State Justice Institute to establish a model for blended learning (combining e-learning with face-to-face learning) for Washington Courts. The Procedural Fairness learning series was the last project for the grant.

     Ms. Smith has broad experience in multi-cultural education, and has traveled widely in the United States and abroad. A French linguist, she earned her bachelor’s degree from the College of William and Mary in Virginia, and her master’s in French Language and Literature from the Université Libre de Bruxelles in Brussels, Belgium. She is a certified community college and secondary teacher. She also studied Spanish at the University of Arizona. When not at work, she enjoys travel, gardening, and a variety of outdoor activities. 

     

  • Blended Learning: Seven Lessons Learned through Experience

    Nancy Smith
    Nancy Smith, MA

    By Nancy Smith, MA

    In June of 2009, employees of the Washington State Administrative Office of the Courts gathered to hear the news about state budget cuts. Within minutes, State Administrator for the Courts Jeff Hall relayed that the money for the Presiding Judges conference had been eliminated from the budget. The Washington Court Education Services team had to figure out how to provide appropriate learning opportunities for this important group. The answer came a few months later when the State Justice Institute provided a grant of $25,000 to fund a test of a blended learning model for the courts. This grant took advantage of resources already on hand at the Washington AOC: Skilled educators, well-established processes, and the newly acquired web-conferencing system Adobe Connect Pro and associated eLearning software Adobe Presenter to help solve the problem of diminishing budgets.

    In this article, you will find a definition of blended learning, a description of the grant project, an outline of web-based learning capabilities used, and lessons learned for future blended learning projects.

    What constitutes blended learning? According to the Sloan Consortium, blended learning consists of courses or programs in which 30%-79% of the learning is offered online while the rest is face-to-face (Allen, Seaman, and Garrett, p. 5). To test a blended model for our courts, we envisioned three blended learning series consisting of 2-3 electronic learning modules and one single or part day face-to-face module. Using this model, travel could be limited to one day for most participants, hotel and meal costs could be drastically limited, topics could still be dealt with in some depth, while still providing some time for networking and collegiality.

    The Education Team proposed the creation of three (later increased to four) iterations of a blended learning model. For the electronic learning modules we combined either webinars, which are live (synchronous) instructor led sessions, or self-paced eLearning (asynchronous) sessions, followed by a face-to-face session conceived of as a capstone event. Important to the plan was the idea that all sessions needed to be interrelated and interactive, with ideas and concepts building on the ones previously taught. The face-to-face session was designed to allow the opportunity to synthesize, analyze, and evaluate the learning from the web-based events to create a tangible product learners could use. We used the word “product” very loosely to mean anything from action plans, forms, bench cards, and checklists to skills acquisition through role plays, discussion groups, coaching and the like. We wanted our face-to-face events to be used to enhance education in ways that are difficult to reproduce in web-based sessions.

    We designed our first iteration of the model for the presiding judges and their court administrators. With budget development as the topic, we presented two live webinars that talked about leadership, relationships, and processes both within and outside of the courts. We recorded the webinars for future viewing by those unable to attend the live event. The self-paced module consisted of a nuts and bolts approach to budget controls and performance monitoring, complete with exercises for practice. The last event, a face-to-face symposium, brought together judges and administrators to share concerns and ideas with elected and appointed officials from the executive and legislative branches of Washington State and local government.

    The second iteration, for courthouse facilitators, had a similar design, although the three electronic modules were more sequential than in the budget series. The facilitators learned and practiced techniques on the topic of Temperaments and Dealing with Difficult People.

    In the third iteration, we combined several traditional teaching methods with electronic learning modalities. With Search and Seizure as the topic, participants completed a reading and self-test, answered discussion questions, considered hypothicals and participated in a live discussion via web-conference for each of four learning modules. Learners accessed all parts of the series through an Adobe Connect curriculum which served as an online “one-stop” shop.

    Developing and teaching in the blended model proved to be so cost-effective that we were able to apply to SJI to continue the model testing with grant funds so far unused. Our fourth iteration is underway, and considers the topic Procedural Fairness. It is organized quite differently, and involves reading an article and completing a web-based court self-assessment, attending a live conference session with national speaker Judge Kevin Burke, participating in a webinar with Washington judges as facilitators, and completing the self-assessment a second time to gauge improvement. As can be seen from the contents of the learning series, the modalities used to present learning modules evolved with practice, and the order different modalities occurred varied as time went on.

    This evolution occurred for several reasons. First of all, we learned as we practiced and tested to add different methods to our repertoire. Thus, by round three, we had added readings, self-tests and discussion groups to the web-sessions, and used those sessions like a classroom discussion instead of a lecture. By round four, we added a web-based self-assessment designed to be completed twice and to function as a self pre-test and post-test for individual judges or courts.

    Thus, we arrive at lesson number one for creating blended learning: choose the modality to suit the learning objectives, not the other way around. Always choose the most effective means to teach your learning objectives. If, for example, the material requires working with content, not people, a self-paced module might be perfect. If your audience needs to access the information at their own pace, or at any time, self-paced is a good choice. If coaching is involved, face-to-face may be best. Wikis, threaded discussions, blogs and many other tools can be used. The possibilities are many, but the modality used should fit your learning objectives.

    Lesson number two: Whichever modality is chosen the learning needs to be interactive. As defined in Michael Allen’s Guide to e-Learning, instructional interactivity is “interaction that actively stimulates the learner’s mind to do those things that improve ability and readiness to perform effectively” (Allen, p. 94). This can be particularly challenging with self-paced learning, where there isn’t a live instructor. In a self-paced design, a read and click module will likely put learners to sleep. Strive to create modules where learners have to make decisions and choices through scenario based learning or simulations. For webinars, a talking head will drive learners directly to their email or other work. Create opportunities for interaction through polling, status checks, the chat pod, and in smaller sessions, by opening the microphone to audience members. Experts advise an interaction anywhere from every 3-5 minutes to every 6-7 minutes (Hofmann, p. 12).

    In addition to interactivity, ensure all the elements of your blend are coordinated. This is lesson number three. If for example you require a self-paced lesson prior to learners attending a classroom event, devise a way to make sure they have completed the module on time. Do not teach over again something you expect learners to know before they arrive in your classroom. You will frustrate those who did the work ahead of time, enable those who don’t, and waste valuable teaching time in unneeded review. By the same token, allow learners to skip parts of a series they already know (Allen, p. 88). For example, if a judge has worked extensively on a particular kind of case, why should she have to read the case again to advance through your curriculum? A quiz could work as a “test-out” and allow the learner to move on.

    Lesson four, make sure the technology works. Quizzes, pre-work, webinars, job aids, whatever your modality, it should function the way you expect it to. Things that function well at a central location with powerful servers and excellent internet capabilities do not necessarily work in a remote location. For example, we stopped using webcams in our webinars because while audio usually works fine, video takes a lot of bandwidth and can be problematic. Screen resolution and older software can also be issues. Sometimes it is the computer user who is the problem. Be flexible, try modules on computers away from your system, and test, test, test. That said, don’t be afraid to make use of technology because it might not work. If you do your homework, you will likely find a solution and be successful.

    Lesson five also has to do with success, and the lesson is be patient! You will find resistance to learning online among your clients. Gradually increase your offerings, teaching their use and publicizing them carefully. Guide your learners to accept that online learning can be just as effective as face-to-face learning by producing excellent learning modules in appropriate modalities. It will take time. We have found more judges are beginning to participate in webinars and there is greater acceptance of web-conferencing for meetings. Our court staffers often prefer self-paced modules because they can do them at their own pace when they have time. You won’t persuade everyone right away, but with time, solid learning opportunities, interactive modules, and well-coordinated events, you will eventually build your audience.

    You not only have to convince your audience, you also have to convince your faculty. Lesson six is train your faculty for their new role. For a self-paced module, they may fill the role of subject-matter expert while you do instructional design, while for a webinar, they should act more like a facilitator than a talking head. In a December 2010 Adobe event, trainer Ken Molay of Webinar Success provides valuable insights about presenting webinars in a recording called “Training Webinars 101”. A more advanced lesson on facilitating webinars comes from author Cynthia Clay in “Great Webinars: Crossing the Chasm to High-Performance Virtual Delivery”, sponsored by the eLearning Guild and recorded February 2, 2012. Consider asking your faculty facilitators to watch Mr. Molay’s presentation first, and as they build confidence, move them to Ms. Clay’s presentation. Follow the blogs from various eLearning publications to improve your skills and your faculty’s skills in this domain.

    Lesson seven is money talks, and as long as you have a web-conferencing system and eLearning development software, it can be cheaper to produce a blended learning event, or any online event, than face-to-face training. Given the choice between no training and eLearning due to lack of funding, our audience chose eLearning. We limited travel time and expense by holding our face-to-face events in locations easily accessible to large numbers of learners. In addition to fewer travel expenses, our webinars and self-paced events limit time away from the office and make learning more accessible.

    It is not easy to persuade judicial branch clients of the value of these new ways of learning. It takes research, planning, testing, coordination, and a new skill set to successfully put together blended learning events, or any online learning event. It might provide perspective if you keep these numbers in mind as you tackle the topic: approximately 5.6 million U.S. college and university students enrolled in one or more online courses in the fall of 2009, reflecting a 21% increase in online enrollment compared to only a 2% increase in overall student population in the same period (Allen and Seaman, p. 8). College and university administrators predict continued growth in this sector. These students represent our future court employees, lawyers, judges, and educators. Soon, I believe we will find our learners demanding online learning opportunities from us.

    Is blended learning always the best solution? No, not always, but appropriately planned and implemented, it can be the best approach in many situations.

    References
    1. Allen, I. E., Seaman, J., and Garrett, R. (2007) Blending In: The Extent and Promise of Blended Education in the United States, Sloan Consortium.
    2. Allen, I. E., Seaman, J. (2010). Class Differences: Online Education in the United States 2010, Babson Survey Research Group.
    3. Allen, M. (2008). Guide to e-Learning, New Jersey: John Wiley & Sons.
    4. Hofmann, J., (2011). “Top 10 Challenges of Blended Learning,” Training Magazine p. 10-12.

    *****
    Nancy F. Smith joined the team at the Administrative Office of the Courts (AOC) in September 2008 as a Court Education Professional. She has worked in education for most of her career, including 14 years as a teacher at the community college and secondary levels in Tucson, Arizona. Prior to moving to the AOC, she assisted the Deans of Curriculum at the Evergreen State College in Olympia planning and producing the curriculum for Evergreen’s full-time programs. In a past life, she spent four years as an Army Intelligence Officer.

    At the AOC, Ms. Smith supports the Appellate Judges, County Clerks, Presiding Judges and District and Municipal Court Judges Education Committees. In addition, she enjoys running the Institute for New Court Employees in the fall. In June of 2010, Ms. Smith completed a certificate in Electronic Learning Design and Development at the University of Washington. She has organized a variety of webinar and self-paced learning modules for different groups supported by the AOC. In 2009, Ms. Smith was awarded a grant from the State Justice Institute to establish a model for blended learning (combining e-learning with face-to-face learning) for Washington Courts.

    Ms. Smith has broad experience in multi-cultural education, and has traveled widely in the United States and abroad. A French linguist, she earned her bachelor’s degree from the College of William and Mary in Virginia, and her master’s in French Language and Literature from the Université Libre de Bruxelles in Brussels, Belgium. She is a certified community college and secondary teacher. She also studied Spanish at the University of Arizona.