This case study provides critical insights into how coaching creates value in an organization.

Phillips, J. J. (2012) Coaching for Business Impact: Creating Value, Including ROI, Through Executive Coaching. In Phillips, P. P. and Phillips, J. J. Measuring the Success of Coaching. ATD Press, pp., 183 – 201.

Read More

One of the aims of the Japan Institute of Workers’ Evolution (JIWE) is to contribute to the industrial development of Japan by promoting opportunities for female workers to make full use of their vocational abilities and skills.

Read More

One of the joys of writing this column is that it gives me the excuse to pause and reflect upon my recent learning about research and practice and to consider how it may be of interest to you, my readers. Over the last month or so I have been particularly taken by what it means to be a master practitioner and how research can help us attain that level of practice. It started with my recent move to Middlesex University (England) as Director of Programs for their Professional Doctorate Program. The candidates in this program undertake a doctorate in and through their own practice. Unlike the conventional doctorate where the focus is on academic knowledge, this doctorate’s focus is on practice itself, including all the messiness of real life and context. Within my new role, I have the opportunity to work with senior practitioners from a range of professions and talk with their professors and senior academics. It is fascinating to note that we are all intrigued by the question, “What makes mastery?”

All of us are struck by the great similarities between different disciplines–it seems that the process is the same, although the technical knowledge may be vastly different. For example, I had the delight of working with Dr. Susan Melrose, a professor of the performing arts, and I loved her perspective–to quote: “Disciplinary mastery is always relational: it is undertaken somewhere, by and for someone, with reference to (and thereby rearticulating the terms of) one or another disciplinary tradition”–this has a resonance for me when thinking about coaching. As we meet with our clients we are co-constructing a ‘performance’ with them. As we seek to probe what mastery really looks like and how it can be acquired, we are in the same realm as the performer seeking to construct a depiction of Hamlet or Sleeping Beauty which communicates and explores anew some aspect of what it means to be human.

The question of mastery has real power for coaching when we consider where we are as a profession. If we are to construct the boundaries of what constitutes our body of knowledge and practice, we need to be able to articulate in a clear manner what it means to be a master practitioner in our field. Here we differ from a performance artist in that we need to differentiate ourselves from other related disciplines. The academic requirements, i.e., the amount of stuff we need to know, are relatively straightforward. They are not easy, but they are straightforward. There may be differences in the focus of some courses depending upon the preference of the professors teaching them–but the amount and depth of study are monitored by the university accreditation boards and audited against the standard of a current body of knowledge in the area. However, with all due respect, we know that passing a master’s degree is not indicative of mastery in a profession. A master’s degree identifies that you have the required technical knowledge, NOT that you have the required professional knowledge and skills. For this we need to develop–through practice–the professional know-how and ‘gut feel’ indicative of a seasoned practitioner. This is the elusive but necessary ingredient of mastery.

So what might it be? The literature shows us a variety of perspectives and comes up with ‘practice wisdom’ and ‘expert intuition,’ both of which try to identify the process by which a practitioner produces a decision or constructs a flexible innovative intervention within the context they find themselves, i.e., their particular client or situation. It is relational, as Susan Melrose says. Let us take a moment to reflect: When was the last time you surprised yourself in practice and thought, “I wonder where that came from? Why did I do that? It worked but where did I get it from?” Probably quite recently! Your expert intuition was in full flight. You probably rationalized your decision or design AFTER the event, but it arrived like magic at the time. As Schön1 would have said, you were ‘knowing in action.’

We are starting, as researchers, to get some sense of what is happening in practice wisdom so we can help practitioners attain the holy grail of mastery. It is not appropriate to call it ‘intuition’ –expert or not–as this is a catchall phrase suggesting it is innate and without rational basis. My own view is that we are working with a kaleidoscope (I thank one of my students, Steve Wigzell, for this metaphor), each color contributing to the pattern is one aspect of what we are bringing to the interaction. For instance, we will bring technical knowledge from various disciplines: learning theory, change management, etc., but also our knowledge of context, the pragmatics in operation, our own values and beliefs, our experience in similar situations, etc. All these and more are part of the color spectrum we have in our kaleidoscope. For each client and situation, we rotate the kaleidoscope again to produce a pattern unique and specific to that client and situation.

The creation of each new pattern has to happen fast and effortlessly ‘in the moment’ through ‘reflection in action,’ and, as such, is the result of using images, examples, and understandings achieved through practice. A person’s performance nearly always uses several kinds of knowledge (technical, experiential, etc.) in some integrated form and is influenced by both context and feelings.

What recent research has shown is that the transition from novice to competent practitioner can happen when one or two areas of work are mastered. The transition from competent to master practitioner needs the practitioner to not only be using a broad and deep knowledge base, but also to be actively creating knowledge by applying their expertise in new arenas. To create new knowledge, experts must be well versed in the problems and methodologies of the field in which they work and actively engaged in problem finding. These experts are posing questions and instituting investigations that push the boundaries of their work.

So there we have it–if you want to develop expertise and be a Master Practitioner, you must be a problem finder and hence a researcher!

Enjoy your problem finding!

This article first appeared in Business Coaching Worldwide (October Issue 2010, Volume 6, Issue 3).


1 D. Schön, The Reflective Practitioner (Basic Books: New York, 1983) An old one but a good one and well worth a read

Dr. Annette Fillery-Travis

Dr. Annette Fillery-Travis is a senior researcher and education coach with the Professional Development Foundation. The author of more than 60 research articles and studies, her book. The Case for Coaching, presenting a literature review with research case studies and interviews from over 20 organizations on coaching efficacy, was published in 2006 by CIPD, UK.

I have recently experienced some of the gifts offered to coaches worldwide to enable them to develop their discipline. These include practitioner research, international conferences, and research grants. My first column for the year discusses the importance of these gifts and how we can make good use of them.

Practitioner Research and Reflective Practice

What do we really know about how coaching works, exactly how well it works, and when it works best? In essence, not much. Our “knowledge” is based mainly on what coaches say they do, or on what they think makes sense-rather than on observation of what they really do, or on research into coaching outcomes experienced by individuals, teams, and organizations. As a coaching practitioner, it is essential to continually research your own practice, ultimately developing your own professional competence through reflective practice.

David Peterson (2009) suggests simple ways to do this. For example, try different techniques in your coaching, i.e., with alternate clients do a background interview that is only one third of your normal interview; see what happens and take notes on what you observe. Secondly, you can generate a list of experimental ideas for your coaching from reading about new techniques, new types of questions, or new processes. Try one new thing every coaching session and record your findings. Thirdly, you can ask your coaching participants what was the most effective thing you (as coach) did in the session, and why was it helpful.

Also ask what was the least effective thing, and why was it not helpful? Record your feedback, looking for patterns, and substitute new processes for the least effective things. Think about participating in coaching research studies, or finding clients from your own practice to participate in such studies. Most importantly, think critically about and read current coaching research, trying to incorporate findings into your own practice.

The general characteristics of practitioner research are that (Fillery-Travis, 2009):

Coaching Conferences

Coaching in Medicine and Leadership

In late September 2009, I attended and spoke at the second International Harvard Coaching Conference on Coaching in Medicine and Leadership. Coaching has emerged as a competency dedicated to helping individuals grow, develop, and meet personal and professional goals while at the same time building personal and professional capacity and resilience. Although every year coaches are servicing a US$1.5 billion market, the most developed market segment is leadership coaching in organizations-less than 20 percent of professional coaches are from the mental health or medical fields. The Harvard conference was therefore a groundbreaking event, with lectures and workshops by world leaders in coaching and coaching research. There were three tracks: Overcoming the Immunity to Change; Coaching in Leadership-Theory and Practice; and Coaching in Health Care-Research and Application.

ICRF2 London: Measuring Results

In November I participated in the second International Coaching Research Forum (ICRF2) held in London, sponsored by the IES (UK Institute for Employment Studies) and the International Coaching Research Forum (ICRF). ICRF2: Measure for Measure looked specifically at how to design coaching measures and instruments, with the ultimate aim of discovering what makes coaching effective. Researchers from around the world met to discuss three major topic groups: process measures, outcome measures for executive/leadership coaching, and outcome measures for health, wellness, and life coaching. The format for each discussion was:

  1. Discussion of what inputs should be measured;
  2. Identification of aspects of the coaching process to be measured
  3. Identification of outcomes to measure, based on coaching purpose;
  4. Specific suggestions on how best to measure areas of greatest interest.

Critical issues in measurement and methodology were discussed, the biggest concerns relating to:

  1. How do we evaluate instruments and measures? What are the important considerations, such as reliability, validity (quantitative research), and trustworthiness (qualitative research)?
  2. How can we incorporate measures into our research? What are the issues and considerations in research design and methodology for incorporating measures and interpreting results?
  3. What qualitative research issues have arisen in recent coaching research?
  4. What are some of the most compelling coaching topics and challenges and how can they be measured?

A final report will be made available on the websites of both the International Coaching Research Forum and COMENSA (Coaches and Mentors of South Africa) early next year. All of the group forums were recorded, and key points from each discussion will be included in the final report.

GCC Rainbow Convention-Cape Town 2010

These recent conferences have implications for all coaches worldwide, and particularly for the work being carried out by the Global Coaching Community (GCC), an international dialogue aimed at furthering the development of coaching. The GCC’s last convention took place in Ireland in July 2008 and produced the momentous Dublin Declaration on Coaching. The declaration was supported by recommendations from the GCC’s ten working groups, and has been endorsed by organizations and individuals representing over 15,000 coaches around the world.

It is now South Africa’s turn to host this pivotal event and help take the dialogue forward, and so the GCC Rainbow Convention will be held in Cape Town during 10-16 October 2010. The convention will showcase the results of pioneering practitioner research being undertaken by “pods” of coaches around South Africa. It will also continue the development work undertaken by the GCC’s ten working groups, as well as host specialist workshops on aspects of coaching practice.

Grants from the Institute of Coaching

Another boost to the professional development of coaching practitioners is an endowment of US$2,000,000 from the Harnisch Foundation to the Institute of Coaching based at Harvard Medical School/McLean Hospital. The Institute is able to translate this generous endowment into grants totaling US$100,000 per year to fund rigorous research into coaching, thereby helping develop the scientific foundation and professional knowledge base of the field.

The Institute offers four types of grant, with deadlines for applications on the first day of February, May, August, and November each year:

  1. Graduate student fellowships of up to US$10,000 for high-quality research projects. To qualify, applicants must be Masters or Doctoral candidates looking for financial support for dissertation research on coaching.
  2. Research project grants of up to US$40,000 annually for individuals who would like to conduct empirical research in coaching.
  3. Research publication grants of up to US$5,000 to assist with the writing, editing, and publication of coaching research in  peer-reviewed journals.
  4. Travel awards to cover travel expenses related to presenting coaching research at the annual Harvard Coaching Conference.

Please visit to learn more about the Institute’s various grants, membership programs, current research, and publications  and for information on the recent Harvard Conference. As a Founding Fellow of the Institute of Coaching and a member of its Research Advisory Board, I am keen that all practitioner researchers in coaching are aware of these research grants. It is crucial that we begin to build the body of knowledge on what is working and what still needs work within the discipline of business coaching worldwide.

How Can You Play a Part in the Development of the Field?

Our goal in developing reflective research and enquiry is to make a substantial contribution to the emerging practice of coaching worldwide (Stout Rostron, 2009). Your gift to our emerging discipline is to play a practical part. For example, you can:

Systemic Team Coaching Research Survey

Professor Peter Hawkins, creator of the Seven-Eyed Supervision Model1and founder of the UK Bath Consultancy Group, is currently writing a new book on Systemic Team Coaching to be published by Kogan Page in 2010. He would like this book to best represent what is known and practiced in the field of team coaching. He is asking thought leaders, leading researchers, and senior team coaches to contribute from their experience. All contributions will be fully acknowledged and you will be referenced. Everyone who fills in the questionnaire will also be invited to the book launch in the UK in autumn 2010. Key questions are as follows:

Please email responses to: Professor Peter Hawkins or send to Barrow Castle, Rush Hill, Bath, UK BA2 2QR.

This article first appeared in Business Coaching Worldwide (Feburary Issue 2010, Volume 6, Issue 1).


Fillery-Travis, A. (2009). Practitioner Research Workshop, GCC Rainbow Convention, notes.

Peterson, D. (In press). “Executive Coaching: A Critical Review and Recommendation for Advancing the Practice.” In Handbook of Industrial and Organizational Psychology, edited by S. Zedeck. Washington, DC: American Psychological Association.

Stout Rostron, S. (2009). Business Coaching Wisdom and Practice: Unlocking the Secrets of Business Coaching. Johannesburg: Knowledge Resources. Available from

Wilkins, N. (2009). “Countdown to the GCC Rainbow Convention!” COMENSAnews, November. Available from

1Hawkins, P. and Shohet, R. (2007) Supervision in the Helping Professions. United Kingdom: Open University Press.

Dr. Sunny Stout Rostron, DProf, MA

Dr. Sunny Stout Rostron, DProf, MA is an executive coach and consultant with a wide range of experience in leadership and management development, business strategy and executive coaching. The author of six books, including Business Coaching Wisdom and Practice: Unlocking the Secrets of Business Coaching (2009), Sunny is Director of the Manthano Institute of Learning (Pty) Ltd and founding president of COMENSA (Coaches and Mentors of South Africa).

In my writing so far, I hope I have whetted your appetite for coaching research and put a convincing argument that it cannot be left as an “academic” pastime, but should be part of every practitioner’s arsenal.

Most of us do not have the time to carry out research per se, but given that our profession is in its infancy, there is much to discover in the literature about the true potential of what we can offer as coaches and how this can impact upon our clients and their organizations! We can contribute to the growing body of knowledge ourselves by delving into journals and articles, discussing “hot topics” within our networks and generally making our literature our own. As practitioners we contribute a valuable perspective when we talk with our peers and to academic researchers.

Over the last year I have been contributing to a working group on research as part of the Global Coaching Convention. This convention was established to create a collaborative framework of stakeholders in coaching with the aim of professionalizing the industry. Quite a job and at times I think the size and complexity of the aim has daunted even the hardiest souls. As in any undertaking of this size, there has been a debate about the value of such an initiative. Detractors say that the coaching community has grown organically so far and it should be left to continue doing so. Others say that the convention is taking on too big a job, and with so many diverse agendas on the table, there is little hope of getting collaboration or consensus so people are wasting their time. I will not go further into the debates other than to mention that if any of our clients came out with such a view, we might well consider challenging it! But enough of my soap box, as a good researcher I shall admit my bias and point everyone in the direction of the GCC’s website for the latest news and events.

Sunny Stout Rostron and Carol Kauffman chaired and facilitated the research working group and they did a grand job in challenging our process and thinking as well as generally bringing the project home. The core piece of work was a review of where we are, as a community, in terms of our research. Sunny and Carol will be publishing the full piece in the near future, but I would like to share with you some of the thinking it sparked with me.

First and foremost, we agreed that if we are thinking of moving to becoming a profession then we have to define what our body of knowledge is—what is it that makes our offer different to that of occupational psychologists, management consultants or other related fields? Research is the route to defining our knowledge. Even if we are simply looking at, and comparing, each other’s practice we are engaging in research.

The second point that struck a real note with me was our discussion around whether we should define what is “good” and what is “bad” research. This question and its real depth gets in the way of many of us entering the world of research. It throws up all kinds of questions about what is the “correct” way of doing it, reporting it or even defining it.

Let us first consider our purpose in doing research. For me and many others it is to find something out or to learn, and the best evidence of learning is to change behaviors. So we are effectively saying to our colleagues:

“Trust me—I have looked at this issue and found XYZ. You can now take my findings and apply them directly to your practice.”

That is quite a thing to say. We are suggesting people change their practice and behaviors because of what we have found out. To do this (and still sleep at night) we need to know that we are right (or valid) and not leading people down the garden path on a scenic route to nowhere. Some researchers have taken the easy route out of this dilemma and stuck to one way of doing research, irrespective of the question they are asking. Usually the method of choice is a controlled experimental study where one group gets coaching and one group doesn’t, and at the end there is some measure of impact on behaviors (with everyone hoping there is a positive effect on those who have been coached)!

Everyone breathes a sigh of relief as they are doing a scientific study and don’t have to justify themselves any further. Oh if only life was that easy! As we have discussed before, what you research and how you do it is determined by the question you are asking NOT the other way around. A controlled experiment would be terribly complicated and confusing if we wanted to explore how and what elements of the coaching engagement are of most value to a diverse range of clients. Trying to control for all the factors that would be different between groups would make it untenable (and unusable).

Identifying the research method used as the main differentiator between good and bad research is therefore not a sensible path and will only lead to restricting the type of research question we will be able to ask (and answer). Our criteria for whether an inquiry is “good” research must be: Is there coherence between the question, the method used to research it and the analysis undertaken, and has everything been done to the standards of good practice? Let us leave the question of what is good practice to one side for another day and take that as read; we can then be happy to consider good research to include any method or even mix of methods that makes sense for the question we are asking.

The same thinking should be brought to bear on the question: What is the best research to do? Everyone wants to do research that will set the world alight, but choosing a topic isn’t easy. Governments have been engaged in foresight exercises for many years trying to second guess the research investment they should make to enable them to meet the challenges for the future. They have invested a significant amount of money, but it has resulted in quite a lot of what has been described as crystal gazing.

Experience has shown that such exercises are useful for mapping current drivers for research, but usually fail to foresee the big issues for the future, e.g., the exponential rise in the use of the mobile phone and the personal computer. If we cannot see what will be the main issue for the future then the best research to do is the research that speaks to you and your practice, i.e., the research that follows your passion. Chances are that your passion will be shared by others—go ahead and ask them—and if that is the case then you can be confident that there will be an audience for your work.

This article first appeared in Business Coaching Worldwide (October Issue 2008, Volume 4, Issue 3).

Worth Reading:

As an introduction to how people are thinking about research for the future, have a look at these two papers. If you do not have access to these journals through a library or database, then just go to the website of the journal and order the download direct to your computer.

Linley, Alex P. 2006. “Coaching Research: Who? What? Where? When? Why?” International Journal of Evidence Based Coaching and Mentoring Vol. 4, No. 2 (Autumn): 1

Bennett, J.L. 2006. “An Agenda for Coaching-Related Research: A Challenge for Researchers.” Consulting Psychology Journal: Practice and Research Vol. 58, Part 4: 240-249

Dr. Annette Fillery-Travis

Dr. Annette Fillery-Travis is a senior researcher and education coach with the Professional Development Foundation. The author of more than 60 research articles and studies, her recent book The Case for Coaching, presenting a literature review with research case studies and interviews from over 20 organizations on coaching efficacy, was published in 2006 by CIPD, UK.

I am sometimes surprised to learn that coaches are failing to gather a range of information from their clients prior to the commencement of the coaching process. In the absence of such information, how is it possible to calculate an ROI?

In the early 70s, Donald Kirkpatrick introduced a model for evaluating the benefits of training. This same model is used today by training and human resources departments to evaluate the ROI of coaching. The model has four levels:

  1. Reaction: How well did the client like the coaching?
  2. Learning: What principles, facts and techniques did the client learn?
  3. Behavior: What changes in job behavior resulted from the coaching?
  4. Results: What were the quantitative results of the coaching in terms of reduced costs, improved performance, improved efficiency, etc.?

Each level in the model requires information from both the client and the organization. In order to be of value, the information must be gathered before, during and after the coaching process.

In an article entitled An ROI Method for Executive Coaching: Have the Client Convince the Coach of the Return on Investment (2005), Mary Beth O’Neill outlines a method for engaging the client in taking responsibility for the gathering of this data. By taking ownership of their learning and measuring their progress and outcomes, clients support their own development through the coaching process.

A recent client of mine began to keep a reflective journal of each of our coaching sessions. In this journal, he identified different areas on which he wanted to comment (leadership, finances, emotions, family, learning, and ideas). After each session, he would record what he had learned about himself during the coaching or note something that had stimulated his thoughts or feelings. This journal became an invaluable resource for the client, as he would often revisit entries that were several months old, reflecting on how much he had changed and the progress he had made.

He and I took the outcomes he had recorded in his journal and applied them to Kirkpatrick’s model. The Results (Level 4 of the model) showed improvement in his leadership style, his interaction with his staff, the speed with which he could think creatively, and his understanding of self on an emotional level.

With all of this in mind, what data should you be gathering? Be very clear and specific about what, why and when you require any data from your client or from the organization. Provide a clear context for the use of the information.

A list of potentially useful data to gather is provided below. This list is by no means exhaustive:

Once you have captured your data, apply it to the different levels of Kirkpatrick’s model.

One small but rather important tip is to remember that coaching results occur in both the short term and the long term. ROI calculations sometimes focus solely on the “now,” computing gains, savings, and losses as of the completion date of the coaching partnership. Yet the benefits of coaching continue long after the coaching relationship has ended. Building in an evaluation at the completion of the coaching process, and then re-evaluating results at subsequent time intervals, will provide you with some excellent information. And, if the client takes ownership of the process and can see the benefits for him or herself, the result is a great win-win for both of you.

This article first appeared in Business Coaching Worldwide (Spring Issue 2006, Volume 2, Issue 1).


O’Neill, Mary Beth. 2005. “An ROI Method for Executive Coaching: Have the Client Convince the Coach of the Return on Investment.” International Journal of Coaching in Organizations 3:39-47.

Bronwyn Bowery-Ireland

Bronwyn Bowery-Ireland is the CEO of International Coach Academy, an international coach training school. She has been an executive coach for over 10 years.

Business coaching is expanding as a means of improving programs, processes, and even people. Sponsors, clients, and corporate executives–those who fund coaching activities–want to hear about successes in terms that they understand, terms related to organizational needs. Everyone might know that a coaching program made a positive difference, but someone insists on getting to the bottom-line: what did we spend and what did we get in return? The following is a brief summary of a real-life case study of a coaching intervention demonstrating measurement and evaluation, including the calculation of the return on investment (ROI)


A USA-based, internationally established, prosperous hotel company, the Nations Hotel Company (NHC), sought to maintain and improve its status in the highly competitive hospitality industry. With hotels in 15 countries, 98% brand awareness worldwide, and 72% customer satisfaction rating, NHC wanted to help executives find ways to improve efficiency, customer satisfaction, revenue growth, and retention of high-performing employees. Challenged to execute this project, the Nations Hotel Learning Organization (NHLO) developed a program, including as a pivotal component a formal, structured coaching program called Coaching for Business Impact (CBI). NHC corporate executives wanted, as part of the process, to see the actual ROI for the coaching project.


The NHLO first surveyed executives to identify learning needs and to assess their willingness to be involved in coaching. Most of the executives indicated that they would like to work with qualified coaches to assist them through a variety of challenges and issues, and that this would be an efficient way to learn, apply, and achieve results. The measurement and evaluation goal for the senior executive team was to assess results for 25 executives, randomly selected (if possible) from the participants in CBI. Figure 1 depicts the 14 steps in the new coaching program, from the beginning to the ultimate outcomes. For the planned ROI analysis, step #4 was critical; executives made a commitment to provide data on action plans and questionnaires.


Although these steps are self-explanatory as to the coaching process, the ROI process involved gathering data throughout the coaching engagement so that evaluation results could be reported for all five levels:

To collect complete and reliable data for Levels 4 and 5, executive-participants completed action plans that included questions addressing the four business impact measures sought to be improved:

1. What is the unit of measure?
2. What is the value (cost) of one unit in monetary terms?
3. How did you arrive at this value?
4. How much did the measure change during the evaluation period? (Monthly value)
5. What other factors could have contributed to this improvement?
6. What percentage of this change was actually caused by this coaching for business impact program?
7. What level of confidence do you place on your estimate of the change attributable to this program? (100% = Certainty and 0% = No Confidence)

Using the action plan responses and collecting data through executive questionnaires, senior executive questionnaires, and company records, the NHLO obtained information to convert data to monetary values (items 1-4 above), to isolate the effects of the coaching on this business impact data (items 5-6 above), and to adjust for errors in estimation (item 7 above).

Evaluation Results

Careful data collection planning allowed the NHLO team to measure the results of the coaching program at all levels. Level 1: Reaction, Level 2: Learning, and Level 3: Application all showed positive results and comments.

Impact: To assess the business impact, the NHLO team assimilated the information on the action plans for the 22 CBI executive-participants who responded. Using these responses, the NHLO arrived at the total adjusted value of the program’s benefits as $1,861,158.ROI: The fully-loaded costs of the CBI program included both the direct and indirect costs of coaching (needs assessment/development, coaching fees, travel, time, support, overhead, telecommunications, facilities, and evaluation). CBI costs for 25 executives totaled $579,300.
Using the total monetary benefits ($1,861,158) and total cost of the program ($579,800), the NHLO developed two ROI calculations. First is the benefit-cost ratio (BCR), which is the ratio of the monetary benefits divided by the costs:

This value suggests that for every dollar invested $3.21 was returned. The ROI formula for investments in any human performance intervention is calculated as it is for other types of investments: earnings divided by investment. For this coaching solution, the ROI was calculated thus:

For every dollar invested in the coaching program, the investment dollar was returned and another $1.21 was generated. In this case, the ROI exceeded the 25% target.

Intangibles: The NHLO chose not to convert all measures to monetary values, creating a list of intangible benefits — improved commitment, teamwork, job satisfaction, customer service, and communication.


Credibility of data and of the ROI process itself is always critical. The NHLO’s sources of data (executives and company records), conservative data collection process, isolation of program impact, adjustment for errors in estimates, use of only first-year benefits in the analysis, fully loading program costs, and reporting results at all levels made a convincing case for the CBI program.


To communicate results to target audiences, the NHLO produced three documents:

To convey a clear understanding of the methodology, the conservative process, and information generated at each level, the NHLO team held meetings with the sponsor and other interested senior executives. Conservative and credible processes and competent communication led senior executives to decide that, with a few minor adjustments in the program, they would continue to offer the coaching for business impact program on a volunteer basis. Pleased with the process and progress, they were delighted to have data connecting coaching to the business impact.

This article first appeared in Business Coaching Worldwide (Fall Issue 2005, Volume 1, Issue 3).

Jack J. Phillips, Ph.D

Jack J. Phillips, Ph.D, is a world-renowned expert on measurement and evaluation, chairman of the ROI Institute, and consultant to many Fortune 500 companies. He facilitates workshops for major conference providers throughout the world. His most recent books are Proving the Value of HR (SHRM, Winter 2005) and Investing in Your Company’s Human Capital (AMACOM, Spring 2005). Find out more about Jack’s work at

As businesses and organizations increasingly turn to coaching for performance improvement and leadership development, questions about the value of coaching naturally arise. In addition, calculating the return on investment (ROI) of coaching can seem daunting. Here are five of the most frequently asked questions that business coaches ask about measuring the ROI of coaching.

  1. Do I have to learn finance and accounting principles to understand and effectively measure the ROI in coaching? No. Although many of the basic principles of finance and accounting aren’t required for developing the ROI in business coaching, it is important to understand issues such as revenue, profit and cost. Ultimately, the payoff of coaching or any human resources project or program will be based on either direct costs saved or additional profits generated. It’s helpful to understand the nature and types of costs and the different types of profits and profit margins.
  2. Do I have to know complicated statistics to understand ROI? No. Basic statistical processes–simple averages, variance and the standard deviation–are all that are necessary to develop most ROI impact studies. These very simple concepts are, by design, simplified as much as possible so that the ROI can be determined successfully with all types of business coaching solutions.
  3. Isn’t ROI too complicated for most business coaching professionals? No. The ROI calculation itself is a very simple ratio: net benefits divided by costs. The process follows a methodical, step-by-step sequence with guiding principles for collecting data and calculating benefits and costs. In all, six types of data are collected, including reaction, learning, application, impact, ROI and intangibles (to review, see Figure 1 at What can make the process somewhat complicated are the many options in each step in the process. Several methods are available for isolating the effects of coaching as well as for converting data to monetary values. Selecting the data collection methods for a given coaching assignment will be influenced by the nature of the coaching engagement and the particular environment and setting. To keep the process simple and clear, the coach, the participant, and the sponsor or client organization must establish the parameters and expectations for the coaching experience at the beginning of the assignment.
  4. Shouldn’t business coaches focus on the human dynamics rather than on the numbers? Certainly, within the coaching assignment the business coach’s attention is on the coaching task and on developing rapport with the participant so that learning and change can happen. The astute coach and coaching firm will realize the need for accountability and for measurement and evaluation of the coaching engagement, including ROI. Assessing the value of business coaching and reporting that information to decision-makers enhances the likelihood of continuing and even increasing the opportunities to coach.
  5. Isn’t this just a fad? No. This methodology is comprehensive, consistent, and credible. ROI has been used as a business evaluation tool for 300 years. Although ROI has only recently begun to be used to evaluate coaching, its significance as the benchmark in measurement and evaluation is well-established and well-documented.

Measuring and evaluating the return on investment validates the critical role of coaching as a performance improvement solution. Expressing value in monetary terms puts business coaches on track to meet the growing demand for accountability.

This article first appeared in Business Coaching Worldwide (Summer Issue 2005, Volume 1, Issue 2).

Jack J. Phillips, Ph.D

Jack J. Phillips, Ph.D, is a world-renowned expert on measurement and evaluation, chairman of the ROI Institute, and consultant to many Fortune 500 companies. He facilitates workshops for major conference providers throughout the world. His most recent books are Proving the Value of HR (SHRM, Winter 2005) and Investing in Your Company’s Human Capital (AMACOM, Spring 2005). Find out more about Jack’s work at

Measuring ROI? In business coaching? Yes and yes.
Isn’t this just a fad? Isn’t this impossible? No and no.

As more and more organizations use business coaching as a human resources, performance improvement, and leadership development approach, many executives question its value, particularly as coaching expenditures grow. Whether the engagement takes place in the context of an internal department for coaching or through arrangement with a business coaching firm, coaching assignments and commitments are planned and executed with good intentions. Unfortunately, however, not all coaching engagements produce the value desired by either the individual being coached (participant) or the sponsor who often pays for it. It will be increasingly important that business coaches measure a significant return on investment (ROI) and show the value of business coaching in terms that managers and executives understand.

It’s Not a Fad . . .
Measuring ROI enjoys a history of nearly thirty years of application in a variety of human resource and performance improvement processes and across the full spectrum of industries and organizations. Thousands of trained practitioners implement an ROI process in their own settings and thousands of impact studies are generated annually worldwide. The methodology is the subject of many books in many languages.

It’s Not Impossible . . . 
Successfully measuring ROI for business coaching involves much more than simply assessing results achieved. The most effective ROI processes involve four phases: planning, data collection, data analysis, and reporting.

In the planning phase the coach, the person being coached, his or her manager, and the sponsor (client organization) agree on the evaluation plans and establish a baseline for expectations.

The data collection phase occurs in two time frames. Data is collected first during the coaching experience and then at the conclusion of the engagement or at an appropriate follow up time. The data collected include satisfaction and reaction, learning, application and implementation, business impact, and ROI. See Figure 1.

Evaluation Levels
Level Measurement Focus
1. Reaction & Planned ActionMeasures participant satisfaction with the coaching experience and captures planned actions
2. LearningMeasures changes in knowledge, skills, and attitudes
3. Application and ImplementationMeasures changes in on-the-job behavior and progress with application
4. Business ImpactCaptures changes in business impact measures
5. Return on InvestmentCompares coaching engagement monetary benefits to the program costs
Figure 1 – The Levels of Data

The third phase in the ROI Methodology–data analysis— isolates the effects of the coaching on the business. The process includes converting data to monetary values using conservative figures (higher figures for costs, lower figures for benefits), capturing costs, calculating the return on investment, and identifying intangible measures and benefits.

Phase four–reporting–requires reaching conclusions, generating reports, and communicating the information to target groups. This new knowledge affords all involved–from the coach and the person being coached to upper level executives in the client organization–the ability to assess the value of the coaching engagement and the opportunity to make adjustments going forward.

Final Thoughts . . .
Developing the ROI in business coaching is not a fad, and it’s not impossible. Measuring ROI in business coaching is, and will increasingly become, an imperative for organizations and coaching firms pursuing the highest standards of accountability.

This article first appeared in Business Coaching Worldwide (Premier Issue 2005, Volume 1, Issue 1).

Jack J. Phillips, Ph.D

Jack J. Phillips, Ph.D, is a world-renowned expert on measurement and evaluation, chairman of the ROI Institute, and consultant to many Fortune 500 companies. He facilitates workshops for major conference providers throughout the world. His most recent books are Proving the Value of HR (SHRM, Winter 2005) and Investing in Your Company’s Human Capital (AMACOM, Spring 2005). Find out more about Jack’s work at