Professional Documents
Culture Documents
Week 1 Leadership and Management Education and Training (LMET) Its Relationship To Shipboard Effectiveness and Readiness
Week 1 Leadership and Management Education and Training (LMET) Its Relationship To Shipboard Effectiveness and Readiness
Monterey California
* 00 o
SODTIC ELECTE
SFEB 1IS3
THESIS D
by
Teresa C. Cissell
and
David P. Polley
I
December 1987
Thesis Advisor: Carson K. Foyang
R8 2 12 007
rECUC A- 0A.N
3; 75 ACE
I REPORT DOCUMENTATION PAGE
ls. REPORT SECURITY CLASSIFiCATION tb RESTRiCTIvE MARKINGS
UNCu SIFM
2a. SECURITY CLASSIFICATION AUTHORITY 3 DISTRIBUTION /AVAILABILITY OF REPORT
Ba. NAME OF FUNDING i SPONSORING 8b. OFFICE SYMBOL 9. PROCUREMENT INSTRUMENT IENTIFICATION NUMBER
ORGANIZATION (If ap)plicab!e)
Bc. ADDRESS (City, State, and ZIP Cod*e) 10 SOURCE OF FUNDING NUMBEPS
PROGRAM PROJECT TASK WORK UNIT
ELEMENT NO. NO. NO ACCESSION NO.
17. COSATI CODES 18. SUBJECT TERMS (Continue on reverse if necessary and identify by block number)
FIELD GROUP _ SUB-GROUP • Leadership; U•S; Shipboard Effectiveness;
Readiness
19 ABSTRACT (Continue on reverse of necessary and identity by block number)
Teresa C. Cissell
Lieutenant, Lnited States Navy
B.A., University of Alabama in Huntsville, 1978
and
David P. Po11ev
Lieutenant Commander. Uniied States Navy
B.S., Miamri Lniversity, 1975
from the
NAVAL POSTGR.ADUATE SCHOOL
______________________ December 1987
Authors:K '6 ~ -
I eresa C. Cissell
.0 Cv
Ac oioi For
N.TPS CRA&W
OTIC TAE8
Unannoiwi:tcJd
Justificativ:..
y .....................
Distributloio I
Availability Codes
Dist Ai~ u
I|
TABLE OF CONTENTS
1. INTRODUCTION.......................................... 9
A. ORGANIZATION OF THE THESIS ....................... 9
B. DEFINITION OF LMET ................................ 9
C. BASIS FOR RESEARCH ............................... 10
II. BACKGROUND .......................................... 12
A. MILITARY LEADERSHIP.............................. 12
B. HISTORY OF NAVY LEADERSHIP TRAINfNG ............ 13
Ill. LITERATURE REVIEW.................................... 18
A. PREVIOUS EVALUATIONS OF LMET....................18S
I. NPRDCIONR/N,\PGS (1977) .......................... 18
2. SDC (1979)........................................ 19
3. Davies (1980) ..................................... 20
4. Parker (198 1)...................................... 21
5. Vandover and Villarosa (1981)......................... 21
6. Abe and Babylon (1982) ............................. 22
7. Foley (1983) ...................................... 22
8. Glenn (1987) ...................................... 23
9. LMET Sites ....................................... 24
10. Command Effectiveness Study (1985) .................... 24
11. Command Excellence Seminar Feedback Analysis (1987) ......26
12. Summary....................................... .26
B. STUDIES ON MEASURES OF EFFECTIVENESS ............ 27
I. Horowitz (1986).................................... 28
2. Davilli and Schenzel (1986)............................ 28
3. Chatfield and Morrison (1987)......................... 29
5
B. SCOPE . . . . . . . .. ....
. . . . . . . . . . . ..
32
C. DATA SOURCES..................................... 33
D. ANALYSIS TECHNIQUE .............................. 33
V. RESULTS ........................................... 34
A. ATLANTIC FLEET RESULTS........................... 37
B. PACIFIC FLEET RESULfS............................. 38
C. SUMMARY.........................................318
VI. CONCLUSIONS AND r.ECOMMENDATIONS.................. 40
A. CONCLUSIONS ...................................... 40
B. RECOMMENDATIONS ............................... 42
1. Improvements o LMET............................. 42
2. Areas for Futurm Research ............................ 43
C. SUMMARY ......................................... 44
LIST OF REFERENCES............................................ 50
6
gI
LIST OF TABLES
7
ACKNOWLEDGE, ENTS
The authors wish to acknowledge !he assistance and cooperation provided by CAPT
Fmie Haag, USN ret.; CAPT William Fahey, USN, ret.; Mr. Mike Glenn; LT Barbara
Korosec of the U.S. Naval Academy staff. Dr. Robert F. Morrison, NPRDC; Mr.
Mike Dove and Ms. Micheile Saunders, DMDC; LT Duke Kamm".eer, LMET
instructor at the Naval Supply Corps School; CWOI Millican, LMET instructor at
Naval Aviation Schools Command, Pensacola; CDR Rich Pearce and LTJG Sanford,
SVRFPAC staff, LCDR Singleton, SURFLANT staff- EMCM (SS) Winston Posey,
NMPC 62; LMET sites who responded to queries; QM I Kinsey of NMPC-482; ENS
Saldowski of OP.64; and Dr. Carson Eoyang of PERSEREC and Naval Postgraduate
School.
8
1. INTRODUCTION
B. DEFINITION OF LMET
LMET is a formal, Navy specific training program designed to prepare
supervisors and managers for leadership and management positions at key (threshold)
points in their careers. LMET is based on research done by McBer and Company, of
effective Navy leader behavior (as is discussed in Chapter I1) and focuses on specific
skills and individual initiative. LMET is now taught at 21 sites to about 30,000 Nav-y
personnel each year. In fiscai year 1980 LMET cost the Navy about 517 million
tRef. 11
9
'p, 207" in 1986 the approximate cost of LMET was S21 million.1 There are 19
varieties of the course-each geared to the appropriate level in the chain of command
(i.e. Loading Petty Officer, Senior Officer) and tailored to the warfare or staff
community (surface, aviation, medical, supply) [Ref 2: p. vi]. LMET uses lecture, case
studies, role playing, simulations, small group discussions, instrumented feedback (self
assessment questionnaires), and individual written and reading assignments to convey
to participants leadership competencies.2 [Re. 2: p. 391
II
IL BACKGROUND
A. MILITARY LEADERSHIP
Good leadership is essential to the effectiveness of any organization. On finds,
however, little agreement aniong scholars, researchers, or practitioners as to what
leader.ship is, much less how to defmne good ledership. Definitions and theories range
from those focusing on an individual's personality and genetic tr:'ts to those describing
leadership more as a process involving interaction between organizational purpose and
individual behavior.
Competent military leadership is essential to the effectiveness of each military
unit as well as to the success of the U.S. Defense Department in accomplishing its
goals. Military organizations have unique missions which often require humans to
perform tasks which might otherwise be considered inapproprinte, immoral, or even
unlawfW in any other settinC. The military also may require submission to stricter
rules, adverse environmental conditions, and any number of tasks contrary to personal
preference. The men and women in the armed services are expected to perform ever
more diverse and demanding tasks with existing or often fewer resources. The future
role of the Navy as well as that of other services will place increasing pressure on
ilitary leaders to do more and better with less. The shape of the future, because it
points to increased technology, automation, and reduced manning levels, only sharpens
the need for Navy officers and senior NCOs to acquire requisite leadership and
management skills.
In seeking those skills encompassed by leadership and management, one must
first understand the concepts: how do key people in the Navy defime leadership and
management? In a June 1987 conference on leadership held at the United States Naval
Academys VADM William Rowden, Commander Naval Sea Systems Command,
distinguished leadership from management by defining leadership as 'the ability to
motivate people' and management as 'a process of getting things done" (Ref. 6: p. 31.
Also at the conference, Professor Ben Schneider, University of Maryland, nade a
SOn June 10-12, 1987, the Naval Academy and Navy Personnel Research and
Development Center (NPRDC) co.sponsored a conference on leadership. There were
90 participants and 35 speakers. Twelve active and retired flag officers attended
included AD.M Trost, Chief of Naval Operations. Several academic researchers in the
leadership field spoke as did the Master Chief Petty Officer of the Navy.
12
~ ~
h~ ~ ~ -
similar distinction: leadership is "energizing and directing others" and management is
"a process of getting things done" [Ref 6: p. 3]. Admiral Rowden also said that
management is more eaily learned than leadership. Perhaps this is because there is
greater agreement over what management is , leadership still exists in a haze of
theory and disagreement.
headed by CAPT Carl Auel (Chaplain Corps) assisted by Fred Fiedler (a scholar in the
leadership field). Over a three month period, the panel examined earlier and existing
leadership training and proposed a method for designing an "ideal" training model.
Their report refered to development of a system, not a course, implying that much
more than a single course would be necessary to correct the leadership training
program. "Without an LMET system, the first phase of which is a clear and
comprehensive definition of requirements by line managers, any further expansion,
consnlidation, or reprogramming of current training efforts would meet fleet needs at
the level of chance." [Ref. 9: p. iv]
It was in 1975 that McBer and Co. became involved in the Navy Human
Resources Management Program. McBer, a Boston-based consulting firm established
by Dr. David C. McClelland and David Berlew in '1970 [Ret 10: pp. 35 & ,9], was
contracted to improve the effectiveness of Human Resource Management (HRM)
Specialists. McClelland, a Harvard psychologist, had focused much of his work on
improving the screening process for hiring employees. He found that in many
organizations, the tests they were using to screen applicants tested for academic
potential rather than for skills that would be reflected in job proficiency. McClelland
believes that people should be hired and trained based upon competencies.
"Competencies are not aspects of the job but characteristics of the people who do their
job best.' [Ref. 10: p. 40]
After identifying what behaviors superior HRM specialists demonstrate better or
more often than average specialists do, McBer devised a training model based on
I
"competencies" [Ref. 11]. McCleliand's theories on competencies and how they relate
to achievement are explained in further detail in The Achieving Society [Ref. 12] and
"Testing for Competence Rather Than for 'Intelligence' [Ref. 13].
14
McBer's approach--to sample (using Behavioral Ev-nt Interviews) 8 high
performers and average performers to train people to do those things that separate
high performers from their peers--had both scientific and practical appeal to the Navy.
In January 1976, after abandoning internal efforts to develop a new leadership training
program, and under high level pressure to produce tangible results, several civilian
contractors were asked for proposals. The unconventional approach of McBer was
selected. Using the same technique employed in the Navy HRM Project, McBer
analyzed the results of interviews with Navy supervisory personnel previously
categorized by thier commanding o.icers as either superior or average leaders.
[Ref 1: pp. 204 & 205] In 1976, McBer began sampling Navy Leading Petty Officers,
Chief Petty Officers, and Commissioned Officers first on the West Coast and then on
the East Coast. Their first model included twenty-seven competencies. In 1978 and
1979, pilot courses were taught by Navy instructors and evaluated by System
Development Corporation [Ref. 16]. Evidently, these early courses were based on all
twenty-seven competencies. To validate their findings, McBer later sampled 1,000
Navy officers and enlisted personnel using nine tests to measure competency elements.
Behavioral Event Interviews were also conducted on a subset of 100 testees. Sixteen of
the original twenty-seven competencies were found to be significantly related to
superior leadership in the vaiidation phase. These sixteen competencies, listed in Table
1, are now the backbone of most of the current LMET courses.
The premise behind LMET is that the sixteen competencies can be learned, and
increased use of the competencies will lead to better leadership and managemcnt and
hence improved effectiveness. LMET competencies are acquired through a five-step
process:
1. Recognition (identifying knowledge, skills, values, etc. present in
cases/incidents)
2. Understanding (integration ar.d connection with one's own experience)
3. Self Assessment in Relation to the Competency (discovery of one's own level in
each competency and identification of a-eas for specific improvement)
4. Skill Acquisition and Practice (practical exercises,,classroom applications)
MM
..... %
I%
TABLE 1
LMET COMPETENCIES
a variety of situations, i.e., when to employ which competencies. [Ref. 2: pp. 50-511
16
I - . - - - - - -- - - - - -- --
More recently, the Command Effectiveness StudyO results have had an impact on
LMET content, particularly the courses for Prospective Commanding and Executive
Officers (now replaced by the Command Effectiveness Seminar) and the LCPO course.
Essentially LMET has been shaped not only by the initial research done by McBer, but
also by feedback from participants, evaluations made by observers, and subsequent
research.
LMET is now the approved method for Naval leadership and management
training. The course components have not, however, been systematically included in
Naval Academy curricula, Officer Candidate School classes, or Navy Reserve Officer
Training Corps (NROTC) requirements.
9 The results of the Command Effectiveness Study are discussed in Chapter III.
17
!11. LITERATURE REVIEW
18
* factor analysis using two few cases (Morrison)
significance testing at the .10 level (King) 10
After such sharp criticisms, it is surprising that LMET ever got off the ground.
It would appear that McBer's competencies, however arrived at, intuitively appealed to
reviewers and are similar to other characteristics of successful leaders such as those
found in the Handbook of Leadership. by Stogdill. 11
2. SDC (1979)
Between May 1978 and May 1979, System Development Corporation (SDC)
evaluated LMET pilot courses for Leading Petty Officers (LPOs), Leading Chief Petty
Officers (LCPOs), Prospective Commanding and Executive Officers (PCO,/PXO) and
LMET instructors. Objectives of the assessment were: (1) to perform on-site
evaluations of the delivery of the courses; (2) to review instructor guides and student
journals; and (3) to provide specific recommendations for management decisions
concerning the assignment of Navy instructors to deliver the courses.
The SDC assessment was not intended to measure impact of the training on
subsequent performance, but did attempt to discover whether students were receptive
to the course material and absorbing any of it. A sample of SDC's findings:
* Navy instructors were in need of training in facilitation techniques.
* Course materials needed to be "de.civilianized", that is, made more suited to
military needs and situations.
* Participants enjoyed the courses.
* Time boundaries limited the ability to use very many practical exercises.
* SDC recommended courses be standardized and offered to all targeted levels of
Navy personnel. [Ref. 16]
It is interesting to note that SDC also participated with McBer in the data
gathering phase of the LMET project [Ref. 31. Their expertise in LMET design is
useful; however, one might question their complete objectivity as they may have had
some stake in the success of LMET.
10 Excerpted from memorandums written by Dr. Morrison, Dr. King, Dr. Elster,
and Dr. Eoyang in response to requests to review McBer's research. These memos
were supplied by Dr. Carson K. Eoyang, PERSEREC, from his personal files.
t"t These characteristics include but are not
limited to: intelligence, energy,
judgement/decisitveness, integrity, achievement drive, dominance, duLve for
responsibility, initiative, sociabiliiy, assertiveness, emotional balance and control, and
ability to enlist cooperation. [Ref. 18: pp. 74-75].
19
3. Davies (1980)
In his thesis Davies discussed the need for the Army to evaluate its leadership
training. He presents an extensive review of the leadership theories contributing to the
Army's organizational leadership model, their training programs, and the leadership
training of the other services.
In his discussion of the development of the Navy's LMET program, Davies
traces the evolution from the early 1970's when the Navy had 157 different leadership
courses through the research by the McBer Company which identified the sixteen
competencies which form the basis of the current L.MET courses. He also notes that
there has been no formal evaluation of LMET, although at the time of his work (1980)
the Chief of Naval Education and Training (CNET) was "progressing toward an
internal evaluation plan to determine whett -.r the course is actually teaching what it
was designed to teach." [Ref 191
As discussed by Davies and elsawherc in this thesis, tho evaluaton of LMET
is com.licated by the lack of a control group within the Navy. This is because the
Navy hais adsipted the pooicy oi not deiyuing LIME'i training to any personnei ior the
purpose of establishing a control group.
In presenting his recommendations, Davies proposed two separate plans, the
organizational leadership training evaluation plan and the Army's Leadership and
Management Development Course evaluation plan. Within each of these areas he
offered specific objectives and steps to measure the achievement of the objectives. One
of his lower echelon divisions is organizational performance for which the stated
objective is: to determine if the leadership training program is reflected in changes in
the operational performance of the units to which the newly traiued leaders are
assigned. Although this objective roughly parallels the purpose of this thesis, there are
numerous and significant differences between the Army's program and the Navy's
LMET program, thus precluding further development along the path which Davies has
laid out. For example, the Army takes a decentralized approach for its Leadership and
Management Development Course, allowing these experience based workshops to flow
according to the needs and backgrounds of the individuals attending. Whereas the
Navy has adopted a strict, centrally controlled format for its LMET courses.
In summary, Davies achieves his stated purpose of raising the question of
evaluation of the Army's leadership program and of offering a plan whereby the issue
of training effectiveness can be studied. That such an ambitious evaluation scheme as
20
he proposes will ever be undertaken by the Army or any other agency is questionable
because such an evaluation would require a high level of commitment including several
million dollars for the research effort alone.
4. Parker (1981)
Donald F. Parker is a retired Navy Captain and (in 1981 when this reference
was published) was Assistant Professor of Organizational Behavior and Industrial
Relations in the Graduate School of Business Administration at the University of
Michigan. Immediately preceding his retirement from the Navy he was CO of the
Navy Personnel Research and Development Center. In a chapter he wrote for Military
Leadership [Ref I, Parker reviews the events leading up to LMET development, the
research upon which LMET is based, and LMET course design and delivery.
The following are some of the findings and conclusions made by Parker:
* LMET could have been developed with internal expertise.
* LMET was not designed with a clear comprehensive definition of requirements
as was recommended by a panel headed by Chaplain Auel [Ref. 91.
* LMET was not developedi under the Interservice Procedures for Instructional
Systems Development (ISD) which Parker believes led to inadequate learning
objectives; inconsistencies between tests, instraction material, and stated
objectives; and difficulty in measuring the success of the training.
0 Analysis for LMET design vwas 'deficient with respect to concept definition,
research design, data collection, data analysis and interpretation' [Ref. I: p 198].
0 LMET courses don't include the concept of contingency, i.e., how to select
appropriate behaviors in differing situations.
From his observations of classroom training, Parker found that the flow from
lecture to discussion to small group activities helped to maintain student interest and
provided frequent opportunities for students to express thrf.r opinions and trade ideas
with peers. He found that in practice LMET instruction differed somewhat from class
to class and location to location as instructors sought to motivate each group. He also
found that the course was well accepted by students. [Ref. 1: pp. 207-2081
5. Vandover and Vllarosa (1981)
In 1981 Vandover and Villarosa, two Naval Postgraduate School students,
interviewed a cross section of 51 LMET graduates and their immediate supervisors and
subordinates from 13 operational commands. In their pilot study for evaluation they
sought to discover any inmprovements over non-graduates in the knowledge or behavior
of LMET graduates. They found no systematic link between LM.ET and leadership
related behavior changes. However, some of the trends they discovered include:
21
22
I
* Leadership example as set by superior
* Comnmunications flow
* Attitudie towards inspections (short sightedness)
* Lack of emphasis on ,ubordinate development
* No support by senior members of the command
Lack of a reward system for competency use
Those who had demonstrated behavioral changes that they attributed to LMET
exhibited these characteristics:
* A strong desire to change their behavior
* Felt they had room for improvement in leadership and management
Were more likely to be junior with some leadership experience
• Returned directly to management positions after graduating
* Had some iritial success in practicing the competencies
* An immediate suoerior or peer had served as good role model
* They were more likely to be assigned to a command noted for its organizational
"effectiveness and that stresses subordinate development
She recommended that LMET be continued azwid reinforced at the unit level
and that, through the HRM program, commands improve, communication, stress
subordinate development, and improve problem solving techniques. [Ref. 221
8. Glenn (1987)
Mike Glenn, a former Navy Organizational Effectiveness Consultant presently
working at Naval Training Systems Center in Norfolk, VA, is finishing a doctoral
I
dissertation on 'Senior Management Perceptions of Actions to Support Post-Training
Utilization of (LMET)". Of specific concern to Mr. Glenn are the following:
• Which management and supervisory actions in support of Job Linkage and
Follow-up' 2 do senior Naval managers perceive to be important for their
subordinate managers/supervisors?
To what extent do senior Naval managers perceive that the important
management and supervisory actions in support of Job Linkage and Follow-up
are practiced in their organizations by their subordinate managers/supervisors.
[Ref. 23)
12 Job
Linkage and Follow-up refer to specific activities in this dissertation. Job
Linkage, as Glenn defines it, is re-entry of the trainee into the workplace. He defines
Follow-up as on-going support of learned behaviors on the job.
23
He surveyed 106 Navy operational commands askLig senior managers to rate
the importance o^ various actions which may be taken by management to support the
full use by persowiel, on the job, of behaviors learned in LMET. He also asked them
to indicate whether they had ever observed each action in any organization and in their
present conmmand. So fkr he has found that senior Navy managers believe assignment
of a role model (supervisor or co-worker), trainee goal setting for job performance, and
trainee environment are important to Job Linkage.
9. LMET Sites
At least two LMET sites are gathering data from Commanding Officers (COs)
3
of Temporary Additional Duty (TAD) attendees about six months after graduation.'
Their purpose is to gauge whether the COs are pleased with the "results" of LMET.
Their questions include:
* Has the individualrs leadership and management performance improved after
compieting LM.ETr_
0 What improvements in performance if any have been seen since attending
I
LMET? ,
* What improvements have you seen in the work group?
One site found that about 64 percent of COs responding noticed an
improvement in the performance of graduates. The sample size is very small thus far
(n- 18), but the effort shows considerable promise in speciing what LMET does for
graduates and their commands.
10. Command Effecttwess Study (1985)
Although not spec.iflcally related to LMET, this study, done by McBer and
I
Co., turned from a focus on IndvIdud performance to characteristics distinguishing
superior from average cmouds. After identifng criteria/indicators of superior
command performance, a sample of outstanding and average operational commands
were observed, interviewed and surveyed.
Four survey instruments were used in the study:
* Navy Competency Assessment Profile (NCAP) - asked respondents to rate
themselves on the sixteen fleet competencies and whether they had attended
LMET.
Although the overall competency rating scores did not differ significantly as a
function of LMET training, LMIET-t.,ined individuals were better able to
differentiate their abilities in using the competencies. Persons without LMET
trwining tended to rate themselves similarly on all 16 competencies. (Ref. 2: p.
501
Despite the drawback of using a self assessment instrument such as the NCAP
to measure the effect of LMET; these statements are supported by some of the studies
discussed previously.
The primary product of the Command Effectiveness Study was a model of an
organizational system whose "parts" are interrelated. These components have certain
characteristics which distinguish the superior organization from an average one. The
model consists of three areas which are further broken down into thirteen levels:
* People
- CO
- XO
* Wardroom
* Chief's Quarters
- Crew
0 Relationships
* CO - XO
- Chain of Command
- External
0 Activities
25
* Planning
* Maintaining Standards
Communicating
C
* Building Esprit de Corps
* Training and Development
The results and lessons learned through the Command Effectiveness Study
(CES) are the basis for the Command Effectiveness Seminar (also known as the
Command Excellence Seminar). CES results have also been incorporated into most of
the LMET courses. [Refs. 24,251
11. Command Excellnc. Seminar Feedback Awdlyds (15M7)
This Caliber Associates report summarizes an analysis of feedback (course
critique) sheets filled out by 215 Command Excellence Seminar attendees. The
objective of the report was to identify results related to unproved mission readiness
experienced by seminar attendees by conducting a content analysis of two items from
the feedback sheets. These itezrs asked the respondent to give examzples wh-re the
seminar helped them or their commands do the job better.
The responses were fairly homogeneous in that virtually everyone reported
some type of improved performance in, or greater awareness of somc dimension of
leadership. Overall, the report indicated that the Command Excellence Seminar is
beneficial. Participants responded enthusiastically and attribute considerable personal
success to the course. The data did not support any specific conclusions regarding
"outcomes" in the form of organizational impact from the course. This was due in part
to the lack of specificity in the feedback questions and exploratory nature of the
analysis. (Ref. 261
12. Summary
What can be learned about the effectiveness of LMET from the studies
presented thus far? Training evaluation is often categorized into four types of
measures:
1) Participant Reactions
2) Evidence of Learning
3) Evidence of Behavioral Change and
4) Results in Operations (impact on organizational effectiveness). [Ref. 271
26
a. ParticipantReactions
In reading the studies discussed earlier and in corresponding with former
and current LMET instructors we found that students generally react positively to
LMET. Many have stated that they wish they could have attended earlier in their
careers.
b. Evidence of Learning
Pre and post-testing of studcnts by SDC during their assessments of pilot
courses indicated that students were gaining expected knowledge levels. But Vandover
and Villarosa's [Ref. 20: p. 871 observation regarding deterioration of knowledge after
as lttle as six months leads one to conclude that much of the material learned is short-
lived.
c. Evidence of Behavioral Change
This was the primary focus of three of the Naval Postgraduate School
theses [Refs. 20,21,221. None of them found a systematic link between LMET
attendance and improvement in leadership behavior. They were able to discover some
individual and organizational factors that intervene in behavioral changes. Gleiu.'s
approach in his dissertation is to discover these factors through the perceptions of
senior managers r11,ef. 231.
d. Results in Operations
As stated earlier, this ground is yet uncovered in LMET evaluation. There are a
number of reasons why such research has not been done yet, including:
0 Lack of a "control group" (a command or unit completely unaffected by
LMET)
* No baseline research on eirectiveness levels prior to LMET implementation
* Many uncontrolled intervening variables
* Instability of measures of effcctiveness (MOEs).
27
1. Horowitz (1986)
Dr. Horowitz found at least fifteen studies identifying quantitative links
be~ween Manpower, Personnel, and Training (MPTI factors and unit performance.
Several of these studies address the payoff to training. He suggests such measures as:
* Operational Propulsion Plant Examinations (OPPEs)
* Operational Readiness Examinations(OREs)
* Selected Exercises (including live firing exercises)
* Excellence Awards (such as Battle Efficiency "E")
• Bombing scores (for aviation units)
• Air Combat Maneuvering Ranges
0 Simulator performance
* Casualty Reports (CASREPs)
• Unit Status and Identify Reports (UNITREPs)
* Board of Inspection and Survey (INSURV)
* Maintenance and Material Management (M)
Dr. Horowitz found shortcomings among most of the measures yet he did not see them
as insurmountable barriers to research in the area of relating MPT factors and unit
MOE's. For example:
* Training exercises such as OREs and OPPEs are prepared for, and would
therefore reflect an "upper bound" on performance.
For selected exercises failing grades may not be numerically recorded.
F
* Only one ship per squadron can be awarded the Battle "E".
* CASREPs and UNITREPs are self-reported, not objective and criteria vary
widely among commands.
* 3M data suffers from reporting errors and differences from ship class to ship
class and over time. [Ref. 28]
2. Davilll and Schmzel (1986)
Davilli and Schenzel, two Naval Postgraduate School students, used Refresher
Training (RFFTRA) ORE and Battle Problem scores as dependent variables in creating
Multiple Regression Models of the relationship between readiness and a number of
manpower, training, aid other evaluative measures. They used a small sample of ships
(n- 44), however, and iad to obtain much of the data by physically searching through
REFTRA files in Guantanamo Bay, Cuba.
28
• .'
I
Assuming ORE is a universally accepted measure of readiness, Davilli and
Schenzel's results indicate a multivariate approach to predicting readiness is feasible.
The variables with the greatest predicting power (Beta coeficient and significance level)
in their model were billet vacancies at 90 days and 180 days prior to ORE. Variables
such as average drill periodicity, average school qualification, and average watch
qualification 30 days prior to ORE were poor predictors (low significance and Beta
coefficients, and (for one variable) the opposite sign than was expected). [Ref. 29]
3, Chatfield and Morrison (1987)
Researchers at NPRDC assessed the consistency and stability of 20 surface
ship measures of effectiveness from fiscal year 1982 to 1984 [Ref. 301. Their purpose
was to create a pre-change baseline that could be used later in evaluating the effect of
the new Surface Warfare Officer (SWO) career path on readiness and performance.
They proposed using a multiple measure approach, assuming that no single measure
was an appropriate evaluation standard. The unit measures they looked at included:
* PEB (Propulsion Examining Board)
* PMS (Preventive Maintenance System)
* NWTI (Nuclear Weapons Technical Inspection)
• REFTRA (Refresher Training) Quick Look
• Post-TRE (Training Readiness Evaluation)
* CASREPs (Casualty Reports)
* Personnel Retention
* TRA (Training Readiness Assessment)
• CSRT (Combat System Readiness Test)
0 Safety Inspection
• Command Inspection (done by immediate superior command)
* INSURV
• ARE (Aviation Readiness Evaluation)
• Battle "E" competition
• PQS (Personnel Qualification System)
• UNITREP
Chatfield and Morrison found that data on ship performance had poor year to
* year stability and was inconsistent among different MOEs for the same ship. They
concluded that the measures were too unstable to use as a baseline for evaluating
policy revisions. They did not recommend collection or analysis of other measures as it
29
would likely lead to similar results. Instead Chatfield and Morrison recommended
review and revision of MOEs to improve their reliability and validity,
F4
30
IV. METHODOLOGY
A. SELECTION OF VARIABLES
After reviewing the studies cited in chapter III and phone conversations with
type commander staff members about what the admirals look at in determining which
ships get the Battle "E", the following measures of shipboard effectiveness were
selected:
* Percent (of an eighteen month cycle) time spent in each of four C-ratings 14 as
reported on UNITREPs.
* REFTRA scores15
* 3M:PMS inspection scores
• OPPE scores
16
• Supply Management Inspection (SMI) results
• Personnel retention rates1 7
Because a secondary goal of this study was to create a framework for future
evaluation of LMET, variables were selected not only on the basis of face validity but
also on data availability and potential for quantification. The objective was to obtain
data on as many ships as possible through fairly routine reports and records.
B. SCOPE
The sample consisted of 285 surface ships. This was all surface ships under
Commander Naval Surface Forces, Atlantic and Pacific, less those who were in
overhaul or other programmed repair more than half of the eighteen month cycle
(January 1985 - June 1986). The eighteen month period chosen is the latest complete
competitive cycle.
Since MOEs differ so much among surface, aviation, submarine, and shore
activities, it was decided that the analysis should be limited to only the surface
community in this preliminary study.
IgSince the first LMET courses were taught in 1978, for those who had never
attended, their "years since attendance" was set at eight years. The maximum number
of years since attending could not be greater than seven (1985 - 1978) for those who
had attended. Eight years was chosen so as not to give non-attendance too much
weight in this variable.
32
C. DATA SOURCES
Chief of Naval Operations (OP-64) supplied a tape of C-ratings and
corresponding dates for all surface ships. 19 The remaining data on MOEs were
collected from type commanders (Commander Naval Surface Forces, Atlantic and
Pacific). 20 A list of Unit Identification Codes (UICs) was supplied to Defense
Manpower Data Center (DMDC) which in turn provided a tape containing social
security numbers of all E5s and above onboard these ships who show up on at least
three quarterly manpower reports. This tape was forwarded to Naval Education and
Training Command (CNET) in Pensacola, FL where the social security numbers were
matched with Navy Integrated Training Resources Administrative System (NITRAS)
files of LMET graduates providing data on who had attended LMET and when they
attended. Courses numbers and titles supplied to CNET are listed in Appendix B.
DMDC cleansed the data of cases in which an individual had attended a school which
did not include LMET during the time he/she attended and grouped the data by unit
yielding the LMET variables discussed earlier. Further information regarding the data
received from NITRAS is included in Appendices C and D.
D. ANALYSIS TECHNIQUE
Because so many variables affecting each ship's readiness and performance are
unknown or unavailable, statistical techniques and conclusions are limited to describing
hypothetical association between LMET and effectiveness. To estimate the degree of
association between LMET variables and MOE variables, the Spearman Coefficient of
Rank Correlation (P.) was computed. Like other measures of correlation, ps varies
from -I (a perfect negative relationship between two variabi:s) to + I (a perfect
positive relationship between two variables). This non-parametric test was chosen
because of the ordinal nature of much of our data and the unsuitability of the data for
more classical procedures (randomness, normal distribution, independent observations,
etc.). [Ref. 321
This method of analysis does not lend itself to profound conclusions regarding
LMET's effect on unit performance or combat readiness. However this preliminary
study is considered to be the first step in designing and testing a model for LMET
evaluation against unit effectiveness criteria.
19 The information obtained on C-ratings was classified "Secret".
20 Much of the data obtained from the type commanders were classified
"confidential".
33
V. RESULTS
Results of Spearman's Rho Rank Correlation tests are shown in Tables 2 and 3.
They are separated by fleet because of the often extreme differences in scoring,
reporting, and standards between the fleets. In fact correlation between many of the
variables and fleet was significantly high (p5 ranged from .0009 to .4309). Eight out of
eighteen were significant at the a - .05 level.
In interpreting the results of this correlation, as listed in Tables 2 and 3, one
must use caution. First, what do the numbers themselves mean?
The top number is the coefficient of correlation (Ps) which ranges from negative
one (41) to positive one (+ I). This PS measures the strength of the relationship
be- --en the two variables adjacent to it in the matrix. A ps of negative one would
indicate pezfem negative correlation between the two variables, i.e., as one variable
increa. is the other always decreases. While a p5 of positive one would indicate that
ships with high values in one variable also possess high values of the other variable.
Such extreme values then indicate the strongest possible relationships between
variables. A ps closer to zero (0) tends to show a weaker relationship between
variables, with zero indicating no relationship between the variables. [Ref. 32: p 5621
"1 second number in each of the "cells* of Tables 2 and 3 indicates the n~umber
of cast mailable to compute that particular p5 value. This figure ranges from 24, in
the cas, f the number of ships in the Pacific Fleet for which OPPE scores were
available, to 153, the total number of ships in the study from the Atlantic Fleet. As is
obvious in c serving these figures, not all ships had scores or other measures available
for this qtxudy. This variation in number of' ships sampled for any given
inspection/,examination reflects differing operating schedules, availability of inspection
teams, emergent fleet requirementi (which occasionally require cancellation or
rescheduling of inspections), and other factors beyond the scope of this study.
The third and final statistic captured in each 'cell' of the matrices in Tables 2
and 3 is significance level. This provides the reader an estimate of whether the
relationship indicated by ps is probab!y due to chance or some systematic relationship
between the two variables tested. Significance levels as shown in the tables indicate the
likeihuod that the relationship is merely due to chance, therefore values closer to zero
34
TABLE 2
SPEARMAN'S RHO RANK CORRELATION COEFFICIENTS
ATLANTIC FLEET
inl~ NN-INM
sig.6- Si Si•. Sig. Sig.64 sig .1
in NN=153
ime
in - -08
sg.sg.i -.092J7 Nin.1
sig N-3n
sa. .N24 .10
.166 sigj.
term
etenton41 -.0802 N-.17231 N-.0121
I-41 .0029
N-141 N--. 1691
141 -.0640
N=141
S9.144 Sig .01 sig .887 sig .973 sig .045 sig .451
2RnD r .082o .0163 -.0617 --053° .1169
J N-;.- N- 13 N- 139 N- 13 9 N.- 139
sig.995 sig .336 sig .849 sig .470 sig 33 sig.170
35
TABLE 3
SPEARMAN'S RHO RANK CORRELATION COEFFICIENTS
PACIFIC FLEET
0644 .0
Sig. .s
s151 sig 016 sag .90
V.
ime -.1602 .8-.9912I4 1Q
Nig-
N-11
Sig.I8.3i
N 1 N-3 N-1 3
36
S .. .. m - .. . -...
. . . ... . .. . . . . .. . . .
allow the researcher to reject the null hypothesis (that the two variables are unrelated)
and conclude that the two variables are related.
Once it appears that the two variables are related to each other, the sign (- or +)
of PS should be reexa,-iined. This is the point at which extreme caution should be
taken with regard to interpretation. Many of the variables indicate poorer performance
as their values increase. For example, *percent time in CA (not combat ready)"- as it
increases, readiness of the ship decreases; "average years since attending LMET" - as
this value increases, the LMET training (if any) of personnel onboard is "rustier".
37
percentages of officers who had attended at least one LMET course tended to have
higher rates of retention of first-term personnel. Reinforcing this finding is the
negative p$ between first term retention and average years since attendance for officers.
C. SUMMARY
As shown in Tables I and 2, the majority of 'cells" did not indicate a significant
relationship between MOE and LMET variables. In fact, correlations which did
appear significant were not present in both fleets. Many of the significant correlations
38
were counterintuitive and so deserve closer scrutiny. Results were, at best, mixed and
did not point to any strong relationships between LMET and fleet effectiveness;
although weak relationships were shown Cor some measures-usually on the order of p5
.19 to .27. These results can lead to tempered conclusions and several suggestions
for improved research in the area of LMET evaluation and fleet measures of
I
effectiveness.
I
I
I
39.1
-x
VI. CONCLUSIONS AND RECOMMNF.NDATIONS
A. CONCLUSIONS
There is a measurable relationship between LMET and several fleet MOEs,
however, this relationship was not consistent in both fleets and was mixed or
counterintuitive in some cases. For the Atlantic Flect, only two measures had
significant relationships to any LMET variables.-3M inspection scores and firs term
retention. The relationships were opposite for officer and enlisted LMET, the latter
having a counterintuitive relationship with 3M.
For the Pacific Fleet ships, there were many more significant relationships, but
they were primarily in two areas: C-ratings and retention. Once again the resulta were
often the opposite expected--the more time the ship spent in C-1 or C-2 (combat ready
or substantially combat ready), the fewer personnel had attended LMET. Officer
LMET did correlate positively with career mention indicating some benefit from
LMET.
Why were the relationships between LMET and fleet effectiveness scant and in
some respects, counterintuitive? There are several possibilities:
* The data on LMET attendance may not be accurate. The system for reporting
course attendance to NITRAS contains a number of 'holes' and 'bugs' leading
to a reputation for unreliable data, especially with regard to older data (LMET
records were scanned as far back as 1978, when the course was first taught).
* The data on measures of effectiveness lack clear reliability and validity. High
year to year variability and instability of ship performance measures were found
by Chatfield and Morrison of NPRDC [Ref. 30]. Some measures used could be
considered parochial (OPPE, SMI) and thus were not truly indicative of
shipwide performance.
* LMET may not be having the desired effect on attendees' subsequent
performance. It may in fact be counterproductive.
* LMET may sensitize graduates to imperfections in the fleet environment
causing them to be less likely to reenlist (see Chapter V, Table 2).
* Competencies and behaviors learned in LMET may not be reinforced
(rewarded) in the fleet. Behaviors not at least intermittently rewarded (through
recognition and approval) tend to extinguish rapidly. No measure of degree of
command support for LMET could be made. Only the number of people in
each command with LMET was measured.-not the extent to which LMET
competency use is rewarded and reinforced. As stated by George Eggert in an
40
I
article on management development, "It is unwise to 'develop' behavior in
training programs that will not be reinforced back on the job" [Ref. 331
* Selection bias regarding who attends LMET (especially TAD) may be occuring.
Supervisors and department heads might tend to send those whom the ship can
best afford to lose for two weeks rather than reward their best performeres with
two weeks of leadership training. LMET may also be given to higher
propertions of junior versus senior supervisors.
* Sending people TAD to LMET may leave those ships who send more personnel
shorthanded, teniporaril) compromising readiness.
* The differences in the results for the Pacific and Atlantic Fleets may reflect
variation in:
inspection standards and differences in inspection teams
• operation schedules and missions
• frequency of inspections, drills, and distinguished visitors
* reporting methods and criteria
• Fleet clirnate" (prevailing attitudes and priorities)
* LMET sites, instructors, etc. (East vs. West Coast)
Though essentially speculative, the possible reasons behind the trends in the data
point to a need for further evaluation. Not only should the method and content of
LMET be examined but also the methods by which the Navy measures the
effectiveness of its fleets.
The ultimate standard that must be used in judging the usefulness of any
organizational program is whether or not that program is making an impact on "the
bottom line." For the U.S. Navy, the bottom line is not a profit figure, but something
called readiness or effectiveness. Measuring this effectiveness is something the Navy
attempts to do in many ways--only a few of which were included in this study. These
data are collected for a myriad of purposes among which are:
0 to insure units are meeting minimum standards
* to aid in operational planning
• to gauge whether a unit is prepared for deployment or operational assignments
* to provide input for unit awards
Often secondary uses f the information, such as input to the Commanding
Officer's fitness report or nit award nominations, can obstruct the accurate
recording;reporting of data. Inspections often become an end unto themselves. The
objective becomes.-passIng the inspection-.instead of maximizing overall effectiveness
and capability. A CO may delay the transfer of key supervisors onboard until after a
41
major inspection and then lose a substantial proportion of his key people just prior to
deployment. Many ships use their "first teams" for inspection/training drills as much
as possible allowing other watch sections to lag in their proficiency.
All of the measures available for this thesis have at least some limitations ranging
from inconsistent, reporting methods, to ship to ship differences in standards, to
inflation of grades, to "fudged" reports. Even the data regarding LMET attendance are
regarded as potentially unreliable--probably in the direction of not including everyone
who had attended LMET (the technicians at NITRAS could not assure the authors
that all LMET attendance had been recorded for all courses since 1978).
Some additional considerations and limitations in applying and interpreting the
results of this study:
* The technical nature of 3M inspections .,id OPPEs--these measures are
influenced by the seniority and skill of personnel in certain
divisions/departments.
0 Wide variation in ship type which in turn means differences in mission,
coperational schedule, homeport, command climate, etc.
0 Variations in ships' age, maintenance status, and crew mix (senior/junior,
officerenlisted)
0 No control group--as Davies mentioned, the Navy's policy is not to
systematically deny LMET training to any targeted group [Ref. 191 In fact
Navy policy is to send all personnel headed for sea duty to LMET (see chapter
I).
a The results of past studies regarding LMET effects on behavior change
[Refs. 20,21,221 indicated no systematic relationship between LMET and
improved leader performance.
The LMET prograzv- has gone unevaluated with regard to its effect on
operational unit performance for nearly ten years now. To date (including this study)
no clue has been found that the training helped a single ship. If nothing else, this
study should provoke more extensive efforts to evaluate LMET against effectiveness
measures on a Navywide basis.
B. RECOMMENDATIONS
I. Improvements to LMET
The following recommendations for improvements to LMET are based primarily on
literature review:
42
'llI °T • •||-•"•t¢lJ[81J-''
•'•'•tllP•u.mIILrwF~h'i"nEI~rt•'inaJ----
Continue current efforts to include aspects of the Model for Command
Effectiveness [Ref. 241 in LMET curricula. This will give the LMET graduate
more of a "systems" view of an organization--how the components and forces
within and external to the organization interact and impact upon the overall
effectiveness of the unit.
Make further eflbrts to tailor the courses to the developmental needs of officers
and NCOs to the stage (early or mid-career) and to the field (warfare or staff
community).
Instructors should be closely screened. They should be volunteers, have
superior trai'ing,' facilitation skills, and be proven in leadership performance.
This recommendation may be difficult to implement given the reputation of
LMET and HRM billets as "not career enhancing".
Reexamine the methodlology--Does LMET include the best known state of the
art methods for teaching the objectives of LMET? If time is limited are
learning objectives prioritized and the most effort spent on the most
important/difficult areas.
Given improvements in LMET delivery and content over the last few years,
provide more opportunities for attendance by NCO's. Officer training has been
systematically included in the transfer/training pipeline (SWOS Basic, SWOS
Department Head School, PCO:PXO School). Soon, if not already, many
officers will have received LMET at all three levels--not as a redundant course,
but at appropriate depth and emphasis for each target group. Enlisted
supervisors in this study received LMET in consistently lower proportions than
their officer counterparts.
One promising program is the "mobile" LMET team that can perform the
training for all levels (in appropriate groups) within a single unit. The teams
are often able to incorporate unit specific issues into the course. This technique
may be one of the best methods for achieving "critical mass" [Ref. 231, a
sufficient proportion of LMET trained supervisors within the unit to assure
agreement and support of LMET competencies.
2. Areas for Future Research
43
* Use other Measures of Effectiveness in a similar study:
" Monthly Training Reports
* PQS accomplishment
* Departmental Excellence Awards
* INSURV
* NWTI (Nuclear Weapons Technical Inspections)
o Develop a control group-.a group of ships on which no members have had
LMET; then measure their accomplishments over a period of time compared to
similar ships with LMET trained personnel.
• Expand the study to other Navy communities--aviation, submarine, special
warfare, shore, etc.
* The authors also agree with and offer several recommenidations made by
Chatfield and Morrison regarding Navy MOEs:
* Commit resources for improving measures of readiness and develop new,
carefully constructed indices of ship readiness.
* Measures should assess true operational readiness rather than
administrative procedures and equipment status.
* Assessments should be perceived to contribute to the capability of the unit
and not as inspections for inspection's sake.
* Centralize and automate all readiness rating recording and analysis.
Eliminate redundancy and weight measures according to their importance
to the fleet.
Use only external assessment teams or individuals.
'Surprise' inspections should be random and live up to their name.
* Standardize the content, administration, and an&!ysis of assessments.
[Ref. 30]
* Leadership training should be included in studies using multivariate models to
predict organizational effectiveness and productivity.
• Experimentation regarding what the optimum form of LMET is: maximum
benefit to cost ratio, optimal career points for training, optimum course length
for each career point.
* Implement surveys of TAD graduates' COs at all LMET sites (see Chapter III,
LMET sites)
C. SUMMARY
The Navy, like other military organizations, faces complex leadership problems
and therefore complex training issues. Leaders must be capable of converting their
methods and styles to combat situations as quickly as the need arises. Navy
44
Without a doubt, the quality of available leadership at all levels determines the
character of an organization and the effectiveness with which it accomplishes its
objectives. Accordingly, the development of individuals who occupy leadership
positions is one of the most critical flunctions in any organization.
Although difficult when conducted properly, effective training for leadership is
feasible. Despite the fact that the field is in a state of disarray and many
programs are not very effective, there is sufficient evidence to conclude that
leadership can be taught when training is sincerely deemed importait by
* managements and when it is thoughtfully- designed and carefully implemented.
[Ref. 341
45
APPENDIX A
VARIABLE LIST
46
APPENDIX B
Indicated only for courses that were not strictly LMET courses
47
"APPENDIXC
L• ATTENDANCE FIGURES FOR LMET COURSES
Course Name Number Attended % Attended
Attended no LMET 26,774 51.65
Recruit Company Commander 200 .39
LMET Instructor 37 .07
SWOS Department Head 972 1.88
Prospective Commanding Officer 310 .60
Prospective Executive Officer 373 .72
SWOS Basic 3,622 6.99
BT,'MM Six Year Obligor LMET 511 .99
LMET for Leading Petty Officers 14,441 27.86
LMET for Leading Chief Petty Officers 3,453 6.66
LMET for Division Officers 323 .62
LMET for Aviation Division Officers 98 .19
Supply Corps Officer Basic 643 1.24
Officer Indoctrination School 76 .15
Of those who attended LMET 88.2 percent attended only one course; 10.2
percent attended two courses; 1.5 percent attended three; and .1 percent attended four
or more courses.
48
DITIBTO OFYARO ATTENDANCEI
yearFreqencyPercent
1979 505
.2I
0984
2.1
1980 1,170 4.8
1981 2,372 9.7
1982 4,913 20.0
1983 5,797 23.7
1984 4,994 20.4
Total 24,505 100.0I
49I
LIST OF REFERENCES
2. Ecker, George, A History of LMET, 4th ed., McBer and Company in conjunction
with Naval Military Personnel Command (NMPC-62), July 1987.
4. "Navy Seen Falling Short in POs' Leadership Training", Navy Times, pp. 1 and
26, January 26, 1987.
8. Klemp, George 0., Munger, M. T., and Spencer, L. M., Analysis of Leadership
and Management Competencies of Commissioned and Non-commissioned Officers in
the Pacific and Atlantic Fleets. McBer and Company, Boston, MA, 1977.
9. Auel, C., "Leadership and Management Education and Training Long Range
Study Proposal," Pensacola, FL: Nav-y Education and Training Program
Development Center (NETPDC), February 4, 1975.
10. Goleman, Daniel, "The New Competency Tests: Matching the Right People to
the Right Jobs", Psychology Today, pp. 35-46, January 1981.
II. McClelland, David C., "A Competency Model for Human Resource Management
Specialists to be Used in the Delivery of the. Human Resource Management
Cycle", McBer and Co., Boston, MA, June 19, 1975.
5O
12. McClelland David C., The Achieving Society, Van Nostrand-Rheinhold, New
York, 1961.I
13. McClelland, David C., "Testing for Competence Rather Than for 'Intelligence".
American Psychologist, pp. 1-14, January 1973.
14. Flannagan, J. C., "The Critical Incident Technique', Ps.chological Bulletin, pp.
327-358, 1954.
15. Winter, D. G., "An Introduction to LMET Theory and Research', McBer and
Co., Bostcn, MA, August 8, 1979.
16. SDC (System Development Corporation) reports:
Minton, Margaret E., Novelli, Luke Jr., and Saad, Katherine J., "Results of an
Assessment of the Navy Leadership and Management Education and Training
(LMET) Leading Chief Petty Officer (LCPO) Course", February 9, 1979.
Grace, Gloria L. et. al., "Results of an Assessment of the Navy LMET Instructor
Course", March 15, 1979.
Minton, Margaret E., Saad, Katherine J., and Grace, Gloria L., "Results of an
Assessment of the Navy LMET Leading Petty Officer (LPO) Course', March 23,
1979.
Grace, Gloria L. and Meinken, Joan L., "Final Report on Assessment Task
Order EG.04: Navy LMET Project Development", March 16, 1979.
Minton, Margaret E., Saad, Katherine J., and Grace, Gloria L., "Results of an
Assessment of the Navy LMET Prospective Commanding Officer / Prospective
Executive Officer (PCO,/PXO) Course', June 4, 1979.
17. Magnus, Gary S., YNC, USN, "LMET The Change!!!!", The Navy human
Resource Management Journal, Fall "79,'Winter '80.
19. Davies, John E., A Plan for the Evaluation of Leadeiship Training in the United
States Army, M.S. Thesis, Naval Postgraduate School, Monterey, CA, June 1980.
21. Abe. Gary K. and Babylon, William T., Delegation: A Competency of Superior
Performers?, M.S. Thesis, Naval Postgraduate School, Monterey, CA, December,
1932.
51
22. Foley, Patricia G., From Classroom to Wardroom: Internualing. intqrating and
Reiwforng Lederasip and Management, Eduation ad Trining (LMET) Skills in
the aVdWV M.S. Thesis. Naval Postgraduate School, Monterey. CA. December,
1983.
24. McBer and Company, Command E•feetiveness in the United States Navy-Final
Report, prepared under Navy Contract #N00600-81-D-3198 in conjunction with
Naval Military Personnel Command (NMPC.62), October 1985&
25. McBer and Company, Command EFOectiveness in the United State Na-.Swrvey
Data Addendum prepared under Navy Contract #N0060081-D.3498 in
conjunction with NMPC-62, October 1986.
27. Brethower, Karen S. and Runnlu, Geary A. 'Evaluating Training, Training and
Development Journal,pp. 14-22, May 1979.
28. Institute for Defense Analysis, Evluating Navy Mfanpower, Personnel and
Training Policiesin Terms qfPerformace, by Stanley A. Horowitz, March 1986.
29. Davilli. Thomas B. and Schenzt. William P., Predcting Shipboard Readiness
Ltiliuig Information and Proxies Currmnty Availablr. Reports, Exercises and
Statistics, M.S. Thesis, Naval Postgraduate School, Monterey, CA, December
1986.
30. Navy Personnel Research and Development Center (NPRDC), San Diego,
TN 87-37, Preparing to Evalue OTwer Careew Path Changes. Pre-Change
DatabaseDevelopment, by Robert E. Chatfield and Robert F. Morrison, August,
1987.
31. Government Accounting Office (GAO) The Unit Stoam and Identity Report
(UNITREP) System-What It Does and Does Not AMeasure, Government Printing
Office, Washington, D.C., March 12, 1984.
32. Berenson, Mark L, and Levine, David M. Basic Business Staiutics: Concepts
and Applications, 3d ed., Prentice Hall, Englewood Cliffs, NJ, 1986.
52
33. Eauet, Georp L "Management Development: Destroying Myths, Solving
Problems, Date Mmqgme
L and Wa Deo•
t, pp. 48-59, October 1977. cited by Taylor, Robert
M. Air Fore#P.esunif Milietar E£dwl" ju d Excautive
U.eder&hp Dmlopuenest-A Summay and Annotated Bibliorrepy, Colorado: L.S.
Air Force Academy, January 1980.
I
34, Human Resources Research Organization, Alexandria, VA, TR 80-2, Leadership
Training: The State of the Apt, by Joseph A. Olmstead, October 1980.
I
I
I
53 1M
BIBLIOGRAPHY
Brown, Ralph J. and Somerville, James D., "Evaluation of Management Development
Programs...An Innovative Approach", Personnel, July-August 1977.
Chief of Naval Operations, LMET Navy Training Plan, Washington, DC, 12 February
1979.
Center for Naval Analysis, Crew Characteristics and Ship Condition (Maintenance
Personnel Fjectiveness Study), by Stanley A. Horowitz and Allan Sherman, CDR,
USN, March 1987.
McBer and Company, Letter to Navy Personnel Research and Development Center,
Subject: LMET Pilot Study Design, January 8,1981.
McBer and Company, Boston, MA, A Profile of Exemplary Commanding Officers and
Executive Officers prepared under Navy contract #N00600-81I-D-3498, December 1983.
McBer and Company, Boston, MA, An Interim Report on the Pilot Com-nand
Fffectiveness Study, prepared under US Navy contract # N00600-81-D-3498, July 1984.
McClelland, David C. and Burnham, David H., "Power is the Great Motivator',
HarvardBusiness Review. March-April 1976.
54
Malone, Dandridge M and McGee, Michael, "Jazz Musicians and Algonquin Indians",
JMiditary Review, December 1985.
Navy Personnel Research and Development Center, Human Resource Management and
Operational Readiness as Measured by Refresher Training on Navy Ships by Sandra J.
Mumford, February 1976.
Student Journalfor Leadership and Management Education and Training for Aviation
k Vivision Officers, U.S. Government Printing Office, Washington, D.C., 1981.
Student Journalfor Leadership and Management Education and Training for Division
Officers, Supply School, obtained from Naval Supply Corps School in Athens, GA,
revised January 1986.
Turabian, Kate L., A Manualfor Writers of Term Papers, Theses, and Dissertations,4th
edition, Chicago: Univernity of Chicago Press, 1973.
55
INITIAL DISTRIBUTION LIST
No. Copies
1. Defense Technical Information Center 2
Cameron Station
Alexandria, VA 22304-6145
2. Library, Code 0142 2
Naval Postgraduate School
Monterey, CA 93943-5002-
3. Naval Military Personnel Command (NMPC.62) 2
Washington, D.C. 20370-5000
4. Dr. Carson K. Eoyang 2
Department of Administrative Science (Code 54)
Naval Postgraduate School
Monterey, CA 93943
5. Dr. Stephen Mehay
Department of Administrative Science (Code 54)
Naval Postgraduate School
Monterey, CA 93943
6. Dr. David R. Whipple
Department of Administrative Science (Code 54)
Naval Postgraduate School
Monterey, CA 93943
7. Dr. Ben Roberts
Department of Administrative Science 'Code 54)
Naval Postgraduate School
Monterey, CA 93943
8. Naval Training Systems Center1
Representative Atlantic, CEP 160
Naval Base
Norfolk, VA 23511-5121
Attn: Mike Glenn
9. Commander 1
Naval Surface Force U. S. Atlantic Fleet
Norfolk, VA 23511
10. Chief of Naval Operations (OP-64) I
Washington, D.C. 20350
11. Commander I
Naval Surface Force U. S. Pacific Fleet
San Diego, CA 92125.5035
56
i
12. Chief of Naval Education and Training
Naval Air Station
Pensacola, FL 32508
13. Navy Personnel Resaarch and Development Center (NPRDC)
San Diego, CA 92152.6800
Attn: Dr. Robert Morrison
14. LCDR David P. Polley
USS Fresno (LST 1182)
FPO San Fransisco, CA 96665-1803
15. LT Teresa C. Cissell
Naval Military Personnel Command
NMPC-60
Washington, D.C. 20370.5000
II
57 1