Assessment of quality indicators in Spanish higher military education

.

evaluation indicators can be conveniently adapted to be applied in the military education system.This evaluation processes should be supported on obtaining reliable data based on self-evaluation, regular evaluation and specific evaluation, to develop its structure and functioning for enabling implementation of a continuous improvement plan.Keywords: Evaluation; higher military education; content analysis; assessment quality indicators

Theoretical framework
The emergence of quality as a management science dates back to the 1920s, within the business and administrative world.The companies coined different ways to measure and assess quality.This phenomenon went through different paradigms through time such as quality inspection, quality control and quality assurance, where they suggested models for the problems arisen until they arrived to the total quality management of modern companies, [8].
We must understand quality as an "abstract concept, as wide in definition and application that each organization must understand it from its own interests", [11].Therefore, the concept of quality is the result of an agreement between those who intend to establish features on an object or matter on a certain place, tome and conditions.As this is a construction, it cannot be univocal or monolithic, [10], and it generates controversies, as there will always be opposed interests in such construction.In general, Traba, Barletta and Velázquez [11] mentioned Armand Feigenbaum, considered as a nuclear author in quality aspects, on whose conceptualization he presented two essential aspects: quality must be defined in terms of client satisfaction and quality is multidimensional, so it must be comprehensively defined.
To approach the idea of quality we resort to evaluation as a form to discern it, even if we cannot say that we have completed the understanding of the concept just by using certain measurements, [9].Quality assessment is subject to multiple interpretations regarding its content and goals and also with regard to the methods and techniques to be used.Even if assessment purposes are very different, most of them pursue a triple objective, [1]: improvement, accountability and information.
Sverdlick [10] points out that quality with regard to education management was extrapolated from the business sphere and linked to the idea of control in production process.When we apply the concept of quality to the education sphere, we need to consider the position from which we do it, which involves purpose, sense and education functions.This is, that we put at stake its functions, learning-teaching processes, teachers' action and assessment, considering the education system as a whole.
At present, the concept of quality evaluation has been replaced by management of educational quality, according to Mateo, [6]: "the new culture of evaluation is not directed to penalty, classification and selection anymore, but to provide a reasoned and reasonable information aimed to guide educational improvement management".This change of conception has emerged because we live in a changing society and, in it, training and education have become an essential objective in all countries, [7].
These considerations can also be applied to the Military Education System and its assessment, whose main objective is "facilitating an assessment process to improve military education quality, obtaining reliable data of the suitability of its structure and operation for the objectives assigned and making the subsequent implantation of continuous improvement plans possible".According to the fifth paragraph of the annex of the Ministerial Order 51/2004, the assessment process of military education shall be carried out through self-evaluation, external evaluation and specific evaluation: E PA R 5 • Self-evaluation: internal teaching reflection process carried out through the centre's Self-evaluation Team, appointed for that purpose.The centre, following the action patterns recorded in the Self-evaluation Guide, describes and valuates its situation with regard to the criteria established, identifying the necessary improvement proposals.The result of this work is the Self-evaluation Report.
• External evaluation: a group of evaluators external to the assessed education centre are appointed to be part of the Unit of Experts which, following the guidelines contained in the Guide for External Evaluation and supported by the Self-evaluation Team analyses the level of compliance of the proposals recorded in due course in the Self-evaluation Report.The results of the external evaluations are recorded in the External Evaluation Report.
• Specific evaluation: its object is "assessing the training received to fulfil the functions typical of each command, rank and speciality".Such evaluation includes both the specific evaluation of the Curriculum of the officers and the specific evaluation of the Curriculums of troop and marine professional soldiers.
External evaluation is the assessment process of Higher Military Centres, in which the Unit of Experts on External Assessment identifies strengths and weaknesses on the aspects relevant for the self-evaluation process, such as the organization of the Centre, development of curriculums, teachers' regime, students' regime, assessment, qualification and classification systems and preparing the appropriate recommendations.This report becomes a channel to establish mechanisms that facilitate continuous improvement and provide transparency and independence to the assessment system.

Objectives
The objectives of this research work are focused on: • Conceptualizing evaluation as an element of judgement in the decision-making process to change and improve Higher Military Education.
• Delimiting the evaluative dimensions and elements, which will have an effect on the questionnaires to guarantee quality education in Higher Military Education.

Methodology
The technique used to collect the information has been content analysis, understood as the whole of analysis techniques of the communications designed to obtain indicators, whether quantitative or not, by systematic and objective content description procedures of the messages allowing to infer knowledge related to the production/reception conditions of the context of such messages", [2, 4 and 3], facilitating the extraction of relevant indicators to evaluate higher military education.
Therefore, we started from a comprehensive analysis of the indicator system suggested by the ANECA (National Agency for Quality Assessment and Accreditation of Spain), referred to Higher Military Education.And, on the other hand, we analysed quality indicator systems to assess the following national Higher Education evaluation institutions and agencies: University of Chile, University of Paraguay, the Canary Agency for Quality Assessment and University Accreditation, the Agency for the Quality of the University Education System of Castilla y León and the National Agency for Quality Assessment and Accreditation regarding Higher Education.
This way, we selected all the indicators related to the different dimensions assessed in Higher Military Education and continued to the subsequent analysis and comparison of these indicators in order to complete the necessary information to improve the assessment process.

Data analysis
The process of data analysis is carried out comparing the quality indicators of Higher Military Education suggested by the ANECA and those suggested by the other assessment agencies, for each dimension considered.This way, we have added or removed indicators depending on their advisability, correct wording or purpose, supressing those that were poorly written, repeated or attached to more than one dimension.The following table 1 shows a global synthesis of the different dimensions analysed in quality assessment indicators de of Higher Military Education.
Table 1.Synthesis of the analysed dimensions Dimension Categories Decision-making explanation 1. Centre Organization.

Teaching organization.
• It is necessary to include in this paragraph indicators related to informative feedback from the opinions of the graduates with regard to their professional training, guaranteeing a appropriate suitable adaptation of training processes according to the needs detected by graduates.
• On the other hand, it would be advisable to define and make public the degree goals with clarity, both from a basic viewpoint, related to education and research, and from a vision of a declaration of ethical and philosophical principles.
• The system of indicators suggested to evaluate higher military education has not established any section regarding the acquisition of an appropriate level and according to the graduates' needs in a second language.
• Likewise, it would be advisable to favour the participation of the different actors of the educational community to include them in the management mechanisms of the different resources.
• Finally, the system of indicators suggested does not mention the importance of applying the ECTS system to assess the dedication required to students.

Internal rules
• This section does not include any indicators related to the satisfaction of the different actors intervening in the processes.

Curriculum organization 2.1. Objectives of the curriculum
• It is advisable to include a consistent monitoring between the curriculum proposed and the graduates' profile.
• Likewise, no patterns have been established to associate the degrees proposed with the socioeconomical profile of the environment from the demands and/or suggestions posed, even if it should not be necessary in certain cases.
• It is necessary to explain the revision mechanisms of the degree, mainly in order to adapt them to the possible changes arisen in social demand or in the diversification of the educational offer.

Curriculum structure.
• The indicators suggested to evaluate higher military education do not consider general training areas, speciality training, professional training and practical training in order to integrate them in the degree.
• There is not a valuation of possible internships outside the education centres during the training period.It would be advisable to include an evaluation of the number of credits of such internships, as well as of the evidences that support that such internships contribute to favour graduates' education.

Faculty system
• The mechanisms available for professors to work with other members of the professional community in design and renovation activities of study programmes and teaching quality improvement have not been described.

Administration and Services staff
• This field does not require the inclusion of any new indicator after the analysis of the content.

Material resources 4.1. Classrooms and other teaching spaces
• The indicators proposed to assess higher military education are more complete than those suggested in the evaluation agencies consulted; therefore, it is not necessary to include any additional indicator for this criterion.4.2.Work spaces 5. Curriculum development

5.1.Students system
• No information has been established with regard to financial or academic scholarship programmes.

Development of the teachinglearning process
• It would be advisable to include the term "updated" in the use of teaching strategies, methodologies and techniques applied in the degree.

Evaluation, assessment and classification system
• In the evaluation process, it is necessary to include a systematic analysis of the efficiency of the learning-teaching process to adjust it with regard to a possible improvement.
• Likewise, it is necessary to show the existing correlations between the methods used and the nature of the expected learning.• As stated above, it is important to include indicators determining the importance of evaluation as an essential part of learning and that students receive feedback from it.

Results
• No indicators have been included on the commitment of obtaining results and social return in matters of R&D. ,

Conclusions, difficulties and prospective
After the analysis performed, we can conclude that, in general, the indicators under assessment are not based on evidences of degree user satisfaction.For example, to evaluate the students' system we did not suggest indicators based on one's own satisfaction for this criterion, which prevents a correct feedback that allows to carry out a training evaluation process adapted to continuous improvement.Likewise, we think that such feedback, [5] of the process should have the widest perspective possible, covering all the educational agents, such as managing and administration staff, teachers, students and graduates.
On the other hand, considering the data analysed, we have noticed an evident disproportion between certain evaluation dimensions and others, especially with regard to assessment indicators on spaces and premises destined to teaching, which are much more numerous and specific than others with a higher qualitative relevance in the global training process, according to the analysis of the indicators offered by different agencies and institutions with evaluative functions.And, finally, it is worth noting the scarce information, regarding indicators, drawn from graduate students.It would be relevant to implement a long-term evaluation process, including this collective, to facilitate the development of continuous education throughout their professional development in a well-founded manner.
Regarding to the analysis of the content of the documents and indicators proposed by the ANECA to assess Higher Military Education, we suggest the incorporation of indicators related to the aspects that show more scarcity in evaluative and data-collection processes.Therefore, we suggest to reformulate some of the data-collection instruments, especially the questionnaires used for each of the agents involved, in order to obtain a more complete vision of the training process, dealing with a higher number of evaluative dimensions.