Evaluation of Clinical Engineering Department Services in Riyadh City Hospitals

BACKGROUND: Since its establishment, clinical engineering in healthcare facilities has evolved rapidly owing to increased employment of highly trained staff. Clinical engineering department represents a factor critical for successful healthcare management.OBJECTIVE: This study developed an integrated evaluation method for services rendered by clinical engineering departments using two questionnaires supplied to governmental hospitals in Riyadh, Saudi Arabia.METHODS: One questionnaire is evaluation for the end user (medical department staff), while the other is evaluation questionnaire for the clinical engineering department staff.RESULTS: The overall evaluation of administrative, training, technical skills for clinical engineering department staff by medical department staff was very good with mean 4.07±1.09, 3.98±0.74 and 3.8±1.14, respectively. Hospital size affects the technical and training skills for the clinical engineering department’s staff, with a p-value less than 0.05 at 95% confidence interval. This also affects maintenance management system. Professional role for medical department staff had no effect on their satisfaction on the services provided by clinical engineering department.CONCLUSION: The procedure, standards and basic requirements which established by Saudi Arabia ministry of health for Clinical engineering department services were applied on all hospital, but the maintenance management system should be fixed regardless of hospital size.


Introduction
Healthcare management at most hospitals comprises, first, human resources and, second, medical equipment. Both these resources carry equal weight; they are equally prioritized for a hospital to function at maximum capacity.
Medical technology is the foundation of healthcare services. Today, nearly all diagnostics and treatment depend on technology. Developments within this industry have delivered monumental innovations in medical equipment-with many being more advanced as well as complex than before. However, increased complexity in technology also entails increased difficultly in equipment management. In medical practice, safe use of medical equipment requires proper maintenance and management. This has given rise to a new field in healthcare known as medical equipment management in the last decade. This type of management represents a vital parameter that determines the improvement of healthcare outcomes [1].
The maintenance process of medical equipment depends on the size of a hospital and the nature of its operations. The number of medical equipment-as well as their complexity and cost1-is expected to periodically increase in hospitals. As a result, hospitals feel greater necessity in establishing special management for equipment maintenance. Such management would require both technological capabilities as well as trained staff to handle and maintain the equipment as per modern methods of clinical engineering management [3].
Medical equipment management is currently moving away from single management for distinct departments to a more centralized management represented by a clinical engineering department (CED). Based on the use rate and management efficiency of equipment, "a clinical engineer is defined as a specialist who maintains and improves patient care by directing engineering and managerial services to health care technology" [1] [4]. Importantly, a clinical engineer is distinguished from a biomedical engineer by her or his working environment in a hospital as well as managerial role [4]. Earlier, health staff found it difficult to clarify the role of the CED because of the absence of evidence on the suitability and benefits of such a service in a hospital [5].
In the late sixties and early seventies, patient safety was primarily a concern for clinical engineers [6][7]. Thereafter, engineers began to take interest in equipment procurement, product testing, and user training. Over time, the approach of equipment management came to dominate the field, from initial selection of equipment to approval in all stages of equipment life cycle [7].
The use of inappropriately managed equipment or faulty equipment could cause injury to the patient, staff, as well as visitors. Such hazards are not uncommon in daily hospital management and are often attributed to an absence of an appropriate hospital equipment management plan [8][9]. Thus, the role and responsibilities of the CED have increased to help efficiently manage medical technology use as well as to ensure seamless synergy between technology and clinical practice [6].
In 1985, Pacela presented quantitative data-survey of facilities, staffing, wages, benefits, computer equipment, and quality control for clinical engineering departments [10]. Many studies were conducted to evaluate the role of clinical engineering department in training of medical staff and the management of equipment and maintenance. A questionnaire survey study was carried out in late 1987 and early1988 regarding the non-existent preparation, maintenance, and final disposition. This questionnaire tried to determine to what degree these departments are involved in their function, their level of resources and assignment volumes, their kinds of technology and whether their oversight authority satisfied them and their institution's position was understood [11].
At the same time, clinical engineers began assessing the efficacy of metrics in hospitals. For example, Yadin and Rohe introduced a model to measure the effectiveness of productivity measurements [12].
1 In the Saudi Arabian context (2019), the market for medical equipment is estimated to be just under US$2 billion [2].
Glouhova et al. conducted a medical engineering effectiveness survey using the Frize model [13]. The authors profiled the results of the examined clinical engineer services by region, mission, structure, staff, and resources. Similarly, there exist other countrywise reports on clinical engineer services in developing countries [14][15][16][17].
In the early twenty-first century, the Association for the Advancement of Medical Instrumentation organized a subcommittee to develop a measure combining the quality and financial metrics to support a standardized clinical engineer benchmark [18]. However, to date, no metric has been developed, although Wang et al. did suggest a global failure rate as an efficiency benchmark [19] for calculating the percentage of repairs to accomplished equipment per overall number of clinical engineer equipment. Nevertheless, researchers have conducted questionnaires to determine suitable performance indicators that could be used to establish a score system that evaluates select clinical engineers' tasks within the hospital [20][21][22][23][24].
In this study conducted a direct and an indirect assessment of CED services within hospitals. A questionnaire was written based on previous surveys in studies by Glouhova et al., Frize, and Frize et al. [11][12] [25] and, accordingly, map the status of current services and its relationship with factors such as hospital size, level of education, and experience. To the best of my knowledge, this is the first such study on Saudi Arabia, which is acknowledged to be the second most advanced Arab country and thirtieth most advanced country globally in terms of human development [26].

Material
In this study, two structured questionnaires were used as the research instrument, especially considering the examined population and their professional role in the hospital. Each questionnaire was designed and written in English, and then translated into Arabic. Mostly closed questions were used to elicit quick responses with room for additional information. Thus, the questionnaire layout was structured to optimize answers. The questionnaire was kept brief (estimated to be completed in 20 minutes) without a confidence appeal.
The first questionnaire was designed to target the CED's staff, which included senior specialists, specialists, and technicians of medical equipment. This questionnaire represented direct evaluation and was structured into four parts. The first part contained general information (e.g., hospital size, position, experience, and number of internal and external training); the second part evaluated the medical equipment maintenance system (e.g., maintenance strategies and maintenance plane); the third part evaluated the software for maintenance management; the fourth part evaluated the agents or medical equipment companies (local or international).
The second questionnaire targeted medical equipment users, such as physicians, nursing staff, and medical laboratory staff. This indirect evaluation was structured into two parts: one for general information (e.g., hospital size and position) and another for evaluation of the CED services based on administration, training, and technical skills.

Method
At the time of the study, the Saudi Arabian Ministry of Health (MOH) managed 44 hospitals within Riyadh, the capital city. Most of hospitals had a capacity from 50 to over 250 beds, with little hospitals having over 1,000 beds. Hospital size was used to determine the study population and sample size, dividing hospitals into three categories: hospital with fewer than 50 beds, hospital with 50 to 250 beds, and hospital with more than 250 beds.
Nine hospitals that is, three hospitals for each category. The sample size was determined using the Yamane method, represented by following equation [27]: where is the sample size, is the population size, and is the confidence interval (=0.95) Next, to test the content's effectiveness, and then to enhance questions relevant to this study, the questionnaires sent to advisors, after finalizing the questionnaires in the suitable form, it was tested for reliability by applied on 30 trial sample for each questionnaire [28].
The questionnaires were distributed after they were approved by the MOH (No: 1441-30038 via fax; see Appendix 1); the responses were received via mail or through an electronic link or through interviews with healthcare providers at different levels. The total number of responses was 527 responses (358 responses from the questionnaire for the medical department staff questionnaire; 169 responses from the questionnaire for the CED staff).
Statistical Package of Social Sciences (SPSS) version 25 was used for the quantitative data analysis and the principal investigator was responsible for the data entry. The accuracy of the data was guaranteed by valid controls, re-entering, and comparison of a selected sub-sample with the original data set. The frequency distribution tables provided the basic descriptive research. To describe the importance of the results, related statistical methods were used.

Results
The current study presents the findings of the two specially designed pretested questionnaires. The questionnaires were labeled as "medical department staff questionnaire" (MDSQ) and "CED staff questionnaire" (CESQ). The reliability analysis carried out on the perceived task values scale comprising 30 items . The results of the Cronbach's α coefficient is 0.939 and 0.845 for each questionnaire, respectively. Tables 1 and 2 report the data on hospital information and personal characteristics of the participants for both questionnaires. The largest number of participants (N=128; 35.75%) for the MDSQ (indirect evaluation) are from the hospital with more than 250 beds, and for the CESQ (direct evaluation) (N= 69, 40.83%) are from the hospital with 50 to 250 beds.
Most clinical engineering staff (N=103; 60.95%) had fewer than or equal to five years of experience and most underwent internal training (N= 136, 80.47%) or even external training (N=111, 65.68%). Table 3 reports the descriptive and frequency analysis of the MDSQ responses. The MDSQ is subdivided into three domains; the overall evaluation based on the calculated means is 4.07±1.09, 3.89±0.74, and 3. 80±1.14, respectively. Table 4 (a and b) reports the descriptive and frequency analysis of the CESQ responses. Table 4(a) reports the evaluation of the first domain, maintenance management, it consists of 10 binomial questions (Yes/No), a score was generated in order to determine the performance level of for maintenance management system, performance level was classified into five level as: • "Lowest performance", this for who answer yes for less than 2 questions.
• "Low performance ", this for who answer yes form 2 to less than 4 questions.
• "Moderate", this for who answer yes form 4 to less than 6 questions.
• "Good performance", this for who answer yes form 6 to less than 8 questions.
• "Highest performance ", well this for who answer yes form 8 to less than or equal 10 questions. Table 4(b) reports the evaluation of the maintenance management software as well as the performances of the international and local medical equipment agent. The overall evaluation is 4.11±0.72, 3.88±0.61, and 3.73±0.75, respectively.
Normality distribution was tested using Kolmogorov-Smirnov and Shapiro-Wilk, because the measurement variables did not fulfil a two-way analysis of variance normality statement, the Kruskal-Wallis test was applied on both questionnaires. Table 5 reports the test results for the MDSQ responses in order to check if there is a significant difference between hospital size groups and participants' evaluation for each domain. The findings revealed no significant difference for the administration skills of the CED staff. However, there exists a significant difference between hospital size groups for training, technical skills, and overall evaluation, with p-value less than 0.05 at the 95% confidence interval. To determine which groups are different for each item, a Mann-Whitney test was applied between the groups as a post hoc test. The test results for training skills evaluation show a difference between hospitals size with fewer than 50 beds and hospitals size with more than 250 beds at a p-value of 0.001; the mean rank of the hospital group with more than 250 beds is 136.32.
For technical skills evaluation, there was only no difference between hospitals size groups with less than 50 beds and hospitals size groups with more than 250 beds with P-value 0.417, while the other combinations were significantly different due to the Pvalue was less than 0.05. The overall evaluation had no difference between hospitals size groups, except between the group with fewer than 50 beds and that with 50 to 250 beds, with a p-value of 0.006 and mean rank of 127.3 for 50-to 250 beds group. Table 6 reports the results of the Kruskal-Wallis test for the MDSQ responses that checked for a significant difference between professional role groups and participants' evaluations for each domain. There is no significant difference between professional role groups for the evaluation the CED staff for all domains, with the p-value more than 0.05 at the 95% confidence interval.
Tables 7, 8, and 9 report the Kruskal-Wallis test results for the CESQ responses to check if there exists a significant difference between hospital size, job position, and experience groups, respectively, and the evaluation for each domain. As shown in Tables 7 and 9, the significant difference was only for the maintenance management evaluation (MM) domain, with a p-value less than 0.05. To check the source of this difference, Mann-Whitney test was applied between the groups as a post hoc test. The results showed no significant difference, except between hospital size of fewer than 50 beds and that of 50 to 250 beds, and in Table 8, between participants who had fewer than or equal to five years of experience and those who had more than or equal to 16 years of experience.

Discussion
The CED management has been improved in hospitals, since most of hospitals have organization structure for clinical engineering department. In Saudi Arabia, the clinical engineering departments have been recently introduced to undertake training and maintenance tasks in hospitals. According to the available literature, few and sporadic studies have been conducted on topics related to clinical engineering, particularly in Saudi Arabia. For instance, the studies of Muhammad [29] and Hesham et al. [30], who tackled issues related to the maintenance in hospitals. The current study deems the first in dealing with evaluation of services provided by the clinical engineering department. One of the important factors, which affects the medical equipment management system, was hospital size and as mentioned before, this study was conducted in the hospitals of Riyadh city, which was 44 public hospital. The hospitals were classified according to the size or number of beds [31] into three categories. The samples were taken from each category and most of these hospitals are in the range of 50 to 250 beds [32].
Clinical engineering department services are classified into two main parts in terms of responsibility; the first one comes from clinical engineering department staff in hospitals who provide administration, training, and technical services. The second part comes from the outside companies or agencies, which provide extra-needed services to the hospitals. In the current study, the results of two questionnaires have been addressed; the first one (MDSQ) is focused on the evaluation of three main services (administration, training, and technical) provided by CEDS, whereas the second questionnaire (CESQ) is mainly focused on the evaluation of maintenance procedures inside the hospital and the maintenance management software program provided by clinical engineering department and Local or international agencies.
As aforementioned, the first questionnaire addressed the evaluation results of administration, training, and technical services of (CEDS). The findings of both administration and training skills are not affected by hospital size and professional role and this iJOE -Vol. 16, No. 11, 2020 might be attributed, in terms of administration skills, to the great interest of MOH with managerial and administrative skills in all health facilities, while for training skills might be due to the training skills deems one of the essential characteristics for clinical engineering department staff [33][34][35]. Regarding the technical or performance skills of (CEDS), in terms of maintenance management system, there was significant difference with the hospital size and experience and this might be due to two reasons; the first is the ratio of CED engineers to technicians staff in the hospital, Fize [36] and Eisler [37] who reported that the complexity of medical equipment will decrease the need of technicians. The second is the lack of fixed guidelines for maintenance management system by Ministry of Health, which makes the established maintenance management system depends on CED staff experience [38][39].
The second questionnaire was focused on the self-evaluation of CEDS, which presented in tables (7 and 9). The findings revealed no significant difference in both maintenance management software program of international and local agencies. This might be referred to the established standards and mandatory requirements for purchasing any equipment or software by MOH [40]. All the above evaluations were confirmed through reliability and credibility tests, as shown in table 9 and all measures of CED satisfaction remain the same irrespective of the job position of participants.

Conclusion
Based on the current study on government hospitals in Riyadh, Saudi Arabia, the roles of CED staff could be more clearly established. The hospital size affects the technical and training skills of CED staff as well as the established maintenance management. Further, most end users should be trained in safety procedures when handling and managing medical equipment. Hospital size did not influence the MOH procedure, standards, and basic requirements for maintenance management software and contracts with international or local agencies in hospitals. However, some MOH recommendations were not implemented in all hospitals, such as the classification of medical equipment.
The recommendations based on this study for the Saudi Arabia ministry of health: • Provide a maintenance management manual for medical equipment and it should up to date. • Provide specialized training for CED's staff like updated maintenance procedures, medical equipment infection control. • Provide central maintenance management software, which can be used monitor all maintenance procedures applied on all hospitals.

Acknowledgement
I would like to thank Deanship of Scientific Research at Majmaah University for supporting this work.