This study utilized a qualitative research methodology within the framework of an empirical study, as it allows for close examination of practices in real environments [33]. This approach enabled us to deeply engage with current practices and challenges in web system reuse. We employed 'ATLAS.ti,' a computer software tool, for meticulous documentation and analysis of the data collected [34]. All methods were carried out in accordance with relevant guidelines and regulations. All experimental protocols were approved by Faculty of computers and information at Mansoura University, Egypt. Informed consent was obtained from all subjects and their legal guardians.
We adopted interviews, focus groups, and observation as qualitative data collection methods to gather information about current practices. Subsequently, we applied grounded theory [35] as our qualitative data analysis methodology to understand the present state, identify major challenges, and determine their causes.
Our initial step involved reaching out to influential companies that play a significant role in our study. We targeted the top hundred software companies heavily involved in web development across various sectors, including organizations, institutions, universities, and government entities in Egypt and Saudi the Arabia. These companies were selected based on three primary criteria:
-
Market Influence: We assessed the market influence of these companies based on their market share and reputation within the industry. This was determined through publicly available market reports and rankings within industry publications, ensuring that our study focused on key players in the market.
-
Project Volume and Product Delivery: The number of projects and the types of products delivered by these companies played a crucial role in their selection. We analyzed historical data of their completed projects and ongoing engagements, which provided insights into their operational scale and the complexity of systems they handle.
-
Technical Workforce Size: The number of technical staff employed by the company indicated their capacity for undertaking substantial software development projects. Companies were segmented into small (less than 50 employees), medium (51 to 150 employees), and large (more than 150 employees) categories, based on their technical stuff. This categorization helped in understanding the different challenges faced by companies of varying sizes.
We communicated the aims and benefits of our study through personalized emails, explaining how participation could help them identify and tackle the root causes of their challenges, thereby enhancing their competitive edge in the market. The response rates were indicative of the interest and relevance of our study, with small companies showing a 90.4% response rate, medium-sized companies 92.5%, and large companies 50%, culminating in an overall response rate of 84% as shown in Table1. These rates are consistent with software engineering research standards and comparable to other similar studies conducted in the region [36] [37].
Table 1
The Number of Emailed Companies, Number of Responses, and Response Rate
Companies Size | Number of emailed companies | Number of response companies | Response Rate |
Small (< 50) | 42 | 38 | 90.4% |
Medium (51–150) | 40 | 37 | 92.5% |
Large (> 150) | 18 | 9 | 50% |
| 100 | 84 | 84% |
We obtained an informed consent from all participating individuals and their legal guardians to participate with us in this study, all methods were carried out in accordance with relevant guidelines and regulations. Additionally, all experimental protocols were approved by Faculty of computers and information at Mansoura University, Egypt.
We conducted exploratory interviews with 354 different technologists from companies which agreed to collaborate with us. The interviews were conducted remotely through video conference using Microsoft teams, Skype [38], Zoom [39] or Webex [40], the maximum duration for the interviews was 30 minutes. The main purpose of exploratory interviews was focused on how far web system reuse methodologies and practices are applied, identified in a high level the main challenges of web systems reuse in real environments. During the first round of interviews, we consolidated the definitions for various terms used during the study to set the expectations and remove any conflicts in terminology among the participants because of the different technical backgrounds and cultures for the stakeholders involved from various companies. Therefore, we described and agreed on the most commonly used terminology during the study as follows:
We define the satisfaction level of the current reuse practices into three levels; full satisfied, partially satisfied, and not satisfied. Fully satisfied indicated that the stakeholder is satisfied with the current practices and unwilling to improve them. Partially satisfied indicated that the stakeholder is satisfied and has concerns and seeking improvements for the current practices. Not satisfied indicated that the stakeholder is not satisfied at all with current practices and request to improve them.
We agreed with participants to define company size based on the number of technical employees like (analyst, development managers, technical team leads, project managers, software developers, and quality controls). The companies with less than 50 employees considered as small-sized companies; between 51 and 150 employees considered medium-sized companies, while more than 150 employees considered large-sized companies.
We agreed with participants to define project size based on the project duration. This duration started from the analysis phase and finished with the testing of the whole project. Projects required a period less than five months are considered small-sized projects, between five months and one year as medium-sized projects, and more than one year as being large-sized projects.
Regardless of the development methodology (waterfall, scrum, agile or spiral) the participants followed inside their companies, we have agreed with them on four main traditional development phases (analysis, design, implementation, and testing phase).
We defined the reusable assets that are any documents, piece of codes, software component, or a part of a web system which can be reused in multiple projects with specific conditions.
According to the data collected during exploratory interviews, we developed a list of open-ended questions to conduct semi-structured interviews, such as:
-
Q1: Can you list the existing reusable components, and give a short description of them?
-
Q2: How do you integrate an existing component with a new web system?
-
Q3: What is the impact of using a reusable component on development time, cost, and productivity?
-
Q4: What types of software assets reused? Do they reuse documents, test cases, or only reuse codes?
-
Q5: Is there a standard process they follow to build a reusable component, if yes describe it, please!
The purposes of semi-structured interviews were depending on the respondents' professional role and their objectives inside the company for example:
-
CEOs and software development managers to discover the impact of web systems reuse on productivity, quality and delivery commitments.
-
Project Managers to explore the impact of web systems reuse on projects plan, quality, modification efforts, and maintenance cost.
-
Software development managers and technical team leaders to understand how they determined which component should be reused, and how to reuse it.
-
Software team leaders to identify the software reuse methodology which they were using, and examine the result of applying it in the development process.
-
Software developers to highlight the impact of applying software reuse on their productivity, software quality, modification and maintenance time.
-
Software quality control to check the result of applying software reuse to software quality.
So that, we categorized the questions of semi-structured interviews based on the interviewee's role since the interviews followed more of a conversational flow [36]. The average years of experience for interviewees categorized by the roles shown in Fig. 1.
Table 2 shows number of the exploratory, and semi-structured interviews have conducted in the various software industry stakeholders in Egypt and Saudi Arabia.
Table 2
Number of Conducted Interviews with various stakeholders.
Respondents' Role | Exploratory Interviews | Semi-Structured Interviews | Total Number of Interviews |
Chief executive officer (CEOs) | 33 | 28 | 61 |
Project Manager | 35 | 45 | 80 |
Development Manager | 40 | 50 | 90 |
System Analyst | 42 | 52 | 94 |
Software Team Leader | 45 | 63 | 108 |
Senior Software Engineer | 54 | 101 | 155 |
Web Designer | 23 | 35 | 58 |
UI Developer | 37 | 56 | 93 |
Quality control | 45 | 65 | 110 |
| 354 | 495 | 849 |
Based on the data collected during interviews, we decided to conduct focus groups with different stakeholders in order to have deep understanding for the current web system reuse practices from different perspectives either within the same company or externally. We created a categorized contact list contains stakeholders contributed in the interviews, this list helped us to organize and conduct focus groups, we divided stakeholders into two groups based on their role:
-
The management group: it includes (CEOs, Development Managers (DM), Project Managers (PM), System Analysts (SA), Team Leaders (TL)).
-
The technical group: it includes (Development Managers (DM), System Analysts (SA), Team Leaders (TL), Software Developers (SD), Web Designer (WD), User Interface Developer (UID), and Quality Controls (QC)).
We also divided the focus groups into two types:
We also identified a list of steps to follow during our study in order to have deep understanding for the reasons of the current challenges as follow:
-
Defined which role reported the challenge.
-
Identified the causes of challenge and linked with practices.
-
For each practice we specified the below points:
-
in which phase of the development process this practice occurred.
-
which role did this practice.
-
Consolidate the reason for this practice during the conducted interviews with stakeholders did this practice.
-
Conducted both homogeneous and heterogeneous focus groups with stakeholders who did and who reported these practices to make sure of our findings during the study.
Table 3 shows the number of the conducted various types of focus groups. We followed the below steps to prepare each focus group before held:
We defined the main objectives of the focus group.
We listed the main topics, and sometimes open-ended question.
We determined the required roles that must attend the focus group, and always kept the total number of software stakeholders between five and a maximum of ten participants.
We determined expected duration, proposed location, in the case focus group will be conducted in one spot, or the proposed application (Skype or zoom) that will be used in case it will be conducted remotely.
We agreed with the selected attendees expected time and topics that will discuss in the meeting.
we sent an email to all attendees to confirme the meeting with time, location, and topics.
Table 3
Number of conducted various types of focus groups
Homogeneous/ Heterogeneous | Management Group | Technical Group | Total Number |
Homogeneous | 23 | 46 | 69 |
Heterogeneous | In one spot | 17 | 50 | 67 |
Remotely | 28 | 63 | 91 |
| 68 | 159 | 227 |
According to the analysis results of the data collected during focus groups and interviews, we have come to the importance of conducting participant observation to validate the result of the analysis and ensure that challenges and practices collected during interviews and focus groups are consistent with the real environment. We agreed with seven medium size companies and thirteen small size companies on conducting participant observation to observe and contribute within the development process starting from the analysis phase and documenting the requirements passing through the development phase, designing web systems layouts, editing it as HTML, actual implementation for coding system functionality, and finally deliver the web system to the customer. During the observation, we contributed in the full development process for 13 web systems, and partially for 32 web systems. We participated in various types of the web systems such as electronic services for government sectors (E-Services), Enterprise Resource Planning (ERP), archiving web system, Enterprise Project Management System (EPM), and Content Management system (CMS) for small organizations. Table 4 shows the number of the observed web systems and its types. The participant observation took approximately three months for some systems and has been extended to more than one year for other ones; this period has been calculated starting from the development phase after the customer approve at least one requirement document until customer delivery for the first release.
Table 4
Number of observed web systems and its types
Web System Type | Fully Observed | Partially Observed | Total Number |
E-Services for Government sectors | 2 | 6 | 8 |
ERP Web-based System | 1 | 3 | 4 |
Archiving web-based system | 1 | 4 | 5 |
Management System for Organization | 2 | 6 | 8 |
EPM | 1 | 4 | 5 |
CMS | 6 | 8 | 14 |
Online Shopping | 0 | 1 | 1 |
| 13 | 32 | 43 |