The world of software engineering is constantly discovering new ways that lead to an increase in team performance in the production of software products and, at the same time, brings the customer's further satisfaction. With the advent of agile methodologies in software development, these objectives have been considered more seriously by software teams and companies. Due to their very nature, agile methodologies have the potential to be integrated with other methodologies or specific managerial approaches defined in line with agility objectives. One of the cases is Six Sigma, which is used in organizations by focusing on organizational change and process improvement. In the present study, attempts were made to present the hybrid software development approach, including Scrum, as the most common agile and Six Sigma methodology. This approach was practically used in a case study, and the obtained results were analyzed. The results of this evaluation showed that this hybrid method could lead to the increased team performance and customer satisfaction. However, besides these two achievements, an increase in the number of re-works, number of defects discovered, and the duration of the project implementation were also observed. These cases are in line with the main objectives of Scrum and Six Sigma and are justifiable and acceptable due to achieving those objectives.
From IT service management to IT service governance: An ontological approach ...IJECEIAES
Some companies have achieved better performance as a result of their IT investments, while others have not, as organizations are interested in calculating the value added by their IT. There is a wide range of literature that agrees that the best practices used by organizations promote continuous improvement in service delivery. Nevertheless, overuse of these practices can have undesirable effects and unquantified investments. This paper proposed a practical tool formally developed according to the DSR design science approach, it addresses a domain relevant to both practitioners and academics by providing IT service governance (ITSG) domain model ontology, concerned with maximizing the clarity and veracity of the concepts within it. The results revealed that the proposed ontology resolved key barriers to ITSG process adoption in organizations, and that combining COBIT and ITIL practices would help organizations better manage their IT services and achieve better business-IT alignment.
574An Integrated Framework for IT Infrastructure Management by Work Flow Mana...idescitation
Information Technology (IT) is one of the most emerging
fields in today’s Internet world. IT can be defined in various ways,
but is broadly considered to encompass the use of computers and
telecommunications equipment to store, retrieve, transmit and
manipulate data. Infrastructure is the base for everything. IT also
has an infrastructure, which can be managed and maintained
properly. For an organization’s Information Technology,
Infrastructure Management (IM) is the management of essential
operation components, such as policies, processes, equipment,
data, human resources and external contacts, for overall
effectiveness.
In this paper, we propose a methodology to manage the IT
Infrastructure in a better way. Our methodology uses the tree-
structure based architecture to manage the infrastructure with less
manual power. The process of how to manage the infrastructure is
discussed with efficient methodology and necessary steps with
algorithm, in this paper. Also, in this paper, the process of workflow
management on IT infrastructure management has been provided.
Data warehouse generally contains both types of data i.e. historical & current data from various data sources. Data warehouse in world of computing can be defined as system created for analysis and reporting of these both types of data. These analysis report is then used by an organization to make decisions which helps them in their growth. Construction of data warehouse appears to be simple, collection of data from data sources into one place (after extraction, transform and loading). But construction involves several issues such as inconsistent data, logic conflicts, user acceptance, cost, quality, security, stake holder’s contradictions, REST alignment etc. These issues need to be overcome otherwise will lead to unfortunate consequences affecting the organization growth. Proposed model tries to solve these issues such as REST alignment, stake holder’s contradiction etc. by involving experts of various domains such as technical, analytical, decision makers, management representatives etc. during initialization phase to better understand the requirements and mapping these requirements to data sources during design phase of data warehouse.
CHALLENGES FOR MANAGING COMPLEX APPLICATION PORTFOLIOS: A CASE STUDY OF SOUTH...IJMIT JOURNAL
This research explores the challenges in management and the root cause for complex application portfolios
in the public sector. It takes Australian public sector organisations with the case of South Australia Police
(SAPOL) for evaluation it being one of the significant and mission critical state government agencies. The
exploratory research surfaces some of the key challenges using interview as primary data collection
source, along with archive records, documentation, and direct observation as secondary sources. This
paper reports on the information analysed surfacing eight key issues. It highlights that the organic growth
of the technology portfolios, with mission criticality has resulted in many quick fixes which are not aligned
with long term enterprise architectural stability. Integration of different mismatched technologies, along
with the pressure from the business to always keep the lights on, does not provide the opportunity for the
portfolios to be rationalised in an ongoing way. Other issues and the areas for further study are explored
at the end.
A Glimpse into Software Defined Data CenterFung Ping
Note: This article is not published yet, it is for preview purpose. Interested publisher please contact hpfung1@gmail.com or hanping.fung@aeu.edu.my
A Glimpse into Software Defined Data Center
Abstract – Existing data centers today are not ready to support IT organizations to meet the ever changing business demands. Hence, next generation of data center like Software Defined Data Center (SDDC) is explored and expected to come to rescue. However, SDDC is relatively new since its inception in 2012 whereby there are different early interpretations on its definition, criteria, reference architecture and values that SDDC brings. There is also limited literature and sharing on how a SDDC works. The objective of this study is to shed some lights on SDDC operational definition, criteria, reference architecture, depiction on how SDDC works in three scenarios as well as standardized the values it brings. Moreover, some factors to guide IT organizations how to adopt SDDC are also discussed. This study has taken a qualitative approach in which SDDC literature is reviewed and some SDDC IT professionals are interviewed. Lastly, limitations of the study, future research and conclusion are also provided.
Agent-SSSN: a strategic scanning system network based on multiagent intellige...IJERA Editor
The document describes an Agent-SSSN system that uses a multi-agent approach and ontology to develop a strategic scanning system for business intelligence. The system aims to integrate expert knowledge through cooperative information gathering from the web. It uses various agent roles like information retrieval agents, mediator agents, and notification agents. Ontologies are used to represent shared domain concepts and expert knowledge to enable knowledge sharing between agents. The system is modeled using the O-MaSE methodology, with goals, roles, and capabilities defined for each agent.
CRESUS-T: A COLLABORATIVE REQUIREMENTS ELICITATION SUPPORT TOOLijseajournal
Communicating an organisation's requirements in a semantically consistent and understandable manner
and then reflecting the potential impact of those requirements on the IT infrastructure presents a major
challenge among stakeholders. Initial research findings indicate a desire among business executives for a
tool that allows them to communicate organisational changes using natural language and a model of the IT
infrastructure that supports those changes. Building on a detailed analysis and evaluation of these findings,
the innovative CRESUS-T support tool was designed and implemented. The purpose of this research was to
investigate to what extent CRESUS-T both aids communication in the development of a shared
understanding and supports collaborative requirements elicitation to bring about organisational, and
associated IT infrastructural, change. In order to determine the extent shared understanding was fostered,
the support tool was evaluated in a case study of a business process for the roll out of the IT software
image at a third level educational institution. Statistical analysis showed that the CRESUS-T support tool
fostered shared understanding in the case study, through increased communication. Shared understanding
is also manifested in the creation of two knowledge representation artefacts namely, a requirements model
and the IT infrastructure model. The CRESUS-T support tool will be useful to requirements engineers and
business analysts that have to gather requirements asynchronously.
MEASURING TECHNOLOGICAL, ORGANIZATIONAL AND ENVIRONMENTAL FACTORS INFLUENCING...csandit
This document summarizes a research study that examined factors influencing the adoption intentions of public cloud computing among private sector firms. The study used a proposed integrated model to analyze technological, organizational, and environmental factors. A survey of IT decision makers at 40 firms across various industries received 122 responses. The results found that compatibility, cost savings, trialability, and external support were the most influential factors in adoption intentions. The study provides recommendations to help increase cloud adoption rates among firms and improve cloud services.
From IT service management to IT service governance: An ontological approach ...IJECEIAES
Some companies have achieved better performance as a result of their IT investments, while others have not, as organizations are interested in calculating the value added by their IT. There is a wide range of literature that agrees that the best practices used by organizations promote continuous improvement in service delivery. Nevertheless, overuse of these practices can have undesirable effects and unquantified investments. This paper proposed a practical tool formally developed according to the DSR design science approach, it addresses a domain relevant to both practitioners and academics by providing IT service governance (ITSG) domain model ontology, concerned with maximizing the clarity and veracity of the concepts within it. The results revealed that the proposed ontology resolved key barriers to ITSG process adoption in organizations, and that combining COBIT and ITIL practices would help organizations better manage their IT services and achieve better business-IT alignment.
574An Integrated Framework for IT Infrastructure Management by Work Flow Mana...idescitation
Information Technology (IT) is one of the most emerging
fields in today’s Internet world. IT can be defined in various ways,
but is broadly considered to encompass the use of computers and
telecommunications equipment to store, retrieve, transmit and
manipulate data. Infrastructure is the base for everything. IT also
has an infrastructure, which can be managed and maintained
properly. For an organization’s Information Technology,
Infrastructure Management (IM) is the management of essential
operation components, such as policies, processes, equipment,
data, human resources and external contacts, for overall
effectiveness.
In this paper, we propose a methodology to manage the IT
Infrastructure in a better way. Our methodology uses the tree-
structure based architecture to manage the infrastructure with less
manual power. The process of how to manage the infrastructure is
discussed with efficient methodology and necessary steps with
algorithm, in this paper. Also, in this paper, the process of workflow
management on IT infrastructure management has been provided.
Data warehouse generally contains both types of data i.e. historical & current data from various data sources. Data warehouse in world of computing can be defined as system created for analysis and reporting of these both types of data. These analysis report is then used by an organization to make decisions which helps them in their growth. Construction of data warehouse appears to be simple, collection of data from data sources into one place (after extraction, transform and loading). But construction involves several issues such as inconsistent data, logic conflicts, user acceptance, cost, quality, security, stake holder’s contradictions, REST alignment etc. These issues need to be overcome otherwise will lead to unfortunate consequences affecting the organization growth. Proposed model tries to solve these issues such as REST alignment, stake holder’s contradiction etc. by involving experts of various domains such as technical, analytical, decision makers, management representatives etc. during initialization phase to better understand the requirements and mapping these requirements to data sources during design phase of data warehouse.
CHALLENGES FOR MANAGING COMPLEX APPLICATION PORTFOLIOS: A CASE STUDY OF SOUTH...IJMIT JOURNAL
This research explores the challenges in management and the root cause for complex application portfolios
in the public sector. It takes Australian public sector organisations with the case of South Australia Police
(SAPOL) for evaluation it being one of the significant and mission critical state government agencies. The
exploratory research surfaces some of the key challenges using interview as primary data collection
source, along with archive records, documentation, and direct observation as secondary sources. This
paper reports on the information analysed surfacing eight key issues. It highlights that the organic growth
of the technology portfolios, with mission criticality has resulted in many quick fixes which are not aligned
with long term enterprise architectural stability. Integration of different mismatched technologies, along
with the pressure from the business to always keep the lights on, does not provide the opportunity for the
portfolios to be rationalised in an ongoing way. Other issues and the areas for further study are explored
at the end.
A Glimpse into Software Defined Data CenterFung Ping
Note: This article is not published yet, it is for preview purpose. Interested publisher please contact hpfung1@gmail.com or hanping.fung@aeu.edu.my
A Glimpse into Software Defined Data Center
Abstract – Existing data centers today are not ready to support IT organizations to meet the ever changing business demands. Hence, next generation of data center like Software Defined Data Center (SDDC) is explored and expected to come to rescue. However, SDDC is relatively new since its inception in 2012 whereby there are different early interpretations on its definition, criteria, reference architecture and values that SDDC brings. There is also limited literature and sharing on how a SDDC works. The objective of this study is to shed some lights on SDDC operational definition, criteria, reference architecture, depiction on how SDDC works in three scenarios as well as standardized the values it brings. Moreover, some factors to guide IT organizations how to adopt SDDC are also discussed. This study has taken a qualitative approach in which SDDC literature is reviewed and some SDDC IT professionals are interviewed. Lastly, limitations of the study, future research and conclusion are also provided.
Agent-SSSN: a strategic scanning system network based on multiagent intellige...IJERA Editor
The document describes an Agent-SSSN system that uses a multi-agent approach and ontology to develop a strategic scanning system for business intelligence. The system aims to integrate expert knowledge through cooperative information gathering from the web. It uses various agent roles like information retrieval agents, mediator agents, and notification agents. Ontologies are used to represent shared domain concepts and expert knowledge to enable knowledge sharing between agents. The system is modeled using the O-MaSE methodology, with goals, roles, and capabilities defined for each agent.
CRESUS-T: A COLLABORATIVE REQUIREMENTS ELICITATION SUPPORT TOOLijseajournal
Communicating an organisation's requirements in a semantically consistent and understandable manner
and then reflecting the potential impact of those requirements on the IT infrastructure presents a major
challenge among stakeholders. Initial research findings indicate a desire among business executives for a
tool that allows them to communicate organisational changes using natural language and a model of the IT
infrastructure that supports those changes. Building on a detailed analysis and evaluation of these findings,
the innovative CRESUS-T support tool was designed and implemented. The purpose of this research was to
investigate to what extent CRESUS-T both aids communication in the development of a shared
understanding and supports collaborative requirements elicitation to bring about organisational, and
associated IT infrastructural, change. In order to determine the extent shared understanding was fostered,
the support tool was evaluated in a case study of a business process for the roll out of the IT software
image at a third level educational institution. Statistical analysis showed that the CRESUS-T support tool
fostered shared understanding in the case study, through increased communication. Shared understanding
is also manifested in the creation of two knowledge representation artefacts namely, a requirements model
and the IT infrastructure model. The CRESUS-T support tool will be useful to requirements engineers and
business analysts that have to gather requirements asynchronously.
MEASURING TECHNOLOGICAL, ORGANIZATIONAL AND ENVIRONMENTAL FACTORS INFLUENCING...csandit
This document summarizes a research study that examined factors influencing the adoption intentions of public cloud computing among private sector firms. The study used a proposed integrated model to analyze technological, organizational, and environmental factors. A survey of IT decision makers at 40 firms across various industries received 122 responses. The results found that compatibility, cost savings, trialability, and external support were the most influential factors in adoption intentions. The study provides recommendations to help increase cloud adoption rates among firms and improve cloud services.
EMPLOYEES CHARACTERISTICS IN KNOWLEDGE TRANSFER AND PERFORMANCEcsandit
While most studies are concerned with the industry, but for non-profit organizations has not
received much attention. Various have highlighted knowledge transfer (KT) for creates value,
however an obstacle from the perspective among employees still exists. The main problem is
still difficult because employees will not share their knowledge. This study investigated factors
and develop that influence KT among employees of non-profit organizations in Indonesia. The
survey 364 respondents were used, 325 were returned, and 39 were not returned. Likert and
smart PLS to confirm construct. This paper conclude factors that helping others, trust, soft
reward, and personality of employees motivation are factors which influencing the KT
behaviour. Finally, the findings were discussed.
AN OVERVIEW OF EXISTING FRAMEWORKS FOR INTEGRATING FRAGMENTED INFORMATION SYS...ijistjournal
Literatures show that there are several structured integration frameworks which emerged with the aim of facilitating pplication integration. But weakness and strength of these frameworks are not known. This
paper aimed at reviewing these frameworks with the focus on identifying their weakness and strength. Toaccomplish this, recommended comparison factors were identified and used to compare these frameworks.Findings shows that most of these structure frameworks are custom based on their motives. They focus onintegrating applications from different sectors within an organization for the purpose of eliminating communication inefficiencies. There is no framework which guides pplication’s integrators on goals of integrations, outcomes of integration, outputs of integration and skills which will be required for
types of applications expected to be integrated. The study recommended further study on integration
framework especial on designing unstructured framework which will support and guide application’s
integrators with consideration on consumer’s surrounding environment.
Strategic alignment is a conviction that is considered extremely important in understanding how organizations can apply their arrangement of information technology (IT) into substantial boosts in achievement. To attain alignment advantage, Information Technology Infrastructure Library (ITIL) prepares a framework of best practice approch for IT Service Management in all countries and Control Objectives for Information and Related Technology (COBIT) is an IT governance framework and aiding toolset that permits managers to stretch the gap between control prerequisites, technical matters and business risks. The purpose of this paper is to recognize how COBIT can complement ITIL to attain Business-IT Alignment.
An Approach to Automate the Relational Database Design Process ijdms
Information and Communication Technology improves the business competitiveness in both large scale
enterprises as well as small and medium scale enterprises. Lack of technical knowledge in Information
Communication Technology and the cost have been identified as challenges forsmall and medium
enterprises to adopt ICT for their businesses. They can overcome this problem by using freely available
tools/systems which aid to generate information systems automatically. However,they require the database
structure; therefore, it is desirable to have a tool to automate the relational database design process. In the
proposed approach, business forms were considered as the database requirement input sources among:
learning from examples, natural language, structured input/output definition and schema definition and
forms. The approach uses a functional dependency algorithm on the un-normalized data which is fed
through the business form and then apply a normalization algorithm onthe discovered functional
dependencies to have the normalized database structure. User intervention is needed to have the domain
knowledgeof this approach. Finally, it develops the normalized database with all the keys and
relationships; the accuracy of the out-come totally depend on the data fed by end users.
This document proposes developing a competence-based model for securing the internet of things (IoT) in organizations. It aims to define the required set of organizational competences for IoT products and services regarding information security. The research will use a design science methodology to develop and empirically test a model of organizational competence for IoT security based on an existing model. This will help managers assess their competences against future requirements and properly align their IoT approaches regarding customer information security.
The document proposes a domain-driven data mining methodology to efficiently process tickets in an IT organization. The methodology involves classifying tickets by category, identifying tickets with high issue rates, applying root cause analysis (RCA) to determine the root cause of issues, and applying continuous improvement (CI) to identify and implement solutions. An experiment applying the methodology to a banking sector showed it improved processing rates and reduced tickets with issues compared to processing tickets independently without categorization or RCA/CI. The methodology aims to efficiently solve ticket issues, increase customer satisfaction and requests, and improve processing without waiting for service level agreements.
Full Paper: Analytics: Key to go from generating big data to deriving busines...Piyush Malik
This document discusses how analytics can help organizations derive business value from big data. It describes how statistical analysis, machine learning, optimization and text mining can extract meaningful insights from social media, online commerce, telecommunications, smart utility meters, and improve security. While tools exist to analyze big data, challenges remain around data security, privacy, and developing skilled talent. The paper aims to illustrate how existing algorithms can generate value from different industry use cases.
The effect of technology-organization-environment on adoption decision of bi...IJECEIAES
Big data technology (BDT) is being actively adopted by world-leading organizations due to its expected benefits. However, most of the organizations in Thailand are still in the decision or planning stage to adopt BDT. Many challenges exist in encouraging the BDT diffusion in businesses. Thus, this study develops a research model that investigates the determinants of BDT adoption in the Thai context based on the technology-organizationenvironment (TOE) framework and diffusion of innovation (DOI) theory. Data were collected through an online questionnaire. Three hundred IT employees in different organizations in Thailand were used as a sample group. Structural equation modeling (SEM) was conducted to test the hypotheses. The result indicated that the research model was fitted with the empirical data with the statistics: Normed Chi-Square=1.651, GFI=0.895, AFGI=0.863, NFI=0.930, TLI=0.964, CFI=0.971, SRMR=0.0392, and RMSEA=0.046. The research model could, at 52%, explain decision to adopt BDT. Relative advantage, top management support, competitive pressure, and trading partner pressure show significant positive relation with BDT adoption, while security negatively influences BDT adoption.
AN ITERATIVE HYBRID AGILE METHODOLOGY FOR DEVELOPING ARCHIVING SYSTEMSijseajournal
With the massive growth of the organizations files, the needs for archiving system become a must. A lot of
time is consumed in collecting requirements from the organization to build an archiving system. Sometimes
the system does not meet the organization needs. This paper proposes a domain-based requirement
engineering system that efficiently and effectively develops different archiving systems based on new
suggested technique that merges the two best used agile methodologies: extreme programming (XP) and
SCRUM. The technique is tested on a real case study. The results shows that the time and effort consumed
during analyzing and designing the archiving systems decreased significantly. The proposed methodology
also reduces the system errors that may happen at the early stages of the development of the system.
- GCC countries are working to develop an interoperable federated e-identity system to facilitate citizens' mobility across GCC borders and enhance economic cooperation.
- Each GCC country has established a national digital identity program based on smart ID cards with biometric identifiers and credentials stored securely.
- The initiative aims to create a trusted cross-border infrastructure where citizens can authenticate and validate their identities across GCC countries using their national e-ID.
- This would allow citizens to securely access online services in other GCC countries using their home country-issued e-ID credentials.
An effective pre processing algorithm for information retrieval systemsijdms
The Internet is probably the most successful distributed computing system ever. However, our capabilities
for data querying and manipulation on the internet are primordial at best. The user expectations are
enhancing over the period of time along with increased amount of operational data past few decades. The
data-user expects more deep, exact, and detailed results. Result retrieval for the user query is always
relative o the pattern of data storage and index. In Information retrieval systems, tokenization is an
integrals part whose prime objective is to identifying the token and their count. In this paper, we have
proposed an effective tokenization approach which is based on training vector and result shows that
efficiency/ effectiveness of proposed algorithm. Tokenization on documents helps to satisfy user’s
information need more precisely and reduced search sharply, is believed to be a part of information
retrieval. Pre-processing of input document is an integral part of Tokenization, which involves preprocessing
of documents and generates its respective tokens which is the basis of these tokens probabilistic
IR generate its scoring and gives reduced search space. The comparative analysis is based on the two
parameters; Number of Token generated, Pre-processing time.
Selecting Experts Using Data Quality Conceptsijdms
Personal networks are not always diverse or large enough to reach those with the right information. This
problem increases when assembling a group of experts from around the world, something which is a
challenge in Future-oriented Technology Analysis (FTA). In this work, we address the formation of a panel
of experts, specifically how to select a group of experts from a huge group of people. We propose an
approach which uses data quality dimensions to improve expert selection quality and provide quality
metrics to the forecaster. We performed a case study and successfully showed that it is possible to use data
quality methods to support the expert search process.
Small and medium enterprise business solutions using data visualizationjournalBEEI
The small and medium enterprise (SME) companies optimize performance using different automated systems to highlight the operations concerns. However, lack of efficient visualization in reporting results in slow feedbacks, difficulties in extracting root cause, and minimal corrective actions. To complicate matters, the data heterogeneity has intensely increased, and it is produced in a fast manner making it unmanageable if the traditional methods of analytics are applied. Hence, we propose the use of a dashboard that can summarize the operational events using real-time data based on the data visualization approach. This proposed solution summarizes the raw data, which allows the user to make informed decisions that can give a positive impact on business performance. An interactive intelligent dashboard for SME (iid-SME) is developed to tackle issues such as measurement of cases completed, the duration of time needed to solve a case, the individual performance of handling cases and other tasks as a proof of concept. From the result, the implementation of the iid-SME approach simplifies the conveyance of the message and helps the SME personnel to make decisions. With the positive feedback obtained, it is envisaged that such a solution can be further employed for SME improvement for better profit and decision making.
Applying Classification Technique using DID3 Algorithm to improve Decision Su...IJMER
International Journal of Modern Engineering Research (IJMER) is Peer reviewed, online Journal. It serves as an international archival forum of scholarly research related to engineering and science education.
International Journal of Modern Engineering Research (IJMER) covers all the fields of engineering and science: Electrical Engineering, Mechanical Engineering, Civil Engineering, Chemical Engineering, Computer Engineering, Agricultural Engineering, Aerospace Engineering, Thermodynamics, Structural Engineering, Control Engineering, Robotics, Mechatronics, Fluid Mechanics, Nanotechnology, Simulators, Web-based Learning, Remote Laboratories, Engineering Design Methods, Education Research, Students' Satisfaction and Motivation, Global Projects, and Assessment…. And many more.
Improvement of IT Governance Case Study Government Institution Region Xijtsrd
The use of information technology in government processes will increase the efficiency, effectiveness, transparency, and accountability of government administration. Utilization of IT within an organization requires a system to manage IT better as well as the required audit of information technology that can be run in accordance as the expected. IT Audit is an important matter that must be carried out within an organization, also including the Government Institution Region X which utilize the technology of information as supporting the process of the public servicing. The audit of information technology is carried out with the purpose of fixing the critical point or problems that often occur in the process within the institution. As the result of study using the framework of COBIT 5, it shows the level of capability of five IT processes selected are at a lower level, namely APO07 at the level 2, EDM04 at the level 1, DSS01 at the level 1, BAI01 at the level 1, and APO08 at the level 1, whereas the expectation of capabilities of the organization’s leader is at the level 4. The results of the audit of information technology that has been made, shows the difference of the level of gap between the current maturity level with the maturity level based on the organization’s leader. In this study, will be getting suggestions and improvement recommendations according to the framework of COBIT 5 and ITIL 2011. Agus Ade Muliyana Krisna | Gusti Made Arya Sasmita | Gusti Agung Ayu Putri "Improvement of IT Governance (Case Study: Government Institution Region X)" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-4 | Issue-6 , October 2020, URL: https://www.ijtsrd.com/papers/ijtsrd33496.pdf Paper Url: https://www.ijtsrd.com/engineering/information-technology/33496/improvement-of-it-governance-case-study-government-institution-region-x/agus-ade-muliyana-krisna
Analysis and Design of Information Systems Financial Reports with Object Orie...ijceronline
Micro, Small and Medium Enterprises (SMEs) are a group effort proved resistant to a wide range of economic crisis shocks. But in the operation of their business financial management is still not transparent and are also still mixed between business finance and personal finance. So that needs to be done with good financial management. In this research, analysis and information system design financial reports as a basis for the development of the system. Software development life cycle (SDLC) using the model of the object oriented approach. With object-oriented approach, the tools used by the notation Unified Modelling Language (UML). In object-oriented approach all systems applications are Viewed as a collection of objects that allow organisasi interloking and end users to Easily understand logical entities. Object-oriented approach Provides the benefits of the reuse of codes and saves the time for developing quality product.
Employee Engagement within the IT Industry Momo Scott
This document is a 2288 word consulting report that addresses employee engagement within a multinational IT business. The report aims to provide recommendations on whether the client should participate in an industry-wide employee engagement survey and what engagement means within their context. Through a literature review, the report finds that participation in the survey could help the client better understand their workforce and contribute to research in the field. Key factors that influence engagement are identified, and it is recommended that the client participate conditionally to address issues like absenteeism and turnover.
Service-oriented computing has created new requirements for information systems development processes
and methods. The adoption of service-oriented development requires service identification methods
matching the challenge in enterprises. A wide variety of service identification methods (SIM) have been
proposed, but less attention has been paid to the actual requirements of the methods. This paper provides
an ethnographical look at challenges in service identification based on data from 14 service identification
sessions, providing insight into the practice of service identification. The findings identified two types of
service identification sessions and the results can be used for selecting the appropriate SIM based on the
type of the session.
This document discusses enterprise content management (ECM) solutions, including their typical architecture and key challenges in implementation. It describes the four main components of an ECM architecture: (1) the user interface, (2) information governance, (3) attributes like data archiving and workflow, and (4) the repository for secure storage. The document also outlines stages in an ECM implementation roadmap strategy, highlighting the need to specify information governance over the lifecycle and establish interoperability between systems.
This document summarizes the results of an online survey about factors influencing the adoption of agile practices. The survey received 103 responses. Based on the results, the authors hypothesized that: 1) Extreme Programming is the most popular adopted agile methodology. 2) Agile methodologies are chosen more by small teams. 3) Direct interaction with stakeholders is the most effective way to gather requirements. 4) User stories and diagrams are commonly used to capture requirements in agile. 5) Agile gives higher access to stakeholders to resolve problems quickly. 6) Adopting agile increases customer satisfaction and productivity. The survey results provide preliminary support for these hypotheses.
A study of critical success factors for adaption of agile methodologyIAEME Publication
This document discusses critical success factors for adopting agile methodology. It conducted a survey of 200 IT professionals, though only 40 responded. The survey found that management involvement, team member readiness, and organization size significantly impact adopting agile software methodology. The document also reviews various agile methods like extreme programming, scrum, crystal and feature driven development, and compares them to traditional waterfall methods. It notes advantages of agile include short cycles, test-first development, and empowering teams, while waterfalls struggle with changing requirements.
Guidelines to minimize the cost of software quality in agile scrum processijseajournal
This document discusses guidelines for minimizing the cost of software quality in an agile scrum process based on a case study. The case study analyzed a retail domain project that failed to properly adopt the agile scrum process. Key findings include teams being overloaded with sprint backlog items, inability to deliver quality due to tight timelines, and high costs associated with defects discovered late in the process or after production. The document identifies gaps in how the scrum process was implemented and provides guidelines to address them, such as involving all team members in pre-envisioning meetings to improve understanding before sprint planning.
EMPLOYEES CHARACTERISTICS IN KNOWLEDGE TRANSFER AND PERFORMANCEcsandit
While most studies are concerned with the industry, but for non-profit organizations has not
received much attention. Various have highlighted knowledge transfer (KT) for creates value,
however an obstacle from the perspective among employees still exists. The main problem is
still difficult because employees will not share their knowledge. This study investigated factors
and develop that influence KT among employees of non-profit organizations in Indonesia. The
survey 364 respondents were used, 325 were returned, and 39 were not returned. Likert and
smart PLS to confirm construct. This paper conclude factors that helping others, trust, soft
reward, and personality of employees motivation are factors which influencing the KT
behaviour. Finally, the findings were discussed.
AN OVERVIEW OF EXISTING FRAMEWORKS FOR INTEGRATING FRAGMENTED INFORMATION SYS...ijistjournal
Literatures show that there are several structured integration frameworks which emerged with the aim of facilitating pplication integration. But weakness and strength of these frameworks are not known. This
paper aimed at reviewing these frameworks with the focus on identifying their weakness and strength. Toaccomplish this, recommended comparison factors were identified and used to compare these frameworks.Findings shows that most of these structure frameworks are custom based on their motives. They focus onintegrating applications from different sectors within an organization for the purpose of eliminating communication inefficiencies. There is no framework which guides pplication’s integrators on goals of integrations, outcomes of integration, outputs of integration and skills which will be required for
types of applications expected to be integrated. The study recommended further study on integration
framework especial on designing unstructured framework which will support and guide application’s
integrators with consideration on consumer’s surrounding environment.
Strategic alignment is a conviction that is considered extremely important in understanding how organizations can apply their arrangement of information technology (IT) into substantial boosts in achievement. To attain alignment advantage, Information Technology Infrastructure Library (ITIL) prepares a framework of best practice approch for IT Service Management in all countries and Control Objectives for Information and Related Technology (COBIT) is an IT governance framework and aiding toolset that permits managers to stretch the gap between control prerequisites, technical matters and business risks. The purpose of this paper is to recognize how COBIT can complement ITIL to attain Business-IT Alignment.
An Approach to Automate the Relational Database Design Process ijdms
Information and Communication Technology improves the business competitiveness in both large scale
enterprises as well as small and medium scale enterprises. Lack of technical knowledge in Information
Communication Technology and the cost have been identified as challenges forsmall and medium
enterprises to adopt ICT for their businesses. They can overcome this problem by using freely available
tools/systems which aid to generate information systems automatically. However,they require the database
structure; therefore, it is desirable to have a tool to automate the relational database design process. In the
proposed approach, business forms were considered as the database requirement input sources among:
learning from examples, natural language, structured input/output definition and schema definition and
forms. The approach uses a functional dependency algorithm on the un-normalized data which is fed
through the business form and then apply a normalization algorithm onthe discovered functional
dependencies to have the normalized database structure. User intervention is needed to have the domain
knowledgeof this approach. Finally, it develops the normalized database with all the keys and
relationships; the accuracy of the out-come totally depend on the data fed by end users.
This document proposes developing a competence-based model for securing the internet of things (IoT) in organizations. It aims to define the required set of organizational competences for IoT products and services regarding information security. The research will use a design science methodology to develop and empirically test a model of organizational competence for IoT security based on an existing model. This will help managers assess their competences against future requirements and properly align their IoT approaches regarding customer information security.
The document proposes a domain-driven data mining methodology to efficiently process tickets in an IT organization. The methodology involves classifying tickets by category, identifying tickets with high issue rates, applying root cause analysis (RCA) to determine the root cause of issues, and applying continuous improvement (CI) to identify and implement solutions. An experiment applying the methodology to a banking sector showed it improved processing rates and reduced tickets with issues compared to processing tickets independently without categorization or RCA/CI. The methodology aims to efficiently solve ticket issues, increase customer satisfaction and requests, and improve processing without waiting for service level agreements.
Full Paper: Analytics: Key to go from generating big data to deriving busines...Piyush Malik
This document discusses how analytics can help organizations derive business value from big data. It describes how statistical analysis, machine learning, optimization and text mining can extract meaningful insights from social media, online commerce, telecommunications, smart utility meters, and improve security. While tools exist to analyze big data, challenges remain around data security, privacy, and developing skilled talent. The paper aims to illustrate how existing algorithms can generate value from different industry use cases.
The effect of technology-organization-environment on adoption decision of bi...IJECEIAES
Big data technology (BDT) is being actively adopted by world-leading organizations due to its expected benefits. However, most of the organizations in Thailand are still in the decision or planning stage to adopt BDT. Many challenges exist in encouraging the BDT diffusion in businesses. Thus, this study develops a research model that investigates the determinants of BDT adoption in the Thai context based on the technology-organizationenvironment (TOE) framework and diffusion of innovation (DOI) theory. Data were collected through an online questionnaire. Three hundred IT employees in different organizations in Thailand were used as a sample group. Structural equation modeling (SEM) was conducted to test the hypotheses. The result indicated that the research model was fitted with the empirical data with the statistics: Normed Chi-Square=1.651, GFI=0.895, AFGI=0.863, NFI=0.930, TLI=0.964, CFI=0.971, SRMR=0.0392, and RMSEA=0.046. The research model could, at 52%, explain decision to adopt BDT. Relative advantage, top management support, competitive pressure, and trading partner pressure show significant positive relation with BDT adoption, while security negatively influences BDT adoption.
AN ITERATIVE HYBRID AGILE METHODOLOGY FOR DEVELOPING ARCHIVING SYSTEMSijseajournal
With the massive growth of the organizations files, the needs for archiving system become a must. A lot of
time is consumed in collecting requirements from the organization to build an archiving system. Sometimes
the system does not meet the organization needs. This paper proposes a domain-based requirement
engineering system that efficiently and effectively develops different archiving systems based on new
suggested technique that merges the two best used agile methodologies: extreme programming (XP) and
SCRUM. The technique is tested on a real case study. The results shows that the time and effort consumed
during analyzing and designing the archiving systems decreased significantly. The proposed methodology
also reduces the system errors that may happen at the early stages of the development of the system.
- GCC countries are working to develop an interoperable federated e-identity system to facilitate citizens' mobility across GCC borders and enhance economic cooperation.
- Each GCC country has established a national digital identity program based on smart ID cards with biometric identifiers and credentials stored securely.
- The initiative aims to create a trusted cross-border infrastructure where citizens can authenticate and validate their identities across GCC countries using their national e-ID.
- This would allow citizens to securely access online services in other GCC countries using their home country-issued e-ID credentials.
An effective pre processing algorithm for information retrieval systemsijdms
The Internet is probably the most successful distributed computing system ever. However, our capabilities
for data querying and manipulation on the internet are primordial at best. The user expectations are
enhancing over the period of time along with increased amount of operational data past few decades. The
data-user expects more deep, exact, and detailed results. Result retrieval for the user query is always
relative o the pattern of data storage and index. In Information retrieval systems, tokenization is an
integrals part whose prime objective is to identifying the token and their count. In this paper, we have
proposed an effective tokenization approach which is based on training vector and result shows that
efficiency/ effectiveness of proposed algorithm. Tokenization on documents helps to satisfy user’s
information need more precisely and reduced search sharply, is believed to be a part of information
retrieval. Pre-processing of input document is an integral part of Tokenization, which involves preprocessing
of documents and generates its respective tokens which is the basis of these tokens probabilistic
IR generate its scoring and gives reduced search space. The comparative analysis is based on the two
parameters; Number of Token generated, Pre-processing time.
Selecting Experts Using Data Quality Conceptsijdms
Personal networks are not always diverse or large enough to reach those with the right information. This
problem increases when assembling a group of experts from around the world, something which is a
challenge in Future-oriented Technology Analysis (FTA). In this work, we address the formation of a panel
of experts, specifically how to select a group of experts from a huge group of people. We propose an
approach which uses data quality dimensions to improve expert selection quality and provide quality
metrics to the forecaster. We performed a case study and successfully showed that it is possible to use data
quality methods to support the expert search process.
Small and medium enterprise business solutions using data visualizationjournalBEEI
The small and medium enterprise (SME) companies optimize performance using different automated systems to highlight the operations concerns. However, lack of efficient visualization in reporting results in slow feedbacks, difficulties in extracting root cause, and minimal corrective actions. To complicate matters, the data heterogeneity has intensely increased, and it is produced in a fast manner making it unmanageable if the traditional methods of analytics are applied. Hence, we propose the use of a dashboard that can summarize the operational events using real-time data based on the data visualization approach. This proposed solution summarizes the raw data, which allows the user to make informed decisions that can give a positive impact on business performance. An interactive intelligent dashboard for SME (iid-SME) is developed to tackle issues such as measurement of cases completed, the duration of time needed to solve a case, the individual performance of handling cases and other tasks as a proof of concept. From the result, the implementation of the iid-SME approach simplifies the conveyance of the message and helps the SME personnel to make decisions. With the positive feedback obtained, it is envisaged that such a solution can be further employed for SME improvement for better profit and decision making.
Applying Classification Technique using DID3 Algorithm to improve Decision Su...IJMER
International Journal of Modern Engineering Research (IJMER) is Peer reviewed, online Journal. It serves as an international archival forum of scholarly research related to engineering and science education.
International Journal of Modern Engineering Research (IJMER) covers all the fields of engineering and science: Electrical Engineering, Mechanical Engineering, Civil Engineering, Chemical Engineering, Computer Engineering, Agricultural Engineering, Aerospace Engineering, Thermodynamics, Structural Engineering, Control Engineering, Robotics, Mechatronics, Fluid Mechanics, Nanotechnology, Simulators, Web-based Learning, Remote Laboratories, Engineering Design Methods, Education Research, Students' Satisfaction and Motivation, Global Projects, and Assessment…. And many more.
Improvement of IT Governance Case Study Government Institution Region Xijtsrd
The use of information technology in government processes will increase the efficiency, effectiveness, transparency, and accountability of government administration. Utilization of IT within an organization requires a system to manage IT better as well as the required audit of information technology that can be run in accordance as the expected. IT Audit is an important matter that must be carried out within an organization, also including the Government Institution Region X which utilize the technology of information as supporting the process of the public servicing. The audit of information technology is carried out with the purpose of fixing the critical point or problems that often occur in the process within the institution. As the result of study using the framework of COBIT 5, it shows the level of capability of five IT processes selected are at a lower level, namely APO07 at the level 2, EDM04 at the level 1, DSS01 at the level 1, BAI01 at the level 1, and APO08 at the level 1, whereas the expectation of capabilities of the organization’s leader is at the level 4. The results of the audit of information technology that has been made, shows the difference of the level of gap between the current maturity level with the maturity level based on the organization’s leader. In this study, will be getting suggestions and improvement recommendations according to the framework of COBIT 5 and ITIL 2011. Agus Ade Muliyana Krisna | Gusti Made Arya Sasmita | Gusti Agung Ayu Putri "Improvement of IT Governance (Case Study: Government Institution Region X)" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-4 | Issue-6 , October 2020, URL: https://www.ijtsrd.com/papers/ijtsrd33496.pdf Paper Url: https://www.ijtsrd.com/engineering/information-technology/33496/improvement-of-it-governance-case-study-government-institution-region-x/agus-ade-muliyana-krisna
Analysis and Design of Information Systems Financial Reports with Object Orie...ijceronline
Micro, Small and Medium Enterprises (SMEs) are a group effort proved resistant to a wide range of economic crisis shocks. But in the operation of their business financial management is still not transparent and are also still mixed between business finance and personal finance. So that needs to be done with good financial management. In this research, analysis and information system design financial reports as a basis for the development of the system. Software development life cycle (SDLC) using the model of the object oriented approach. With object-oriented approach, the tools used by the notation Unified Modelling Language (UML). In object-oriented approach all systems applications are Viewed as a collection of objects that allow organisasi interloking and end users to Easily understand logical entities. Object-oriented approach Provides the benefits of the reuse of codes and saves the time for developing quality product.
Employee Engagement within the IT Industry Momo Scott
This document is a 2288 word consulting report that addresses employee engagement within a multinational IT business. The report aims to provide recommendations on whether the client should participate in an industry-wide employee engagement survey and what engagement means within their context. Through a literature review, the report finds that participation in the survey could help the client better understand their workforce and contribute to research in the field. Key factors that influence engagement are identified, and it is recommended that the client participate conditionally to address issues like absenteeism and turnover.
Service-oriented computing has created new requirements for information systems development processes
and methods. The adoption of service-oriented development requires service identification methods
matching the challenge in enterprises. A wide variety of service identification methods (SIM) have been
proposed, but less attention has been paid to the actual requirements of the methods. This paper provides
an ethnographical look at challenges in service identification based on data from 14 service identification
sessions, providing insight into the practice of service identification. The findings identified two types of
service identification sessions and the results can be used for selecting the appropriate SIM based on the
type of the session.
This document discusses enterprise content management (ECM) solutions, including their typical architecture and key challenges in implementation. It describes the four main components of an ECM architecture: (1) the user interface, (2) information governance, (3) attributes like data archiving and workflow, and (4) the repository for secure storage. The document also outlines stages in an ECM implementation roadmap strategy, highlighting the need to specify information governance over the lifecycle and establish interoperability between systems.
This document summarizes the results of an online survey about factors influencing the adoption of agile practices. The survey received 103 responses. Based on the results, the authors hypothesized that: 1) Extreme Programming is the most popular adopted agile methodology. 2) Agile methodologies are chosen more by small teams. 3) Direct interaction with stakeholders is the most effective way to gather requirements. 4) User stories and diagrams are commonly used to capture requirements in agile. 5) Agile gives higher access to stakeholders to resolve problems quickly. 6) Adopting agile increases customer satisfaction and productivity. The survey results provide preliminary support for these hypotheses.
A study of critical success factors for adaption of agile methodologyIAEME Publication
This document discusses critical success factors for adopting agile methodology. It conducted a survey of 200 IT professionals, though only 40 responded. The survey found that management involvement, team member readiness, and organization size significantly impact adopting agile software methodology. The document also reviews various agile methods like extreme programming, scrum, crystal and feature driven development, and compares them to traditional waterfall methods. It notes advantages of agile include short cycles, test-first development, and empowering teams, while waterfalls struggle with changing requirements.
Guidelines to minimize the cost of software quality in agile scrum processijseajournal
This document discusses guidelines for minimizing the cost of software quality in an agile scrum process based on a case study. The case study analyzed a retail domain project that failed to properly adopt the agile scrum process. Key findings include teams being overloaded with sprint backlog items, inability to deliver quality due to tight timelines, and high costs associated with defects discovered late in the process or after production. The document identifies gaps in how the scrum process was implemented and provides guidelines to address them, such as involving all team members in pre-envisioning meetings to improve understanding before sprint planning.
International Journal of Engineering Research and Development (IJERD)IJERD Editor
This document discusses applying agile software development methodology in a dynamic business environment. It begins by defining the traditional software development life cycle and some common development methodologies. It then discusses the principles of agile development, focusing on the Agile Manifesto and Scrum methodology. Some key benefits of agile development discussed include continuous customer feedback, developing products faster through iterative releases, managing change through prioritized backlogs, and continuous risk management through short iterations. Overall, the document argues that agile methods allow for more flexibility and rapid response to changes that are needed in dynamic business environments.
International journal of computer science and innovation vol 2015-n2-paper3sophiabelthome
This document provides an overview and comparison of the Scrum and FDD agile methodologies. It finds that Scrum is more iterative, delivers working software frequently in sprints, emphasizes teamwork and adaptability. FDD focuses more on upfront design, code ownership, and quality control through inspections. Both have benefits and limitations. Overall, combining practices from different agile methods may produce the most effective software development process.
Perspectives on the adherance to scrum rules in software project managementnooriasukmaningtyas
Adapting users need to fulfill their requirements and delivering products to be on time within the planned cost, is critical matter that all software project managers (SPM) put the highest priority for it while considering the users satisfaction at the same time. Agile methodology is one of the solutions provided by software engineers (SE), to get the customers involved in the system development life cycle (SDLC) to avoid the risk nonconformance cost. Yet SPM’s still facing the nonconformance costs and the dynamic changes, and the root cause of the issue is not pointed on to find a solution for it. This undertaking research aimed at determining whether software developers understand scrum rules. In addition, how does this knowledge gab affect the software projects success from the project management perspective. Furthermore, the engagement studied the impact of lack of enough knowledge on the topic to project delivery. The collected data from the qualitative and quantitative methods, which was conducted with scrum teams who worked in the health information system (HIS), Educational solutions, and Governmental solutions has showed deviations in organizational practices and team conflicting, competition, and pressure as well as declined product quality.
A novel risk management model in the Scrum and extreme programming hybrid me...IJECEIAES
Risk management in software development has always been one of the necessities of software project management. The logical nature of software projects and products has caused several challenges and risks in these projects. On the other hand, with the emergence of agile methodologies, especially Scrum, and extreme programming (XP) methodologies, in recent years, this issue has become more serious. This is mainly because emphasizing limited documentation in these methodologies has caused these methods to pay little attention to some aspects of project management, particularly risk management. Concentrating on this challenge, the current study has proposed a risk management model in the hybrid methodology, combining Scrum and XP. Using this model in a case study shows this model's success in achieving risk management purposes. The results of this study indicate an appropriate reduction in the number of reworks, change requests, identified risks, and occurred risks. Moreover, the number of eliminated risks and team productivity have increased.
Taloring A Clouded Data Security Life Cycle EssayMarisela Stone
The document discusses the pros and cons of using an agile methodology for software development projects. It begins by stating that there are many different software development methodologies to choose from, each with their own advantages and disadvantages. It goes on to specifically examine the pros and cons of the agile methodology. Some benefits mentioned are its ability to adapt to changing requirements and provide working software frequently. Potential downsides include higher initial costs and more complex planning. The document concludes by noting agile may be best suited for environments where requirements are uncertain or likely to change.
A CRITICAL ANALYSIS AND COMPARISON OF AGILE WITH TRADITIONAL SOFTWARE DEVELOP...Brooke Heidt
This document provides an overview and comparison of agile and traditional software development processes. It discusses popular agile methodologies like Scrum, Extreme Programming (XP), and Dynamic Systems Development Method (DSDM). Scrum is described in more detail, outlining its roles of Product Owner, Development Team, and Scrum Master. Benefits seen in two case studies using agile/Scrum include higher customer satisfaction, flexibility, and improved communication. While agile allows for evolving requirements, its sprints cannot be modified once started. Overall, agile focuses on iterative development, collaboration, and rapid adaptation to change versus traditional sequential processes.
Integrating Scrum development process with UX design flowjournalBEEI
Nowadays, Agile software development practices are being widely adapted all over the world. Scrum is one of the most known Agile models, it satisfies the business needs and put the main focus on the product. One common challenge for the development of customer-facing products is having a good user experience. This paper presents integrating Scrum development process with user experience design flow. In this study, papers relating to the topic of user experience (UX) process integration with Agile development process, how to measure it and how to improve it, from the year 2010 onwards are reviewed. This is to identify how organizations can integrate UX design flow and Scrum development and get the benefits of both. The conducted review identifies a number of limitations in the existing integrations efforts. A proposed process model to resolve these limitations is presented. Along with our experience in implementing it on an ongoing software development project. The results of applying this process, its impact on the project outcomes quality and the employees’ satisfaction with the process are discussed. The goal of this study is to aid organizations in integrating UX design into their development process.
An Agile Software Development FrameworkWaqas Tariq
Agility in software projects can be attained when software development methodologies attain to external factors and provide a framework internally for keeping software development projects focused. Developer practices are the most important factor that has to cope with the challenges. Agile development assumes a project context where the customer is actively collaborating with the development team. The greatest problem agile teams face is too little involvement from the customer. For a project to be agile, the developers have to cope with this lack of collaboration. Embracing changing requirements is not enough to make agile methods cope with business and technology changes. This paper provides a conceptual framework for tailoring agile methodologies to face different challenges. The framework is comprised of three factors, namely, developer practices, customer collaboration, and predicting change
This document provides an overview and comparison of agile processes for software development. It defines agile processes as iterative and incremental based approaches that emphasize adaptability, customer collaboration, and rapid delivery of working software. The document then discusses characteristics of agile projects like being iterative, modular, and people-oriented. It also describes popular agile methodologies like extreme programming, scrum, and feature-driven development. The paper aims to serve as a guide for understanding agile processes and comparing them to other software development life cycle models.
This document provides an overview and comparison of agile processes for software development. It defines agile processes as iterative and incremental based approaches that emphasize adaptability, customer collaboration, and rapid delivery of working software. The document then discusses characteristics of agile projects like being iterative, modular, and people-oriented. It also describes popular agile methodologies like extreme programming, scrum, and feature-driven development. The paper aims to serve as a guide for software development process models by outlining the advantages and disadvantages of agile approaches.
Software projects mostly exceeds budget, delivered late and does not meet with the customer’s satisfaction for years. In the past, many traditional development models like waterfall, spiral, iterative, and prototyping methods are used to build the software systems. In recent years, agile models are widely used in developing the software products. The major reasons are – simplicity, incorporating the requirement changes at any time, light-weight approach and delivering the working product early and in short duration. Whatever the development model used, it still remains a challenge for software engineer’s to accurately estimate the size, effort and the time required for developing the software system. This survey focuses on the existing estimation models used in traditional as well in agile software development.
The performance of an algorithm can be improved using a parallel computing programming approach. In this study, the performance of bubble sort algorithm on various computer specifications has been applied. Experimental results have shown that parallel computing programming can save significant time performance by 61%-65% compared to serial computing programming.
It concentrate on the development rather than managerial ascepts of software projects.
XP was designed so that organization would be free to adopt all or part of the methodology.
XP projects start with a release planning phase,followed by several iteration ,each of which concludes with user acceptance testing.
When the product has enough features to satisfy users, the team termination iteration and release the software.
Comparative Analysis of Agile Software Development Methodologies-A ReviewIJERA Editor
This document provides a review and comparison of several agile software development methodologies, including Scrum, Extreme Programming (XP), Dynamic Systems Development Method (DSDM), Feature-Driven Development (FDD), and Adaptive Software Development (ASD). It finds that while all agile methods emphasize iterative development, customer collaboration, and responsiveness to change, they differ in their documentation requirements, level of customer involvement, use of meetings, and suitability for small versus large projects. For example, XP and Scrum involve customers most heavily while FDD relies more on documentation, and XP and ASD generally work best for smaller projects compared to Scrum, FDD and DSDM. A table compares the key characteristics of each
Agile techniques that utilize iterative development are broadly used in various industry projects as a lightweight development technique which can satisfy the continuous changes of requirements. Short repetitions are used that are required for efficient product delivery. Traditional and old software development methods are not much efficient and effective to control the rapid change in requirements. Despite the benefits of Agile, criticism on agile methodology states that it couldn’t succeed to pay attention to architectural and design issues and therefore is bound to produce small design-decisions. The past decade has observed numerous changes in systems development with many organizations accepting agile techniques as a viable methodology for developing systems. An increase in the number of research studies reveals the growing demand and acceptance of agile methodologies. While most research has focused on acceptance rate and adaptation of agile practices, there is very limited knowledge of their post-adoption usage and incorporation within organizations. Several factors explain the effective usage of agile methodologies. A combination of previous research in Agile Methodologies, Diffusion of Innovations, Information Systems implementation, and Systems Development has been carried out to develop a research model that identifies the main factors relevant to the propagation and effective usage of agile methodologies in organizations.
Implementation Of Incremental Development ProcessSherry Bailey
The document discusses the implementation of an incremental development process compared to a waterfall methodology. It notes that agile methodologies are open to change throughout the project, unlike waterfall which has rigid requirements. The author provides examples of how Google and a satellite manufacturing company approach development differently due to their product types. Google uses agile methods for software while hardware requires waterfall.
Many software organizations have moved from traditional methods for software development, such as waterfall method to usage of agile methods. Agile methods are used especially in software development and are constantly refurbishing and improving initial plans along the way. In software development the systems usually require frequent changes during the development process. This method is very suitable for small project and organizations, but it is very hard to implement it in large organizations with large projects and teams. This paper
aims to identify weaknesses of two existing scrum frameworks used for large organizations and to present proposed hybrid framework scaled from both existing frameworks. It’s highlighting the importance of predefined life cycle of teams, which is key factor in achieving better timeline and to avoid mistakes that affects the time of release deployment.
Similaire à A case study of using the hybrid model of scrum and six sigma in software development (20)
Redefining brain tumor segmentation: a cutting-edge convolutional neural netw...IJECEIAES
Medical image analysis has witnessed significant advancements with deep learning techniques. In the domain of brain tumor segmentation, the ability to
precisely delineate tumor boundaries from magnetic resonance imaging (MRI)
scans holds profound implications for diagnosis. This study presents an ensemble convolutional neural network (CNN) with transfer learning, integrating
the state-of-the-art Deeplabv3+ architecture with the ResNet18 backbone. The
model is rigorously trained and evaluated, exhibiting remarkable performance
metrics, including an impressive global accuracy of 99.286%, a high-class accuracy of 82.191%, a mean intersection over union (IoU) of 79.900%, a weighted
IoU of 98.620%, and a Boundary F1 (BF) score of 83.303%. Notably, a detailed comparative analysis with existing methods showcases the superiority of
our proposed model. These findings underscore the model’s competence in precise brain tumor localization, underscoring its potential to revolutionize medical
image analysis and enhance healthcare outcomes. This research paves the way
for future exploration and optimization of advanced CNN models in medical
imaging, emphasizing addressing false positives and resource efficiency.
Embedded machine learning-based road conditions and driving behavior monitoringIJECEIAES
Car accident rates have increased in recent years, resulting in losses in human lives, properties, and other financial costs. An embedded machine learning-based system is developed to address this critical issue. The system can monitor road conditions, detect driving patterns, and identify aggressive driving behaviors. The system is based on neural networks trained on a comprehensive dataset of driving events, driving styles, and road conditions. The system effectively detects potential risks and helps mitigate the frequency and impact of accidents. The primary goal is to ensure the safety of drivers and vehicles. Collecting data involved gathering information on three key road events: normal street and normal drive, speed bumps, circular yellow speed bumps, and three aggressive driving actions: sudden start, sudden stop, and sudden entry. The gathered data is processed and analyzed using a machine learning system designed for limited power and memory devices. The developed system resulted in 91.9% accuracy, 93.6% precision, and 92% recall. The achieved inference time on an Arduino Nano 33 BLE Sense with a 32-bit CPU running at 64 MHz is 34 ms and requires 2.6 kB peak RAM and 139.9 kB program flash memory, making it suitable for resource-constrained embedded systems.
Advanced control scheme of doubly fed induction generator for wind turbine us...IJECEIAES
This paper describes a speed control device for generating electrical energy on an electricity network based on the doubly fed induction generator (DFIG) used for wind power conversion systems. At first, a double-fed induction generator model was constructed. A control law is formulated to govern the flow of energy between the stator of a DFIG and the energy network using three types of controllers: proportional integral (PI), sliding mode controller (SMC) and second order sliding mode controller (SOSMC). Their different results in terms of power reference tracking, reaction to unexpected speed fluctuations, sensitivity to perturbations, and resilience against machine parameter alterations are compared. MATLAB/Simulink was used to conduct the simulations for the preceding study. Multiple simulations have shown very satisfying results, and the investigations demonstrate the efficacy and power-enhancing capabilities of the suggested control system.
Neural network optimizer of proportional-integral-differential controller par...IJECEIAES
Wide application of proportional-integral-differential (PID)-regulator in industry requires constant improvement of methods of its parameters adjustment. The paper deals with the issues of optimization of PID-regulator parameters with the use of neural network technology methods. A methodology for choosing the architecture (structure) of neural network optimizer is proposed, which consists in determining the number of layers, the number of neurons in each layer, as well as the form and type of activation function. Algorithms of neural network training based on the application of the method of minimizing the mismatch between the regulated value and the target value are developed. The method of back propagation of gradients is proposed to select the optimal training rate of neurons of the neural network. The neural network optimizer, which is a superstructure of the linear PID controller, allows increasing the regulation accuracy from 0.23 to 0.09, thus reducing the power consumption from 65% to 53%. The results of the conducted experiments allow us to conclude that the created neural superstructure may well become a prototype of an automatic voltage regulator (AVR)-type industrial controller for tuning the parameters of the PID controller.
An improved modulation technique suitable for a three level flying capacitor ...IJECEIAES
This research paper introduces an innovative modulation technique for controlling a 3-level flying capacitor multilevel inverter (FCMLI), aiming to streamline the modulation process in contrast to conventional methods. The proposed
simplified modulation technique paves the way for more straightforward and
efficient control of multilevel inverters, enabling their widespread adoption and
integration into modern power electronic systems. Through the amalgamation of
sinusoidal pulse width modulation (SPWM) with a high-frequency square wave
pulse, this controlling technique attains energy equilibrium across the coupling
capacitor. The modulation scheme incorporates a simplified switching pattern
and a decreased count of voltage references, thereby simplifying the control
algorithm.
A review on features and methods of potential fishing zoneIJECEIAES
This review focuses on the importance of identifying potential fishing zones in seawater for sustainable fishing practices. It explores features like sea surface temperature (SST) and sea surface height (SSH), along with classification methods such as classifiers. The features like SST, SSH, and different classifiers used to classify the data, have been figured out in this review study. This study underscores the importance of examining potential fishing zones using advanced analytical techniques. It thoroughly explores the methodologies employed by researchers, covering both past and current approaches. The examination centers on data characteristics and the application of classification algorithms for classification of potential fishing zones. Furthermore, the prediction of potential fishing zones relies significantly on the effectiveness of classification algorithms. Previous research has assessed the performance of models like support vector machines, naïve Bayes, and artificial neural networks (ANN). In the previous result, the results of support vector machine (SVM) were 97.6% more accurate than naive Bayes's 94.2% to classify test data for fisheries classification. By considering the recent works in this area, several recommendations for future works are presented to further improve the performance of the potential fishing zone models, which is important to the fisheries community.
Electrical signal interference minimization using appropriate core material f...IJECEIAES
As demand for smaller, quicker, and more powerful devices rises, Moore's law is strictly followed. The industry has worked hard to make little devices that boost productivity. The goal is to optimize device density. Scientists are reducing connection delays to improve circuit performance. This helped them understand three-dimensional integrated circuit (3D IC) concepts, which stack active devices and create vertical connections to diminish latency and lower interconnects. Electrical involvement is a big worry with 3D integrates circuits. Researchers have developed and tested through silicon via (TSV) and substrates to decrease electrical wave involvement. This study illustrates a novel noise coupling reduction method using several electrical involvement models. A 22% drop in electrical involvement from wave-carrying to victim TSVs introduces this new paradigm and improves system performance even at higher THz frequencies.
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...IJECEIAES
Climate change's impact on the planet forced the United Nations and governments to promote green energies and electric transportation. The deployments of photovoltaic (PV) and electric vehicle (EV) systems gained stronger momentum due to their numerous advantages over fossil fuel types. The advantages go beyond sustainability to reach financial support and stability. The work in this paper introduces the hybrid system between PV and EV to support industrial and commercial plants. This paper covers the theoretical framework of the proposed hybrid system including the required equation to complete the cost analysis when PV and EV are present. In addition, the proposed design diagram which sets the priorities and requirements of the system is presented. The proposed approach allows setup to advance their power stability, especially during power outages. The presented information supports researchers and plant owners to complete the necessary analysis while promoting the deployment of clean energy. The result of a case study that represents a dairy milk farmer supports the theoretical works and highlights its advanced benefits to existing plants. The short return on investment of the proposed approach supports the paper's novelty approach for the sustainable electrical system. In addition, the proposed system allows for an isolated power setup without the need for a transmission line which enhances the safety of the electrical network
Bibliometric analysis highlighting the role of women in addressing climate ch...IJECEIAES
Fossil fuel consumption increased quickly, contributing to climate change
that is evident in unusual flooding and draughts, and global warming. Over
the past ten years, women's involvement in society has grown dramatically,
and they succeeded in playing a noticeable role in reducing climate change.
A bibliometric analysis of data from the last ten years has been carried out to
examine the role of women in addressing the climate change. The analysis's
findings discussed the relevant to the sustainable development goals (SDGs),
particularly SDG 7 and SDG 13. The results considered contributions made
by women in the various sectors while taking geographic dispersion into
account. The bibliometric analysis delves into topics including women's
leadership in environmental groups, their involvement in policymaking, their
contributions to sustainable development projects, and the influence of
gender diversity on attempts to mitigate climate change. This study's results
highlight how women have influenced policies and actions related to climate
change, point out areas of research deficiency and recommendations on how
to increase role of the women in addressing the climate change and
achieving sustainability. To achieve more successful results, this initiative
aims to highlight the significance of gender equality and encourage
inclusivity in climate change decision-making processes.
Voltage and frequency control of microgrid in presence of micro-turbine inter...IJECEIAES
The active and reactive load changes have a significant impact on voltage
and frequency. In this paper, in order to stabilize the microgrid (MG) against
load variations in islanding mode, the active and reactive power of all
distributed generators (DGs), including energy storage (battery), diesel
generator, and micro-turbine, are controlled. The micro-turbine generator is
connected to MG through a three-phase to three-phase matrix converter, and
the droop control method is applied for controlling the voltage and
frequency of MG. In addition, a method is introduced for voltage and
frequency control of micro-turbines in the transition state from gridconnected mode to islanding mode. A novel switching strategy of the matrix
converter is used for converting the high-frequency output voltage of the
micro-turbine to the grid-side frequency of the utility system. Moreover,
using the switching strategy, the low-order harmonics in the output current
and voltage are not produced, and consequently, the size of the output filter
would be reduced. In fact, the suggested control strategy is load-independent
and has no frequency conversion restrictions. The proposed approach for
voltage and frequency regulation demonstrates exceptional performance and
favorable response across various load alteration scenarios. The suggested
strategy is examined in several scenarios in the MG test systems, and the
simulation results are addressed.
Enhancing battery system identification: nonlinear autoregressive modeling fo...IJECEIAES
Precisely characterizing Li-ion batteries is essential for optimizing their
performance, enhancing safety, and prolonging their lifespan across various
applications, such as electric vehicles and renewable energy systems. This
article introduces an innovative nonlinear methodology for system
identification of a Li-ion battery, employing a nonlinear autoregressive with
exogenous inputs (NARX) model. The proposed approach integrates the
benefits of nonlinear modeling with the adaptability of the NARX structure,
facilitating a more comprehensive representation of the intricate
electrochemical processes within the battery. Experimental data collected
from a Li-ion battery operating under diverse scenarios are employed to
validate the effectiveness of the proposed methodology. The identified
NARX model exhibits superior accuracy in predicting the battery's behavior
compared to traditional linear models. This study underscores the
importance of accounting for nonlinearities in battery modeling, providing
insights into the intricate relationships between state-of-charge, voltage, and
current under dynamic conditions.
Smart grid deployment: from a bibliometric analysis to a surveyIJECEIAES
Smart grids are one of the last decades' innovations in electrical energy.
They bring relevant advantages compared to the traditional grid and
significant interest from the research community. Assessing the field's
evolution is essential to propose guidelines for facing new and future smart
grid challenges. In addition, knowing the main technologies involved in the
deployment of smart grids (SGs) is important to highlight possible
shortcomings that can be mitigated by developing new tools. This paper
contributes to the research trends mentioned above by focusing on two
objectives. First, a bibliometric analysis is presented to give an overview of
the current research level about smart grid deployment. Second, a survey of
the main technological approaches used for smart grid implementation and
their contributions are highlighted. To that effect, we searched the Web of
Science (WoS), and the Scopus databases. We obtained 5,663 documents
from WoS and 7,215 from Scopus on smart grid implementation or
deployment. With the extraction limitation in the Scopus database, 5,872 of
the 7,215 documents were extracted using a multi-step process. These two
datasets have been analyzed using a bibliometric tool called bibliometrix.
The main outputs are presented with some recommendations for future
research.
Use of analytical hierarchy process for selecting and prioritizing islanding ...IJECEIAES
One of the problems that are associated to power systems is islanding
condition, which must be rapidly and properly detected to prevent any
negative consequences on the system's protection, stability, and security.
This paper offers a thorough overview of several islanding detection
strategies, which are divided into two categories: classic approaches,
including local and remote approaches, and modern techniques, including
techniques based on signal processing and computational intelligence.
Additionally, each approach is compared and assessed based on several
factors, including implementation costs, non-detected zones, declining
power quality, and response times using the analytical hierarchy process
(AHP). The multi-criteria decision-making analysis shows that the overall
weight of passive methods (24.7%), active methods (7.8%), hybrid methods
(5.6%), remote methods (14.5%), signal processing-based methods (26.6%),
and computational intelligent-based methods (20.8%) based on the
comparison of all criteria together. Thus, it can be seen from the total weight
that hybrid approaches are the least suitable to be chosen, while signal
processing-based methods are the most appropriate islanding detection
method to be selected and implemented in power system with respect to the
aforementioned factors. Using Expert Choice software, the proposed
hierarchy model is studied and examined.
Enhancing of single-stage grid-connected photovoltaic system using fuzzy logi...IJECEIAES
The power generated by photovoltaic (PV) systems is influenced by
environmental factors. This variability hampers the control and utilization of
solar cells' peak output. In this study, a single-stage grid-connected PV
system is designed to enhance power quality. Our approach employs fuzzy
logic in the direct power control (DPC) of a three-phase voltage source
inverter (VSI), enabling seamless integration of the PV connected to the
grid. Additionally, a fuzzy logic-based maximum power point tracking
(MPPT) controller is adopted, which outperforms traditional methods like
incremental conductance (INC) in enhancing solar cell efficiency and
minimizing the response time. Moreover, the inverter's real-time active and
reactive power is directly managed to achieve a unity power factor (UPF).
The system's performance is assessed through MATLAB/Simulink
implementation, showing marked improvement over conventional methods,
particularly in steady-state and varying weather conditions. For solar
irradiances of 500 and 1,000 W/m2
, the results show that the proposed
method reduces the total harmonic distortion (THD) of the injected current
to the grid by approximately 46% and 38% compared to conventional
methods, respectively. Furthermore, we compare the simulation results with
IEEE standards to evaluate the system's grid compatibility.
Enhancing photovoltaic system maximum power point tracking with fuzzy logic-b...IJECEIAES
Photovoltaic systems have emerged as a promising energy resource that
caters to the future needs of society, owing to their renewable, inexhaustible,
and cost-free nature. The power output of these systems relies on solar cell
radiation and temperature. In order to mitigate the dependence on
atmospheric conditions and enhance power tracking, a conventional
approach has been improved by integrating various methods. To optimize
the generation of electricity from solar systems, the maximum power point
tracking (MPPT) technique is employed. To overcome limitations such as
steady-state voltage oscillations and improve transient response, two
traditional MPPT methods, namely fuzzy logic controller (FLC) and perturb
and observe (P&O), have been modified. This research paper aims to
simulate and validate the step size of the proposed modified P&O and FLC
techniques within the MPPT algorithm using MATLAB/Simulink for
efficient power tracking in photovoltaic systems.
Adaptive synchronous sliding control for a robot manipulator based on neural ...IJECEIAES
Robot manipulators have become important equipment in production lines, medical fields, and transportation. Improving the quality of trajectory tracking for
robot hands is always an attractive topic in the research community. This is a
challenging problem because robot manipulators are complex nonlinear systems
and are often subject to fluctuations in loads and external disturbances. This
article proposes an adaptive synchronous sliding control scheme to improve trajectory tracking performance for a robot manipulator. The proposed controller
ensures that the positions of the joints track the desired trajectory, synchronize
the errors, and significantly reduces chattering. First, the synchronous tracking
errors and synchronous sliding surfaces are presented. Second, the synchronous
tracking error dynamics are determined. Third, a robust adaptive control law is
designed,the unknown components of the model are estimated online by the neural network, and the parameters of the switching elements are selected by fuzzy
logic. The built algorithm ensures that the tracking and approximation errors
are ultimately uniformly bounded (UUB). Finally, the effectiveness of the constructed algorithm is demonstrated through simulation and experimental results.
Simulation and experimental results show that the proposed controller is effective with small synchronous tracking errors, and the chattering phenomenon is
significantly reduced.
Remote field-programmable gate array laboratory for signal acquisition and de...IJECEIAES
A remote laboratory utilizing field-programmable gate array (FPGA) technologies enhances students’ learning experience anywhere and anytime in embedded system design. Existing remote laboratories prioritize hardware access and visual feedback for observing board behavior after programming, neglecting comprehensive debugging tools to resolve errors that require internal signal acquisition. This paper proposes a novel remote embeddedsystem design approach targeting FPGA technologies that are fully interactive via a web-based platform. Our solution provides FPGA board access and debugging capabilities beyond the visual feedback provided by existing remote laboratories. We implemented a lab module that allows users to seamlessly incorporate into their FPGA design. The module minimizes hardware resource utilization while enabling the acquisition of a large number of data samples from the signal during the experiments by adaptively compressing the signal prior to data transmission. The results demonstrate an average compression ratio of 2.90 across three benchmark signals, indicating efficient signal acquisition and effective debugging and analysis. This method allows users to acquire more data samples than conventional methods. The proposed lab allows students to remotely test and debug their designs, bridging the gap between theory and practice in embedded system design.
Detecting and resolving feature envy through automated machine learning and m...IJECEIAES
Efficiently identifying and resolving code smells enhances software project quality. This paper presents a novel solution, utilizing automated machine learning (AutoML) techniques, to detect code smells and apply move method refactoring. By evaluating code metrics before and after refactoring, we assessed its impact on coupling, complexity, and cohesion. Key contributions of this research include a unique dataset for code smell classification and the development of models using AutoGluon for optimal performance. Furthermore, the study identifies the top 20 influential features in classifying feature envy, a well-known code smell, stemming from excessive reliance on external classes. We also explored how move method refactoring addresses feature envy, revealing reduced coupling and complexity, and improved cohesion, ultimately enhancing code quality. In summary, this research offers an empirical, data-driven approach, integrating AutoML and move method refactoring to optimize software project quality. Insights gained shed light on the benefits of refactoring on code quality and the significance of specific features in detecting feature envy. Future research can expand to explore additional refactoring techniques and a broader range of code metrics, advancing software engineering practices and standards.
Smart monitoring technique for solar cell systems using internet of things ba...IJECEIAES
Rapidly and remotely monitoring and receiving the solar cell systems status parameters, solar irradiance, temperature, and humidity, are critical issues in enhancement their efficiency. Hence, in the present article an improved smart prototype of internet of things (IoT) technique based on embedded system through NodeMCU ESP8266 (ESP-12E) was carried out experimentally. Three different regions at Egypt; Luxor, Cairo, and El-Beheira cities were chosen to study their solar irradiance profile, temperature, and humidity by the proposed IoT system. The monitoring data of solar irradiance, temperature, and humidity were live visualized directly by Ubidots through hypertext transfer protocol (HTTP) protocol. The measured solar power radiation in Luxor, Cairo, and El-Beheira ranged between 216-1000, 245-958, and 187-692 W/m 2 respectively during the solar day. The accuracy and rapidity of obtaining monitoring results using the proposed IoT system made it a strong candidate for application in monitoring solar cell systems. On the other hand, the obtained solar power radiation results of the three considered regions strongly candidate Luxor and Cairo as suitable places to build up a solar cells system station rather than El-Beheira.
An efficient security framework for intrusion detection and prevention in int...IJECEIAES
Over the past few years, the internet of things (IoT) has advanced to connect billions of smart devices to improve quality of life. However, anomalies or malicious intrusions pose several security loopholes, leading to performance degradation and threat to data security in IoT operations. Thereby, IoT security systems must keep an eye on and restrict unwanted events from occurring in the IoT network. Recently, various technical solutions based on machine learning (ML) models have been derived towards identifying and restricting unwanted events in IoT. However, most ML-based approaches are prone to miss-classification due to inappropriate feature selection. Additionally, most ML approaches applied to intrusion detection and prevention consider supervised learning, which requires a large amount of labeled data to be trained. Consequently, such complex datasets are impossible to source in a large network like IoT. To address this problem, this proposed study introduces an efficient learning mechanism to strengthen the IoT security aspects. The proposed algorithm incorporates supervised and unsupervised approaches to improve the learning models for intrusion detection and mitigation. Compared with the related works, the experimental outcome shows that the model performs well in a benchmark dataset. It accomplishes an improved detection accuracy of approximately 99.21%.
The CBC machine is a common diagnostic tool used by doctors to measure a patient's red blood cell count, white blood cell count and platelet count. The machine uses a small sample of the patient's blood, which is then placed into special tubes and analyzed. The results of the analysis are then displayed on a screen for the doctor to review. The CBC machine is an important tool for diagnosing various conditions, such as anemia, infection and leukemia. It can also help to monitor a patient's response to treatment.
CHINA’S GEO-ECONOMIC OUTREACH IN CENTRAL ASIAN COUNTRIES AND FUTURE PROSPECTjpsjournal1
The rivalry between prominent international actors for dominance over Central Asia's hydrocarbon
reserves and the ancient silk trade route, along with China's diplomatic endeavours in the area, has been
referred to as the "New Great Game." This research centres on the power struggle, considering
geopolitical, geostrategic, and geoeconomic variables. Topics including trade, political hegemony, oil
politics, and conventional and nontraditional security are all explored and explained by the researcher.
Using Mackinder's Heartland, Spykman Rimland, and Hegemonic Stability theories, examines China's role
in Central Asia. This study adheres to the empirical epistemological method and has taken care of
objectivity. This study analyze primary and secondary research documents critically to elaborate role of
china’s geo economic outreach in central Asian countries and its future prospect. China is thriving in trade,
pipeline politics, and winning states, according to this study, thanks to important instruments like the
Shanghai Cooperation Organisation and the Belt and Road Economic Initiative. According to this study,
China is seeing significant success in commerce, pipeline politics, and gaining influence on other
governments. This success may be attributed to the effective utilisation of key tools such as the Shanghai
Cooperation Organisation and the Belt and Road Economic Initiative.
Batteries -Introduction – Types of Batteries – discharging and charging of battery - characteristics of battery –battery rating- various tests on battery- – Primary battery: silver button cell- Secondary battery :Ni-Cd battery-modern battery: lithium ion battery-maintenance of batteries-choices of batteries for electric vehicle applications.
Fuel Cells: Introduction- importance and classification of fuel cells - description, principle, components, applications of fuel cells: H2-O2 fuel cell, alkaline fuel cell, molten carbonate fuel cell and direct methanol fuel cells.
International Conference on NLP, Artificial Intelligence, Machine Learning an...gerogepatton
International Conference on NLP, Artificial Intelligence, Machine Learning and Applications (NLAIM 2024) offers a premier global platform for exchanging insights and findings in the theory, methodology, and applications of NLP, Artificial Intelligence, Machine Learning, and their applications. The conference seeks substantial contributions across all key domains of NLP, Artificial Intelligence, Machine Learning, and their practical applications, aiming to foster both theoretical advancements and real-world implementations. With a focus on facilitating collaboration between researchers and practitioners from academia and industry, the conference serves as a nexus for sharing the latest developments in the field.
Comparative analysis between traditional aquaponics and reconstructed aquapon...bijceesjournal
The aquaponic system of planting is a method that does not require soil usage. It is a method that only needs water, fish, lava rocks (a substitute for soil), and plants. Aquaponic systems are sustainable and environmentally friendly. Its use not only helps to plant in small spaces but also helps reduce artificial chemical use and minimizes excess water use, as aquaponics consumes 90% less water than soil-based gardening. The study applied a descriptive and experimental design to assess and compare conventional and reconstructed aquaponic methods for reproducing tomatoes. The researchers created an observation checklist to determine the significant factors of the study. The study aims to determine the significant difference between traditional aquaponics and reconstructed aquaponics systems propagating tomatoes in terms of height, weight, girth, and number of fruits. The reconstructed aquaponics system’s higher growth yield results in a much more nourished crop than the traditional aquaponics system. It is superior in its number of fruits, height, weight, and girth measurement. Moreover, the reconstructed aquaponics system is proven to eliminate all the hindrances present in the traditional aquaponics system, which are overcrowding of fish, algae growth, pest problems, contaminated water, and dead fish.
Use PyCharm for remote debugging of WSL on a Windo cf5c162d672e4e58b4dde5d797...shadow0702a
This document serves as a comprehensive step-by-step guide on how to effectively use PyCharm for remote debugging of the Windows Subsystem for Linux (WSL) on a local Windows machine. It meticulously outlines several critical steps in the process, starting with the crucial task of enabling permissions, followed by the installation and configuration of WSL.
The guide then proceeds to explain how to set up the SSH service within the WSL environment, an integral part of the process. Alongside this, it also provides detailed instructions on how to modify the inbound rules of the Windows firewall to facilitate the process, ensuring that there are no connectivity issues that could potentially hinder the debugging process.
The document further emphasizes on the importance of checking the connection between the Windows and WSL environments, providing instructions on how to ensure that the connection is optimal and ready for remote debugging.
It also offers an in-depth guide on how to configure the WSL interpreter and files within the PyCharm environment. This is essential for ensuring that the debugging process is set up correctly and that the program can be run effectively within the WSL terminal.
Additionally, the document provides guidance on how to set up breakpoints for debugging, a fundamental aspect of the debugging process which allows the developer to stop the execution of their code at certain points and inspect their program at those stages.
Finally, the document concludes by providing a link to a reference blog. This blog offers additional information and guidance on configuring the remote Python interpreter in PyCharm, providing the reader with a well-rounded understanding of the process.
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressionsVictor Morales
K8sGPT is a tool that analyzes and diagnoses Kubernetes clusters. This presentation was used to share the requirements and dependencies to deploy K8sGPT in a local environment.
Null Bangalore | Pentesters Approach to AWS IAMDivyanshu
#Abstract:
- Learn more about the real-world methods for auditing AWS IAM (Identity and Access Management) as a pentester. So let us proceed with a brief discussion of IAM as well as some typical misconfigurations and their potential exploits in order to reinforce the understanding of IAM security best practices.
- Gain actionable insights into AWS IAM policies and roles, using hands on approach.
#Prerequisites:
- Basic understanding of AWS services and architecture
- Familiarity with cloud security concepts
- Experience using the AWS Management Console or AWS CLI.
- For hands on lab create account on [killercoda.com](https://killercoda.com/cloudsecurity-scenario/)
# Scenario Covered:
- Basics of IAM in AWS
- Implementing IAM Policies with Least Privilege to Manage S3 Bucket
- Objective: Create an S3 bucket with least privilege IAM policy and validate access.
- Steps:
- Create S3 bucket.
- Attach least privilege policy to IAM user.
- Validate access.
- Exploiting IAM PassRole Misconfiguration
-Allows a user to pass a specific IAM role to an AWS service (ec2), typically used for service access delegation. Then exploit PassRole Misconfiguration granting unauthorized access to sensitive resources.
- Objective: Demonstrate how a PassRole misconfiguration can grant unauthorized access.
- Steps:
- Allow user to pass IAM role to EC2.
- Exploit misconfiguration for unauthorized access.
- Access sensitive resources.
- Exploiting IAM AssumeRole Misconfiguration with Overly Permissive Role
- An overly permissive IAM role configuration can lead to privilege escalation by creating a role with administrative privileges and allow a user to assume this role.
- Objective: Show how overly permissive IAM roles can lead to privilege escalation.
- Steps:
- Create role with administrative privileges.
- Allow user to assume the role.
- Perform administrative actions.
- Differentiation between PassRole vs AssumeRole
Try at [killercoda.com](https://killercoda.com/cloudsecurity-scenario/)
A case study of using the hybrid model of scrum and six sigma in software development
1. International Journal of Electrical and Computer Engineering (IJECE)
Vol. 11, No. 6, December 2021, pp. 5342~5350
ISSN: 2088-8708, DOI: 10.11591/ijece.v11i6.pp5342-5350 5342
Journal homepage: http://ijece.iaescore.com
A case study of using the hybrid model of scrum and six sigma
in software development
Mona Najafi Sarpiri1
, Taghi Javdani Gandomani2
1
Department of Computer Eng., Isfahan Branch, Islamic Azad University, Isfahan, Iran
2
Department of Computer Science, Shahrekord University, Shahrekord, Iran
Article Info ABSTRACT
Article history:
Received Oct 16, 2020
Revised Apr 11, 2021
Accepted Jun 12, 2021
The world of software engineering is constantly discovering new ways that
lead to an increase in team performance in the production of software
products and, at the same time, brings the customer's further satisfaction.
With the advent of agile methodologies in software development, these
objectives have been considered more seriously by software teams and
companies. Due to their very nature, agile methodologies have the potential
to be integrated with other methodologies or specific managerial approaches
defined in line with agility objectives. One of the cases is Six Sigma, which
is used in organizations by focusing on organizational change and process
improvement. In the present study, attempts were made to present the hybrid
software development approach, including Scrum, as the most common agile
and Six Sigma methodology. This approach was practically used in a case
study, and the obtained results were analyzed. The results of this evaluation
showed that this hybrid method could lead to the increased team performance
and customer satisfaction. However, besides these two achievements, an
increase in the number of re-works, number of defects discovered, and the
duration of the project implementation were also observed. These cases are
in line with the main objectives of Scrum and Six Sigma and are justifiable
and acceptable due to achieving those objectives.
Keywords:
Agile methodologies
Agile software development
Hybrid methodologies
Scrum
Six sigma
This is an open access article under the CC BY-SA license.
Corresponding Author:
Taghi Javdani Gandomani
Department of Computer Science
Shahrekord University
Rahbar Blvd., Shahrekord, Iran
Email: javdani@sku.ac.ir
1. INTRODUCTION
Software specialists have always sought solutions and methods to improve software development. In
this path, the use of appropriate methodologies, new solutions, emerging technologies, and tools is
considered [1]. Over the past years, the use of agile methodologies in software development has been the first
option of the software companies and teams in the software project implementation. By focusing on the
values such as paying attention to the customers and their changing needs, individuals and human
interactions, avoiding development overheads, frequent and early delivery of the developed parts of the
software, these methodologies seek to create higher performance in the software projects [2], [3].
Agile methodologies in software development include a set of methods, each of which considers a
specific part of the product development process. Among the most important agile software development
methodologies are Scrum, XP, Crystal, Kanban, feature-driven development (FDD), and test-driven
development (TDD) [4]. Meanwhile, the use of Scrum combined with other methodologies, such as XP or
Kanban, is the most popular and most common option considered by agile teams [3].
2. Int J Elec & Comp Eng ISSN: 2088-8708
A case study of using the hybrid model of scrum and six sigma in … (Mona Najafi Sarpiri)
5343
Literature review shows that the combination of software development methodologies has, in many
cases, helped software teams achieve specific objectives. In fact, software teams combine software
methodologies to use their strengths and enjoy their benefits together [5]-[7]. Meanwhile, sometimes
software methodologies are combined with new standards and thoughts of productivity and management,
which have resulted in increasing team productivity and customer satisfaction and improving the software
development process [8]-[10].
Six Sigma is one of the successful strategies in the industry that pays special attention to achieving
specific objectives, such as process performance, customer satisfaction, cost reduction, and quality increase
[11]. Six Sigma is regarded as an organizational transformation strategy leading to the expansion and
development of managerial methods in an organization. In this strategy, the measurement and evaluation of
the improvement rate is a necessity to help managers make decisions about process improvement. Over
recent years, this thinking has been used in software project implementation [12]-[14]. The objectives and
values on which Six Sigma has been defined and developed seem to be aligned with the values and objectives
pursued in agile software development processes. The combination of these two approaches can thus have
positive effects. However, this may have overheads for agile development that partially take agile processes
away from their ideals.
In the present case study, Scrum and Six Sigma frameworks were combined to investigate the
effectiveness of using Six Sigma thinking and strategy in performing software projects. The rest of this
article is organized as follows. In the second section, the agile approach and Scrum are introduced briefly.
The third section is allocated to the brief introduction of Six Sigma. Section four shows how to use and
integrate Scrum and Six Sigma. In Section five, the result of combining these two processes is evaluated in a
case study. Finally, the last section concludes the paper.
2. SCRUM AND AGILE SOFTWARE DEVELOPMENT
The use of software development methodologies is regarded as an initial infrastructure for software
project implementation. A software methodology shows the software development stages, how to develop a
software product, and how to manage the development process. Over the last two decades, agile
methodologies have taken the place of disciplined or plan-driven methodologies in the software world. In
agile methodologies, the development method, values, and principles governing the development process and
the project management method have undergone a serious change. That is why today, the past methodologies
are known as traditional methodologies.
In software development, agile methodologies were formally introduced with the creation of the
agile manifesto in 2001 [2]. However, in practice, it tooks the software industry some time to adapt to these
new methodologies and use them in the software project implementation. What makes these methodologies
popular is the values and principles on which these methods are defined. These values and principles are, in
most cases, in contrast with the values and principles that have received attention in disciplined
methodologies.
Though several agile methodologies have been proposed, all are a function of values and principles
highlighted in the agile manifesto. The most common agile methodology is Scrum; whose special role is in
managing software projects. This methodology emphasizes the iterative development of the product over
short-term intervals called Sprints. In this methodology, the three key roles of master Scrum, product owner,
and development team members are the only roles considered during the software product development [15].
The traditional responsibilities in a software development team are assigned to these roles [16]. Owing to the
proper agility of this methodology and its main role in managing the project, many aspects of product
development are not specified in this methodology, and software teams are free to consider them based on
their needs and capabilities [17]. That is why this methodology has a high potential to be combined with
other methodologies, standards, and management approaches. The existence of various hybrid solutions with
regard to this methodology is rooted in this fact [5], [7], [18]-[20].
As shown in Figure 1, Scrum process is a three-step process. The team focus in the first step, called
the pre-game phase, is on preparing the project implementation environment and defining the generalities and
initial architecture of the product. The second phase, called game or development, is allocated to the iterative
process of the product development. In each iteration, a part of the product is prepared and can be delivered.
The third step, called the post-game phase, emphasizes the integration of the developed parts and the final
product testing and delivery to the customer.
Over recent years, Scrum has been widely used in software companies such that it is known as the
most popular agile methodology [3]. Of course, this may be caused by several problems that software teams
and companies often experience in managing software projects. The problems that have made the use of
Scrum a managerial requirement in these projects.
3. ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 11, No. 6, December 2021 : 5342 - 5350
5344
Figure 1. Scrum process in software development [4]
3. SIX SIGMA
In most companies and organizations, Six Sigma means a measure of quality that tends toward the
ultimate quality. This concept is regarded as Six Sigma, or perfection, or complete satisfaction of the
customer. But, in a sense, Six Sigma is a disciplined data-driven methodology in quality management used
for defects elimination and reduction in executive processes. These processes can embrace all industrial,
manufacturing, and service processes [18], [21].
The Six Sigma statistics-based approach quantitatively shows how a process is implemented. To
achieve this objective, the maximum defect and failure in a process should not be more than 3.4 defects per
million based on the Six Sigma approach [18]. Any defect in Six Sigma is anything that is not within the
customer's needs and requirements. The final objective of Six Sigma is to implement a measurement-based
strategy that focuses on process improvement and qualitative deviations reduction through the
implementation of the Six Sigma recovery mechanisms. To this purpose, Six Sigma uses two sub-processes:
DMAIC and DMADV [9], [18].
As shown in Figure 2, the Six Sigma DMAIC process is a system for improving the existing process
[9]. This process includes define, measure, analyze, improve, and control stages. DMADV process is an
improvement process that focuses on new processes and products based on the expected quality levels of Six
Sigma [9]. As shown in Figure 3, this process includes define, measure, analyze, design, and verify stages.
The reason why Six Sigma emphasizes on measurement is the fact that improvement is impossible without
measuring and analyzing its results [19]. That is why these steps are seen in both sub-processes of Six Sigma.
Figure 2. Six Sigma DMAIC process Figure 3. Six Sigma DMADV
4. Int J Elec & Comp Eng ISSN: 2088-8708
A case study of using the hybrid model of scrum and six sigma in … (Mona Najafi Sarpiri)
5345
4. COMBINATION OF SCRUM AND SIX SIGMA
The simultaneous use of Scrum and Six Sigma seems to be a logical option for software companies
to enjoy the benefits and values promised in both thoughts simultaneously. The important point is how one
can use Six Sigma in a company or software team that uses Scrum [22], [23]. To this purpose, the
requirements of Six Sigma somehow needed to be used in the Scrum framework.
As the basis of Six Sigma is measurement and measurement-based improvement, this requirement
somehow should be seen in the scrum process steps. The first step of Six Sigma is defining phase, which can
be used transparently in the contract and the initial requirements of the project implementation, project
objectives, financial effects assessment, and the team members' performance. Additionally, the productivity
index- related criteria and the analytical statistics required to determine the scope of the project should be
defined simply and clearly [24]. The measure phase is the second phase in the Six Sigma. In this phase, the
SIPOC diagram is used to determine the relevant elements of the project that needs improvement. This is
done prior to the project onset [25]. Each of the executive elements in a Sprint should be documented and
considered based on their inputs and outputs, tasks to be done in them, the persons in charge, and the time
required for each activity [24]. In addition, the cause-effect diagram and analysis should be used for the
problems occurring during Sprint, daily Scrum, review meeting, and retrospective meeting. The input and
output of the development process should also be considered properly. In this section, the appropriate
matrices are required to be used to analyze the profit and cost, and the priorities required by the customer.
Finally, an appropriate measurement and inspection system is needed to be defined to ensure the accuracy of
the data collection process.
The third step of Six Sigma is the analyze phase. The fault tree analysis diagram can be used in this
step to find the root cause of a fault [24]. Moreover, regression analyses can be used to model the relation
between and dependence of the process input and output variables. This can be used to analyze the positive
or negative events that occurred in the development process and make it possible to discover defects and
eliminate them. In the improve phase of Six Sigma, an executive plan is created in which the measures
required to be taken during daily Scrum, review meeting, and retrospective meeting have been specified.
Updating SIPOC and the initial process map by considering the executive plan implementation can be a great
help in this section. In this section, the effectiveness of the taken improvement measures should be evaluated
and analyzed.
In the control phase, a control plan is required to be created based on the critical inputs analyses,
which properly show the result of the performance indices and can be used as a control tool to monitor the
probable improvements occurred in processes and individuals. For Six Sigma to be deployed properly, the
following can be used as executive principles during the scrum development process [13], [14], [24], [25]:
- Implementing DMAIC process in each Sprint
- Using process maps to create Sprint backlogs
- Determining the reason for defects or faults in the review and retrospective meetings
- Prioritizing the product backlog given the improvements expected by the customer
Overall, it seems that the use of Six Sigma in the scrum process, qualitative improvements, and scrum
objectives can be properly considered, and the synergy of these two approaches leads to better achievement
of these objectives. But there may be overheads to be considered in practical investigations.
5. CASE STUDY
One of the appropriate options to evaluate the combination of methodologies is to use case studies.
In the present study, the hybrid Scrum with Six Sigma was used in a case study. In this case study, the hybrid
method was used in a project in the health domain. The above-mentioned project was implemented in the
form of three separate teams. Table 1 shows the teams' characteristics.
Table 1. Team’s specifications
Team Size Avg. work experience Avg. cooperation in team
Team1 6 4.5 3.1
Team2 5 4.3 3.3
Team3 6 4.7 3.4
The composition of these teams was considered by the senior management of the company before
the project onset. In this composition, attempts were made to keep teams at the same level of workability and
engineering. To reduce the impact of selecting the volunteering team, a draw was made between the three
teams for the hybrid method implementation, and Team1 was selected as the team to be studied. The other
5. ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 11, No. 6, December 2021 : 5342 - 5350
5346
two teams were regarded as the control teams in this research. Team 1 received justification for using the
hybrid methodology, and subjects were provided with the required training in this regard, and the required
forms and procedures were prepared to implement and consider Six Sigma.
To evaluate the effectiveness of the hybrid methodology, the following variables were considered
and gathered in all three teams during the Sprints implementation.
- Number of discovered defects in each Sprint
- Amount of re-works (the user stories transferred from the previous to the current Sprint)
- Team performance (resulting from the amount of the end-user stories selected in each Sprint)
- Amount of customer satisfaction (customer self-reporting in the Sprint review meeting)
Considering the contract negotiations with the customer, it should be noted that the length of the
Sprints of this project was considered to be one week. The product backlog was the same in this project, and
each team selected user stories from this backlog based on customer preferences. The nature of the project
was thus the same for each of the three teams, and there was no damage to the research and its results. In the
following, the results of data collection with regard to the variables of this research are addressed.
5.1. Detected defects
From one point of view, the defects discovered in each Sprint indicate the team's accuracy in
performing its tasks. The more the team can obtain a degree of executive quality where fewer defects are
observed, the higher the quality of the development process implementation. Table 2 shows the defects
observed during the project in the project Sprints.
The statistics mentioned above show that the total of the defects observed in Team1 is in total less
than that of the other two teams. Also, observing these defects in Team 1 has been a downward trend,
indicating that this team has outperformed the other two ones. The point is that overall, Team 1 implemented
one Sprint more than the other two teams. Figure 4 shows the statistics of the defects discovered in three
teams.
Table 2. Defects observed during the project (in each Sprint)
Team SP1 SP2 SP3 SP4 SP5 SP6 SP7 SP8 SP9 Sum
Team1 5 5 4 3 4 3 3 2 2 31
Team2 4 5 5 4 5 4 4 4 - 35
Team3 5 6 5 3 4 5 6 5 - 39
Figure 4. Number of defects detected in the teams
5.2. Re-works
This criterion shows the extent to which teams are required to perform things again. This
requirement can be caused by performing inspections and considering the evaluation criteria. In calculating
the re-works in this research, the amount of user stories that teams have to implement again in the next Sprint
is considered. The lower the volume of the user stories, the higher the team accuracy in performing tasks.
Table 3 shows the statistics related to this criterion.
As can be observed, the amount of re-works in Team 1 is greater than the other two teams. This is
probably caused by the use of measurements and evaluations required by Six Sigma for the team understudy.
In fact, to achieve the Six Sigma objectives, the team seems to be sometimes forced to do more things. This
6. Int J Elec & Comp Eng ISSN: 2088-8708
A case study of using the hybrid model of scrum and six sigma in … (Mona Najafi Sarpiri)
5347
may be regarded as a negative point, but is justifiable with regard to achieving higher quality and customer
satisfaction. Figure 5 shows the number of re-works in the case study teams.
Table 3. Number of User Stories transferred from the previous sprint (re-works)
Team SP1 SP2 SP3 SP4 SP5 SP6 SP7 SP8 SP9 Sum
Team1 0 18 25 20 17 25 30 22 0 175
Team2 0 20 20 10 15 16 14 10 0 105
Team3 0 15 17 15 10 20 15 14 0 106
Figure 5. Number of re-works in the case study teams
5.3. Team performance
One of the other comparison criteria in this research is the team performance. In this research, the
amount of the work done (end-user stories in each Sprint) to the committed work (selected user stories in
each Sprint) ratio was used. The statistics are shown in Table 4.
As can be observed, the team performance in Team 1 is higher than the other two teams. This
increase indicates that the effectiveness of Scrum and Six Sigma together has led to an increase in team
performance. Also, the synergy of these approaches appears to be effective in line with their common
objectives. Figure 6 demonstrates the above statistics.
Table 4. Team performance in teams’ understudy (percentage)
Team SP1 SP2 SP3 SP4 SP5 SP6 SP7 SP8 Average
Team1 78 82 83 84 83 89 87 87 74.8
Team2 80 76 76 80 78 85 85 - 70
Team3 77 79 83 82 80 80 80 - 70.1
Figure 6. Team performance in the team’s understudy
7. ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 11, No. 6, December 2021 : 5342 - 5350
5348
5.4. Customer satisfaction
In this research, the customer was asked about customer satisfaction, which is one of the main
objectives of Scrum and Six Sigma, by several questions via a self-reporting form. Finally, the customer was
asked to express his absolute satisfaction in percentage. Table 5 shows these statistics.
Table 5. Customer satisfaction in the team’s understudy (percentage)
Team SP1 SP2 SP3 SP4 SP5 SP6 SP7 SP8 SP9 Average
Team1 85 85 85 85 90 95 90 90 95 88.9
Team2 85 80 80 85 80 85 85 85 - 83.1
Team3 75 75 70 80 80 90 85 80 - 79.4
As can be observed, approximately in all Sprints, the customer satisfaction reported by the customer
on the performance of Team1 has been higher than the other two teams. This criterion indicates the success
of the hybrid methodology in achieving the particular objective of increasing customer satisfaction. The
customer satisfaction trend is shown in Figure 7.
Figure 7. Customer satisfaction in the team’s understudy
6. CONCLUSION
The use of hybrid methodologies in software development is one of the solutions used by software
engineers to achieve greater competitiveness and higher quality of software products. This combination can
include methodologies, approaches, and quality and managerial standards. Today, the use of Scrum has a
particular place in most software teams. However, these teams also attempt to use the strengths of other
standards and methodologies. The Six Sigma methodology is one of the managerial approaches that help
companies achieve higher quality and greater customer satisfaction through the manufacturing and service
processes improvement. In the present study, a combination of Scrum and Six Sigma was proposed and
investigated. This hybrid method was used in a real-world case study and evaluated experimentally. The
results obtained from this case study showed that the above hybrid method led to an increase in team
performance and customer satisfaction. However, the amount of the discovered defects, re-works, and
duration of the project implementation were higher in the team that used the hybrid method than the other
teams that only used Scrum. Of course, the increased amount of the discovered defects and re-works seems to
be caused by the higher accuracy and obsession in achieving quality criteria. This increase is justifiable from
this perspective. But it seems that due to the overheads caused by doing more work and the team
responsibilities resulting from the use of Six Sigma, the execution time of the work will increase. But due to
the overheads caused by doing more work and the team responsibilities resulting from the use of Six Sigma,
the duration of the project implementation seems to increase.
ACKNOWLEDGEMENTS
The authors would like to thank the teams that voluntarily participated in this study. Without their
help and support, this study could not be completed.
8. Int J Elec & Comp Eng ISSN: 2088-8708
A case study of using the hybrid model of scrum and six sigma in … (Mona Najafi Sarpiri)
5349
REFERENCES
[1] P. Tell et al., "What are hybrid development methods made of? an evidence-based characterization," in 2019
IEEE/ACM International Conference on Software and System Processes (ICSSP), 2019, pp. 105-114, doi:
10.1109/ICSSP.2019.00022.
[2] K. Beck et al., “Agile Manifesto,” agilemanifesto.org. accessed June 2020. [Online]. Available:
http://www.agilemanifesto.org
[3] Version One, "14 annual state of Agile report," digital.aiTM
. accessed June 2020. [Online]. Available:
https://stateofagile.com/#ufh-c-7027494-state-of-agile,
[4] P. Abrahamsson, O. Salo, J. Ronkainen, and J. Warsta, "Agile software development methods: Review and
analysis," 2017, arXiv preprint arXiv:1709.08439.
[5] D. Castilla, "A Hybrid Approach Using RUP and Scrum as a Software Development Strategy," UNF Graduate
Theses and Dissertations, Dept. Computing, University of North Florida. School of Computing, Florida, USA,
2014. [Online]. Available: https://digitalcommons.unf.edu/etd/514
[6] M. Kuhrmann et al., "Hybrid Software Development Approaches in Practice: A European Perspective," in IEEE
Software, vol. 36, no. 4, pp. 20-31, Jul-Aug. 2019, doi: 10.1109/MS.2018.110161245.
[7] M. Kuhrmann et al., "Hybrid software and system development in practice: waterfall, scrum, and beyond," in
Proceedings of the 2017 International Conference on Software and System Process, 2017, pp. 30-39, doi:
10.1145/3084100.3084104.
[8] M. Esteki, T. J. Gandomani, and H. K. Farsani, "A risk management framework for distributed scrum using
PRINCE2 methodology," Bulletin of Electrical Engineering and Informatics, vol. 9, no. 3, pp. 1299-1310, June
2020, doi: 10.11591/eei.v9i3.1905.
[9] M. Mousaei and T. J. Gandomani, "A new project risk management model based on scrum framework and Prince2
methodology," International Journal of Advanced Computer Science and Applications, vol. 9, no. 4, pp. 442-449,
2018, doi: 10.14569/IJACSA.2018.090461.
[10] M. Uddin, A. H. Jahin, and M. M. Hasan, "A Hybrid Framework for Interconnecting Various Software Engineering
Process Models and Techniques," in ICCA 2020: Proceedings of the International Conference on Computing
Advancements, 2020, pp. 1-4, doi: 10.1145/3377049.3377065.
[11] T. Pyzdek and P. Keller, "The six sigma," Estados Unidos: Mcgraw-hill, 2003.
[12] D. Grant and A. E. Mergen, "Towards the use of Six Sigma in software development," Total Quality Management,
vol. 20, no. 7, pp. 705-712, Jul. 2009, doi: 10.1080/14783360903036962.
[13] A. M. E. Hamid, "Application of Six Sigma Methodology in Software Development," in 2018 International
Conference on Computer, Control, Electrical, and Electronics Engineering (ICCCEEE), 2018, pp. 1-5, doi:
10.1109/ICCCEEE.2018.8515773.
[14] H. Pournaghshband, "Blending Six Sigma and Software Development," in Proceedings of the Future Technologies
Conference (FTC) 2019, 2019, pp. 619-624, doi: 10.1007/978-3-030-32523-7_45.
[15] K. S. Rubin, "Essential Scrum: A Practical Guide to the Most Popular Agile Process," Michigan: Addison-Wesley
Professional, USA, 2012.
[16] T. J. Gandomani, Z. Tavakoli, H. Zulzalil and H. K. Farsani, "The Role of Project Manager in Agile Software
Teams: A Systematic Literature Review," in IEEE Access, vol. 8, pp. 117109-117121, 2020, doi:
10.1109/ACCESS.2020.3004450.
[17] M. Hron and N. Obwegeser, "Scrum in practice: an overview of Scrum adaptations," in Proceedings of the 51st
Hawaii International Conference on System Sciences, 2018, doi: 10.24251/HICSS.2018.679.
[18] B. G. Tavares, C. E. S. da Silva, and A. D. de Souza, "Risk management analysis in scrum software projects,"
International Transactions in Operational Research, vol. 26, no. 5, pp. 1884-1905, 2019, doi: 10.1111/itor.12401.
[19] Z. Tavakoli, T. Gandomani, and M. Ghasemi, "Adaptation of scrum activities and artifacts to Milton’s knowledge
management model," Journal of Software Engineering & Intelligent Systems, vol. 2, no. 3, pp. 226-230, 2017.
[20] A. Reddy, "The Scrumban [r] evolution: Getting the Most Out of Agile, Scrum, and Lean Kanban," Addison-
Wesley Professional, USA, 2015.
[21] V. Bubevski, "A Six Sigma Security Software Quality Management," Journal of Computer and Communications,
vol. 4, no. 13, pp. 40-60, 2016, doi: 10.4236/jcc.2016.413004.
[22] G. Improta et al., "Agile Six Sigma in Healthcare: Case Study at Santobono Pediatric Hospital," International
Journal of Environmental Research and Public Health, vol. 17, no. 3, 2020, Art. no. 1052, doi:
10.3390/ijerph17031052.
[23] A. Happy, "A systematic literature review of hybrid approaches of lean, agile and six sigma philosophies in supply
chain management," Thesis, Western Sydney University, 2019.
[24] A. Correia, A. Gonçalves, and S. Misra, "Integrating the Scrum Framework and Lean Six Sigma," in International
Conference on Computational Science and Its Applications, 2019, pp. 136-149, doi: 10.1007/978-3-030-24308-
1_12.
[25] K. Sanford, "The applications of scrum and six sigma to the software industry," Technical Report, 2017. Accessed:
Sep. 2020. [Online]. Available: http://www.karisanford.com/ResearchPapers/SixSigmaSCRUMSoftwareInd.pdf
9. ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 11, No. 6, December 2021 : 5342 - 5350
5350
BIOGRAPHIES OF AUTHORS
Mona Najafi Sarpiri is a Ph.D. student in software engineering at Isfahan Branch, Islamic Azad
University, Isfahan Iran. She is currently pursuing her Ph.D. in Agile software development and
hybrid methodologies.
Taghi Javdani Gandomani, Senior Member, IEEE, received the Ph.D. degree from Universiti
Putra Malaysia (UPM), Malaysia in 2014. He is currently an Assistant Professor of Software
Engineering at Shahrekord University, Iran. His research interests include Agile software
development, software process improvement, software project management, and empirical
software engineering. He is also leader of data science Lab in Shahrekord University, Iran.