Large Language Models for Test Case Evolution and Repair
Benchmarck 2014 Bonita Activiti Jbpm
1. User experience-based evaluation of open source Workflow
systems :
The cases of Bonita, Activiti, jBPM, and Intalio
Karim Baïna and Salah Baïna
Alqualsadi research team of Enterprise Architecture,
ENSIAS, Université Mohammed V-Souissi, BP 713 Agdal,
Rabat, Morocco
baina@ensias.ma
Abstract— The task of selecting a workflow system becomes
more and more complex and risky. For this reason, organisations
require a broad, and a clear vision of which workflow engines are,
and will continue to be, suitable for changing requirements. This paper
presents a user experience based evaluation model for workflow systems
to analyse, compare, and select business process management modelling
and enactment systems according to user specific requirements. After
the description of the underlying worflow system evaluation model
itself, we experiment it to assess criteria satisfaction on four open source
workflow systems Bonita, Activiti, jBPM, and Intalio.
Keywords— Business Process Management Systems,
Workfolow Management Systems, evaluation criteria, user centric
evaluation.
I. INTRODUCTION
Enterprise business models are becoming more and more
complex, involving numerous interacting applications within rich
business and technical contexts. Thus, enterprise business processes
inherit this business models growing complexity. Since those
business processes are considered in the enter- prise business core
strategy, they need workflow engines that handle suitably the
continuously growing business processes management complexity.
With their low capital expenditure (CAPEX), Free and Open
Source Software (FOSS) are a very serious opportunity for
emergent countries. More than being an IT strategy, FOSS is a real
Business strategy with well established business models and
organisation. FOSS will enable emergent countries to develop
knowledge, through open innovation, brand equity, real local
software industry, and to make foreign currency economy. Without
FOSS, emergent countries will remain dependent to the same IT
software suppliers which represent a concrete risk for their autonomy
and sustainable and durable development [1].
We have been observing and studying many interesting FOSS
tools that show an increasing opportunity, among others open source
enterprise architecture tools [2] and open source worflows systems
[3], [4]. We have been working with and observing the evolution
of first open source workflow system since 15 years, where neither
maturity nor executability were really comfortable.
Few years ago, jBPM and Intalio BPMS became two pioneer
examples of open source products gaining more and more attention
and popularity [5]. Perseverance and Innovation of the open source
communities arround the world in one hand, involvment and
partnership of computer development companies in the other hand
made those two examples only the first element of a very valuable
list of experiences that lead to very competitive tools and
environments.
Nowadays, open source workflow management systems
(WfMS)1
are becoming numerous, interesting in terms of
functionalities which makes the task of selecting an open source
WfMS complex and risky for the enterprise. For this reason,
organisations require a broad, and a clear vision of which
workflow engines are, and will continue to be, suitable for
changing requirements. Selection of such workflow engines is
strategic for nowadays organisations.
Our contribution is to present our workflow system evalua-
tion model, and to experiment it on four open source workflow
system Bonita, Activiti, jBPM, and Intalio. The paper is organised
as follows : section II discusses related works, section III
presents our workflow engines evaluation model, section IV
illustrates the approach through the results of our workflow
engines evaluation model experiment, and section V concludes
and gives an outlook for this work.
II. RELATED WORKS
Many other works have studied the problem of producing a
generic workflow engine evaluation model, but none of them
has proposed a rich workflow evaluation model as we propose
(35 evaluation sub-criteria). [6] compares workflow engines
according to their process definition metamodels through eight
criteria : Granularity, Control Flow, Data Flow, Organisa-
tional Model, Role Binding, Exception Handling, Transaction
Support, and Commitment Support. [7] compares workflow
engines according to their organizational meta model, and Process
Meta Model richness and expressivity. [8] tackles the comparsion
problem from the point of view of the ability of workflow
engines to support workflow patterns. [9] bases its evaluation
of scientific workflow engines on four criteria :
1 in the remainder of this paper, the terms worflow engine (WFE),
worflow management system (WfMS) and business process management
systems (BPMS) will be used interchangeably.
2. Workflow Design, Workflow Scheduling, Fault Tolerance,
and Data Movement. [10] focuses its evaluation on a subset
of five sub-criteria of [11] software quality criteria :
Reliability, Usability, Efficiency, Maintainability, and
Portability.
We have ourselves achieved an evaluation experiment in
2007 that has highlited the top 3 WFE at that time : jBPM,
OpenWFE & Enhydra Shark among a set of 35 studied open
source workflow engines [3], [4] which was profitable for both
BPM open source community and BPM research. In fact, our
work has influenced and has driven during many years a huge
BPM research community (e.g. [12], [13], [14], and their
citing papers.) to be more attracted by BPM open source
benchmarking within a such large open source community,
and to focus later only on the top 3 WFE : jBPM, OpenWFE
& Enhydra Shark with newer visions at that time instead of
being lost within a such fertile but not mature production at that
time.
III. WORKFLOW ENGINES EVAUATION MODEL
Our workflow engines evaluation model will be presented in
three steps : first, section III-A present the chosen criteria for
workflow engines evaluation, then section III-B shows how key
performance indicators are used to evaluate presented criteria, and
finaly section III-C summarizes our approach through the
presentation of a workflow engines evaluation metamodel.
A. Workflow engines evaluation criteria model
We have been working many years on an evaluation model that
we have built upon 31 criteria and sub-criteria grouped in three
axes (1) 12 executability criteria clustered in 3 sub- axes (1.1)
API & GUI support, (1.2) Execution Properties, and (1.3)
Software Quality based Factors, (2) 10 vision criteria clustered in
2 sub-axes (2.1) Model Richness, and (2.2) Operations & statistics
support, and (3) 9 contextual criteria clustered in 3 sub-axes (3.1)
Applicative architecture, (3.2) Technical architecture, and (3.3)
Licensing and Affiliation [4]. Those criteria and their sub-criteria
were built on a bottom-up approach gathering criteria proposed by
WARIA (Workflow And Reengineering International Association)
[15], or highlighted by workflow research [16], [17], [18], [19],
[20], [21], from software quality factors defined by McCall et al.
[11], and finally from auxiliary contextual properties on Workflow
engines.
Our new evaluation model aims to be more user oriented,
either end-user or developer, with a top-down approach. It is built
upon 34 new criteria clustered in three sub-classes as shown in
(1) 21 executability criteria, (2) 12 vision criteria, and (3) 14
contextual criteria2
. These new criteria are sum- marised in figure
1 and described below.
We would like to emphasis that our criteria are a synthesis of
most used worklfow system evaluation criteria, and our user
experience-based evaluation marks are both resulting from our own
experience but also from user experience (developers and end-
users) on the blogosphere [22], [23], [5], [24]
2 As you may notice from figure 4, some sub-criteria fit more than
one evaluation axis
1) Evaluation Axes: Figure 1 shows evaluation model axes (i.e.
criteria classification).
2) Evaluation Criteria:
• I1 - Overall Functionality : User appreciation of
the overall completeness of the functionalities of the
plat- form.
• I2 - Overall Usability of Tools : User appreciation
of the overall usability and reliability of the platform.
• I3 - License : Openness of the license of the tool based on
a comparison of various features of licenses of the four
tools taking in account conditions of each license.
• I4 - Standard Respect : Overall Appreciation of stan-
dards respect .
• I5 - Process Modelling Language (BPMN) : Richeness
and completeness of the modelling language and the full-
compliance with the BPMN standard.
• I6 - Process Execution Language : Compliance with
standards and formats for process execution.
• I7 - Support of Wf-XML : Implements (or not) WF-XML
interaction format
• I8 - Support of XPDL : Import and Export functionali-
ties in a standard XPDL format.
• I9 - Process Engine (PVM) : Embed a Process Virtual
Machine with adequat API.
• I10 - Database, Versionning and Transactions Man-
agement : Functionalities and components of the embed- ded
database management solution.
• I11 - Dynamic Deployment : Enable Dynamic de-
ployment of new versions of processes with automatic
upgrade of existing instances.
• I12 - Process Designer for dev : Technical Environment
and tools for advanced developers.
• I13 - Integration with Development Frameworks :
Facilities to integrate processes with development frame-
works, Application Servers and portals.
• I14 - Application Development (API, Debug & Tests) :
Development Environment, API and facilities for imple-
menting processes.
• I15 - Form Designer : Tools to easily define human forms
for web application.
• I16 - Process Design for Business users : High abstrac- tion
level tools to define and design processes
• I17 - Process Administration & Monitoring tools :
Quality of the provided tools for process instanciation,
instance management from administration view.
• I18 - BAM : Embed or indicate tools to handle business
activity monitoring.
• I19 - BRE: Embed or indicate engine to execute and
manage business rules.
3. Fig. 1.Workflow Engine Evaluation Criteria Axes
• I21 - WS/Human Task Support: Implements
WS/Human Task Standard.
• I22 - Content Manegement Integration: Offers con-
nectors to interoperate with most popular content man-
agement tools (CRM,ERP,CMS,Reporting, etc.).
• I23 - Connectivity: Offers connectors to integrate
process with other components of Enterprise Information
Systems.
• I24 - Simulation: Offers tools to simulate execution of
processes.
• I25 - Rapid Dev: Offers Environment for agile de-
velopment (prototyping, forms generation, application
generation).
• I26 - SOA Friendly: Ready to use interface to commu-
nicate with an ESB.
• I27 - Restfull: Implements REST protocole for machine
to machine interaction.
• I28 - Resource Management (LDAP Friendly): ready
to use interface to communicate with an LDAP repository.
• I29 - Maturity: User Centric appreciation of maturity
of the environment and its component.
• I30 - Performance: Appreciation of performance and
resource usage (cpu/memory).
• I31 - Adoption/Popularity: Number of Downloads and
user satisfaction.
• I32 - Release Activity: Frequency of bug correction and
release of new versions.
• I33 - Resources and Community: References, Books,
Case Studies.
• I34 - Success Stories: Projects that have resulted in
substantial achievements with regard to the tool. Rules.
B. Workflow engines evaluation key performance indicators model
We can divide the presented criteria in two categories :
those one may evaluate through a mark and those which are
contextual and less subject to an effective mark. For example,
what should mean a mark affected toLicencing and Affiliation?
We consider that contextual criteria have not to be marked
while execution properties and vision criteria have to be. Each
high level criteria is seen as a hierarchical marked criterion that
aggregates the set of all its submarked criterion marks trhough a
criterion formula (in the same vain of multidimensional database
roll-up aggregation operation). In the following, aggregation
formulas of each hierarchical marked criterion given at
section III-A will be presented and explained. The criterion
mark of a hierarchical marked criterion is computed as the
uniform arithmetic average of its sub marked criterion (it has been
chossen to affect to all atomic sub marked criterion the same
weigth =1).
Fig. 3, shows Evaluation criteria Axis/Criteria aggregation
formulas.
C. Workflow engines evaluation metamodel
Now that we have presented the criteria and their evaluation we
can present our workflow engines evaluation meta-model
which is composed of 6 meta-concepts, as shown in Fig. 2).
Fig. 2. Evaluation criteria metamodel
4. Fig. 3.Evaluation criteria Axis/Criteria aggregation formulas
• WFEmeta-concept : represents a workflow engine entity
with its WFE_ID, and WFE_NAME;
• CRITERION meta-concept : represents an abstract work-
flow engine criterion entity with its CRITERION_ID,
CRITERION_NAME, CRITERION_DESCRIPTION,
and given evaluation CRITERION_COMMENT;
• CONTEXTUAL_CRITERION meta-concept : represents
a concrete workflow engine textual criterion entity that
describes the evaluation result of WfMS according to a
contextual property ;
• HIERARCHICAL_CONTEXTUAL_CRITERION meta-
concept : represents a hierarchical concrete workflow
engine contextual criterion entity that is composed of
many CONTEXTUAL_CRITERION entities ;
• MARKED_CRITERION meta-concept : represents a
con- crete workflow engine criterion entity with its
given evaluation floating CRITERION_MARK;
HIERARCHICAL_MARKED_CRITERION meta-concept:
represents a hierarchical concrete
workflow engine marked criterion entity that is
composed of many MARKED_CRITERION
entities. This hierarchy aggregates the set of all its
SUB_MARKED_CRITERION.CRITERION_MARK through
CRITERION_FORMULA (in the same vain of
multidimensional database roll-up aggregation
operation).
D. Evaluation Axis/Criteria Correspondance:
Fig. 4, shows evaluation axis/criteria correspondance matrix.
IV. WORKFLOW ENGINES EVALUATION MODEL EXPERIMENT
A. Workflow engines evaluation process
The evaluation process attributes for each studied workflow
engine its evaluation information (i.e. the set of contextual and
marked criteria). The evaluation information is based upon
studying their related research papers, marketing white papers,
and slides, setting up and testing technically all engines according
to a complete functional case study process. The case study was
based on six different processes :
(P1) ISO 9002 preventive, and corrective actions process
(P2) purchase management process
(P3) recruitment process
(P4) expense report reimbursement process
(P5) loan management process
(P6) leave application process
Evaluation of a workflow engine is achieved by giving
subjective marks : for each WFE, and CRITERION a floating
evaluationMARKis given from0to5(0is the worst mark, and5is
the best mark), then aggregated marks and key performance
indicators are computed on the basis of the evaluation basic
information.
B. Workflow engines subject of evaluation
Without loosing in generality, we present an experiment of
our evaluation model on four open source workflow systems :
Bonita, Activiti, jBPM, and Intalio.
1. jBPM[25] : under Apache/Eclipse/MIT Licence, jBPM is
the first historical reference in the domain of open
source BPM and workflow tools, it has been promoted
byJBossas the leader of open source business process
engines. In the last version, JBoss made huge efforts to
integrate a business rules management system to jBPM,
these efforts lead jBPM to meet Drools.
jBPM is based on the BPMN 2.0 specification and supports
the entire life cycle of the business process (from
authoring through execution to monitoring and
management). It jBPM offers open-source business process
execution and management, including (i) embeddable,
lightweight Java process engine, supporting native BPMN
2.0 ex- ecution (ii) BPMN 2.0 process modeling in
Eclipse (developers) and the web (business users) (iii)
process collaboration, monitoring and management through
the Guvnor repository and the web console (iv) human
interaction using an independent WS-HT task service; (v)
tight, powerful integration with business rules and event
processing.
6. In order the continue implementing their own phyloso- phy
endependently from JBoss directives, initial creators of jBPM
made a fork called Activiti based on the forth version of jBPM4.
Fig. 5.experimenting (P5) loan management process on jBPM4 (integrated with
Alfresco) [26]
2. Activiti[27] : under an Apache Licence, Activiti is a light-
weight workflow and Business Process Man- agement
(BPM) Platform targeted at business people, developers and
system admins. Its core is a fast and solid BPMN 2 process
engine for Java. It is a full open-source solution and
distributed under the Apache license.Activitiruns in any Java
application, on a server, on a cluster or in the cloud. It
integrates perfectly with Spring, it is extremely lightweight
and based on simple concepts.Activitiacknowledges that
executable business processes must be applicable as a
component in everyday software development. That’s
whyActiviti spends a lot of effort in making sure that it can be
used very easily in *every* Java environment. This includes
the cloud as soon many applications will be written for the
cloud.Activiti’s primary purpose and focus is toimplement
the general purpose process language BPMN 2.0 but also to
support any custom process languages built on top of it.
Fig. 6.experimenting (P4) expense report reimbursement process on Activiti.
3. Bonita[28] : under a GPL forBonitaOpen Solution (BOS)
studio & a LGPL for BOS execution engine, Bonitawas
initialy developped by Bull, and it has been recently
outsourced under the control of an independent company
BonitaSoft. Bonita is available under two packages an
open source comunity edition and a commercial
one.Bonita’s purpose is to offer a simple, intuitive and
graphical solution that support development of ready to use
BPMN process based applications.
Fig. 7.experimenting (P4) expense report reimbursement process onBonita.
1. Intalio[29] : under Intalio|BPMS Community &
Commercial Edition licences, The last tool in this selection is
Intalio, also available under two packages (community and
commercial).Intaliocreators prensent the tool as the Modern
Way To Build Business Applications. The aim of Intalio is
to provide tools and technologies that businesses use to
build engaging, agile business applications that include the
modern capabilities that users demand, including
collaboration, process automa- tion, mobility and cloud
enablement. It is clear that commercial strategies of
Intalio make it less and less compatible with open source
licenses and phylosophy. Involving such a tool in our
experiment, helps us to compare full open source solution
supported only by community with commercial oriented
tools.
Fig. 8.Experimenting ( P2) purchase management process on Intalio.
The study presented in this paper is the result of a detailed and
time-consuming analysis of functionalities and features of each one
of these tools. Now that comparison criterias have been defined,
future work could easily integrate other BPM open source and open
source like tools (ProcessMaker, Enhydra Shark, Orchestra, Runa,
etc.).
C. Workflow engines evaluation results
The following figures 9 and 10, and table classify graphi-
cally, and describe textually the four WfMS in detail.
7. Our evaluation experiment results informs that generally,
priorities taken into account by open source workflow engines
software editors, are nowadays : (1) API & GUI support is the
most common functionality ; then (2) the model richness, then the
execution properties ; then (4) software quality factors ; and finally
(5) Operations and statitics.
The interpretation of our results can be made under the three
main axes of the analisys Executability, Vision and Context.
Axis I - Executability: after analysing results of the
comparison process, the following ranking in the executability axe
has been validated (1) Bonita, (2) jBPM, (3) Activiti and finally (4)
Intalio. Unsurprisingly, the very high quality of the BonitaStudio
designer and it’s high level usability made it at the top of our
ranking. At the other extremity of the ranking, Intalio’s choices to
not implement some of the most important standards (e.g. XPDL),
but also the choice of a very limited connectivity in the
community edition(connectors are meant to be non-free and also
very expensive) made it attend the last position of this analysis.
Axis II - Vision & Innovation: Results achieved in this axis
give the following ranking (1) Bonita, (2) Activiti, (3) jBPM, (4)
Intalio. The evaluation marks obtained by the four tools are very
similar, what should particularly be noticed is that Activity has a good
vision ranking, this statement could make him a good challenger in
the next few years. Indeed, despite his low executability mark (in
comparison toBonita and Jbpm in Axis I analysis), Activiti vision
takes advantage from the orientation of his developers trying to be as
standard compliant as possible,Activititakes also advantage from the
clear orientation to meet SOA in order to achieve a BPM/SOA
convergence solution. At the end of the list, we find Intalio, again.
Intalio suffers from his very constraining license and his lake of will
to integrate with other components to build a full BPM tool instead of
beeing simply a workflow engine.
Axis III - Context: Due to their wide communities,
documentation sources and books, but also the numerous forums,
blogs dedicated to their issues and features ;Bonita and jBPM seem to
be very popular and very well adopted by the open source
developpers. Many years of development and improvements made
Bonita, jBPM and Intalio admitted as very mature tools. How ever
as we stated before, Activiti has all the caracteristics of an outsider
that could change the statu quo in the next few years.
CONCLUSION
This paper has presented our workflow systems evaluation
model and experiment on four open source workflow engines :
Bonita, Activiti, jBPM, and Intalio. Our evaluation experiment
has revealed preliminary user-experience based classifications
among the top four open source workflow systems.
In summary we believe that, once the research and develop-
ment work on the aspects described above has been completed, this
approach will result in a comprehensive platform that can
substantially reduce (i) WfMS presenting and understanding effort
both for editors and customers, and (ii) WfMS selection effort and
therefore foster the widespread adoption of either open source or
commercial workflow technology.
Fig. 9. Aggregation Table, and 2D Bull evaluation results examples
Fig. 10. Radar Graph evaluation results examples
REFERENCES
[1] K. Baïna, “Foss as IT & Biz strategy for emergent coutries - Moroccan
case study, at Software Freedom Day Casablanca, Morocco,” September, 22
8. 2012. [Online]. Available: http://fr.slideshare.net/kbaina/software-
freedomdayfossasitbizstrategy
[2] K. Baïna, “Quels Outils pour l’Architecture d’Entreprise dans le
monde du Logiciel Libre ?” May, 7 2010. [Online].
http://fr.slideshare.net/kbaina/enterprise-architecture-and-open-source
[3] K. Baïna, F. Z. Azayite, N. Belakbir, H. Srir, and K. Benali, “Workflow
engines comparison model,” inProceedings of the First International
Conference on Research Challenges in Information Science (RCIS’07),
Ouarzazate, Morocco, C. Rolland, O. Pastor, and J.-L. Cavarero, Eds.,
April 23-26, 2007, pp. 289–296.
[4] K. Baïna, “WFESelector - A tool for comparing and selecting work- flow
engines,” inProceedings of the Ninth International Conference on
Enterprise Information Systems (ICEIS’07) (1), Volume DISI, Funchal,
Madeira, Portugal,, J. Cardoso, J. Cordeiro, and J. Filipe, Eds., June 12-
16, 2007, pp. 330–337.
[5] P. Nie, R. Seppälä, and M. Hafrén, “Open Source Power on BPM - A
Comparison of JBoss jBPM and Intalio BPMS, T - 86.5161 Special
Course in Information Systems Integration,” 2011, http://jannekorho-
nen.fi/project_report_final_BPMS.pdf.
[6] K. Lei and M. Singh, “A comparison of workflow metamodels,” 1997.
[Online]. http://iteseer.ist.psu.edu/lei97comparison.html
[7] M.M. Rosemann and M. zur Muehlen, “Evaluation of Workflow
Management Systems - a Meta Model Approach,”Australian Journal of
Information Systems, vol. 6, no. 1, pp. 103–116, 1998.
[8] W.M.P. van der Aalst, A.H.M. ter Hofstede, B. Kiepuszewski, and A.P.
Barros, “Workflow patterns,”Distrib. Parallel Databases, vol. 14, no. 1,
pp. 5–51, 2003.
[9] J. Yu and R. Buyya, “A taxonomy of scientific workflow systems for
grid computing.”SIGMOD Record, vol. 34, no. 3, pp. 44–49, 2005.
[10] K. Stoilova and T. Stoilov, “Comparison of workflow software products,”
inInternational Conference on Computer Systems and Technologies,
CompSysTech’2006, 2006.
[11] J. McCall, P. Richards, and G. Walters, “Factors of software quality,”
NTIS, vol. 3, Nov. 1977.
[12] P. Wohed, N. Russell, A. H. M. ter Hofstede, B. Andersson, and W. M. P.
van der Aalst, “Open source workflow : A viable direction for bpm ?” in
Advanced Information Systems Engineering, 20th International Confer-
ence, CAiSE 2008, Montpellier, France, June 16-20, 2008, Proceedings, ser.
Lecture Notes in Computer Science, Z. Bellahsene and M. Léonard, Eds.,
vol. 5074. Springer, 2008, pp. 583–586.
[13] P. Wohed and N. Russell and A.H.M. ter Hofstede and B. Andersson and
W.M.P. van der Aalst, “Patterns-based evaluation of open source BPM
systems : The cases of jBPM, OpenWFE, and Enhydra Shark,”Information
& Software Technology, vol. 51, no. 8, pp. 1187– 1216, 2009.
[14] P. Wohed, B. Andersson, and P. Johannesson, “Open source workflow
systems,” inModern Business Process Automation - YAWL and its
Support Environment, A. H. M. ter Hofstede, W. M. P. van der Aalst,
M. Adams, and N. Russell, Eds. Springer, 2010, pp. 401–434.
[15] “Waria workflow comparative study,”
http://www.waria.com/books/study-2004.htm.
[16] M. Reichert and P. Dadam, “A Framework for Dynamic Changes in
Workflow Management Systems,” in8th International Workshop on
Database and Expert Systems Applications (DEXA’97), R. Wagner, Ed.
Toulouse, France : IEEE Computer Society Press, September 1-2, 1997, pp.
42–48.
[17] A. Bernstein, C. Dellarocas, and M. Klein, “Towards adaptive workflow
systems : Cscw-98 workshop report,”SIGGROUP Bull., vol. 20, no. 2, pp.
54–56, 1999.
[18] N. Russell, W.M.P. van der Aalst, A. H. M. ter Hofstede, and D.
Edmond, “Workflow resource patterns : Identification, representation and
tool support.” inCAiSE, 2005, pp. 216–232.
[19] W. Gaaloul, K. Baïna, and C. Godart, “Towards Mining Structural
Workflow Patterns,” in16th International Conference on Database and
Expert Systems Applications (DEXA’05), K. V. Andersen, J. K.
Debenham, and R. Wagner, Eds., vol. 3588. Copenhagen, Denmark :
Springer-Verlag, August 22-26 2005, pp. 24–33.
[20] K. Baïna, K. Benali, and C. Godart, “DISCOBOLE : A service
Architecture for Interconnecting Workflow Processes,”Computers in
Industry - Special issue on Collaborative Environments for Concurrent
Engineering, Elsevier Science Publisher, pp. 768–777, Dec. 2006.
[21] W. Gaaloul and C. Godart, “Mining workflow recovery from event
based logs.” inBusiness Process Management, W. M. P. van der Aalst, B.
Benatallah, F. Casati, and F. Curbera, Eds., vol. 3649, 2005, pp. 169–
185.
[22] J. A. Maaref, “Evaluation des systèmes de gestion de règles et de flux de
travail pour la Chaire de logiciel libre - Finance sociale et solidaire,”
Master’s thesis, Université du Québec à Montréal, 2010, maîtrise en
informatique.
[23] J. Carhuatocto, “jBPM, Bonita, Intalio, Process- Maker,
Activiti. Qué BPM Suite uso ? ,” July 2011,
http://holisticsecurity.wordpress.com/2011/07/21/jbpm-bonita-intalio-
processmaker-activiti-que-bpm-suite-uso/.
[24] J. Denanot, “Panorama des moteurs BPM/workflow open source,” June 2012,
http://www.improve-technologies.com/2012/06/27/panorama-des-moteurs-
bpmworkflow-open-source/.
[25] “jBPM,” http://www.jbpm.org.
[26] “Alfresco.” [Online]. Available : http://www.alfresco.com/fr
[27] “Activiti,” http://www.activiti.org.
[28] “Bonita Open Solution,” http://www.bonitasoft.com.
[29] “intalio,” http://www.intalio.com/.