The SysML modeling tools reviewed on this site for Model-Based Systems Engineering (MBSE) applications are evaluated using weighted evaluation criteria. This article describes the MBSE modeling tool evaluation criteria applied, and shows how they can be weighted to customize tool evaluations to meet team and project specific needs.
The following evaluation criteria separate functional and non-functional tool features. For functional tool features they distinguish between diagram drawing functions , which are can be supported by templates for popular drawing tools (e.g., Visio, OpenOffice Draw, iDraw), from modeling tool specific functions, such as Simulation & Execution functions.
We evaluate modeling tool Usability while exercising the tool Functional Features described below. Usability factors that we evaluate include, but are not limited to: User Interface (UI) design, UI learnability, UI efficiency (as measured by #UI gestures/function), UI response times, context-sensitive help, and technical documentation.
Functional features that we evaluate are categorized into Drawing features and Simulation & Execution features:
We evaluate basic, intermediate, and advanced drawing features of modeling tools. Basic drawing features that we assess include, but are not limited to: opening/saving models, opening/saving diagrams, drawing model elements and relationships, editing properties of elements and relationships, and multiple undo/redo capabilities. Intermediate drawing features that we assess include, but are not limited to: support for recursive design techniques, requirements traceability across diagram types, generating Allocation tables from diagrams, and automated document generation. Advanced drawing features that we assess include, but are not limited to: model management and tool customizability via profiles, options, and scripts.
Simulation and execution features that we evaluate for SysML modeling tools include, but are not limited to: simulating Parametric diagrams, enforcing constraints and well-formedness rules, and code generation from behavioral diagrams (Activity, State Machine, Sequence).
We evaluate tool compliance to SysML notation (syntax) and semantics as defined by the current SysML 1.x specification. We also evaluate the interoperability of XMI files generated for model import/export, and the interoperability of SysML diagrams with UML diagrams for modeling projects with Systems and Software Engineers that require mixed language usages.
We evaluate general tool support to include installation, set up, update mechanisms, and technical support. We also evaluate team modeling support for project sharing, versioning, and user permissions.
We calculate Value as a function of Feature (Completeness,Quality) / Price in order to rank tools in relative order.
After you have specified your selection criteria you should assign numerical weights to them to quantify their relative importance. The assignment of relative numerical weights to your criteria allows you to tailor your tool selection to specific team and project needs, and it can also reduce evaluator bias. For example, consider weighting the evaluation criteria described above as follows:
\[Score_ {tool} = \overline{x} = \sum_{i=1}^{n} w_i * c_i \]
where Tool Evaluation Score (\(Score_{tool}\)) is the weighted mean (\(\overline{x}\)) of the following six (6) evaluation criteria (\(c_{1}\) = Usability, \(c_{2}\) = Functionality: Drawing, \(c_{3}\) = Functionality: Round-Trip Engineering, \(c_{4}\) = Standards & Interoperability, \(c_{5}\) = Technical & Team Modeling Support, \(c_{6}\) =Value) with corresponding weights (\(w_{1}\) = 15%, \(w_{2}\) = 15%, \(w_{3}\) = 15%, \(w_{4}\) = 15%, \(w_{5}\) = 10%, \(w_{6}\) = 15%).
The weights that you apply to your own MBSE SysML tool evaluations should be adjusted for your team and project modeling needs.