Skip to content
Create an account for full access.

Model-driven approach in the lifecycle

On the V-diagram, it is convenient to discuss the classic slogan of systems engineering: all possible works on the right side of the diagram should be transferred to the left side. Everything that can be done at the system description stage should be done exactly at this stage: it's much cheaper to operate with bits than with atoms, especially when dealing with complex expensive systems like an airplane or a nuclear power plant. Here are INCOSE's data on the cost of error correction depending on the life cycle stage[1]:


Stage Error detection Correction cost Requirements x1 (reference unit) Design x5 Construction x12 Verifications X40 Operations X250


Considering that the leading practice of system description is modeling, it can be concluded that maximizing modeling of various kinds compared to actual system realization is necessary, to avoid inevitable mistakes and reworks. Think, model multiple times, and only then implement it.

Currently, it is added:

  • Modeling needs to be done many times (not just once under strictly defined requirements: the system concept constantly changes during the project, you can't stop life - "freezing requirements" today is considered a bad practice, good practice is to test hypotheses on usage concept, then adjust based on the testing results, test again - and thus continuously evolve the system after the MVP release), but before that, the system should also be broken down into modules, using architectural knowledge (architecture has become a separate discipline from development with its modeling to break development into maximally independent parts).
  • And it needs to be done many times, including not the whole system, but its individual parts. If the system is correctly divided into parts (maximally independent but still interacting to achieve emergent system properties overall) and approaches for their interaction are organized, then rework can be limited to only part of the system. Further, the system needs to be continuously developed, reworking its individual parts in the physical world, which takes time ("continuous everything" includes continuous development, continuous manufacturing, continuous testing, continuous deployment, continuous commissioning of new versions of the system with new features).

Modeling has, thus, not decreased, but increased - just "model a lot, and implement once" is now being followed many times.

"Modeling in the broad sense is the cost-effective use of something in place of something else for some cognitive purpose. It allows us to use something that is simpler, safer or cheaper than reality instead of reality for some purpose. A model represents reality for the given purpose; the model is an abstraction of reality in the sense that it cannot represent all aspects of reality. This allows us to deal with the world in a simplified manner, avoiding the complexity, danger and irreversibility of reality."[2].

We do not waste energy on discussing and processing unnecessary details of the modeled object. Models are "proper simplifications." We discuss only what has been modeled, what is important, what is needed.

A formal (math-based) model can be easily verified for formal correctness - manually or even by computer. This is called model checking. For instance, in a radio circuit, one can formally ensure that all its components are connected and there are no unconnected components, all connections are continuous (i.e., they do not lead to nowhere).

A formal model can be optimized (including computationally) against a variety of criteria, including many ranked criteria. The search can work with a formal model - searching for the optimal solution in the solution space computationally (the model can even be deformed for optimal searching, this refers to so-called differentiable architectures[3]).

Where manufacturing a physical object (a system) is time-consuming and expensive, it may be sufficient to use a quickly developed information model and still get an answer.

A formal model can be created/generated by a computer - a solution to the complexity problem. For example, it is impossible to draw by hand 2.6 trillion transistors in a modern microchip on a silicon plate, and it is even impossible to hand draw a schematic of such a microchip. However, from models at a higher level of abstraction, a schematic as well as a lithographic mask of such a microchip can be derived/generated.

The use of computer modeling has significantly increased the accuracy and error-free nature of design based on verifiable models, and also significantly increased the precision of manufacturing details. Configuration and change management by computer have greatly reduced the number of configuration collisions.

As a result of the changes in the distribution of work practices in the life cycle (more resources allocated to system description work, therefore less resources for system realization work - and as the description is done with greater accuracy and more scrutiny, the system is manufactured with more precision and also tested more) the very first plane of a new model flies, which was impossible before. Nowadays, the traditional joke about the need for "filing each part with a file" has lost its relevance. All meticulously thought out and modeled parts, often not handmade, are simply assembled into a whole system, and it works as described by the designers. This aligns with one of the slogans of systems engineering: "getting it right the first time," meaning the first system implementation should be operational, not requiring rework, while the rework should be on the descriptions/models. It's not always happening yet, not in all teams or areas of activity, but more and more frequently. For instance, out of the forty Mars flights to date, only half have been successful. But the team of systems engineers from India managed to send a rocket to orbit Mars on the first attempt in 2013[4]. Modernity only adds to this: no rocket to Mars, no implemented system will be the last one for you. "Getting it right the first time" must be done all the time, modeling must happen all the time.

The practice of integration (assembling from poorly fitted parts and related solutions to systemic problems) used to be one of the main practices of systems engineering. Now, out of the set of main practices of systems engineering, the integration practice has disappeared, and assembly has become a routine non-creative operation. This mainly occurred due to rigorous system modeling (and then manufacturing precisely modeled parts based on dimensions and other physical properties). And configuration management allowed starting work on a new configuration even when the previous configuration has not yet been manufactured, not tested, and not in operation - but the development team already knows how to make the next version of the system better. And they do it, "continuous development," single creation of the system has become not only the creation but also the growth of the system. And well-established configuration management within an adopted development method prevents confusion with different versions of the project/system design and the system embodiment, avoiding configuration collisions.

Modeling, in this sense, has expanded not only to the left side of the V-diagram (system description/design) but also entered the right side - models started to be used in system manufacturing and more recently in operation. For an operational system, this is called a "digital twin," and the system itself has thus become a "physical twin." All different computers with models, sensors, and actuators of the completed system are connected by a "digital thread" (previously called "lifecycle integration," meaning data from one stage were passed "downstream" to the next stage, it was "lifecycle data integration"). The digital thread turned out to be a successful marketing term in this regard; managers immediately understood that it was "interconnecting different computers," whereas "data integration" sounded abstract to them.

"Modeling thinking" in engineering is supported by modelers, and creating a production platform (internal development platform) for modeling in engineering - digital engineering. This will be detailed in the textbook "Systems Engineering," whereas here we will show a new variation of the V-diagram that appeared in 2018 in connection with the focus on modeling in engineering[5]:

This new variant of the V-diagram from Boeing mentions another synonym for digital engineering: model-based engineering/MBE. Moreover, it's not the V, but an MBE diamond.

Here, it is acknowledged that modeling (design/modeling and simulation) occurs throughout the entire "life cycle," but the life cycle itself is presented as creating the system only once, with development being implicitly implied.

Of course, systems engineers of "physical" systems have long understood that system development takes time, but they continue to adhere to the V-diagram and various other variants, reminding them of a one-time waterfall. However, in software engineering, they have already moved away from such diagrams and always show some "cycles" on their diagrams, indicating development: proposing hypotheses, verifying them, adjusting, adapting to changed conditions, etc.

In any case, modern diagrammatic representations of the life cycle are dramatically different from the one-dimensional "sausage" representations of the recent past; they usually visually resemble some fundamental scheme, a functional diagram of alpha flow through practices, rather than a "sequence of steps": method of development/life cycle and schedule - they are not the same!


  1. https://www.bristol.ac.uk/media-library/sites/eng-systems-centre/migrated/documents/pdavies-blockley-lecture.pdf ↩︎

  2. Modeling, in the broadest sense, is the cost-effective use of something in place of something else for some cognitive purpose. It allows us to use something that is simpler, safer, or cheaper than reality instead of reality for some purpose. A model represents reality for the given purpose; the model is an abstraction of reality in the sense that it cannot represent all aspects of reality. This allows us to deal with the world in a simplified manner, avoiding the complexity, danger, and irreversibility of reality. "The Nature of Modeling.", Jeff Rothenberg, in Artificial Intelligence, Simulation, and Modeling, L.E. William, K.A. Loparo, N.R. Nelson, eds. New York, John Wiley and Sons, Inc., 1989, pp. 75-92, http://poweredge.stanford.edu/BioinformaticsArchive/PrimarySite/NIHpanelModeling/RothenbergNatureModeling.pdf ↩︎

  3. https://ailev.livejournal.com/1464563.html ↩︎

  4. https://en.wikipedia.org/wiki/Mars_Orbiter_Mission ↩︎

  5. https://www.incose.org/docs/default-source/midwest-gateway/events/incose-mg_2018-11-13_scheurer_presentation.pdf ↩︎