Questions for those among you with decades of embedded software engineering experience: - do you have a rough estimation of the percentage of complex embedded software projects that have badly failed (e.g., died, ran out of budget, were too late to market, were not maintainable, had too many defects, etc.) because they were not using Model Driven Architecture (MDA) / Model Driven Development (MDD) approaches and the right modeling tools? - aren't we getting to a point that NOT using Model Based Engineering and proper software modeling tools will eventually become synonymous of a more or less quick death of the project? Introduction There are some monster companies out there that can have the luxury of spending hundreds of millions of dollars building a new embedded product, and there seems no money limit for them - at most their CEO may step down to make investors happy, but the company and the project will continue, they are too big to fail. But for smaller and more human-size companies, that luxury does not exist. Also, embedded products have become a lot more complex and the time to market has significantly shrunk. Can an embedded software project really succeed and then survive, without a reasonable amount of modeling techniques and tools capable of translating model elements into running software, the same way a C/C++/Java/Ada compiler can translate lines of code into executable software? Twenty five years after the inception of UML, most modeling tools still cannot translate the model elements into executable software, therefore many software engineers do not even want to use modeling tools (unless they are forced to by their management), as it represents a lot of extra work for them, for instance to maintain the model after the coding has started, or even to use the tools as some of them are really painful even to draw a simple diagram. The consequence is that, unlike many other engineering disciplines (e.g., electronics, mechanical, hydraulics, etc.), software engineers keep using the old approaches, imagining everything in their head, or doodling a few drawings on a white-board before erasing them for something else. Consequently, adding even ONE new feature to an existing software (which could be simply done using UML and a capable modeling tool - like Rhapsody - by just adding a new class to the model with some attributes and methods and possibly some more advanced behavioral diagrams like a state machine or an activity diagram and let the tool generate the source code from them) often requires a lot of effort (human, time, money), with many regressions in addition, and testing and bug fixing effort. I have seen developers pushing back some simple change requests, because they didn't even know where to make those changes, as the source code was impossible to understand, whereas a model-based engineering approach and a good structural definition of the software can naturally show where the changes should be made. Unfortunately, I believe that software modeling has not become a mainstream practice essentially because of poor modeling tools that are just drawing tools. Or is it because of job security considerations, as why should a software developer finish his work X times faster, with the risk of losing his job X times faster too? Thanks to share your experience and thoughts.