Systems Engineering and the META Programpost by ryan_b · 2018-12-20T20:19:25.819Z · score: 31 (11 votes) · LW · GW · 3 comments
I periodically look for information on systems engineering. This time I came across a powerpoint presentation from the MIT Open Courseware course Fundamentals of Systems Engineering. Professor de Weck, who taught the course, had done some research on state-of-the-art methods developed as part of DARPA's META Program.
A few years ago DARPA wrapped up the program, designed to speed up delivery of cyber-electro-mechanical systems (war machines) by 5x. Since the parent program Adaptive Vehicle Make seems to have concluded without producing a vehicle, I infer the META Program lost its funding at the same time.
The work it produced appears to be adjacent to our interests along several dimensions though, so I thought I would bring it to the community's attention. The pitch for the program, taken from the abstract of de Weck's paper:
The method claims to achieve this speedup by a combination of three main mechanisms:
1. The deliberate use of layers of abstraction. High-level functional requirements are used to explore architectures immediately rather than waiting for downstream level 2,3,4 ... requirements to be defined.
2. The development and use of an extensive and trusted component (C2M2L) model library. Rather than designing all components from scratch, the META process allows importing component models directly from a library in order to quickly compose functional designs.
3. The ability to find emergent behaviors and problems ahead of time during virtual Verification and Validation (V&V) and generating designs that are correct-by-construction allows a more streamlined design process and avoids costly design iterations that often lead to expensive design changes.
Which is to say they very carefully architect the system, use known-to-be-good components, and employ formal verification to catch problems early. In the paper a simulation of the META workflow successfully achieved a 4.4x development speedup compared to the same project's actual development using traditional methods.
There are a bunch of individual directions explored which are of interest. Some that struck me were:
- A metric for complexity.
- A metric for adaptability.
- Stuff for quantitative verification methods.
- Stuff for complexity reduction and verification.
In a nutshell it contains a whole bunch of things we have long discussed all in a single package, and now there are a few people out and about trying to get parts into practice.
Comments sorted by top scores.