For the first time, Cybernet decided to coordinate two more major meetings with the TechnoForum – essentially creating a 2-day multiconference. In contrast to the Maplesoft-centric sessions of the Maple Techno Forum on Day 1, the second day was focused on industry problems and the technological and scientific solutions emerging from various sources.

Day 2: The IAV Design of Experiments Conference and the Plant Modeling Consortium General Meeting

The Design of Experiments (DoE) conference was presented by IAV – a German consulting firm specializing in advanced modeling and simulation technologies and services for the automotive industry. DoE offers a statistical framework to coordinate the data-collection and parameter study processes in engineering design and other fields. For engineering modeling, effective use of DoE techniques is vital for model calibration and validation. IAV is very well known for their leadership in this field and they invited a group of expert speakers to speak on DoE and its role relative to other analytical techniques in design.

Following the DoE conference was the third general meeting of the Plant Modeling Consortium – an industry think tank lead by automotive OEMs and their vendors (including Maplesoft). Established as the “Physical” Modeling Consortium in 2007 it recently narrowed its focus to plant modeling (in the context of control system design) where symbolic computation has found a very receptive audience who are drawn to the techniques for automatic model equation generation and model simplification. Both of these techniques have proven to be valuable for accelerating the control plant modeling process. As with the DoE Conference the PMC meeting offered a collection of speakers.



One unique session within the PMC meeting was the panel discussion. Lead by Dr. Shigeru Oho of Hitachi, the audience peppered the panel with a range of engaging and challenging questions. A particular highlight of this session for me was the thread on statistical modeling as it relates to symbolic computation. Historically the type of models that most users of symbolic computation dealt with were principally deterministic ordinary differential-equation (ODE) based. As long as you had enough basic relations from the laws of physics, and you configured your formulation appropriately, you’ll get good results. Well as it turns out, the real world is a bit more sneaky than Isaac Newton thought it would ever be, and many physical phenomena are often difficult to capture mathematically. In context of engine modeling, for example, anything involving heat calculations seem to present no end of headaches.

This is where people like our IAV friends have made a name for themselves. An important part of the DoE techniques are “Statistical models”, which are mathematical models developed from empirical data. The term is broad enough to encompass a range of approaches including lookup tables of data, to System Identification techniques, to stochastic techniques, and more. The main point is though, to recognize dynamics in your model that simply cannot be modeled accurately with a purely physical approach. Furthermore, any effective treatment of real world data within a modeling process is likely going to help the validation and calibration process. Both steps help build confidence that the model sufficiently reflects reality.

During the panel, several leading figures called for better integration between the techniques of physical (plant) modeling and statistical modeling. Mr. Ohata and Dr. Karsten Röpke, Head of Development Methods at IAV both agreed that symbolic computation would constitute an effective platform to merge the respective mathematical frameworks. Corresponding nods of heads by the majority of the audience would seem to show that this is something that will become very important for the future of symbolic computation in the engineering world.

It’s often easy to bathe yourself in the comfort of traditional integral calculus-based deterministic modeling techniques and to not soil our hands with the real world – after all if it was good enough for Newton, Leibniz, Laplace, Fourier, et al., it should be good enough for us! But once again, reality is showing us that with a bit of imagination, these same techniques could become better modeling tools and be better equipped to provide real insights into real systems.

The meeting ended with a lot of interesting technical banter flowing from an odd mix of Japanese, English, and German but through it all, it was clear that the unique technical focus of the Plant Modeling Consortium and the foresight of a collaborative organization of colleagues and competitors is starting to produce some real results. Chalk one up for the PMC …

Please Wait...