Engineering in the Age of Agile
In our previous blog post, Why Hardware Needs a New Foundation, we looked at how legacy engineering tools in the physical realm struggle to keep up with modern demands. So far, they have remained siloed, and disconnected from agile development practices like Ci/CD, Git-based collaboration, and cloud-native workflows. Integrating high-fidelity simulations with system-level models is still cumbersome, often relying on black-box co-simulation that hampers scalability, accuracy, and performance.
Meanwhile, software engineering has leapt ahead, powered by AI and machine learning. But in safety-critical domains like aerospace and automotive, trust, transparency, and domain-specific accuracy make naïve adoption of AI risky. The result: costly, manual, iterative modeling processes that slow innovation and prevent engineers from fully leveraging today’s computational advantages.
In this blog, we explore how Dyad, the Julia ecosystem and SciML are solving these problems, merging software agility, scientific accuracy, and multi-scale modeling into a single, unified workflow.
Breaking the Silos: Bringing Engineers and Developers onto a Single Source of Truth
One of the fundamental challenges with older engineering software was the divide between engineers and developers. This is the “two-culture” problem. Engineers, focused on physics and design, gravitate to GUI-based workflows that surface the information they care about quickly and accurately. Developers, tasked with deployment, work in low-level programming languages like C or Rust for embedded systems. Tools that auto-generate embedded code try to bridge the gap, but workflows remain fractured. Often auto-generated code is not good enough, leading to full re-implementations that cost time and money.
Software teams, over the last 10 years, have thrived on CI/CD automation, diffs, version-controlled workflows, and structured code reviews. These make it very easy to make small changes and rapidly ship software.
GUIs have not yet made this leap. Though they make hardware design much easier and more approachable than text based tools, it has historically been difficult to run agile workflows and continuous deployment with GUI based tools.
Modern devices, however, demand both sides of the house work together. Over-the-air updates (OTA) are now standard, letting cars or aircraft evolve with new features long after delivery (see Rivian’s suspension update as an example). This shift brings massive customer value, but also massive responsibility. A bad OTA update to an autonomy stack isn’t just a bug, it can cause a car crash.
Dyad resolves this “two cultures” problem by creating a single source of truth:
- Models that are both visual and textual, intuitive for engineers and diff-able for developers.
- CI/CD, testing, and deployment pipelines embedded in engineering workflows.
This is the engineering equivalent of what software teams have enjoyed for years: faster iteration, reproducible builds, and reliable deployment.
Multi-Scale Modeling Without the Pain
Traditionally, connecting fast system or control models with slow, high-fidelity physics models required painful compromises, like slow, black-boxed FMUs. This forces the use of slow and expensive co-simulation, which uses separate solvers and timesteps to black box the simulation.
The course simulation pattern leads to a lot of errors in numerical stability, performance, and accuracy, and causes the modeling compiler to be unable to fully optimize the simulator across the boundaries of multi-scale, thus limiting the ability of this model combination to be truly scalable.
Dyad, with its deep integration with SciML and native access to the Julia ecosystem, allows you to connect granular simulations and PDEs directly into an ODE/DAE model. This then allows you to, for example, tune controllers on the direct output of your PDEs rather than going through a black box.
But it gets even better. We have neural surrogates that can compress long and expensive simulations into lightweight and accurate approximations, which you can then embed in control loops for orders of magnitude faster runtime while developing. Then, for the final stage, you can always switch back to using the more exact solvers.
Native SciML integration: SciML and the Julia ecosystem empower highly detailed physics simulations and system-level models available directly to your Dyad model in the same language.
Neural surrogates: Neural surrogates compress long, expensive simulations into lightweight, accurate approximations, so you can embed them in control loops without losing fidelity. You can read more about this in our Defining Surrogates for Industrial Use white paper.
This means you can model the full system and retain the accuracy of component-level physics, something older tools can’t do efficiently.
Scientific AI: Trustworthy by Design
Despite the promise of Generative AI and machine learning, to improve the accuracy and reliability of system-level modeling, its adoption remains limited because current tooling is too complex, requires deep ML expertise and specialized hardware, and is poorly integrated with the workflows and training of practicing engineers—creating a gap between academic innovation and real-world engineering application. While Generative AI tooling is useful, black-box generative models can’t guarantee physically meaningful results, and often overfit or hallucinate to produce wildly incorrect results.
Scientific Machine Learning (SciML) is our solution to this problem. By combining physical laws with data, SciML can build models that are both predictive and trustworthy:
- Living digital twins that evolve with real-world sensor data.
- Individualized models for predictive maintenance and component health tracking.
- Model management tools to track, validate, and deploy surrogates with known performance envelopes.
We have techniques like component-level universal differential equations that will allow you to discover physical phenomena that you may not have in your model, but come from real world data.
From Models to Value
SciML enables models that live beyond the engineer’s laptop. They can integrate with streaming sensor data, identify missing dynamics (e.g., friction terms in a suspension), and suggest improvements. Engineers validate and interpret these insights, focusing on design value rather than manual coding.
This opens the door to individualization: models that adapt to specific engines or vehicles, capturing real-world wear and tear. Predictive maintenance becomes proactive, safety improves, and design cycles accelerate.
Finally, engineers will spend less time retraining models from scratch and more time managing an organized library of surrogates, each with defined performance envelopes, validation histories, and traceability. Instead of starting over, they’ll select the right model for the job, much like developers use well-tested libraries today.
Engineering is entering a new era. By combining SciML, cloud-native workflows, and integrated modeling, tools like Dyad break the barriers between engineers and developers, between fast system models and detailed physics, and between static digital models and evolving digital twins.
The result: safer, faster, more trustworthy innovation in the physical world.
About the Author

Anshul Singhvi
Anshul Singhvi is a contributor to Julia's plotting (Makie.jl), geospatial (JuliaGeo) and documentation ecosystems, and a developer on the Dyad team.