I've always been of the view that for a workflow language, you should use a proper, turing-complete functional language which gives you all the usual flexiblity for transformations on intermediate data, while also supporting things like automatic parallelisation of things like external, compute-intensive tasks.
I guess that ship has sailed and also it's maybe nitpicking but I find it a bit unfortunate to call a new programming language "Rex" when there's already "Rexx" for several decades.
Do you implement a DAG within your system to act as a kind of well-defined backbone for analysis and execution, or do you dispense with (explicit) DAGs entirely?
YAML as a programming language is something I consider as an anti-pattern (see AWS Step Functions). Very difficult to read/debug/test. It's better to use a real programming language that compiles into a DAG (e.g. Temporal, Dagger.io).
It’s interesting to see something new in this space, especially since some people claim that flowcharts will be replaced by AI automation or AI-generated code.
I'm working on something similar as a side project. I'm finding frustration with a lack of repeatability in my LLM flows. 90% of my code is AI written, but most of my guidance to LLMs is not particularly specific. It's "make sure you've read this file", "how does that match against existing patterns", "what's the performance like".
I've ended up building my workflow engine directly in Python, despite YAML being the default choice for LLMs.
I found that YAML had some drawbacks:
* LLMs don't have an inherent understanding of YAML conventions. They tend to be overly verbose. Python code solved this because "good" code is generally as short as you need.
* YAML isn't really composable. Yes, you can technically compose it, but you'll be fighting the LLM the entire time. Python solved this because the LLM knows how to decouple code.
* I want _some_ things to be programatic still. Having Python solves that
* Pretty much any programming language would do. Python just feels like the default for LLM-centric code.
Here's a different kind of workflow engine with a proper DSL. It turns out config management is the same problem as workflow engines, if you use my modern definition of config management.
I was expecting to see some verbose LLM output, but actually the code has a distinctly hand-crafted feel. Nice to see! I'm not sure if "production ready" is a safe claim 7 commits in to a project ;)
This is a good exercise but IMHO, when you really start using a workflow for production usecases, you need a a proper, turing-complete programming language as a DSL.
There used to be a project called Benthos (since acquired and rebranded by Redpanda in 2024) that was amazing, that you might want to gain some inspiration from.
However, durable workflows have also gained popular acceptance as functional design reaches a wider audience.
While Temporal is the most popular choice when it comes to durable workflows, DBOS (cofounded by the father of PostgreSQL) is my personal favorite.
At the moment, orchestration in DBOS has certain gaps - you might very well consider spending your effort on closing those gaps. The value there would be phenomenal!
Hi Felipe! Just point your agent at https://docs.dbos.dev/python/prompting and give it a go - you can really play around with it as much as you want and solve real problems you care about than me lecturing you about it :)
That said, DBOS really makes durable workflows accessible and approachable. Having already used Temporal, I think you're really appreciate how quickly you can get started with DBOS. I forget if they support SQLite but if you have a PostgreSQL server set up, you really don't need anything else to write your first few DBOS durable workflows (vs. needing a Temporal server or cluster)
Let me know if I got you interested to try it out. I first learned about Temporal from Mitchell Hashimoto as they were using it for Hashicorp Cloud. Eventually I discovered DBOS and now all my personal projects are on DBOS.
Was going to ask the same thing. The orchestration space already has some very well established frameworks like Airflow and Dagster. Would be curious to see the pros and cons.
I think the future of replacements to well established frameworks written in Python/etc.. are zero dependency binaries (from Rust or Go) that require so little configuration and tuning and they "just work".
Agreed. Right now, if I needed "workflow" for a greenfield that could tolerate some risk, I'd look at https://www.restate.dev/ which matches your model of a self contained binary.
Type-safe code. Workflows are not configuration! If I wanted YAML hell I could stick to Github Actions.
But that's only the start. There are a lot of other things I would expect of a new workflow orchestrator in 2026 so if you are not comparing yourself to the competition you probably don't know what you're getting yourself into.
Yeah, that makes sense. I looked at a few workflow orchestrators and I'm building something that I will release soon, but my thinking is that the "workflow engine" should be an abstraction that takes the input and executes the steps. "What" you use to define that workflow is probably the SDK layer though, but I can certainly see the value in using type safe code to define as opposed to a YAML file.
I'm mainly focusing on the portability aspect of it (e.g. use TS/Python/etc. to define the workflow/steps or just simple a simple YAML file).
Sort of. My thinking is that the input to define the workflow should be anything you prefer to use (TS, Go, YAML, etc.) and the orchestrator's job is to model that and execute the job, given your deployment model.
DAG Workflow Engine
A production-ready DAG (Directed Acyclic Graph) workflow engine driven by a YAML DSL. Validates, executes, and visualizes workflows with support for parallel execution, retries, conditional branching, batch iteration, and pluggable actions.
I recommend checking out https://github.com/peterkelly/rex and also my PhD thesis on the topic https://www.pmkelly.net/publications/thesis.pdf.
The gap in flexiblity between DAG-only and a full language designed for the task is a significant one.
https://insitro.github.io/redun/
P.S. I'm the author of a similar solution:
* https://github.com/nocode-js/sequential-workflow-designer
* https://github.com/nocode-js/sequential-workflow-machine
I've ended up building my workflow engine directly in Python, despite YAML being the default choice for LLMs.
I found that YAML had some drawbacks:
* LLMs don't have an inherent understanding of YAML conventions. They tend to be overly verbose. Python code solved this because "good" code is generally as short as you need.
* YAML isn't really composable. Yes, you can technically compose it, but you'll be fighting the LLM the entire time. Python solved this because the LLM knows how to decouple code.
* I want _some_ things to be programatic still. Having Python solves that
* Pretty much any programming language would do. Python just feels like the default for LLM-centric code.
https://github.com/purpleidea/mgmt/
https://github.com/swetjen/daggo
There used to be a project called Benthos (since acquired and rebranded by Redpanda in 2024) that was amazing, that you might want to gain some inspiration from.
However, durable workflows have also gained popular acceptance as functional design reaches a wider audience.
While Temporal is the most popular choice when it comes to durable workflows, DBOS (cofounded by the father of PostgreSQL) is my personal favorite.
At the moment, orchestration in DBOS has certain gaps - you might very well consider spending your effort on closing those gaps. The value there would be phenomenal!
That said, DBOS really makes durable workflows accessible and approachable. Having already used Temporal, I think you're really appreciate how quickly you can get started with DBOS. I forget if they support SQLite but if you have a PostgreSQL server set up, you really don't need anything else to write your first few DBOS durable workflows (vs. needing a Temporal server or cluster)
Let me know if I got you interested to try it out. I first learned about Temporal from Mitchell Hashimoto as they were using it for Hashicorp Cloud. Eventually I discovered DBOS and now all my personal projects are on DBOS.
That being said, that's not this project.
Just seeing YAML used for workflows in this age makes me automatically nope out.
But that's only the start. There are a lot of other things I would expect of a new workflow orchestrator in 2026 so if you are not comparing yourself to the competition you probably don't know what you're getting yourself into.
I'm mainly focusing on the portability aspect of it (e.g. use TS/Python/etc. to define the workflow/steps or just simple a simple YAML file).