with
and the power of the hash
collaborative, decentralized --
information designed to be shared.
Work together, but not in lockstep.
... that you put other build tools in, to universalize things.
... secured by hash trees.
# Hi!
And I want that process to be good.
I like to build stuff to last.
I like to build stuff that other people can get into their hands and use easily.
# Hi!
Some of the problems we're about to try to solve are difficult, and are double tricky because they have half-solutions that... aren't.
# Hi!
# Norms
warpforge
# Ikke Enkelt
warpforge
# Ikke Enkelt
warpforge
# Ikke Enkelt
warpforge
# Ikke Enkelt
warpforge
(lol)
# Ikke Enkelt
warpforge
# Ikke Enkelt
warpforge
# Ikke Enkelt
warpforge
# Ikke Enkelt
warpforge
- Blood, sweat, and tears -- lots of manual work.
- Bake giant monolithic images?
- Give up -- we just accept that none of our work can be shared or reproduced easily.
# Ikke Containers
warpforge
# Ikke Containers
warpforge
There's a script to build stuff.
And it is isolated.
That's progress!
# Ikke Enkelt
warpforge
Having a cleanroom is great.
Building a cleanroom, then getting a bucket of mud from outside the door, and dumping it directly into the cleanroom...
...defeats the purpose.
# Ikke Containers
warpforge
# Ikke Containers
warpforge
Except... now we get to have...
# Ikke Containers
warpforge
Baking huge images together is:
This image baking idea actually made it harder to work together.
# Ikke Containers
warpforge
# Ikke Enkelt
warpforge
We're just slamming things together
like cavepeople.
# Stopgaps Suck
warpforge
Containers, CI...
None of them fix the underlying problems.
And they make it harder for people to collaborate!?!
# Stopgaps Suck
warpforge
should indicate to us:
# Stopgaps Suck
warpforge
should indicate to us:
# Stopgaps Suck
warpforge
should indicate to us:
* (and also, "layers" are totally useless.)
# Dreams of Better
warpforge
# Dreams of Better
warpforge
Give me total supply chain control.
Give me isolated cleanrooms.
Every step: deterministic.
# Dreams of Better
warpforge
Every input must be versioned.
Predictable results are important.
Predictable results ain't gonna happen
if we can't get predictable inputs first.
# Dreams of Better
warpforge
Bring together other tools.
Let them do what they do best.
# Dreams of Better
warpforge
I have source code.
I have compilers.
I have shells and environments.
Let me assemble them,
without baking them together forever.
# Dreams of Better
warpforge
Give me something decentralized (like git).
Something that can push and pull,
without centralized hubs.
# Dreams of Better
warpforge
I want to exchange work with others.
In the open.
Not just in monorepos.
Not just hucking images over the wall.
# Dreams of Better
warpforge
If you can't measure it,
you can't improve it.
So let's start measuring!
# Dreams of Better
warpforge
# Case Studies
warpforge
# Case Studies
warpforge
# Case Studies
warpforge
git
.# Case Studies
warpforge
git
.(it's not even a build tool; you're kidding, right?)
# Case Studies
warpforge
Git is beautiful local-first.
It's hugely productive because of it.
Git is also the clearest pointer for why
a good system needs hashes.
Hashes mean you can always talk about exact contents, even without names.
You can add names *onto* this later,
and that doesn't take anything away;
but hashes first makes it all stronger.
# Case Studies
warpforge
make
.# Case Studies
warpforge
make
.# Case Studies
warpforge
Can we take the best parts of these,
do it with container isolation,
make a nice programmable API,
and get something awesome?
# Problem Solving!
warpforge
# Problem Solving!
warpforge
# Problem Solving!
warpforge
$ cd myprojectdir
$ warpforge run
# Problem Solving!
warpforge
$ cat module.wf | grep "action" -A9
...
"action": {
"script": {
"interpreter": "/bin/bash"
"script": [
"echo whee",
"echo woo"
]
}
}
# Problem Solving!
warpforge
$ cat module.wf | grep "multi" -A9 -B1
...
"action": {
"multi": {
"stepA": { ...another action... },
"stepB": { ...another action... },
"stepC": { ...another action... }
}
}
# Problem Solving!
warpforge
$ cat module.wf | grep "input" -A9
...
"inputs": {
"/app/bash": "tar:2nSYg68pkhwmpBfYBrGt6bAAzAGbtUSjGJbYNFiJxkRgJX6dQdJQWrA68FWaSWg2zD",
"/app/go": "tar:6ATn28CVxaUH1nXGC3QGspqjexYh1rRjyMYByMVi7HqZ2wexuF9RVk6otMPjnvjJXa",
"/task/src": "git:b951b7a3388ea4fad7ca5cb9d177ff02b7e9039a",
"/mount/anything": "tar:XGC3QGspqjexYh1rRjyMYByMVi7HqZ2wexuF9RVk6otMPjnvjJXa",
"/compose/easily": "tar:HqZ2wexuF9RVk6otMPjnvjJXaXGC3QGspqjexYh1rRjyMYByMVi7",
}
# Problem Solving!
warpforge
$ cat module.wf | grep "input" -A9
...
"inputs": {
"/app/bash": "tar:2nSYg68pkhwmpBfYBrGt6bAAzAGbtUSjGJbYNFiJxkRgJX6dQdJQWrA68FWaSWg2zD",
"/app/go": "tar:6ATn28CVxaUH1nXGC3QGspqjexYh1rRjyMYByMVi7HqZ2wexuF9RVk6otMPjnvjJXa",
"/task/src": "git:b951b7a3388ea4fad7ca5cb9d177ff02b7e9039a",
"/mount/anything": "tar:XGC3QGspqjexYh1rRjyMYByMVi7HqZ2wexuF9RVk6otMPjnvjJXa",
"/compose/easily": "tar:HqZ2wexuF9RVk6otMPjnvjJXaXGC3QGspqjexYh1rRjyMYByMVi7",
}
# Problem Solving!
warpforge
# Problem Solving!
warpforge
# Problem Solving!
warpforge
$ cat module.wf | grep "input" -A9
...
"inputs": {
"/app/bash": "catalog:bash:v4.12:linux-amd64-zapp",
"/app/go": "catalog:go:v1.19:linux-amd64",
"/task/src": "catalog:myproject:v1000:src",
}
... and fear not:
reproducibly, decentralized, local-first.
# Problem Solving!
warpforge
... and fear not:
reproducibly, decentralized, local-first.
tl;dr: think like git branch names referring to hashes...
except we put that in a merkle tree itself, so it's versioned and you can fork it.
# Problem Solving!
warpforge
$ cat module.wf | grep "outputs" -A9
...
"outputs": {
"yourlabel": {
"from": "/this/path",
"packtype": "tar"
},
"several": {
"from": "/sure/youbetcha",
"packtype": "tar"
}
}
you're not stuck with full system images!
# Tech time
warpforge
# Tech time
warpforge
[ WareID ]
at the bottom I'm going to need a
filesystem snapshot hash
# Tech time
warpforge
[ WareID ]
filesystem snapshot hash
[ Formula ]
Then, some
task declarations
only hashes.
These can use
# Tech time
warpforge
[ WareID ]
filesystem snapshot hash
[ Formula ]
task declarations
[ Workflow ]
Now, to really get things done, we need to refer to other work.
relationship declarations
and these now
So it's time for
only hashes.
... with
use names.
# Tech time
warpforge
[ WareID ]
filesystem snapshot hash
[ Formula ]
task declarations
[ Workflow ]
relationship declarations
only hashes.
... with
use names.
that
# Tech time
warpforge
[ WareID ]
filesystem snapshot hash
[ Formula ]
task declarations
[ Workflow ]
relationship declarations
only hashes.
... with
use names.
that
[ Catalog ]
Names are tricky.
reproducible resolve.
... so we're going to make a
decentralized name database
using merkle trees.
We want
# Tech time
warpforge
[ WareID ]
filesystem snapshot hash
[ Formula ]
task declarations
[ Workflow ]
relationship declarations
only hashes.
... with
use names.
that
[ Catalog ]
[ buildplugs ]
Template reusable parts of workflows!
Use any templating language!
It just has to emit plain JSON objects.
decentralized name database
# Tech time
warpforge
[ WareID ]
Duh; of course we need data snapshots.
[ Formula ]
[ Workflow ]
[ Catalog ]
Solves name lookup.
Is how we organize sharing!
You can save these.
These are rebuild instructions.
You can hash 'em. Perfect memoization;
and a debugging dream.
Is where the user has power.
[ buildplugs ]
# Tech time
warpforge
[ WareID ]
Duh; of course we need data snapshots.
[ Formula ]
[ Workflow ]
[ Catalog ]
Solves name lookup.
Is how we organize sharing!
You can save these.
These are rebuild instructions.
You can hash 'em. Perfect memoization;
and a debugging dream.
Is where the user has power.
L3+
L2
L1
L0
[ buildplugs ]
# It's a deep stack
warpforge
# Tech time
warpforge
Let's see it:
Formulas
{
"formula": {
"inputs": {
"/": "ware:tar:4z9DCTxoKkStqXQRwtf9nimpfQQ36dbndDsAPCQgECfbXt3edaE9TkX2v9"
},
"action": {
"script": {
"interpreter": "/bin/sh",
"contents": [
"MESSAGE='hello, this is a script action'",
"echo $MESSAGE",
"mkdir /out && echo $MESSAGE > /out/log"
"echo done!"
]
}
},
"outputs": {
"test": {
"from": "/out",
"packtype": "tar"
},
}
},
"context": {
"context.v1": {
"warehouses": {
"tar:4z9DCTxoKkStqXQRwtf9nimpfQQ36dbndDsAPCQgECfbXt3edanUrsVKCjE9TkX2v9": "https://warpsys.s3.amazonaws.com/warehouse/4z9/DCT/4z9DCTxoKkStqXQRwtf9nimpfQQ36dbndDsAPCQgECfbXt3edanUrsVKCjE9TkX2v9"
}
}
}
}
This is the bottom.
It's just JSON.
You can see hashes going in...
{
"guid": "cb351d5f-9b85-4404-aec9-b54cb71d249c",
"time": 1634850353,
"formulaID": "zM5K3Xv31xGHCtyjFXjV4kNvdMoXCdamJq5t7DvHhSuKomTuQ85QampAi4kayHuA7sRbWNh",
"exitcode": 0,
"results": {
"test": "ware:tar:3vmwry1wdxQjTaCjmoJnvGbdpg9ucTvCpWzGzvtujbLQSwvPPAECTm3YxrsHnERtzg"
}
}
And here's the JSON you get out!
# Tech time
warpforge
Let's see it:
Workflows
{
"inputs": {
"glibc": "catalog:warpsys.org/bootstrap/glibc:v2.35:amd64",
"ld": "catalog:warpsys.org/bootstrap/glibc:v2.35:ld-amd64",
"ldshim": "catalog:warpsys.org/bootstrap/ldshim:v1.0:amd64",
"make": "catalog:warpsys.org/bootstrap/make:v4.3:amd64",
"gcc": "catalog:warpsys.org/bootstrap/gcc:v11.2.0:amd64",
"grep": "catalog:warpsys.org/bootstrap/grep:v3.7:amd64",
"coreutils": "catalog:warpsys.org/bootstrap/coreutils:v9.1:amd64",
"binutils": "catalog:warpsys.org/bootstrap/binutils:v2.38:amd64",
"sed": "catalog:warpsys.org/bootstrap/sed:v4.8:amd64",
"gawk": "catalog:warpsys.org/bootstrap/gawk:v5.1.1:amd64",
"busybox": "catalog:warpsys.org/bootstrap/busybox:v1.35.0:amd64",
"src": "catalog:warpsys.org/bash:v5.1.16:src"
},
"steps": {
"build": {
"protoformula": {
"inputs": {
"/src": "pipe::src",
"/lib64": "pipe::ld",
"/pkg/glibc": "pipe::glibc",
"/pkg/make": "pipe::make",
"/pkg/coreutils": "pipe::coreutils",
"/pkg/binutils": "pipe::binutils",
"/pkg/gcc": "pipe::gcc",
"/pkg/sed": "pipe::sed",
"/pkg/grep": "pipe::grep",
"/pkg/gawk": "pipe::gawk",
"/pkg/busybox": "pipe::busybox",
"$PATH": "literal:/pkg/make/bin:/pkg/gcc/bin:/pkg/coreutils/bin:/pkg/binutils/bin:/pkg/sed/bin:/pkg/grep/bin:/pkg/gawk/bin:/pkg/busybox/bin",
"$CPATH": "literal:/pkg/glibc/include:/pkg/glibc/include/x86_64-linux-gnu"
},
"action": {
"script": {
"interpreter": "/pkg/busybox/bin/sh",
"contents": [
"mkdir -p /bin /tmp /prefix /usr/include/",
"ln -s /pkg/glibc/lib /prefix/lib",
"ln -s /pkg/glibc/lib /lib",
"ln -s /pkg/busybox/bin/sh /bin/sh",
"ln -s /pkg/gcc/bin/cpp /lib/cpp",
"cd /src/*",
"mkdir -v build",
"cd build",
"export SOURCE_DATE_EPOCH=1262304000",
"../configure --prefix=/warpsys-placeholder-prefix LDFLAGS=-Wl,-rpath=XORIGIN/../lib ARFLAGS=rvD",
"make",
"make DESTDIR=/out install",
"sed -i '0,/XORIGIN/{s/XORIGIN/$ORIGIN/}' /out/warpsys-placeholder-prefix/bin/*"
]
}
},
"outputs": {
"out": {
"from": "/out/warpsys-placeholder-prefix",
"packtype": "tar"
}
}
}
},
"pack": {
"protoformula": {
"inputs": {
"/pack": "pipe:build:out",
"/pkg/glibc": "pipe::glibc",
"/pkg/ldshim": "pipe::ldshim",
"/pkg/busybox": "pipe::busybox",
"$PATH": "literal:/pkg/busybox/bin"
},
"action": {
"script": {
"interpreter": "/pkg/busybox/bin/sh",
"contents": [
"mkdir -vp /pack/lib",
"mkdir -vp /pack/dynbin",
"cp /pkg/glibc/lib/libc.so.6 /pack/lib",
"cp /pkg/glibc/lib/libdl.so.2 /pack/lib",
"cp /pkg/glibc/lib/libm.so.6 /pack/lib",
"mv /pack/bin/bash /pack/dynbin",
"cp /pkg/ldshim/ldshim /pack/bin/bash",
"cp /pkg/glibc/lib/ld-linux-x86-64.so.2 /pack/lib",
"rm -rf /pack/lib/bash /pack/lib/pkgconfig /pack/include /pack/share"
]
}
},
"outputs": {
"out": {
"from": "/pack",
"packtype": "tar"
}
}
}
},
"test-run": {
"protoformula": {
"inputs": {
"/pkg/bash": "pipe:pack:out"
},
"action": {
"exec": {
"command": [
"/pkg/bash/bin/bash",
"--version"
]
}
},
"outputs": {}
}
}
},
"outputs": {
"amd64": "pipe:pack:out"
}
}
This is the midlevel.
It's just JSON.
(This is a real workflow, for compiling and packaging bash.)
# Tech time
warpforge
Let's see it:
Workflows
{
"inputs": {
"glibc": "catalog:warpsys.org/bootstrap/glibc:v2.35:amd64",
"ld": "catalog:warpsys.org/bootstrap/glibc:v2.35:ld-amd64",
"ldshim": "catalog:warpsys.org/bootstrap/ldshim:v1.0:amd64",
"make": "catalog:warpsys.org/bootstrap/make:v4.3:amd64",
"gcc": "catalog:warpsys.org/bootstrap/gcc:v11.2.0:amd64",
"grep": "catalog:warpsys.org/bootstrap/grep:v3.7:amd64",
"coreutils": "catalog:warpsys.org/bootstrap/coreutils:v9.1:amd64",
"binutils": "catalog:warpsys.org/bootstrap/binutils:v2.38:amd64",
"sed": "catalog:warpsys.org/bootstrap/sed:v4.8:amd64",
"gawk": "catalog:warpsys.org/bootstrap/gawk:v5.1.1:amd64",
"busybox": "catalog:warpsys.org/bootstrap/busybox:v1.35.0:amd64",
"src": "catalog:warpsys.org/bash:v5.1.16:src"
},
"steps": {
"build": {
"protoformula": {
"inputs": {
"/src": "pipe::src",
"/lib64": "pipe::ld",
"/pkg/glibc": "pipe::glibc",
"/pkg/make": "pipe::make",
"/pkg/coreutils": "pipe::coreutils",
"/pkg/binutils": "pipe::binutils",
"/pkg/gcc": "pipe::gcc",
"/pkg/sed": "pipe::sed",
"/pkg/grep": "pipe::grep",
"/pkg/gawk": "pipe::gawk",
"/pkg/busybox": "pipe::busybox",
"$PATH": "literal:/pkg/make/bin:/pkg/gcc/bin:/pkg/coreutils/bin:/pkg/binutils/bin:/pkg/sed/bin:/pkg/grep/bin:/pkg/gawk/bin:/pkg/busybox/bin",
"$CPATH": "literal:/pkg/glibc/include:/pkg/glibc/include/x86_64-linux-gnu"
},
"action": {
"script": {
"interpreter": "/pkg/busybox/bin/sh",
"contents": [
"mkdir -p /bin /tmp /prefix /usr/include/",
"ln -s /pkg/glibc/lib /prefix/lib",
"ln -s /pkg/glibc/lib /lib",
"ln -s /pkg/busybox/bin/sh /bin/sh",
"ln -s /pkg/gcc/bin/cpp /lib/cpp",
"cd /src/*",
"mkdir -v build",
"cd build",
"export SOURCE_DATE_EPOCH=1262304000",
"../configure --prefix=/warpsys-placeholder-prefix LDFLAGS=-Wl,-rpath=XORIGIN/../lib ARFLAGS=rvD",
"make",
"make DESTDIR=/out install",
"sed -i '0,/XORIGIN/{s/XORIGIN/$ORIGIN/}' /out/warpsys-placeholder-prefix/bin/*"
]
}
},
"outputs": {
"out": {
"from": "/out/warpsys-placeholder-prefix",
"packtype": "tar"
}
}
}
},
"pack": {
"protoformula": {
"inputs": {
"/pack": "pipe:build:out",
"/pkg/glibc": "pipe::glibc",
"/pkg/ldshim": "pipe::ldshim",
"/pkg/busybox": "pipe::busybox",
"$PATH": "literal:/pkg/busybox/bin"
},
"action": {
"script": {
"interpreter": "/pkg/busybox/bin/sh",
"contents": [
"mkdir -vp /pack/lib",
"mkdir -vp /pack/dynbin",
"cp /pkg/glibc/lib/libc.so.6 /pack/lib",
"cp /pkg/glibc/lib/libdl.so.2 /pack/lib",
"cp /pkg/glibc/lib/libm.so.6 /pack/lib",
"mv /pack/bin/bash /pack/dynbin",
"cp /pkg/ldshim/ldshim /pack/bin/bash",
"cp /pkg/glibc/lib/ld-linux-x86-64.so.2 /pack/lib",
"rm -rf /pack/lib/bash /pack/lib/pkgconfig /pack/include /pack/share"
]
}
},
"outputs": {
"out": {
"from": "/pack",
"packtype": "tar"
}
}
}
},
"test-run": {
"protoformula": {
"inputs": {
"/pkg/bash": "pipe:pack:out"
},
"action": {
"exec": {
"command": [
"/pkg/bash/bin/bash",
"--version"
]
}
},
"outputs": {}
}
}
},
"outputs": {
"amd64": "pipe:pack:out"
}
}
If we zoom in a bit...
This is what catalog references look like.
This is referencing other workflow step outputs.
# Tech time
warpforge
Let's see it:
Buildplugs
load("../lib.star", "plot")
load("../lib.star", "gnu_build_step")
load("../lib.star", "zapp_pack_step")
step_build = gnu_build_step(
src=("warpsys.org/bash", "v5.1.16", "src"),
script="""cd /src/*
./configure --prefix=/warpsys-placeholder-prefix
make
make DESTDIR=/out install""")
step_pack = zapp_pack_step(
binaries=["bash"],
libraries=[
("warpsys.org/bootstrap/glibc", "libc.so.6"),
("warpsys.org/bootstrap/glibc", "libdl.so.2"),
("warpsys.org/bootstrap/glibc", "libm.so.6"),
],
extra_script="rm -rf /pack/lib/bash /pack/lib/pkgconfig /pack/include /pack/share")
result = plot(steps={"build": step_build, "pack": step_pack})
This is a buildplug, using Starlark.
It generates the workflow on the last slide.
You can see how this is leverage!
# Tech time
warpforge
Let's see it:
Catalogs
./warpsys.org/texinfo
./warpsys.org/texinfo/_releases
./warpsys.org/texinfo/_releases/v6.8.json
./warpsys.org/texinfo/_module.json
./warpsys.org/texinfo/_mirrors.json
./warpsys.org/texinfo/_replays
./warpsys.org/texinfo/_replays/zM5K3YjS3FJ3KTXgLoa1bpjRLCth134M1AkrDCXw5VPjq9bKgSrkSsGzAocLfELYVH6sAii.json
./warpsys.org/bash
./warpsys.org/bash/_releases
./warpsys.org/bash/_releases/v5.1.16-2.json
./warpsys.org/bash/_releases/v5.1.16.json
./warpsys.org/bash/_module.json
./warpsys.org/bash/_mirrors.json
./warpsys.org/bash/_replays
./warpsys.org/bash/_replays/zM5K3Vgei44et6RzTA785sEZGwuFV75vCazjhR11RH5veFdMTx7F5cg2c4NA5HXPK8Zv5TQ.json
./warpsys.org/bash/_replays/zM5K3aMARrWToyXjaFxxxWmYU7dZUmYp7ir5hDQtzDi2LCGPtw9PNVch9DTts9ApRyPSacJ.json
./warpsys.org/make
./warpsys.org/make/_releases
./warpsys.org/make/_releases/v4.3.json
Catalogs are a series of files.
Catalog ->
modules ->
releases ->
items.
A "c:m:r:i" tuple
resolves to a WareID. Or to metadata about it.
📦️
# Tech time
warpforge
Let's see it:
Catalogs
Catalogs are a a merkle tree.
The logical structure looks sorta like this.
📦️
(Reminiscent of git)
# Tech time
warpforge
Let's see it:
Catalogs
Catalogs point at the content snapshot ID
(the "WareID")
... but they also point at copy of the Workflow used to Build it.
📦️
# Sharing & Caring
warpforge
I want to work in public.
I want to share work and collaborate.
Even outside my team.
Outside my org and workplace.
How?
# Sharing & Caring
warpforge
# Sharing & Caring
warpforge
# Sharing & Caring
warpforge
Build instructions point at Catalogs,
and Catalogs point at both content IDs,
and rebuild instructions.
That means building your dependencies recursively is optional --
It's possible, but not required.
# Zapps
warpforge
Programs don't work that way.
You can't just mount them wherever.
# Zapps
warpforge
Programs don't work that way.
You can't just mount them wherever.
# Zapps
warpforge
# Zapps
warpforge
They work anywhere.
Any path.
No install hooks.
Even on read-only mounts.
This means you can solve building&packaging once...
And ship anywhere.
Long story short:
# Zapps
warpforge
Okay, sorry, this talk isn't about that.
Not enough time.
Check out the website.
Or prior talks!
# Comparable projs
warpforge
Okay, there's a few almost-similar projects.
# Comparable projs
warpforge
# Comparable projs
warpforge
Except... Remember this diagram?
# Comparable projs
warpforge
# Comparable projs
warpforge
Catalogs and
Content-Addressing
make Warpforge
distinctive.
Warpforge can
choose to build or fetch ... on the fly.
In other systems...
that's a rewrite of your build instructions.
# Comparable projs
warpforge
Bazel, Nix, Guix, etc...
can't do this,
can't even describe this.
Their build instructions only depend on other build instructions,
not on actual content.
# Comparable projs
warpforge
We hope Warpforge can succeed in building bigger ecosystems, where Bazel tends to only survive in monorepos.
Content-addressed checkpoints -> Shareable.
Rebuild instructions rigorous but optional.
Based only around building,
so every team using it does their own bootstrapping.
Encourages monorepos.
Simply executes other things.
Magic built-ins for what it supports.
# Comparable projs
warpforge
Warpforge aims for easier use, and more decentralization.
Based only around building.
(Similar story to Bazel.)
Plain JSON API.
Or: Python-like syntax, if you want it.
Or: Bring your own templating.
Nix uses the Nix language.
I cannot comment on this.
(But many other people have...)
Content-addressed checkpoints -> Shareable.
Rebuild instructions rigorous but optional.
# Comparable projs
warpforge
Warpforge aims to end the distro wars.
Not just build another one.
Warpforge introduces Zapps.
These make packages you build with Warpforge+Zapps instantly usable in any other Linux environment.
Nix builds NixOS.
NixOS has a series of paths starting with `/nix/` and
is a whole distro.
# Comparable projs
warpforge
We've tried to learn from this in Warpforge, and earnestly believe we offer improvements.
# Future Work
warpforge
# Future Work
warpforge
# Getting Involved
warpforge
Gosh, I'm glad you asked.
# Getting Involved
warpforge
We have to:
# Getting Involved
warpforge
# Getting Involved
warpforge
^ Ware/filesystem packing/hashing/transporting is pluggable.
The string before the ":" is the discriminator.
# Getting Involved
warpforge
^ The "script" keyword here is another plugin point.
We can introduce other "executors" and trigger them with other keywords in this position.
# Getting Involved
warpforge
The Starlark system we're shipping can load libraries.
Also, you can bring your own entire templating engine. Can it emit JSON on stdout? Then it'll do.
# Getting Involved
warpforge
Many existing ecosystems exist and have lots of content.
Let's bring build tools to import content IDs into catalogs.
Let's build tools to import their lockfiles into workflows.
# Getting Involved
warpforge
Just by joining us in building stuff
with Warpforge,
you're helping!
# Getting Involved
warpforge
Autorebuilders,
CI integrations,
dashboards,
etc!
# Getting Involved
warpforge
If you've got time and will hack:
Now you see the extension points!
Come! Build a bifrost!
Or hack on the core!
Or package stuff!
Or help write docs!
If you've got money but not time:
Warpforge is a 501c3.
We can accept charitable contributions and make sure it supports other people to contribute their time and skills.
# Getting Involved
warpforge
Reminder:
# Getting Involved
warpforge
Docs:
http://warpforge.io/
Code:
https://github.com/warptools/warpforge/
Catalogs:
https://catalog.warpsys.org/
https://github.com/warptools/catalog/
Find links to our chat
in the Community page
on the website!
Zapps:
https://zapps.app/
lil' ol me: @warpfork
trashheap
# Dreams of Better
warpforge
But we actually have some of the upsides of that.
When I read travis.yml or the .github dir:
that's the upside.
I just want that... on localhost.
# Dreams of Better
warpforge
This isn't just a problem of where you run it.
It's a problem of how much that "where" means.
Github CI, travis, etc... they also have their own caches.
That's why a dockerfile on localhost isn't really helpful progress. Distinct cache.
Nobody's API talks about this, but it's essential to real behaviors.
# Ikke Containers
warpforge
# Ikke Containers
warpforge
Putting an @{hash} at the end of an image name is version lock for shipping the image.
It's not a lockfile for anything that goes into building that image.
# Tech time
warpforge
[ Workflow ]
relationship declarations
use names.
that
[ Catalog ]
[your-templating-here]
This is where version selection happens.
This is where you plug in linters and other reusable process decorations.
decentralized name database
Generate
referencing
# Dreams of Better
warpforge
"Containers, but more like legos,
less like balls-of-mud"
# Dreams of Better
warpforge
I want to build a whole decentralized,
low-coordination,
high-sharing,
ecosystem.
Local-first.
Offline-ready.
High-productivity.
Works for anyone.
Reproducibly.
# Dreams of Better
warpforge
I want to build a whole decentralized,
low-coordination,
high-sharing,
ecosystem.
... and I'm going to need a lot of hashes to do it.
It's just that we need to break the problem down well to get there.
# Ikke Enkelt
warpforge
# Tech time
warpforge
# Hi!
This is a story of my journey
to try to bring sanity to my world
to make things comprehensible
to make anything I built once...
be buildable again.
And also, just maybe, do it in a system I can use for anything.
Okay, also, and be sharable. With anyone.
# Case Studies
warpforge
# Case Studies
warpforge
there's a file that says what you want
the order of that file generally doesn't matter; it's declarative, and all requests are commutative
(in good systems) there's a lockfile
# Case Studies
warpforge
# Case Studies
warpforge
tl;dr baking big balls of mud suxors
isolation is good but getting a bucket of mud from outside and dumping it in as the first step... kind ruins the point.
# Case Studies
warpforge
in neither of these does signing show up as important.
it's there somewhere. but it's tacked on way at the end.
this is correct. signing becomes a lot less necessary when you do hashing throughout the protocol.