Build Anything

Decentralized supply chain management to

with

Warpforge

and the power of the hash

a build tool

collaborative, decentralized --
information designed to be shared.

Work together, but not in lockstep.

and supply chains

... that you put other build tools in, to universalize things.

for everyone.

... secured by hash trees.

Warpforge

Hi, I'm Eric

# Hi!
  • I build software.
  • A lot of software.

And I want that process to be good.

I like to build stuff to last.

I like to build stuff that other people can get into their hands and use easily.

In this talk, I'm going to:

# Hi!
  • Introduce a piece of software that solves problems
    around building and distributing software.
     
  • ...but first, spend a lot of time on the motivations.

Some of the problems we're about to try to solve are difficult, and are double tricky because they have half-solutions that... aren't.

Outline

# Hi!
  • Norms now
  • What should things be like?
  • Existing inspirational projects
  • Problem solving time!
  • Compare to other projects
  • The future
  • And you!
# Norms
warpforge

Norms now

# Ikke Enkelt
warpforge
  • You just... need to know what the tools are.
    • Totally fine, when you work on lots of stuff!
      • ... right?

Building software is easy, ...right?

# Ikke Enkelt
warpforge
  • You just... need to know what the tools are.
     
  • All these tools... have lockfiles, right?
    • Yeah, definitely, everything's versioned!
      • ... Riiiiight?

Building software is easy, ...right?

# Ikke Enkelt
warpforge
  • You just... need to know what the tools are.
     
  • All these tools... have lockfiles, right?
     
  • Prereqs might include system packages...
    can't automate; like a massive global variable
    • ... okay jokes aside, surely we all admit we're hosed here?

Building software is easy, ...right?

# Ikke Enkelt
warpforge

Knowing what you just built,

that's easy though...




...right?

(lol)

# Ikke Enkelt
warpforge
  • Are you getting logs of what artifact hashes came out?
  • Can you map that back to source hashes?
  • Can you map that back to all the other dependencies and compilers used?
  • Can you reproduce any of that setup?
  • Can you even tell if any of it is deterministic?

Knowing what you just built
is a madhouse

# Ikke Enkelt
warpforge

This is out of control.

...</sarcasm>

# Ikke Enkelt
warpforge

This is out of control.

It's unproductive.

And it's insecure.

# Ikke Enkelt
warpforge

What are our best (current) defenses?

- Blood, sweat, and tears -- lots of manual work.

 

- Bake giant monolithic images?

 

- Give up -- we just accept that none of our work can be shared or reproduced easily.

# Ikke Containers
warpforge

Containers solved some of these problems.

... sorta.

# Ikke Containers
warpforge

Containers solved some of these problems.

... sorta.

There's a script to build stuff.
And it is isolated.
That's progress!

# Ikke Enkelt
warpforge

Having a cleanroom is great.

Building a cleanroom, then getting a bucket of mud from outside the door, and dumping it directly into the cleanroom...

...defeats the purpose.

# Ikke Containers
warpforge

Containers help freeze one scene,
once it's built.

They don't help you reproduce it.

They don't help you update it.

They don't help you update it.

They don't help you maintain it.

# Ikke Containers
warpforge

Containers "solved some of these problems".

Except... now we get to have...
 

  • huge images
  • "container registries" -- hooray, a new SPOF.
  • no real progress -- no lockfiles in the melange of systems used to build them.
  • the endless quest for a "perfect base image" (spoiler, you can't win this one).
# Ikke Containers
warpforge

Containers might have actually made things

worse.

Baking huge images together is:

 

  • Hard to explain afterwards
  • Hard to upstream fixes
  • Hard to share

 

This image baking idea actually made it harder to work together.

# Ikke Containers
warpforge

People slap signing on top of these things...

What are you signing?

# Ikke Enkelt
warpforge

Our norms are:

We're just slamming things together
like cavepeople.

# Stopgaps Suck
warpforge

Our best (current) defenses...

Containers, CI...

None of them fix the underlying problems.

And they make it harder for people to collaborate!?!

# Stopgaps Suck
warpforge

Our best (current) defenses...

solving this piecemeal is not an option.

should indicate to us:

# Stopgaps Suck
warpforge

Our best (current) defenses...

we need tools that guide
good behaviors.

should indicate to us:

# Stopgaps Suck
warpforge

Our best (current) defenses...

snapshotting is necessary
but not sufficient.

should indicate to us:

* (and also, "layers" are totally useless.)

# Dreams of Better
warpforge

What should things
be like?

# Dreams of Better
warpforge

I want:

A tool that's intensely predictable.

Give me total supply chain control.

Give me isolated cleanrooms.

Every step: deterministic.

# Dreams of Better
warpforge

I want:

A tool that controls the supply chain.

Every input must be versioned.
Predictable results are important.
Predictable results ain't gonna happen
if we can't get predictable inputs first.

# Dreams of Better
warpforge

I want:

A tool that works for anything.

Bring together other tools.
Let them do what they do best.

# Dreams of Better
warpforge

I want:

A tool that can assemble things freely.

I have source code.
I have compilers.
I have shells and environments.

Let me assemble them,
without baking them together forever.

# Dreams of Better
warpforge

I want:

A tool that's local-first.

Give me something decentralized (like git).
 

Something that can push and pull,
without centralized hubs.

# Dreams of Better
warpforge

I want:

A tool that's social.

I want to exchange work with others.

In the open.

Not just in monorepos.
Not just hucking images over the wall.

# Dreams of Better
warpforge

I want:

a determinism-detector.

If you can't measure it,
you can't improve it.

So let's start measuring!

I want hashes everywhere,
so I can see if things are stable.

# Dreams of Better
warpforge

I want:

Enough ease-of-use

that this tooling can become

the path of least resistance.

# Case Studies
warpforge

Let's talk about
some stuff that's
good

# Case Studies
warpforge

Two influences I want to riff from:

  • Git
     
  • Make

Let's talk about
some stuff that's good

# Case Studies
warpforge

let's reflect on
git.

What can inspire us?

# Case Studies
warpforge

let's reflect on
git.

(it's not even a build tool; you're kidding, right?)

# Case Studies
warpforge

Git is beautiful local-first.
It's hugely productive because of it.

 

Git is also the clearest pointer for why
a good system needs hashes.

 

Hashes mean you can always talk about exact contents, even without names.
 

You can add names *onto* this later,
and that doesn't take anything away;
but hashes first makes it all stronger.

# Case Studies
warpforge

let's reflect on
make.

What can inspire us?

# Case Studies
warpforge

let's reflect on
make.

  • It works with anything.
     
  • It's a graph of targets.
     
  • If we did it again today:
    • Maybe not a bespoke DSL...
    • External dependency
      management is needed.
# Case Studies
warpforge

So from these two good influences:

  • Git
  • Make

Can we take the best parts of these,

do it with container isolation,

make a nice programmable API,

and get something awesome?

# Problem Solving!
warpforge

Let's do it!

# Problem Solving!
warpforge

that makes our lives
easier

by being predictable.

Let's make a build and
supply chain manager
 

# Problem Solving!
warpforge

Warpforge is a build tool.

$ cd myprojectdir
$ warpforge run
# Problem Solving!
warpforge

Warpforge runs declarative scripts

$ cat module.wf | grep "action" -A9
    ...
    "action": {
      "script": {
        "interpreter": "/bin/bash"
        "script": [
          "echo whee",
          "echo woo"
        ]
      }
    }
# Problem Solving!
warpforge

Warpforge runs graphs of scripts

$ cat module.wf | grep "multi" -A9 -B1
    ...
    "action": {
      "multi": {
        "stepA": { ...another action... },
        "stepB": { ...another action... },
        "stepC": { ...another action... }
      }
    }
# Problem Solving!
warpforge

Warpforge gets
inputs by hash

$ cat module.wf | grep "input" -A9
    ...
    "inputs": {
      "/app/bash": "tar:2nSYg68pkhwmpBfYBrGt6bAAzAGbtUSjGJbYNFiJxkRgJX6dQdJQWrA68FWaSWg2zD",
      "/app/go":   "tar:6ATn28CVxaUH1nXGC3QGspqjexYh1rRjyMYByMVi7HqZ2wexuF9RVk6otMPjnvjJXa",
      "/task/src": "git:b951b7a3388ea4fad7ca5cb9d177ff02b7e9039a",
      "/mount/anything": "tar:XGC3QGspqjexYh1rRjyMYByMVi7HqZ2wexuF9RVk6otMPjnvjJXa",
      "/compose/easily": "tar:HqZ2wexuF9RVk6otMPjnvjJXaXGC3QGspqjexYh1rRjyMYByMVi7",
    }
# Problem Solving!
warpforge

Warpforge gets
inputs by hash

$ cat module.wf | grep "input" -A9
    ...
    "inputs": {
      "/app/bash": "tar:2nSYg68pkhwmpBfYBrGt6bAAzAGbtUSjGJbYNFiJxkRgJX6dQdJQWrA68FWaSWg2zD",
      "/app/go":   "tar:6ATn28CVxaUH1nXGC3QGspqjexYh1rRjyMYByMVi7HqZ2wexuF9RVk6otMPjnvjJXa",
      "/task/src": "git:b951b7a3388ea4fad7ca5cb9d177ff02b7e9039a",
      "/mount/anything": "tar:XGC3QGspqjexYh1rRjyMYByMVi7HqZ2wexuF9RVk6otMPjnvjJXa",
      "/compose/easily": "tar:HqZ2wexuF9RVk6otMPjnvjJXaXGC3QGspqjexYh1rRjyMYByMVi7",
    }

Wait wut

i thought you said you wanted
"easy to use"

# Problem Solving!
warpforge

... and yes, that
creates some ease-of-use
challenges.

Warpforge gets
inputs by hash

# Problem Solving!
warpforge

... and yes, that
creates some ease-of-use
challenges.

So we have to bridge that gap.
We need some API layers:
easy<->precise.

Warpforge gets
inputs by hash

# Problem Solving!
warpforge

Warpforge supports name lookups too

$ cat module.wf | grep "input" -A9
  ...
  "inputs": {
    "/app/bash": "catalog:bash:v4.12:linux-amd64-zapp",
    "/app/go":   "catalog:go:v1.19:linux-amd64",
    "/task/src": "catalog:myproject:v1000:src",
  }

... and fear not:
reproducibly, decentralized, local-first.

# Problem Solving!
warpforge

Warpforge supports name lookups too

... and fear not:
reproducibly, decentralized, local-first.

How?!

tl;dr: think like git branch names referring to hashes...
except we put that in a merkle tree itself, so it's versioned and you can fork it.

# Problem Solving!
warpforge

Warpforge saves whatever you want

$ cat module.wf | grep "outputs" -A9
    ...
    "outputs": {
      "yourlabel": {
        "from": "/this/path",
        "packtype": "tar"
      },
      "several": {
        "from": "/sure/youbetcha",
        "packtype": "tar"
      }
    }

you're not stuck with full system images!

# Tech time
warpforge

So let's get technical.

# Tech time
warpforge

So let's get technical.

[ WareID ]

at the bottom I'm going to need a

filesystem snapshot hash

# Tech time
warpforge

So let's get technical.

[ WareID ]

filesystem snapshot hash

[ Formula ]

Then, some

task declarations

only hashes.

These can use

# Tech time
warpforge

So let's get technical.

[ WareID ]

filesystem snapshot hash

[ Formula ]

task declarations

[ Workflow ]

Now, to really get things done, we need to refer to other work.

relationship declarations

and these now

So it's time for

only hashes.

... with

use names.

# Tech time
warpforge

So let's get technical.

[ WareID ]

filesystem snapshot hash

[ Formula ]

task declarations

[ Workflow ]

relationship declarations

only hashes.

... with

use names.

that

# Tech time
warpforge

So let's get technical.

[ WareID ]

filesystem snapshot hash

[ Formula ]

task declarations

[ Workflow ]

relationship declarations

only hashes.

... with

use names.

that

[ Catalog ]

Names are tricky.

reproducible resolve.

... so we're going to make a

decentralized name database

 using merkle trees.

We want 

# Tech time
warpforge

So let's get technical.

[ WareID ]

filesystem snapshot hash

[ Formula ]

task declarations

[ Workflow ]

relationship declarations

only hashes.

... with

use names.

that

[ Catalog ]

[ buildplugs ]

Template reusable parts of workflows!
Use any templating language!
It just has to emit plain JSON objects.

decentralized name database

# Tech time
warpforge

Why so many layers?

[ WareID ]

Duh; of course we need data snapshots.

[ Formula ]

[ Workflow ]

[ Catalog ]

Solves name lookup.
Is how we organize sharing!

You can save these.
These are rebuild instructions.

You can hash 'em.  Perfect memoization;
and a debugging dream.

Is where the user has power.

[ buildplugs ]

# Tech time
warpforge

Why so many layers?

[ WareID ]

Duh; of course we need data snapshots.

[ Formula ]

[ Workflow ]

[ Catalog ]

Solves name lookup.
Is how we organize sharing!

You can save these.
These are rebuild instructions.

You can hash 'em.  Perfect memoization;
and a debugging dream.

Is where the user has power.

L3+

L2

L1

L0

[ buildplugs ]

# It's a deep stack
warpforge

Warpforge lets you write high level stuff

And produces tons of hashed,
auditable checkpoints
for every step described.

# Tech time
warpforge
Let's see it:
Formulas
{
  "formula": {
    "inputs": {
      "/": "ware:tar:4z9DCTxoKkStqXQRwtf9nimpfQQ36dbndDsAPCQgECfbXt3edaE9TkX2v9"
    },
    "action": {
      "script": {
        "interpreter": "/bin/sh",
        "contents": [
          "MESSAGE='hello, this is a script action'",
          "echo $MESSAGE",
          "mkdir /out && echo $MESSAGE > /out/log"
          "echo done!"
        ]
      }
    },
    "outputs": {
      "test": {
        "from": "/out",
        "packtype": "tar"
      },
    }
  },
  "context": {
    "context.v1": {
      "warehouses": {
        "tar:4z9DCTxoKkStqXQRwtf9nimpfQQ36dbndDsAPCQgECfbXt3edanUrsVKCjE9TkX2v9": "https://warpsys.s3.amazonaws.com/warehouse/4z9/DCT/4z9DCTxoKkStqXQRwtf9nimpfQQ36dbndDsAPCQgECfbXt3edanUrsVKCjE9TkX2v9"
      }
    }
  }
}

This is the bottom.

It's just JSON.

You can see hashes going in...

{
  "guid": "cb351d5f-9b85-4404-aec9-b54cb71d249c",
  "time": 1634850353,
  "formulaID": "zM5K3Xv31xGHCtyjFXjV4kNvdMoXCdamJq5t7DvHhSuKomTuQ85QampAi4kayHuA7sRbWNh",
  "exitcode": 0,
  "results": {
    "test": "ware:tar:3vmwry1wdxQjTaCjmoJnvGbdpg9ucTvCpWzGzvtujbLQSwvPPAECTm3YxrsHnERtzg"
  }
}

And here's the JSON you get out!

# Tech time
warpforge
Let's see it:
Workflows
{
  "inputs": {
    "glibc": "catalog:warpsys.org/bootstrap/glibc:v2.35:amd64",
    "ld": "catalog:warpsys.org/bootstrap/glibc:v2.35:ld-amd64",
    "ldshim": "catalog:warpsys.org/bootstrap/ldshim:v1.0:amd64",
    "make": "catalog:warpsys.org/bootstrap/make:v4.3:amd64",
    "gcc": "catalog:warpsys.org/bootstrap/gcc:v11.2.0:amd64",
    "grep": "catalog:warpsys.org/bootstrap/grep:v3.7:amd64",
    "coreutils": "catalog:warpsys.org/bootstrap/coreutils:v9.1:amd64",
    "binutils": "catalog:warpsys.org/bootstrap/binutils:v2.38:amd64",
    "sed": "catalog:warpsys.org/bootstrap/sed:v4.8:amd64",
    "gawk": "catalog:warpsys.org/bootstrap/gawk:v5.1.1:amd64",
    "busybox": "catalog:warpsys.org/bootstrap/busybox:v1.35.0:amd64",
    "src": "catalog:warpsys.org/bash:v5.1.16:src"
  },
  "steps": {
    "build": {
      "protoformula": {
        "inputs": {
          "/src": "pipe::src",
          "/lib64": "pipe::ld",
          "/pkg/glibc": "pipe::glibc",
          "/pkg/make": "pipe::make",
          "/pkg/coreutils": "pipe::coreutils",
          "/pkg/binutils": "pipe::binutils",
          "/pkg/gcc": "pipe::gcc",
          "/pkg/sed": "pipe::sed",
          "/pkg/grep": "pipe::grep",
          "/pkg/gawk": "pipe::gawk",
          "/pkg/busybox": "pipe::busybox",
          "$PATH": "literal:/pkg/make/bin:/pkg/gcc/bin:/pkg/coreutils/bin:/pkg/binutils/bin:/pkg/sed/bin:/pkg/grep/bin:/pkg/gawk/bin:/pkg/busybox/bin",
          "$CPATH": "literal:/pkg/glibc/include:/pkg/glibc/include/x86_64-linux-gnu"
        },
        "action": {
          "script": {
            "interpreter": "/pkg/busybox/bin/sh",
            "contents": [
              "mkdir -p /bin /tmp /prefix /usr/include/",
              "ln -s /pkg/glibc/lib /prefix/lib",
              "ln -s /pkg/glibc/lib /lib",
              "ln -s /pkg/busybox/bin/sh /bin/sh",
              "ln -s /pkg/gcc/bin/cpp /lib/cpp",
              "cd /src/*",
              "mkdir -v build",
              "cd build",
              "export SOURCE_DATE_EPOCH=1262304000",
              "../configure --prefix=/warpsys-placeholder-prefix LDFLAGS=-Wl,-rpath=XORIGIN/../lib ARFLAGS=rvD",
              "make",
              "make DESTDIR=/out install",
              "sed -i '0,/XORIGIN/{s/XORIGIN/$ORIGIN/}' /out/warpsys-placeholder-prefix/bin/*"
            ]
          }
        },
        "outputs": {
          "out": {
            "from": "/out/warpsys-placeholder-prefix",
            "packtype": "tar"
          }
        }
      }
    },
    "pack": {
      "protoformula": {
        "inputs": {
          "/pack": "pipe:build:out",
          "/pkg/glibc": "pipe::glibc",
          "/pkg/ldshim": "pipe::ldshim",
          "/pkg/busybox": "pipe::busybox",
          "$PATH": "literal:/pkg/busybox/bin"
        },
        "action": {
          "script": {
            "interpreter": "/pkg/busybox/bin/sh",
            "contents": [
              "mkdir -vp /pack/lib",
              "mkdir -vp /pack/dynbin",
              "cp /pkg/glibc/lib/libc.so.6 /pack/lib",
              "cp /pkg/glibc/lib/libdl.so.2 /pack/lib",
              "cp /pkg/glibc/lib/libm.so.6 /pack/lib",
              "mv /pack/bin/bash /pack/dynbin",
              "cp /pkg/ldshim/ldshim /pack/bin/bash",
              "cp /pkg/glibc/lib/ld-linux-x86-64.so.2 /pack/lib",
              "rm -rf /pack/lib/bash /pack/lib/pkgconfig /pack/include /pack/share"
            ]
          }
        },
        "outputs": {
          "out": {
            "from": "/pack",
            "packtype": "tar"
          }
        }
      }
    },
    "test-run": {
      "protoformula": {
        "inputs": {
          "/pkg/bash": "pipe:pack:out"
        },
        "action": {
          "exec": {
            "command": [
              "/pkg/bash/bin/bash",
              "--version"
            ]
          }
        },
        "outputs": {}
      }
    }
  },
  "outputs": {
    "amd64": "pipe:pack:out"
  }
}

This is the midlevel.

It's just JSON.

(This is a real workflow, for compiling and packaging bash.)

# Tech time
warpforge
Let's see it:
Workflows
{
  "inputs": {
    "glibc": "catalog:warpsys.org/bootstrap/glibc:v2.35:amd64",
    "ld": "catalog:warpsys.org/bootstrap/glibc:v2.35:ld-amd64",
    "ldshim": "catalog:warpsys.org/bootstrap/ldshim:v1.0:amd64",
    "make": "catalog:warpsys.org/bootstrap/make:v4.3:amd64",
    "gcc": "catalog:warpsys.org/bootstrap/gcc:v11.2.0:amd64",
    "grep": "catalog:warpsys.org/bootstrap/grep:v3.7:amd64",
    "coreutils": "catalog:warpsys.org/bootstrap/coreutils:v9.1:amd64",
    "binutils": "catalog:warpsys.org/bootstrap/binutils:v2.38:amd64",
    "sed": "catalog:warpsys.org/bootstrap/sed:v4.8:amd64",
    "gawk": "catalog:warpsys.org/bootstrap/gawk:v5.1.1:amd64",
    "busybox": "catalog:warpsys.org/bootstrap/busybox:v1.35.0:amd64",
    "src": "catalog:warpsys.org/bash:v5.1.16:src"
  },
  "steps": {
    "build": {
      "protoformula": {
        "inputs": {
          "/src": "pipe::src",
          "/lib64": "pipe::ld",
          "/pkg/glibc": "pipe::glibc",
          "/pkg/make": "pipe::make",
          "/pkg/coreutils": "pipe::coreutils",
          "/pkg/binutils": "pipe::binutils",
          "/pkg/gcc": "pipe::gcc",
          "/pkg/sed": "pipe::sed",
          "/pkg/grep": "pipe::grep",
          "/pkg/gawk": "pipe::gawk",
          "/pkg/busybox": "pipe::busybox",
          "$PATH": "literal:/pkg/make/bin:/pkg/gcc/bin:/pkg/coreutils/bin:/pkg/binutils/bin:/pkg/sed/bin:/pkg/grep/bin:/pkg/gawk/bin:/pkg/busybox/bin",
          "$CPATH": "literal:/pkg/glibc/include:/pkg/glibc/include/x86_64-linux-gnu"
        },
        "action": {
          "script": {
            "interpreter": "/pkg/busybox/bin/sh",
            "contents": [
              "mkdir -p /bin /tmp /prefix /usr/include/",
              "ln -s /pkg/glibc/lib /prefix/lib",
              "ln -s /pkg/glibc/lib /lib",
              "ln -s /pkg/busybox/bin/sh /bin/sh",
              "ln -s /pkg/gcc/bin/cpp /lib/cpp",
              "cd /src/*",
              "mkdir -v build",
              "cd build",
              "export SOURCE_DATE_EPOCH=1262304000",
              "../configure --prefix=/warpsys-placeholder-prefix LDFLAGS=-Wl,-rpath=XORIGIN/../lib ARFLAGS=rvD",
              "make",
              "make DESTDIR=/out install",
              "sed -i '0,/XORIGIN/{s/XORIGIN/$ORIGIN/}' /out/warpsys-placeholder-prefix/bin/*"
            ]
          }
        },
        "outputs": {
          "out": {
            "from": "/out/warpsys-placeholder-prefix",
            "packtype": "tar"
          }
        }
      }
    },
    "pack": {
      "protoformula": {
        "inputs": {
          "/pack": "pipe:build:out",
          "/pkg/glibc": "pipe::glibc",
          "/pkg/ldshim": "pipe::ldshim",
          "/pkg/busybox": "pipe::busybox",
          "$PATH": "literal:/pkg/busybox/bin"
        },
        "action": {
          "script": {
            "interpreter": "/pkg/busybox/bin/sh",
            "contents": [
              "mkdir -vp /pack/lib",
              "mkdir -vp /pack/dynbin",
              "cp /pkg/glibc/lib/libc.so.6 /pack/lib",
              "cp /pkg/glibc/lib/libdl.so.2 /pack/lib",
              "cp /pkg/glibc/lib/libm.so.6 /pack/lib",
              "mv /pack/bin/bash /pack/dynbin",
              "cp /pkg/ldshim/ldshim /pack/bin/bash",
              "cp /pkg/glibc/lib/ld-linux-x86-64.so.2 /pack/lib",
              "rm -rf /pack/lib/bash /pack/lib/pkgconfig /pack/include /pack/share"
            ]
          }
        },
        "outputs": {
          "out": {
            "from": "/pack",
            "packtype": "tar"
          }
        }
      }
    },
    "test-run": {
      "protoformula": {
        "inputs": {
          "/pkg/bash": "pipe:pack:out"
        },
        "action": {
          "exec": {
            "command": [
              "/pkg/bash/bin/bash",
              "--version"
            ]
          }
        },
        "outputs": {}
      }
    }
  },
  "outputs": {
    "amd64": "pipe:pack:out"
  }
}

If we zoom in a bit...

This is what catalog references look like.

This is referencing other workflow step outputs.

# Tech time
warpforge
Let's see it:
Buildplugs
load("../lib.star", "plot")
load("../lib.star", "gnu_build_step")
load("../lib.star", "zapp_pack_step")

step_build = gnu_build_step(
    src=("warpsys.org/bash", "v5.1.16", "src"),
    script="""cd /src/*
    ./configure --prefix=/warpsys-placeholder-prefix 
    make
    make DESTDIR=/out install""")
step_pack = zapp_pack_step(
    binaries=["bash"], 
    libraries=[
        ("warpsys.org/bootstrap/glibc", "libc.so.6"),
        ("warpsys.org/bootstrap/glibc", "libdl.so.2"),
        ("warpsys.org/bootstrap/glibc", "libm.so.6"),
    ],
    extra_script="rm -rf /pack/lib/bash /pack/lib/pkgconfig /pack/include /pack/share")

result = plot(steps={"build": step_build, "pack": step_pack})

This is a buildplug, using Starlark.

It generates the workflow on the last slide.

You can see how this is leverage!

# Tech time
warpforge
Let's see it:
Catalogs
./warpsys.org/texinfo
./warpsys.org/texinfo/_releases
./warpsys.org/texinfo/_releases/v6.8.json
./warpsys.org/texinfo/_module.json
./warpsys.org/texinfo/_mirrors.json
./warpsys.org/texinfo/_replays
./warpsys.org/texinfo/_replays/zM5K3YjS3FJ3KTXgLoa1bpjRLCth134M1AkrDCXw5VPjq9bKgSrkSsGzAocLfELYVH6sAii.json
./warpsys.org/bash
./warpsys.org/bash/_releases
./warpsys.org/bash/_releases/v5.1.16-2.json
./warpsys.org/bash/_releases/v5.1.16.json
./warpsys.org/bash/_module.json
./warpsys.org/bash/_mirrors.json
./warpsys.org/bash/_replays
./warpsys.org/bash/_replays/zM5K3Vgei44et6RzTA785sEZGwuFV75vCazjhR11RH5veFdMTx7F5cg2c4NA5HXPK8Zv5TQ.json
./warpsys.org/bash/_replays/zM5K3aMARrWToyXjaFxxxWmYU7dZUmYp7ir5hDQtzDi2LCGPtw9PNVch9DTts9ApRyPSacJ.json
./warpsys.org/make
./warpsys.org/make/_releases
./warpsys.org/make/_releases/v4.3.json

Catalogs are a series of files.

Catalog ->
modules ->
releases ->
items.

A "c:m:r:i" tuple
resolves to a WareID.  Or to metadata about it.

📦️

# Tech time
warpforge
Let's see it:
Catalogs

Catalogs are a a merkle tree.

The logical structure looks sorta like this.

📦️

(Reminiscent of git)

# Tech time
warpforge
Let's see it:
Catalogs

Catalogs point at the content snapshot ID
(the "WareID")

... but they also point at copy of the Workflow used to Build it.

📦️

# Sharing & Caring
warpforge

So about that social angle.

I want to work in public.

I want to share work and collaborate.

Even outside my team.
Outside my org and workplace.

How?

# Sharing & Caring
warpforge

Catalogs are checkpoints for collaboration.

# Sharing & Caring
warpforge

Catalogs are checkpoints for collaboration.

# Sharing & Caring
warpforge

Catalogs are checkpoints for collaboration.

Build instructions point at Catalogs,
and Catalogs point at both content IDs,
and rebuild instructions.

That means building your dependencies recursively is optional --
It's possible, but not required.

# Zapps
warpforge

Hang on.

How are you mounting programs at any random path you want?

Programs don't work that way.

You can't just mount them wherever.

# Zapps
warpforge

Hang on.

How are you mounting programs at any random path you want?

Programs don't work that way.

You can't just mount them wherever.

Yes, we can.

# Zapps
warpforge

Let me tell you about my plan
to destroy all distros

bring about the year
of Linux on the Desktop


Make things easier
by making them less entangled.

# Zapps
warpforge

They work anywhere.
Any path.
No install hooks.
Even on read-only mounts.
 

This means you can solve building&packaging once...

And ship anywhere.

Long story short:

Zapps are a way to package linux software.

# Zapps
warpforge

Okay, sorry, this talk isn't about that.
Not enough time.

Check out the website.
Or prior talks!

# Comparable projs
warpforge

Surely this isn't the first
time this is tried.

Okay, there's a few almost-similar projects.

# Comparable projs
warpforge

Warpforge is a bit like Bazel.

And a bit like Nix/Guix.

# Comparable projs
warpforge

Warpforge is a bit like Bazel.

Except... Remember this diagram?

And a bit like Nix/Guix.

# Comparable projs
warpforge

Here's what that diagram is like in
Nix/Bazel/Guix/etc:

# Comparable projs
warpforge

Catalogs and
Content-Addressing
make Warpforge
distinctive.

Warpforge can
choose to build or fetch ... on the fly.

In other systems...
that's a rewrite of your build instructions.

# Comparable projs
warpforge

Warpforge can describe things that build themselves.

Bazel, Nix, Guix, etc...
can't do this,
can't even describe this.

Their build instructions only depend on other build instructions,
not on actual content.

# Comparable projs
warpforge

Warpforge vs Bazel

We hope Warpforge can succeed in building bigger ecosystems, where Bazel tends to only survive in monorepos.

Content-addressed checkpoints -> Shareable.
Rebuild instructions rigorous but optional.

Based only around building,
so every team using it does their own bootstrapping.
Encourages monorepos.

Simply executes other things.

Magic built-ins for what it supports.

# Comparable projs
warpforge

Warpforge vs Nix

Warpforge aims for easier use, and more decentralization.

Based only around building.
(Similar story to Bazel.)

Plain JSON API.

Or: Python-like syntax, if you want it.
Or: Bring your own templating.

Nix uses the Nix language.
I cannot comment on this.
(But many other people have...)

Content-addressed checkpoints -> Shareable.
Rebuild instructions rigorous but optional.

# Comparable projs
warpforge

Warpforge vs Nix

Warpforge aims to end the distro wars.

Not just build another one.

Warpforge introduces Zapps.

These make packages you build with Warpforge+Zapps instantly usable in any other Linux environment.

Nix builds NixOS.


NixOS has a series of paths starting with `/nix/` and
is a whole distro.

# Comparable projs
warpforge

What can we see from these projects?

  1. It is possible to take an ambitious stance,
    and deliver on it.
  2. Offering a plain JSON API is (surprisingly)
    going to be a novel feature in this space.
  3. The real UX challenge is making it possible
    for people to collaborate... without ending
    up stuck working together in lockstep.

We've tried to learn from this in Warpforge, and earnestly believe we offer improvements.

# Future Work
warpforge

The Future

# Future Work
warpforge

We've already shipped:

  • Core tools that work!
     
  • Several dozen real packages!
     
  • A prototype of
    Starlark, as a batteries-included templating system!
     
  • Catalogsite!

We're gonna need:

  • many more packages!
     
  • bridges to existing systems
     
  • ecosystem tools like auto-rebuilders and version bumpers
# Getting Involved
warpforge

Okay, I'm in!
How can I help?

Gosh, I'm glad you asked.

# Getting Involved
warpforge

Technology is great

We have to:

  • Meet people where they're at.
  • Be easy to extend when we're
    not there yet!
  • Build a community that pushes
    forward!

but it's not enough

# Getting Involved
warpforge

Here's where we made Warpforge ready to extend

# Getting Involved
warpforge

"tar:asdfqwer"

^ Ware/filesystem packing/hashing/transporting is pluggable.

The string before the ":" is the discriminator.

# Getting Involved
warpforge

{"action":{"script":{...

^ The "script" keyword here is another plugin point.

We can introduce other "executors" and trigger them with other keywords in this position.

# Getting Involved
warpforge

Buildplugs!

The Starlark system we're shipping can load libraries.


Also, you can bring your own entire templating engine.  Can it emit JSON on stdout?  Then it'll do.

# Getting Involved
warpforge

Bifrosts!

Many existing ecosystems exist and have lots of content.

Let's bring build tools to import content IDs into catalogs.

Let's build tools to import their lockfiles into workflows.

# Getting Involved
warpforge

Packaging!

Just by joining us in building stuff
with Warpforge,
you're helping!

# Getting Involved
warpforge

Other ecosystem tools!

Autorebuilders,
CI integrations,
dashboards,
etc!

# Getting Involved
warpforge

Here's how you can help:

If you've got time and will hack:

Now you see the extension points!

Come!  Build a bifrost!
Or hack on the core!
Or package stuff!
Or help write docs!

If you've got money but not time:

Warpforge is a 501c3.
We can accept charitable contributions and make sure it supports other people to contribute their time and skills.

# Getting Involved
warpforge

If you build in Warpforge,
and package with Zapps...

You can immediately use the results anywhere else.

Reminder:

# Getting Involved
warpforge

Here's all the links:

Docs:
http://warpforge.io/

Code:
https://github.com/warptools/warpforge/

Catalogs:

https://catalog.warpsys.org/
https://github.com/warptools/catalog/

Find links to our chat
in the Community page
on the website!

Zapps:
https://zapps.app/

lil' ol me: @warpfork

trashheap

 

# Dreams of Better
warpforge

"one tool the rule them all"
... is probably a bit silly.

But we actually have some of the upsides of that.

When I read travis.yml or the .github dir:
that's the upside.

I just want that... on localhost.

# Dreams of Better
warpforge

This isn't just a problem of where you run it.

It's a problem of how much that "where" means.

Github CI, travis, etc... they also have their own caches.

That's why a dockerfile on localhost isn't really helpful progress.  Distinct cache.

Nobody's API talks about this, but it's essential to real behaviors.

# Ikke Containers
warpforge

(Hang on -- what's a "lockfile"?)

  • Contains specific versions -- it's after version selection processes.
  • Usually with hashes -- not meant to change.
  • Typically exists in tandem with some other spec that says what to select.
# Ikke Containers
warpforge

(Hang on -- what's a "lockfile"?)

Putting an @{hash} at the end of an image name is version lock for shipping the image.
 

It's not a lockfile for anything that goes into building that image.

And what do you mean containers don't have them?

# Tech time
warpforge

So let's get technical.

[ Workflow ]

relationship declarations

use names.

that

[ Catalog ]

[your-templating-here]

This is where version selection happens.

This is where you plug in linters and other reusable process decorations.

decentralized name database

Generate

referencing

# Dreams of Better
warpforge

"Containers, but more like legos,
less like balls-of-mud"

# Dreams of Better
warpforge

I want to build a whole decentralized,

low-coordination,

high-sharing,

ecosystem.

Local-first.

Offline-ready.

High-productivity.

Works for anyone.

Reproducibly.

# Dreams of Better
warpforge

I want to build a whole decentralized,

low-coordination,

high-sharing,

ecosystem.

... and I'm going to need a lot of hashes to do it.

It's just that we need to break the problem down well to get there.

# Ikke Enkelt
warpforge

What can be unpredictable in these scenarios?

  • How online are your tools?
    • Online content fetch?
    • Online name resolution?
  • Are all the tools you use using lockfiles?
  • If there are multiple tools: are those (and their lockfiles) themselves versioned?
  • Are your compilers deterministic?
    • Are you sure it's not doing something conditional upon host libraries or config?
# Tech time
warpforge

What's the scope?

Do:

Don't:

  • build anything.
  • lockfiles for all!
  • focus on collab.
  • give people tools
    to break down
    their problems.
  • work offline,
    and predictably.
  • don't try to replace existing tools -- just give them a sandbox.
  • don't try to replace version selection algos -- too many opinions; we don't have one.

Hi

I'm a software engineer

# Hi!

This is a story of my journey
to try to bring sanity to my world
to make things comprehensible

to make anything I built once...
be buildable again.

And also, just maybe, do it in a system I can use for anything.

Okay, also, and be sharable.  With anyone.

# Case Studies
warpforge

Let's do a quick
case-study of

other build/package tools.

# Case Studies
warpforge

there's a file that says what you want

the order of that file generally doesn't matter; it's declarative, and all requests are commutative

(in good systems) there's a lockfile

# Case Studies
warpforge

And let's do a quick
study of

existing container tools.

# Case Studies
warpforge

tl;dr baking big balls of mud suxors

isolation is good but getting a bucket of mud from outside and dumping it in as the first step... kind ruins the point.

# Case Studies
warpforge

in neither of these does signing show up as important.

it's there somewhere.  but it's tacked on way at the end.

this is correct.  signing becomes a lot less necessary when you do hashing throughout the protocol.

(sidenote: signing)