warpforge

Build Anything

Build Anything

{fast,repeatedly,hermetically,reliably, withfriends}

--

hi
i'm warpfork

github.com/warpfork

twitter.com/warpfork

slides.com/warpfork

# I want...

I have software I want to build

# I want...

I have software I want to build

use!
 

─────

# I want...

I have software I want to build

use!
distribute!  install!

─────

compile!

okay, patch!

then compile and use!

and share..!

# I want...

I want to Build Anything

and

I want to Install Anywhere.

(right now) Everything sucks

Fixing builds
so we can share them

Fixing packaging so we can share things

Fixing software supply chains

(right now) Everything sucks

Fixing builds
so we can share them

Fixing packaging so we can share things

Fixing software supply chains

warpforge tools
intervene here

warpsys conventions
have hints here

# I want...

Building software is (weirdly) difficult

it shouldn't be

it is
 

# I want...

Building software is (weirdly) difficult

  • it's typical to go look at a project readme...
     
  • manually install dependencies on your computer...
     
  • issue incantations, and pray for success

a fundamentally unscientific process

# I want...

Building software is (weirdly) difficult

it shouldn't be

it is

and god help you when it comes to pursuing reproducibility...

because current tools sure won't help you.

Anecdata

i try to build unigraph from source...

i need to install gcc for one part...

there's a makefile that tries to get jemalloc...

it asks for root??

if you apt install it ahead of time, it's fine

(but this isn't really documented or clear -- you just peek-and-poke try shit and hope)

and i need 

i try to build unigraph from source...

and i need .... in addition to the source ...

  1. nodejs
  2. ...nvm?  maybe?
  3. yarn
  4. several hundred packages within yarn
  5. wait where did yarn come from?  npm lol
  6. no, kidding, yarn from corepack; corepack from npm
  7. jemalloc
  8. make
  9. gcc
  10. go
  11. another source repo (dgraph)
  12. bash, implicitly
  13. you def used a system pkgman for gcc
  14. how many other system c libraries, implicitly...

*(p.s. i'm not picking on unigraph -- it's a super cool project.  this is unfortunately _normal_.)

How many different things that SHOULD be version controlled TOGETHER did I just interact with SEPARATELY?

# I want...

even Installing software is (weirdly) difficult

it shouldn't be

it is

let's take a practical walk through real experiences...

i try to install emacs on ubuntu...

and then i find either...

apt can give me an outdated version...

or flatpack can give me something...

but it's in weird sandboxes that i didn't ask for and don't want...

or i can install a ppa...

from some random person...

and then actually that doesn't work anyway...

also it somehow broke my apt??  oh wow okay, guess i'll reinstall, holy @#$% cool

okay so maybe i'll use guix, that's declarative and should be good right?

first i'll learn a lisp dialect...

then i will CRASH AND BURN and FAIL TO BOOT
because there aren't even placeholder graphics drivers for any of the hardware i have, lol...

(i appreciate the freedom-maxi attitudes but sometimes they really are not helpful, and yes, i gave up guix bc of this, i'm sorry, i really am)

and i try to find solutions on the mailing list...

and they are all deleted and surpressed...

because it would involve discussion of non-free software...

Why can't I just install things?

Why is this harder than unpacking a tarball?

Why are all these install systems trying to own my whole host and be my whole world?

composable systems

we need

# I want...

distros tried to solve this

.... but I don't think any of them really did.

  • Distros have opinions and require buy-in.
    • Something cannot be a universal build tool if it needs to assume command of the whole machine.
  • Practically: library paths are often a "global variable" that leads to conflicts and collisions... then, balkanization.
  • Questionable even as install tools...
    • Often try to make sure you only have one of a thing.  ...Uh, I don't want that?
# I want...

distros tried to solve this

.... but I don't think any of them really did.

There's a difference between
{trying to make everything fit into your pocket universe}
vs
{trying to make things anyone can use in any universe}.

Distros generally do the former.
I want the latter.

# Everything Sucks

But containers...?

Nope.

# Everything Sucks

But containers...?

Having a clean room is great.

Building a clean room, then getting a bucket of mud from outside the door, and upending the whole thing on the floor...

...defeats the purpose.

Containers

and i'm sorry, that doesn't really work for me

are totally uncomposable as an install story.

(at least as they have been mass-popularized, so far)

(unless the thing you're installing is a microservice connected only by networking)

composable systems

we need

Quick Compare

1. notdocker

Docker -- but not

Warpforge uses containers.

 

But it allows granular and flexible constructions -- not just big unwieldy monolithic images.

(And it doesn't try to centralize your image storage.)

2. notbazel

Bazel -- but not

Warpforge is a build sandbox.

Warpforge is also always hermetic, and it has a lot of tools for sharing snapshots -- both of data and build instructions.  Even if you don't have a monorepo!  And lets you choose to build or fetch.

3. notnix

Nix -- but not

Warpforge is for functional, declarative builds.

Warpforge is JSON APIs.  (If you want a language: Warpforge supports Starlark!)

Warpforge is content-addressed.

# COMPARE

(let's just get this out of the way...)

# Obligatory XKCD Diffusal

XKCD#927 is nice and all.

But sometimes we try to make things better, anyway.

i am going to offer you

> warpforge

warpsys <

# THE OFFER

A tool for building.

Neither demands the other.

A style for packaging.

and

... or anything else, either.

i am going to offer you

> warpforge

warpsys <

# THE OFFER

Solves "build anything" and "version together"

Neither demands the other.

and

... or anything else, either.

Solves "install anywhere" and "composable systems"

What we're solving

> Builds

Packaging <

# THE OFFER

Both.  At the same time.
 

(There's a reason for this, which will become clear shortly.)

What we're solving

> Builds

Packaging <

# CHAPTER 2

Let's talk about builds, first.

That means: Warpforge.

Let's see it go!

Goals

# GOALS
  1. Zero parameter, total environments — run, done
  2. SBOMs for everyone — and they must be loadbearing
  3. Completely decentralized — hashes hashes hashes
  4. Granular inputs — not just monolithic images
  5. Input assembly — put things where you want them
  6. Granular outputs — save what you need, not more
  7. Communicable results — readable names <-> hashes; and share those mappings easily
  8. Be friendly to private work, public work, and teamwork — "communicable results" must be decentralized too
  9. Graphs and pipelines — complex multi-step processes
  10. Widely usable — could appear cozily in every git repo
  11. Unobtrusive — use it if you want to; or, don't

Goals

# GOALS
  1. Zero parameter, total environments — run, done
  2. SBOMs for everyone — and they must be loadbearing
  3. Completely decentralized — hashes hashes hashes
  4. Granular inputs — not just monolithic images
  5. Input assembly — put things where you want them
  6. Granular outputs — save what you need, not more
  7. Communicable results — readable names <-> hashes; and share those mappings easily
  8. Be friendly to private work, public work, and teamwork — "communicable results" must be decentralized too
  9. Graphs and pipelines — complex multi-step processes
  10. Widely usable — could appear cozily in every git repo
  11. Unobtrusive — use it if you want to; or, don't

formulas

}

}

}

catalogs

plots

}

replays

Goals

# GOALS
  1. Zero parameter, total environments — run, done
  2. SBOMs for everyone — and they must be loadbearing
  3. Completely decentralized — hashes hashes hashes
  4. Granular inputs — not just monolithic images
  5. Input assembly — put things where you want them
  6. Granular outputs — save what you need, not more
  7. Communicable results — readable names <-> hashes; and share those mappings easily
  8. Be friendly to private work, public work, and teamwork — "communicable results" must be decentralized too
  9. Graphs and pipelines — complex multi-step processes
  10. Widely usable — could appear cozily in every git repo
  11. Unobtrusive — use it if you want to; or, don't

formulas

}

}

}

catalogs

plots

}

replays

}

and a fractal of "don't fuck up"

(also, warpsys conventions)

}

Goals

# GOALS
  1. Everything must be API driven -- messages, not proglangs.
    • Should be easy to give instructions via API messages
    • Should be easy to get results via API messages
    • Should be easy to debug via printing API messages
    • Should be easy to make audit logs of API messages

Goals

# GOALS

... due credit to every other project in this general space:

even if you don't pick all of those, it's a LOT of goals.  This work is HARD work.

Formula

RunRecord

L1

L2

Plot

Catalog, Release

Replay

L3+

(your templating here)

Starlark

Whatever

# WFAPI
Warpforge API Layers

... are probably the best way to understand it.

 <- Simple ---- Fun ->

Formulas
# WFAPI:L1

an API concept for
hashes->build->hashes

Formulas
{
  "formula": {
    "inputs": {
      "/": "ware:tar:57j2Ee9HEtDxRLE6uHA1xvmNB2LgqL3HeT5pCXr7EcXkjcoYiGHSBkFyKqQuHFyGPN"
    },
    "action": {
      "script": {
        "interpreter": "/bin/sh",
        "contents": [
          "mkdir /out && echo 'heyy' > /out/log"
          "echo done!"
        ]
      }
    },
    "outputs": {
      "yourlabel": {"from": "/out", "packtype": "tar"},
    }
  },
  "context": {
    "warehouses": {
      "tar:57j2Ee9HEtDxRLE6uHA1xvmNB2LgqL3HeT5pCXr7EcXkjcoYiGHSBkFyKqQuHFyGPN": "https://dl-cdn.alpinelinux.org/alpine/v3.15/releases/x86_64/alpine-minirootfs-3.15.0-x86_64.tar.gz"
    }
  }
}
# WFAPI:L1
Formulas: you run them
# WFAPI:L1
Formulas -> RunRecord
{
    "guid": "2fa27e0a-63ab-4924-9b8a-6e2740b983a6",
    "time": 1652997810,
    "formulaID": "zM5K3SBybw7WHvL8sgFNu8NbL67Qh9nEv7u7C22grAiF4QhriUUK8UMEHjesYW14XrHwywL",
    "exitcode": 0,
    "results": {
        "out": "ware:tar:5bv6aKWCUrYzyCzjXKqzT4VkPwwnhadWP1ZMWXvSFKpZGxgZFGvfaHsVEvkgmWzQD"
    }
}
# WFAPI:L1

The "results" map has keys according to your formula's output section.

Some timestamps make each runrecord unique.

(This uniqueness is useful if you want to, for example, store many of them, to use as evidence of reproducibility.

You can find these also in the
`.warpforge/memos` dir,
indexed by formulaID!

Formulas -> RunRecord
# WFAPI:L1

Formulas and RunRecords are always going to be at the bottom of everything you do in Warpforge.

That means everything you do is:
1) introspectable
2) repeatable
3) easy to check for determinism & reproducibility

Where did that run?
# WFAPI:L1

We used containers here!

> I thought you said containers are lame?

No, monolithic images are lame.
Containers are great!

We gave containers a composable input system.

Formulas: one step at a time
# WFAPI:L1

Formulas are the atom of reproducibility and of action in this system.


They're also verbose and limited:

  • they do one thing at a time;
  • copypasting hashes is painful.

So we need another layer of abstraction to get work done...

# WFAPI:L*

sidenote: about being API driven:

Plots
# WFAPI:L2

for when you need
multiple steps, related by pipes

Plots
# WFAPI:L2
  • We need something that can do multiple steps, and graphs of steps.
     
  • We also want better usability than copypasting hashes, which means we need to introduce ways to use human-readable names for inputs.

Plots do both.

(and a bit more)
((ingests))

Plots
{
  "inputs": {
    "thingy": "ware:tar:qwerasdf"
  },
  "steps": {
    "one": {"protoformula": {
        "inputs": {
          "/": "pipe::thingy"
        },
        "action": {
          "exec": {"command": ["/bin/echo", "hi"]}
        },
        "outputs": {
          "stuff": {"from": "/", "packtype": "tar"}
        }
    }},
    "two": {"protoformula": {
        "inputs": {
          "/": "pipe::thingy"
          "/prev": "pipe:one:stuff"
        },
        "action": {
          "exec": {"command": ["/bin/echo", "hi"]}
        },
        "outputs": {
          "stuff": {"from": "/", "packtype": "tar"}
        }
    }}
  },
  "outputs": {
    "foo": "pipe:one:stuff"
    "bar": "pipe:two:stuff"
  }
}
# WFAPI:L2
... provide graphs and pipes:
Plots: you run them
# WFAPI:L2
Plots: you run them
# WFAPI:L2

shown above: output of first step used as input to second step -- see the same hash?

Plots
{
  "inputs": {
    "localname": "ware:tar:9fttDVHncJnoU9gLHdRybpz9eSw12xau9AsJzZTNoQamxFNjX6s6HNTj7a8wmYjCvg"
  },
# WFAPI:L2
... can import from catalog refs:

Our first plot example still used wareIDs:

But you can also use human readable names!
This uses the "catalog" system (more on that shortly):

{
  "inputs": {
    "localname": "catalog:warpsys.org/glibc:v2.35:ld-amd64"
  },
Plots
{
  "inputs": {
    "localname": "ware:tar:9fttDVHncJnoU9gLHdRybpz9eSw12xau9AsJzZTNoQamxFNjX6s6HNTj7a8wmYjCvg",
    "othername": "catalog:warpsys.org/glibc:v2.35:ld-amd64"
  },
# WFAPI:L2
... can use ingests (!!):

Now we've seen wareIDs and catalog lookups...

Plots can also ingest data from the host context:

{
  "inputs": {
    "veryhandy": "ingest:git:.:HEAD"
  },

This will get converted to WareID, just like everything else.

...it's time to handle the communication parts!

harken back to our goals slide:

Goals

  1. Zero parameter, total environments — run, done
  2. SBOMs for everyone — and they must be loadbearing
  3. Completely decentralized — hashes hashes hashes
  4. Granular inputs — not just monolithic images
  5. Input assembly — put things where you want them
  6. Granular outputs — save what you need, not more
  7. Communicable results — readable names <-> hashes; and share those mappings easily
  8. Be friendly to private work, public work, and teamwork — "communicable results" must be decentralized too
  9. Graphs and pipelines — complex multi-step processes
  10. Widely usable — could appear cozily in every git repo
  11. Unobtrusive — use it if you want to; or, don't

formulas

}

}

}

catalogs

plots

}

replays

Catalogs
# WFAPI:L2

for communicating releases!

a plot's best friend

Catalogs
# WFAPI:L2
  • We need to communicate the identify of data we've produced, after giving it human-readable names.
     
  • We need to communicate the rebuild instructions for that data.
     
  • We need to provide a story for what "updating" is like.
     
  • Probably some metadata is nice, too.

Catalogs do all this.

Catalogs
# WFAPI:L2
{
    "name": "warpsys.org/python",
    "releases": {
        "v3.10.2": "bafyrgqg6ckjrdt4d4duvdyzz2yz2gdsupo4isplfyy36yd6zfinucp2o7ixcwpbhlzlndrvaa2i6kzavliqpb3v7fmbkrazbcfwvmjx3e4awk"
    },
    "metadata": {}
}
  • They're a merkle tree!
  • Root document...
    • Name pt1: Module name
    • Name pt2: Release name
Catalogs
# WFAPI:L2
{
    "name": "v3.10.2",
    "items": {
        "src": "tar:3JnLd5GX5S4ndghJvKAxghhLiheCap7X4dkEDW9FTH17cM1GuDAJRWpEgmPKCQqHbZ",
        "amd64": "tar:6iZRkZrFzkTzV7VxkH6bjNrEcjv7scL9Co24Z1eHg8mSvVWBhJyFUs5DJVrJuWGTkg"
    },
    "metadata": {
        "replay": "bafyrgqfn5pn46zswpq57qseoon7xwwa3shhbtbszdccnxgkox7tfyelb5opwtpvwymzvajkojcwf2ibfciniiqgtmk4gthbfztbn5uibl7s44"
    }
}
  • Release document:
    • Name pt3: Item labels (src, bin-amd64, etc)
    • Content hashes!  ("WareID")
    • More metadata... including... Replay?  Replay!

(we'll come back to replays in a moment)

Catalogs
# WFAPI:L2
{
    "byWare": {
        "tar:3JnLd5GX5S4ndghJvKAxghhLiheCap7X4dkEDW9FTH17cM1GuDAJRWpEgmPKCQqHbZ": [
            "https://www.python.org/ftp/python/3.10.2/Python-3.10.2.tar.xz"
        ]
    }
}
  • Mirrors document:
    • Gives hints about where you can fetch stuff!
    • Can ignore this if (re)building everything yourself.
    • Can use URLs, or content-addressed buckets...

boring, but practical.

Plots Using Catalogs
# WFAPI:L2

So remember seeing catalog references in plots?
Now you can see how those are looked up.

{
  "inputs": {
    "localname": "catalog:warpsys.org/glibc:v2.35:ld-amd64"
  },

Each part of the "catalog:name:version:label" string turns into one step of lookup in this structure.

Catalogs on the filesystem
# WFAPI:L2
./.warpforge/catalogs/warpsys/warpsys.org/gnu/binutils
./.warpforge/catalogs/warpsys/warpsys.org/gnu/binutils/releases
./.warpforge/catalogs/warpsys/warpsys.org/gnu/binutils/releases/v2.38.json
./.warpforge/catalogs/warpsys/warpsys.org/gnu/binutils/replays
./.warpforge/catalogs/warpsys/warpsys.org/gnu/binutils/replays/v2.38.json
./.warpforge/catalogs/warpsys/warpsys.org/gnu/binutils/module.json
./.warpforge/catalogs/warpsys/warpsys.org/python
./.warpforge/catalogs/warpsys/warpsys.org/python/module.json
./.warpforge/catalogs/warpsys/warpsys.org/python/replays
./.warpforge/catalogs/warpsys/warpsys.org/python/replays/v3.10.2.json
./.warpforge/catalogs/warpsys/warpsys.org/python/mirrors.json
./.warpforge/catalogs/warpsys/warpsys.org/python/releases
./.warpforge/catalogs/warpsys/warpsys.org/python/releases/v3.10.2.json

Straightforward.

Easy to put in, say, a git repo.

(Also has its own canonical merklehash.)

Catalogs from existing data
# WFAPI:L2

Getting existing stuff into the warpforge universe is important -- and easy.

Gadgets like `warpforge catalog ingest-git-tags` help:

Recap so far...
# WFAPI:L2
  • Formula -- do one specific thing
     
  • Plot -- define a graph of stuff to do;
    produces a series formulas to evaluate
     
  • Catalog -- handles communication;
    lets you stitch plots together!
Replays
# WFAPI:L2

when you want to explain what you did

or do it again
and again
and again
and again

Replays
# WFAPI:L2
  • We need to communicate the rebuild instructions for things we've done.

Replays (stuffed in a Catalog) do this.

Replays: we already had them
# WFAPI:L2

This is kinda cheating: we already had this.


Our original build instructions -- the plot format -- are the replay instructions.

Replays: we already had them
# WFAPI:L2

This is kinda cheating: we already had this.

Sorta.
 

  1. Freeze any ingests.
  2. Okay, now you have it.
Replays: we already had them
# WFAPI:L2
{
    "inputs": {
        "rootfs": "catalog:warpsys.org/debian-bootstrap:bullseye-1646092800:amd64",
        "src": "ingest:git:.:HEAD",
        "glibc": "catalog:warpsys.org/glibc:v2.35:amd64",
        "ld": "catalog:warpsys.org/glibc:v2.35:ld-amd64",
        "ldshim": "catalog:warpsys.org/ldshim:v1.0:amd64"
    },
    "steps": {
         // ...
    },
    "outputs": {
        "src": "pipe::src"
        "amd64": "pipe:build:out"
    }
}
{
    "inputs": {
        "rootfs": "catalog:warpsys.org/debian-bootstrap:bullseye-1646092800:amd64",
        "src": "catalog:example.org/this:v3.10.4:src",
        "glibc": "catalog:warpsys.org/glibc:v2.35:amd64",
        "ld": "catalog:warpsys.org/glibc:v2.35:ld-amd64",
        "ldshim": "catalog:warpsys.org/ldshim:v1.0:amd64"
    },
    "steps": {
         // ...
    },
    "outputs": {
        "src": "pipe::src"
        "amd64": "pipe:build:out"
    }
}

Can even handle ingests by rewriting them to a self-reference.

Replays on the filesystem
# WFAPI:L2
./.warpforge/catalogs/warpsys/warpsys.org/gnu/binutils
./.warpforge/catalogs/warpsys/warpsys.org/gnu/binutils/releases
./.warpforge/catalogs/warpsys/warpsys.org/gnu/binutils/releases/v2.38.json
./.warpforge/catalogs/warpsys/warpsys.org/gnu/binutils/replays
./.warpforge/catalogs/warpsys/warpsys.org/gnu/binutils/replays/v2.38.json
./.warpforge/catalogs/warpsys/warpsys.org/gnu/binutils/module.json
./.warpforge/catalogs/warpsys/warpsys.org/python
./.warpforge/catalogs/warpsys/warpsys.org/python/module.json
./.warpforge/catalogs/warpsys/warpsys.org/python/replays
./.warpforge/catalogs/warpsys/warpsys.org/python/replays/v3.10.2.json
./.warpforge/catalogs/warpsys/warpsys.org/python/mirrors.json
./.warpforge/catalogs/warpsys/warpsys.org/python/releases
./.warpforge/catalogs/warpsys/warpsys.org/python/releases/v3.10.2.json

You already saw where replays go, in catalogs:

The hash of the replay is also in the release metadata.

Replays: multiroutes fine
# WFAPI:L2

Notice that replays contain the whole plot...including its inputs, including if they're catalog references.

Impact?

We can do "recursive explain" of how something (and its dependencies, and granddeps, etc) are built...

... and it's defined, even if there's more than one way to produce the same content.

Replays
# WFAPI:L2
...mean you can ask for recursive build

If you `wf run`, and the required inputs aren't local, and can't be fetched...

If there's a replay?  No problem, we can just
`wf run -r`

It'll rebuild the required inputs from the replay instructions.

Workspaces
# WFAPI:WORKSPACES

The scope in which stuff happens!

Workspaces
# WFAPI:WORKSPACES

sharing is important

 

being able to have separation, even locally,
is important too

The scope in which stuff happens!

Workspaces
# WFAPI:WORKSPACES
  • A workspace is any dir that contains a `.warpforge/` dir.
     
  • A workspace contains a catalog!
    Workspaces are the name lookup scope.
     
  • Every plot needs a workspace.
  • A workspace can contain many plots.
     
  • Each workspace should be independent -- contain its own copy of catalog data for any plots within it.
Workspaces
# WFAPI:WORKSPACES
  • Workspaces can be nested:
    • parent workspaces can supply catalog data to children (but you'll still be nudged to vendor).
  • Root workspace can contain multiple catalogs...
    • This is where you fetch public published catalogs!
       
  • Declare a new root workspace if you want to work in a team/company environment and not let your home workspace polute it (or vice versa).
    • How?  `touch .warpforge/root`!
Workspaces (shown by `status`)
# WFAPI:WORKSPACES
$ wf status
Module "warpsys.org/python":
    Plot has 5 inputs, 2 steps, and 1 outputs.
    Plot contains 5 catalog inputs.
      5/5 catalog inputs resolved successfully.

Workspace:
    /projects/warpsys/python (pwd, module)
    /projects/warpsys (root workspace, git repo)
# WFAPI:WORKSPACES
Workspaces vs Source Control
  • Norm: Put a workspace in the root of every git repo.
    • Why?  You need a merkle tree that includes the root of the catalog data tree, as well as your plots.
       
    • Warpforge will warn you if it has to search above a git repo root to find the nearest workspace.
       
  • More than one workspace per repo?  Sure.
  • Workspaces not at the repo root?  Sure.
  • Non git VCS?  Sure.  (You just don't get warnings against questionable choices.)
# WFAPI:L3+

going further:

yes, you probably want to do LOTS of builds

yes, you'll probably want to hammer the keyboard less --
so, you want templating systems,
and ways to compose content logically

so...

Yes, you probably want another layer, yet.
Something that generates these graphs.

# WFAPI:L3+

going further:

yes, you probably want to do LOTS of builds

yes, you'll probably want to hammer the keyboard less --
so, you want templating systems,
and ways to compose content logically

so...

Yes, you probably want another layer, yet.

Bring your own.
These objects are easy to generate.

Templating
# WFAPI:L3+

it's time for templating

(there can be more than one right answer here)
((but we're going to provide at least one))

Larks
# WFAPI:L3+
  • tl;dr Starlark
    • ... the same language as Bazel uses.
    • It's essentially a python dialect.
  • Warpforge will provide this as a "batteries included" options for composing plots.
     
  • It still just emits plots.
    • Replays work the same.
    • We're not undoing any strictures.
    • We're not using any magic.
      (It's just "batteries included".)
Larks
# WFAPI:L3+

Integrated starlark features are a work-in-progress.

create_plot("warpsys.org/example/thingy", wfapi.Plot(_={
	inputs: #...
	outputs: #...
})
	+ mylib.partial_plot(...)
	+ mylib.other_stuff(...)
)

def reusable_stuff(patchme)
	return patchme.action.script.append("/some/hook --do-stuff")

# much to evolve here

Might look a bit like this?
(Check back soon.)

Formula

RunRecord

L1

L2

Plot

Catalog, Release

Replay

L3+

(your templating here)

Starlark

Whatever

# WFAPI
(api layers recap)

Formula

RunRecord

L1

L2

Plot

Catalog, Release

Replay

L3+

(your templating here)

Starlark

Whatever

# WFAPI
(api layers recap)

do stuff

...graphs of it

...however you want!

# WARPFORGE ROCKS

cOOL stuff time

  • "ferk"
     
  • localhost CI?!
     
  • btw rootless
# WARPFORGE ROCKS

ferk

tl;dr: it's a mode for quick, cheap, interactive containers.  For play!

# WARPFORGE ROCKS

localhost CI

tl;dr: this functionality comes "for free" once the whole plot and ingests system was done...
but gosh is it handy

{{no picture found}}
{{ask for demo later}}

# WARPFORGE ROCKS

rootless containers

Did I mention every time we're invoking containers, they're rootless containers?

You can run `warpforge` without ever invoking sudo.

That was warpforge.

> warpforge

warpsys <

# THE OFFER

A tool for building.

A style for packaging.

Next: time for some words on packaging...

What we're solving

> Builds

Packaging <

# CHAPTER 3

Both.  At the same time.
 

Yes, we are starting to package things...

Packaging <

# PACKAGING

We don't actually care much...

Except for one thing:

# PACKAGING

Thou shalt

BE

PATH-AGNOSTIC.

# Everything Sucks
  • Package management is a mess.
  • Package installation is a mess.
  • Everything is held together by global variables (the paths under `/lib/*` !).
  • Every package management system is completely balkanized from all others.
  • Language pkgman and OS pkgman aren't on speaking terms.

Nobody can collaborate in these systems.

Let's talk about the current world:

not builders.

not users.

# Everything Sucks
  • Package management is a mess.
  • Package installation is a mess.
  • Everything is held together by global variables (the paths under `/lib/*` !).
  • Every package management system is completely balkanized from all others.
  • Language pkgman and OS pkgman aren't on speaking terms.

Nobody can collaborate in these systems.

Let's talk about the current world:

not builders.

not users.

<-

let's fix this

# Everything Sucks

Removing the global variables of library paths...

Library paths are a global variable.

As a global variable, they're a source of contentions and collisions.

These contentions result in distros
-- inevitably, no matter how much they might value collaboration --

trending towards balkanization.

Fixes this.
It lets us collaborate again.
It restores balance to the force.

Also, practical confession...

We want the above style of input assembly to be easy.

"Put stuff in whatever path you want"
just makes sense.

{
  "formula": {
    "inputs": {
      "/app/gawk": "ware:tar:1xvmNB2LgqL3HeT5pC57j2Ee9HEtDxRLE6uHAXr7EcXkjcoYiGHSBkFyKqQuHFyGPN"
      "/app/gcc": "ware:tar:57j2EL3HeT5pCXr7EcXkjcoYiGHSBkFe9HEtDxRLE6uHA1xvmNB2LgqyKqQuHFyGPN"
      "/app/python": "ware:tar:RLE657j2Ee9HEtDxuHA1xvmNXkjcoYiGHSBkFyB2LgqL3HeT5pCXr7EcKqQuHFyGPN"
    },
    "action": {
      "script": {
        "interpreter": "/bin/sh",
        "contents": [
          "export PATH=/app/*/bin/"
          "awk    # should work"
          "gcc    # should work"
          "python # should work"
        ]

"path agnostic"

  • means: doesn't matter where you install it: it just works.
     
  • implies: zero-step installs.
     
  • ideally: no post-install hooks.  works RO.
    • ^ very good for containers: bind mount and go!  (no overlayfs,
      no costly invisible copying, etc)

"path agnostic"

tar xf pagno-python.tgz

mv python /opt/python
/opt/python/bin/python # WORKS

mv python /media/my-jump-drive/
/media/my-jump-drive/bin/python # WORKS

mount /ipfs/
/ipfs/QmREL6gfCuZrTyUftfxPyfwPjyGANrZRBSXJGjhRXFHoFQ/bin/python # WORKS

Literally...

"path agnostic"

tar xf pagno-python.tgz

mv python /opt/python
/opt/python/bin/python # WORKS

mv python /media/my-jump-drive/
/media/my-jump-drive/bin/python # WORKS

mount /ipfs/
/ipfs/QmREL6gfCuZrTyUftfxPyfwPjyGANrZRBSXJGjhRXFHoFQ/bin/python # WORKS

Deeply convenient for users.

Also: helps an awful lot if you want to
use path prefixes with content hashes.

(^ Flat out Impossible otherwise!)

"path agnostic"

Why is this hard?

Long story short: dynamic linking...
and how ELF headers work in linux.

Briefly:

  • ELF rpath header = $ORIGIN/../lib
  • fs of {.bin/*,./lib/*}, etc
  • dirty hijinx to avoid abspath ELF interp
  • ...victory!
  • symlink farm in ./lib/* if you want
    (this lets you dedup shared objects)

How does one pagno?

More info: http://warpforge.io -> then nav to the ecosystem conventions pages about linking!

Static linking has limitations.

Pagno != static linking

This this pagno technique, based on "rpath $ORIGIN", is easier to deploy -- works in more scenarios, with less patching.

  1. Sometimes it requires patching upstreams!  (yikes!!)
     
  2. Obstructs possibility of lib file dedup.

You can also find out more about this in a talk entirely about Path Agnostic Binaries at ASG2018 !

(Yes, I've been harping about this for a while)

Even more about pagno

path-agnostic $PATH

How do we get a useful $PATH, when each application is mounted its own dir tree?

`linkwarp` tool.

It produces a symlink farm.

Foreach executable in `*/bin/*`:
make a symlink to it in `../megabin/`;
add `megabin` to your $PATH.
Done.

> Packaging <

# PACKAGING

That's it.

That's all the opinions we really have.

Packaging should be simple.

# Formulas

Overall:

Pounding in the tentpoles.


Trying to create a rendevous space --
where more people can gather to build a bigger system.

We have usable tools --
but that's only the start.

Status

# Future Work

Future Work -->>>

1. Starlark!  Tighter integration; packages of reusable templating conventions?

2. Better tools for catalog sharing.

3. More packages!  More, more more.

4. Autobumpers for dependency versioning.

 

5. ???

# Future Work

Future Work -->>>

Yes, this slowly trends towards building a new distro.

We're currently resisting that.

 

  • package things once, in warpsys style,
  • built reproducibly and hermetically with warpforge,
  • and be usable everywhere, without constraint.

Ideally, warpforge becomes the seed for a new genre of un-distro:

(barely.)

# Future Work

Future Work -->>>

Your packages are welcome!

Your transport plugins are welcome!

Your execution sandbox plugins are welcome!

Your PRs with better helptext are welcome!

It will take a village to build this!

# PACE LAYERING

Ideal impact

Cultural shift!~

 

Reproducible builds won't become prevalent until they're

  1. Easy; and
  2. Obvious when there's a failure.

 

Warpforge is meant to produce those conditions.

# PACE LAYERING

Ideal impact

Cultural shift to reproducible builds, as a baseline expectation, everywhere.

 


Computers become easier to use,
because they're built out of more composable systems.


Everyone can use the same tools -- small projects, distros, companies -- so we can get more done, together.

Questions?

# WHERETO

(Let's hang out)

More docs:

Code!:

Chat:

overflow slides
and
probable Q&A topics

# PACE LAYERING

Ideal impact... realized

I pulled in a repo that a collaborator pushed gcc build plots into...

... I launched `warpforge run`...

 

and I went to the bar.

 

When I came back, I had a working gcc... And because i had done `wf run -r` on some examples, those were built too!

# PACE LAYERING

 Ouroboros Mesh

Build graph with a cycle in it.

(N.b. warpforge can represent this!  Most can't!)

 

Now join two cycles.  Then more.

High power solution to Trusting Trust:
what kind of adversary can get a viral quine farther back in history... than you can stitch together Ouroborous Mesh?

# Langpkgs?

About language pkgman

Hypothesis: language package managers aren't reused because:

  • Differing selection algorithms;
  • Not enough value-added in sharing transport and snapshot mechanisms.

We might be able to shift the ROI curves.  Worth a shot?

# Langpkgs?

About language pkgman

LPMs do approx 4 things:

  • package suggestion
  • version selection
  • lockfiles (hopefully!)
  • transport & local placement
     

Notice how warpforge has set its boundaries very carefully: it only does the last two -- where being opinionated/special has low/no value.

# Formulas

A Labs Metaphor

When you're doing science, you write a Standard Operating Procedure ("SOP") first; then you execute it.

Warpforge is like that!  You write a formula first; then, you execute it.

When you're doing science, you write a Standard Operating Procedure ("SOP") first; then you execute it.

Repeatedly.

 

# Q&A: dynamic graph?

Does warpforge support dynamic graphs?

Not really.

You can do it at the L3/"templating" layers, though.

(The focus on having outlines of computations before starting to operate is necessary for having a feasible replay system (and thence the explain and audit features around it) -- so this is not a design choice made lightly.)

(iiuc, this is roughly the same choice bazel makes)

“we want to set someone up to become a hero if they put some business process inside warpforge... and just hand tarballs off to their downstream consumers and colleagues.”

 (That means not just emitting tarballs, but also having a strategy for making their contents...
you know,
work.)

Y u no Bazel?

# Y U No X?
  1. Big gnarly deps.
  2. Bad at communication.  Sorta presumes it lives in a monorepo.  I've never seen cross-organization bazel collab.
  3. Users remark on general sense of complexity.  (I hope we manage to keep warpforge feeling simpler; we'll see :))
  4. Way too easy to just say "sandbox=nah".
  5. Doesn't really have a concept of letting user choose between fetch-vs-build.  You hardcode that.
    (Have heard speculations you could hack it together with (ab)use of 'toolchains' and/or 'platforms'; unclear, nonobvious, would be hacky.)

First of all: yes, it's close!

However:

Y u no Bazel?

# Y U No X?

way too many primitives.

.... and this didn't even get to the file fetching ones!

Y u no Nix?

# Y U No X?
  1. That language.
  2. Want: API layer that shows up more clearly / is legible.
  3. Not content addressed.  Game over.  (I hear it's supported, now?  But not default?  This needs to be default, and then *non-CA* needs to be forbidden.  Nothing less.)
  4. Tends towards monolithic the same way as bazel does.  (Maybe now improved by flakes?  Maybe?)
  5. Very recursive.  Cannot tell how much work one expression will cause.
  6. Warpsys style ELF headers are superior to Nix's style.

First of all: yes, it's got heart in the right place!
However:

Holistically: maybe a new project with slightly different choices will be able to reach better UX and more adoption faster.

Y u no Nix?

# Y U No X?

Holistically: maybe a new project with slightly different choices will be able to reach better UX and more adoption faster.

Y u no Nix/Bazel?

# Y U No X?

Both have too many primitives...

...and an ingest instruction can appear almost anywhere.

Even someplace deep in a callstack.

(I think bazel can at least analyze this; I'm not sure Nix can.)

Contrast: Warpforge ingests are syntactically extremely obvious.

(And usually, you don't have them at all, because catalogs.)

crap slides

# Everything Sucks

Why?

  • No (or bad) environment reproduction tools.

     
  • The global variables -- the library paths.
    • ^ Way bigger deal than you'd think!

       
  • Way, way too much reliance on post-install hooks.

All of these things

Be the World

and i'm sorry, that doesn't really work for me

are trying to

(flatpack,snap,nix,guix,ubuntu,everyotherdistro,etc,etc,etc)

Formulas (and their limits)
# WFAPI:L1

A bit like like one “RUN” line in a dockerfile ...

  • A bit more powerful because of input assembly...
  • (okay and you can do multi-line step stuff)
  • but also, less than a dockerfile, because you’re probably gonna need more than one of these

We're only producing outputs at the end of this; it's all one execution cycle.

So: we'll want something to let us compose multiple steps.  Up next: Plots!

# PACKAGING

Path-agno ("pagno"?) builds matter for several reasons:

1. Spiritually: lack of pagno dynamic linking conventions is a (foolish, avoidable) wedge which drives distros apart.  We should not let it waste our time and obstruct our ability to share.

2. Practically: I really want my assembly config...



... to work, and be easy to get joy out of.

"/mount/here": "ware:tar:qwerutoeirt"
"/other/things": "ware:git:12345678abcd"
"/yet/more": "ware:tar:zbkweitguFHgjeek"

selfcontained packages:

 

sort of obvious, right?

MacOS has been like this for years
chromeOS is similar (i've heard)

 

things that work and get adoption are working and being adopted because they are moving this way.

selfcontained packages:

AppImage and friends do this too: bundle everything.  It works!

they just do a more heavyweight intervention:
they use squashfs and mount it.

works.  but dedup is impossible.