Thomas Bradford
I'm Thomas Bradford
package main
import "fmt"
func main() {
fmt.Println("Hello, World!")
}
package main
import "fmt"
func main() {
h := make(chan string)
go func() {
h <- "World"
}()
fmt.Printf("Hello, %s!\n", <-h)
}
// lexer holds the state of the scanner.
type lexer struct {
name string // used only for error reports.
input string // the string being scanned.
start int // start position of this item.
pos int // current position in the input.
width int // width of last rune read.
items chan item // channel of scanned items.
}
// item represents a token returned from the scanner.
type item struct {
typ itemType // Type, such as itemNumber.
val string // Value, such as "23.2".
}
// itemType identifies the type of lex items.
type itemType int
const (
itemError itemType = iota // error occurred; value is text of error
itemDot // the cursor, spelled "."
...
$ cloc .
-------------------------------------------------------------------------------
Language files blank comment code
-------------------------------------------------------------------------------
Go 179 1911 610 9868
Markdown 87 282 0 1342
Scheme 14 139 17 844
XML 14 0 0 667
JSON 1 0 0 24
YAML 3 4 0 18
make 1 6 0 17
-------------------------------------------------------------------------------
SUM: 299 2342 627 12780
-------------------------------------------------------------------------------
Whaaaaa?!?
Yes I did! And you should, too!
(define nth!
(let-rec [scan
(lambda (coll pos missing)
(if (seq coll)
(if (> pos 0)
(scan (rest coll) (dec pos) missing)
(first coll))
(missing)))]
(lambda-rec nth!
[(coll pos)
(if (indexed? coll)
(nth coll pos)
(scan coll pos (lambda () (raise "index out of bounds"))))]
[(coll pos default)
(if (indexed? coll)
(nth coll pos default)
(scan coll pos (lambda () default)))])))
Lisp is the originator of many things 'functional'
Perfect candidate for an incremental approach to developing a compiler
Lexical
Analysis
Scan for the individual tokens of the language (numbers, ids, braces, etc)
Assure those tokens are appearing in adherence to the language's grammar
Assure that tokens are semantically valid (in-scope, correct type, etc...)
Describes an abstract working program that the optimizer can rewrite
Perform local and global rewriting of the abstract representation
Emit a final set of instructions that a machine can execute directly
Syntax
Analysis
Semantic Analysis
Intermediate Code Generator
Code
Optimizer
Machine Code
Generator
It breaks the pipeline into two distinct steps:
The lexical and syntax analyzers are a first-class element of the language and runtime
Together they are a function called read
They are available to the programmer!
(read
(open-input-string
"(display \"hello\")"))
(display "hello")
;; A List with two elements: a Symbol named
;; "display" and a String containing "hello"
This evaluates to
(eval (read
(open-input-string
"(display \"hello\")")))
hello
This displays
(eval (cons 'display (cons "hello" '())))
You can also evaluate a synthesized list
(eval '(display "hello"))
Lexical
Analysis
Scan for the individual tokens of the language (numbers, ids, braces, etc)
Assure those tokens are appearing in adherence to the language's grammar
Assure that tokens are semantically valid (in-scope, correct type, etc...)
Describes an abstract working program that the optimizer can rewrite
Perform local and global rewriting of the abstract representation
Emit a final set of instructions that a machine can execute directly
Syntax
Analysis
Semantic Analysis
Intermediate Code Generator
Code
Optimizer
Machine Code
Generator
Lexical
Analysis
Scan for the individual tokens of the language (numbers, ids, braces, etc)
Assure those tokens are appearing in adherence to the language's grammar
Take the syntax tree that was created and attempt to brute force a result by performing a lot of run-time checks
Syntax
Analysis
Interpreter
Stack Frame
Stack Frame
Stack Frame
Stack Frame
Stack Frame
Stack Frame
Stack Frame
Stack Frame
Base (First Function Called)
Top (Stack Pointer)
Stack Frame
Stack Frame
Stack Frame
Stack Frame
Base (First Function Called)
Top (Stack Pointer)
(define (to-zero x)
(cond
[(> x 1000) (to-zero (- x 1))]
[(> x 0) (to-zero (- x 1))]
[:else 0]))
(to-zero 9999)
At first, I didn't fully understand the beauty of Go's type system
unnecessarily boxing primitives
type Number interface {
Cmp(Number) Comparison
Add(Number) Number
Sub(Number) Number
//...
}
type Integer struct {
value int64
}
// Cmp compares this Integer to another Number
func (l *Integer) Cmp(r Number) Comparison {
if ri, ok := r.(*Integer); ok {
if l.value > ri.value {
return GreaterThan
}
...
type Number interface {
Cmp(Number) Comparison
Add(Number) Number
Sub(Number) Number
//...
}
type Integer int64
// Cmp compares this Integer to another Number
func (l Integer) Cmp(r Number) Comparison {
if ri, ok := r.(Integer); ok {
if l > ri {
return GreaterThan
}
...
I loved to
panic and rescue
and I stuttered a lot
import "github.com/kode4food/ale/internal/sequence"
...
v := sequence.SequenceToVector(s)
// Why am I saying "sequence" over and over?
import "github.com/kode4food/ale/internal/sequence"
...
v := sequence.ToVector(s)
// "sequence to vector" makes WAY more sense than
// sequence sequence to vector.
I took the idiomatic guidelines on local variable names way too seriously
func evalBuffer(ns env.Namespace, src []byte) data.Value {
r := read.FromString(data.String(src))
return eval.Block(ns, r)
}
func evalBuffer(ns env.Namespace, src []byte) data.Value {
foo := read.FromString(data.String(src))
return eval.Block(ns, foo)
}
I used unqualified imports more than I should have
import "github.com/kode4food/ale/internal/sequence"
...
v := sequence.SequenceToVector(s)
import . "github.com/kode4food/ale/internal/sequence"
...
v := SequenceToVector(s)
import "github.com/kode4food/ale/internal/sequence"
...
v := sequence.ToVector(s)