Saturday, February 21, 2009

Pizza Points - story estimation made round!

There's two commonly used methods for agile estimation Story Points and Ideal Programming Somethings, commonly days or hours. Both methods have their merits with some bias towards Story Points (SP) from Mike Cohn and various other well known names although Ideal Programming Somethings seems to be more commonly used by the teams I've spoken to.

There seems to be quite a bit of confusion regarding how they relate and how both terms relate to hours. To remedy some of this I want to propose a new system, Pizza Points, that is driven by an easy to explain intuitive and rich metaphor.

Lets look at the similarities between work and pizza as used in the following discussion.

  • Pizza is round, work tends to be circular.
  • Pizza can be filling and deeply satisfying, as can work.
  • Pizza comes in different sizes, as does stories and tasks.
  • Pizza can have lots of varying and interesting fillings, work can be filled with many intresting things.
  • As we mature we can eat more pizza and as we learn a domain and tools we can tackle bigger tasks.

Depending on the peculiarities of your favourite pizza parlour the size and form may vary, maybe you have children, normal and family, maybe the range is small, medium, large, and extra-large often it's round and sometimes you get oddly square bites. There's no guarantee that a small pizza is the same size between to different places and there's likewise no sense in assuming that a pizza point is equally sized between two teams. That said, if you stick to one place, keep your team intact, any given size will overtime be quite consistent.

So how do we get started using Pizza Points?

We have to start by establishing some sort of baseline size, no diffrent from the initial sizing of story points. Find a fairly small, easily graspable story discuss the criterias for done and label it "children", "small" or why not, one. Continue estimation by thinking about the relative *size* not filling, give them descriptive names, standard, family, 2, 3, 5, 8, 13, 20, 40, 100, xxx-large.

It's as easy as that. The thing to remember is that size is actually a constant but the filling might greatly influence how much we can eat. I like pizza and can eat quite a lot given toppings like different cheeses, ham, pineapple for example. Give me anchovies and you'll have me struggling an evening to come close to finishing even a child size bite. The size haven't changed, my aptitude and motivation did.

If you're the one placing orders and want to get as much pizza eaten as possible during any period of time it can be wise to ask your team for their taste preferences. But real life sometimes dictates that we put anchovies on their plate, that's a big responsibility.

To summarize, think size not filling, match fillings to team, expect size to vary depending on team. Also expect mature, adult, teams to eat more than children.

And don't forget to order planning pizza as a reminder during long estimation sessions.

Monday, February 9, 2009

The story about TypeMock.

To mock or not. That's the question. Here's how I think some BDDers and Mockists labled their Kool-Aid before drinking it.

Given TypeMock
When I want to test
Then everything looks like a mock object.

Tuesday, February 3, 2009

Pencil.Unit and Micro Lightweight Unit Testing

Joe Armstrong of Erlang fame has the following to say on how he write unit tests Micro Lightweight Unit Testing
Here's a condenced retrace of his steps using F# and Pencil.Unit
Step 1) Write Micro Unit Test
Theory "Fib should work for known values from Wikipedia"
    [(0,0); (1, 1); (2, 1); (3, 2); (20, 6765)]
    (fun (n, e) -> Fib n |> Should Equal e)
Step 2) Implement Fib
let rec Fib = function
    | 0 -> 0
    | 1 -> 1
    | _ as n -> Fib(- 1) + Fib(- 2)
Step 3) Theorize about FastFib
Theory "FastFib should give same result as Fib"
    [0; 1; 2; 3; 25]
    (fun n -> FastFib n |> Should Equal (Fib n))
Step 4)Implement FastFib
let FastFib n =
    let rec loop n a b =
        match n with
        | 0 -> a
        | _ -> loop (- 1) b (+ b)
    loop n 0 1

Monday, February 2, 2009

Taking Pencil.Unit for a spin, F# syntax highlighting.

Decided to take the current iteration of the testing code posted earlier for a spin by trying to actually build something usefull with it. Since Im yet to find a decent syntax highlighter that supports F# and doesn't generate utterly disgusting HTML I decided to try to test and hack my way to one that at least suits my quite humble needs. The result was this:
(* Building a (very simple) syntax higligher with Pencil.Unit *)
#light

open System
open System.Text
open System.IO
open Pencil.Unit

type Token =
    | Comment of string
    | Keyword of string
    | Preprocessor of string
    | String of string array
    | Text of string
    | WhiteSpace of string
    | NewLine
    | Operator of string

let Classify x =
    match x with
    | "abstract" | "and" | "as" | "assert"
    | "base" | "begin"
    | "class"
    | "default" | "delegate" | "do" | "done" | "downcast" | "downto"
    | "elif" | "else" | "end" | "exception" | "extern"
    | "false" | "finally" | "for" | "fun" | "function"
    | "if" | "in" | "inherit" | "inline" | "interface" | "internal"
    | "lazy" | "let"
    | "match" | "member" | "module" | "mutable"
    | "namespace" | "new" | "null"
    | "of" | "open" | "or" | "override"
    | "private" | "public"
    | "rec" | "return"
    | "static" | "struct"
    | "then" | "to" | "true" | "try" | "type"
    | "upcast" | "use"
    | "val" | "void"
    | "when" | "while" | "with"
    | "yield" -> Keyword x
    | _ when x.[0] = '#' -> Preprocessor x
    | _ -> Text x

let IsKeyword = function
    | Keyword _ -> true
    | _ -> false

let IsPreprocessor = function
    | Preprocessor _ -> true
    | _ -> false

Theory "Classify should support all F# keywords"

    ("abstract and as assert base begin class default delegate do done
    downcast downto elif else end exception extern false finally for
    fun function if in inherit inline interface internal lazy let
    match member module mutable namespace new null of open or
    override private public rec return static struct then to
    true try type upcast use val void when while with yield"
    .Split([|' ';'\t';'\r';'\n'|], StringSplitOptions.RemoveEmptyEntries))

    (fun x -> Classify x |> IsKeyword |> Should Equal true)

Fact "Classify should treat leading # as Preprocessor"
    (Classify "#light" |> IsPreprocessor |> Should Equal true)

let Tokenize (s:string) =
    let start = ref 0
    let p = ref 0
    let next() = p := !p + 1
    and hasMore() = !p < s.Length
    and sub() = s.Substring(!start, !p - !start)
    and current() = s.[!p]
    and prev() = s.[!p - 1]
    let peek() = if (!p + 1) < s.Length then
                    s.[!p + 1]
                 else
                    (char)0
    and isWhite() =
        match current() with
        | ' ' | '\t' -> true
        | _ -> false
    and isOperator() = "_(){}<>,.=|-+:;[]".Contains(string (current()))
    and isNewLine() = current() = '\r' || current() = '\n'
    let notNewline() = not (isNewLine())
    and notBlockEnd() = not(current() = ')' && prev() = '*')
    let inWord() = not (isWhite() || isNewLine() || isOperator())
    let read p eatLast =
        while hasMore() && p() do
            next()
        if eatLast then
            next()
        sub()
    let readWhite() = WhiteSpace(read isWhite false)
    and readNewLine() =
        next()
        if isNewLine() then
            next()
        NewLine
    and readWord() = Classify(read inWord false)
    and readOperator() = Operator(read isOperator false)
    and readString() =
        let isEscaped() = prev() = '\\'
        let inString() = isEscaped() || current() <> '\"'
        next()
        let s = read inString true
        String(s.Split([|'\r';'\n'|], StringSplitOptions.RemoveEmptyEntries))
    seq {
        while hasMore() do
            start := !p
            let token =
                match current() with
                | '\"' -> readString()
                | '/' when peek() = '/' -> Comment(read notNewline false)
                | '(when peek() = '*' -> Comment(read notBlockEnd true)
                | _ when isWhite() -> readWhite()
                | _ when isOperator() -> readOperator()
                | _ when isNewLine() -> readNewLine()
                | _ -> readWord()
            yield token}

let ToString x =
    let encode = function
        | Comment _ -> "c"
        | Keyword _ -> "k"
        | Preprocessor _ -> "p"
        | String _ -> "s"
        | Text _ -> "t"
        | WhiteSpace _ -> "w"
        | NewLine -> "n"
        | Operator _ -> "o"
    x |> Seq.fold (fun (r:StringBuilder) -> encode >> r.Append) (StringBuilder())
    |> string

Fact "Tokenize should categorize"(
    Tokenize "#light let foo" |> ToString |> Should Equal "pwkwt")

Fact "Tokenize should handle string"(
    Tokenize "\"Hello World\"" |> ToString |> Should Equal "s")

Fact "Tokenize should split string into lines"(
    let lines = function
        | String x -> x
        | _ -> [||]
    Tokenize "\"Hello\r\nWorld\"" |> Seq.hd |> lines |> Seq.length |> Should Equal 2)

Theory "Tokenize should separate start on operators"
    ("_ ( ) { } < > [ ] , = | - + : ; .".Split([|' '|]))
    (fun x -> Tokenize x |> ToString |> Should Equal "o")

Fact "Tokenize should end on separators"(
    Tokenize "foo)" |> ToString |> Should Equal "to")

Fact "Tokenize should handle escaped char in string"(
    Tokenize "\"\\\"\"" |>  ToString |> Should Equal "s")

Fact "Tokenize should handle //line comment"(
    Tokenize "//line comment" |> ToString |> Should Equal "c")

Fact "Tokenize should handle (* block comments *)"(
    Tokenize "(* block comment )*) " |> ToString |> Should Equal "cw")

Fact "Tokenize should handle newline"(
    Tokenize "\r\n" |> ToString |> Should Equal "n")

Fact "Tokenize should separate whitespace and newline"(
    Tokenize "    \r\n" |> ToString |> Should Equal "wn")

let Sanitize (s:string) = s.Replace("&", "&amp;").Replace("<", "&lt;").Replace(" ", "&nbsp;")

Fact "Sanitize should replace < with &lt;"(
    Sanitize "<" |> Should Equal "&lt;")

Fact "Sanitize should repalce & with &amp;"(
    Sanitize "&" |> Should Equal "&amp;")

Fact "Sanitize should repalce ' ' with &nbsp;"(
    Sanitize " " |> Should Equal "&nbsp;")

type IHtmlWriter =
    abstract Literal : string -> unit
    abstract Span : string -> string -> unit
    abstract NewLine : unit -> unit

let HtmlEncode (w:IHtmlWriter) =
    let span style s =
        w.Span style s
    function
    | Comment x -> span "c" x
    | Keyword x -> span "kw" x
    | Preprocessor x -> span "pp" x
    | String x ->
        span "tx" x.[0]
        for i = 1 to x.Length - 1 do
            w.NewLine()
            span "tx" x.[i]
    | Operator x -> span "op" x
    | Text x | WhiteSpace x -> w.Literal x
    | NewLine -> w.NewLine()

let AsHtml s =
    let r = StringBuilder("<div class='f-sharp'>")
    let encode = HtmlEncode {new IHtmlWriter with
        member this.Literal s = r.Append(Sanitize s) |> ignore
        member this.Span c s = r.AppendFormat("<span class='{0}'>{1}</span>", c, Sanitize s) |> ignore
        member this.NewLine() = r.Append("<br>") |> ignore}
    Tokenize s |> Seq.iter encode
    string (r.Append("</div>"))

Fact "AsHtml sample"(
    let sample = "#light\r\nlet numbers = [1..10]"
    let expected =
        String.Concat [|"<div class='f-sharp'><span class='pp'>#light</span><br>"
        ;"<span class='kw'>let</span>&nbsp;numbers&nbsp;<span class='op'>=</span>&nbsp;"
        ;"<span class='op'>[</span>1<span class='op'>..</span>"
        ;"10<span class='op'>]</span></div>"|]
    sample |> AsHtml |> Should Equal expected)

//Render myself.
File.ReadAllText(__SOURCE_FILE__)
|> AsHtml |> (fun x -> File.WriteAllText(__SOURCE_FILE__ + ".html", x))
As outputed from itself. I really like how "Fact" and "Theory" turned out and it seems to suite my current needs just fine.