| Andrew Cooke | Contents | Latest | RSS | Twitter | Previous | Next


Welcome to my blog, which was once a mailing list of the same name and is still generated by mail. Please reply via the "comment" links.

Always interested in offers/projects/new ideas. Eclectic experience in fields like: numerical computing; Python web; Java enterprise; functional languages; GPGPU; SQL databases; etc. Based in Santiago, Chile; telecommute worldwide. CV; email.

Personal Projects

Lepl parser for Python.

Colorless Green.

Photography around Santiago.

SVG experiment.

Professional Portfolio

Calibration of seismometers.

Data access via web services.

Cache rewrite.

Extending OpenSSH.

C-ORM: docs, API.

Last 100 entries

Re: Quick message - This link is broken; Adding Reverb To The Echo Chamber; Sox Audio Tools; Would This Have Been OK?; Honesty only important economically before institutions develop; Stegangraphy via PS4; OpenCL Mess; More Book Recommendations; Good Explanation of Difference Between Majority + Minority; Musical Chairs - Who's The Privileged White Guy; I can see straight men watching this conversation and laffing; When it's Actually a Source of Indignation and Disgust; Meta Thread Defending POC Causes POC To Close Account; Indigenous People Of Chile; Curry Recipe; Interesting Link On Marginality; A Nuclear Launch Ordered, 1962; More Book Recs (Better Person); It's Nuanced, And I Tried, So Back Off; Marx; The Negative Of Positive; Jenny Holzer Rocks; Huge Article on Cultural Evolution and More; "Ignoring language theory"; Negative Finger Counting; Week 12; Communication Via Telecomm Bids; Finding Suspects Via Relatives' DNA From Non-Crime Databases; Statistics and Information Theory; Ice OK in USA; On The Other Hand; (Current Understanding Of) Chilean Taxes / Contributions; M John Harrison; Playing Games on a Cloud GPU; China Gamifies Real Life; Can't Help Thinking It's Thoughtcrime; Mefi Quotes; Spray Painting Bike Frame; Weeks 10 + 11; Change: No Longer Possible To Merge Metadata; Books on Old Age; Health Tree Maps; MRA - Men's Rights Activists; Writing Good C++14; Risk Assessment - Fukushima; The Future of Advertising and Surveillance; Travelling With Betaferon; I think I know what I dislike so much about Metafilter; Weeks 8 + 9; More; Pastamore - Bad Italian in Vitacura; History Books; Iraq + The (UK) Governing Elite; Answering Some Hard Questions; Pinochet: The Dictator's Shadow; An Outsider's Guide To Julia Packages; Nobody gives a shit; Lepton Decay Irregularity; An Easier Way; Julia's BinDeps (aka How To Install Cairo); Good Example Of Good Police Work (And Anonymity Being Hard); Best Santiago Burgers; Also; Michael Emmerich (Vibrator Translator) Interview (Japanese Books); Clarice Lispector (Brazillian Writer); Books On Evolution; Looks like Ara (Modular Phone) is dead; Index - Translations From Chile; More Emotion in Chilean Wines; Week 7; Aeon Magazine (Science-ish); QM, Deutsch, Constructor Theory; Interesting Talk Transcripts; Interesting Suggestion Of Election Fraud; "Hard" Books; Articles or Papers on depolarizing the US; Textbook for "QM as complex probabilities"; SFO Get Libor Trader (14 years); Why Are There Still So Many Jobs?; Navier Stokes Incomplete; More on Benford; FBI Claimed Vandalism; Architectural Tessellation; Also: Go, Blake's 7; Delusions of Gender (book); Crypto AG DID work with NSA / GCHQ; UNUMS (Universal Number Format); MOOCs (Massive Open Online Courses); Interesting Looking Game; Euler's Theorem for Polynomials; Weeks 3-6; Reddit Comment; Differential Cryptanalysis For Dummies; Japanese Graphic Design; Books To Be Re-Read; And Today I Learned Bugs Need Clear Examples; Factoring a 67 bit prime in your head; Islamic Geometric Art; Useful Julia Backtraces from Tasks; Nothing, however, is lost with less discomfort than that which, when lost, cannot be missed; Article on Didion

© 2006-2015 Andrew Cooke (site) / post authors (content).

Differentiating in Python

From: "andrew cooke" <andrew@...>

Date: Sun, 31 Dec 2006 20:21:52 -0300 (CLST)

I read this post -
- and realised I had written the same code in Python.  I hunted around and
found it on my (password protected) diary -
http://www.acooke.org/andrew/diary/2004/mar/4.html - so here it is in public.

It parses, differentiates, and then simplifies simple numerical terms.



the original expression is a+3*b*a
it was parsed to (a+(3*(b*a)))
the differential wrt a is (1+(3*b))

the original expression is a/b
it was parsed to (a/b)
the differential wrt b is (0-(a*(1/(b*b))))

the original expression is a+b+c+d
it was parsed to (a+(b+(c+d)))
the differential wrt b is 1

the original expression is a+b*c+d
it was parsed to (a+((b*c)+d))
the differential wrt b is c


# calculate first derivatives of an arithmetic expression
# basic code using just +,-,/,*,(), integers and variables (lowercase), #
but functions aren't any harder conceptually

# here's the interesting bit
# walk the ast to calculate the derivative wrt some variable

# you'd add functions in the normal way - for example sin(...) would # map
to diff(...) * sin(...) + ... * cos(....)

def diffwrt(n,var):
    def ifn(n): return 0
    def vfn(v):
        if v is var: return 1
        else: return 0
    def nfn(op, n1, n2, dn1, dn2):
        if op is '+' or op is '-':
            return (op, dn1, dn2)
        elif op is '*':
            return ('+',
                    ('*', dn1, n2),
                    ('*', n1, dn2))
        elif op is '/':
            return ('-',
                    ('/', dn1, n2),
                    ('*', n1,
                     ('/', dn2, ('*', n2, n2))))
    return folddown(ifn, vfn, nfn, n)

def tidy(n):
    def ifn(n): return n
    def vfn(v): return v
    def nfn(op, n1, n2, dn1, dn2):
        if isinstance(dn1, int):
            if dn1 is 0:
                if op is '+': return dn2
                elif op is '*': return 0
                elif op is '/': return 0
            elif dn1 is 1 and op is '*': return dn2
        if isinstance(dn2, int):
            if dn2 is 0:
                if op is '+': return dn1
                elif op is '*': return 0
                elif op is '/': raise "division by zero"
            elif dn2 is 1 and op is '*': return dn1
        if isinstance(dn1, int) and isinstance(dn2, int):
            if op is '+': return dn1 + dn2
            elif op is '-': return dn1 - dn2
            elif op is '*': return dn1 * dn2
            elif op is '/': return dn1 / dn2
        return (op, dn1, dn2)
    return folddown(ifn, vfn, nfn, n)

def folddown(ifn, vfn, nfn, n):
    if isinstance(n, int): return ifn(n)
    elif isinstance(n, str): return vfn(n)
        (op,n1,n2) = n
        (dn1, dn2) = (folddown(ifn,vfn,nfn,n1), folddown(ifn,vfn,nfn,n2))
return nfn(op, n1, n2, dn1, dn2)

# a "simple" recursive descent parser

# grammar:
# expr: term ((+|-) term)*
# term: fact ((*|/) fact)*
# fact: '(' expr ')' | var | num

# the ast is just (operator, node, node) tuples

# utilities

def dropspace(s):
    if s and s[0] is ' ': return dropspace(s[1:])
    else: return s

def empty(s): return dropspace(s) is ""

# so these are a bunch of 'recognisers' (eg cousineau + mauny)
# (you can think of them as tokenizers - they either return a match plus #
the remaning text or None)

def add(s): return mkonechar('+')(s)
def subtract(s): return mkonechar('-')(s)
def multiply(s): return mkonechar('*')(s)
def divide(s): return mkonechar('/')(s)
def openbracket(s): return mkonechar('(')(s)
def closebracket(s): return mkonechar(')')(s)

def variable(s): return mkmanychar(lambda c : c >= 'a' and c <= 'z')(s)
def number(s): return mkmanychar(lambda c : c >= '0' and c <= '9')(s)

def mkonechar(c):
    def localonechar(s):
        ss = dropspace(s)
        if ss and ss[0] is c: return (c,ss[1:])
        else: return None
    return lambda s: localonechar(s)

def mkmanychar(p):
    def accum(s,id=""):
        if s and p(s[0]): return accum(s[1:],id+s[0])
        elif id != "": return (id,s)
        else: return None
    return lambda s: accum(dropspace(s))

# this handles the "nxt ((p1|p2) nxt)*" structure in the grammar

def mkextend(p1,p2,nxt):
    def localextend(n1,s):
        if p1(s):
            (x,s) = p1(s)
            (n2,s) = nxt(s)
            return ((x,n1,n2),s)
        elif p2(s):
            (x,s) = p2(s)
            (n2,s) = nxt(s)
            return ((x,n1,n2),s)
        else: return (n1,s)
    return lambda n,s: localextend(n,s)

# and this is the parser itself

def expr(s):
    if term(s):
        (n,s) = term(s)
        return extendexpr(n,s)
    else: raise ("cannot parse " + s)

def extendexpr(n,s): return mkextend(add,subtract,expr)(n,s)

def term(s):
    if fact(s):
        (n,s) = fact(s)
        return extendterm(n,s)
    else: raise ("cannot parse " +s)

def extendterm(n,s): return mkextend(multiply,divide,term)(n,s)

def fact(s):
    if openbracket(s):
        (x,s) = openbracket(s)
        if expr(s):
            (n,s) = expr(s)
            if closebracket(s):
                (x,s) = closebracket(s)
                return (n,s)
            else: raise ("missing ): " + s)
        else: raise ("cannot parse: " + s)
    elif variable(s):
        (x,s) = variable(s)
        return (x,s)
    elif number(s):
        (x,s) = number(s)
        return (int(x),s)
    else: raise ("cannot parse: " + s)

# finally, a pretty printer

def asttostring(n):
    if isinstance(n, int): return str(n)
    elif isinstance(n, str): return n
    else: return astoptostring(n)

def astoptostring((op,n1,n2)):
    return "("+asttostring(n1)+op+asttostring(n2)+")"

# and test

def demo(text, var):
    (ast,x) = expr(text)
    ast2 = tidy(ast)
    print "the original expression is", text
    print "it was parsed to", asttostring(ast2)
    diff = tidy(diffwrt(ast2, var))
    print "the differential wrt", var, "is", asttostring(diff)

demo("a+3*b*a", "a")
demo("a/b", "b")
demo("a+b+c+d", "b")
demo("a+b*c+d", "b")

This Differentiates Strings

From: "andrew cooke" <andrew@...>

Date: Sun, 31 Dec 2006 21:53:52 -0300 (CLST)

I just realised that the code I linked to differentiates actual functions
(although it's not quite as cool as that sounds - rather than using
reflection to handle expressions directly (which is probably impossible in
OCaml, but would work in Lisp, I supposed)) it *appears* (I don't follow
understand the code) to define its own datatypes for the different

Since I don't fully understand things I'm not sure, but I think that means
it's equivalent to my code less the parsing.  If that's right I'm
surprised at how much code is needed (most of my code is parsing, although
I don't handle as wide a range of functions).

Suspect I'm missing something.  Maybe the idea is that once can import the
defined module rather than some standard moduls, without changing any
code, and get differentitation "for free".  Which there was more
explanation with the code...


Differentiating Functions is Important

From: Will M Farr <farr@...>

Date: Thu, 4 Jan 2007 10:42:26 -0500


I think you have it correct in your comment above, but I'd like to  
emphasize that differentiating functions (as I do) is very  
important.  Using reflection (or lisp-style macros) to obtain a  
textual representation of a function (as you do) doesn't work well  
when you compose functions together, or use non-mathematical  
operators in a function.  For example, what's the derivative of

fun x ->
	if x > 0 then
		0 - x

at x = 3?  (You could do this textually if your text processor knew  
enough to avoid processing the if statement, but in order to make  
this work in general, you would have to process the whole language.)   
How about this:

fun x ->
	let y = 0 - x in
	if y < 0 then
		0 - y

It gets even worse if you define a function like

fun x ->
	let y = other_fun x in
	let z = another_fun y in
	x *. y *. z

To process this textually, you need a database which stores the text  
of other_fun and another_fun so they can be re-differentiated w.r.t x  
and then y (respectively).  Don't forget to apply the chain rule to  
the derivative of another_fun (since the argument is y)!  It's really  
a mess.

By the way, your last paragraph is entirely correct---I just take  
some existing code, add "open Deriv" to the top, place (C ...) in  
front of all numerical constants, and then I get derivatives for free.

A disadvantage of my method, which yours doesn't share, is that it  
would compute the derivative of this function:

fun x ->
	x +. 3.0 -. x


fun x ->
	1.0 +. 0.0 -. 1.0

I don't do any simplification, because I don't have the text of the  
expressions available at all.

Thanks for the interesting post---it's fun to see other people doing  
this kind of stuff!


Re: Differentiating Functions is Important

From: "andrew cooke" <andrew@...>

Date: Thu, 4 Jan 2007 13:10:12 -0300 (CLST)


Thanks for replying!  I just re-read my comment and it sounded more
negative than I intended because when I wrote the first half I still
hadn't worked out what I wrote in the second half (if you see what I
mean!) - since that final guess is correct this really is pretty sweet.

The only thing we may disagree on is that I think Lisp could combine the
best of both worlds, in that a Lisp macro effectively gives you both the
text and the "real function" (but I don't have enough experience to know
how the library you refer to would be handled/avoided - perhaps it
cannot).  Maybe MetaML would allow something similar to Lisp in ML, but
then you probably wouldn't be able to integrate the new functions into
existing code so easily (I think that's an excellent example of how cool
ML's module system is).  And of course, with Lisp you don't have a decent
type system :o)


Caseless (test)

From: "andrew cooke" <andrew@...>

Date: Thu, 4 Jan 2007 14:11:00 -0300 (CLST)

Just found a bug in the blog system - emails can be forced to lower case,
so I have changed things to make IDs caseless.  This is to test whether it


Comment on this post