Andrew Cooke | Contents | Latest | RSS | Previous | Next

C[omp]ute

Welcome to my blog, which was once a mailing list of the same name and is still generated by mail. Please reply via the "comment" links.

Always interested in offers/projects/new ideas. Eclectic experience in fields like: numerical computing; Python web; Java enterprise; functional languages; GPGPU; SQL databases; etc. Based in Santiago, Chile; telecommute worldwide. CV; email.

Personal Projects

Choochoo Training Diary

Last 100 entries

[Programming] React Leaflet; AliExpress Independent Sellers; Applebaum - Twilight of Democracy; [Politics] Back + US Elections; [Programming,Exercise] Simple Timer Script; [News] 2019: The year revolt went global; [Politics] The world's most-surveilled cities; [Bike] Hope Freehub; [Restaurant] Mama Chau's (Chinese, Providencia); [Politics] Brexit Podcast; [Diary] Pneumonia; [Politics] Britain's Reichstag Fire moment; install cairo; [Programming] GCC Sanitizer Flags; [GPU, Programming] Per-Thread Program Counters; My Bike Accident - Looking Back One Year; [Python] Geographic heights are incredibly easy!; [Cooking] Cookie Recipe; Efficient, Simple, Directed Maximisation of Noisy Function; And for argparse; Bash Completion in Python; [Computing] Configuring Github Jekyll Locally; [Maths, Link] The Napkin Project; You can Masquerade in Firewalld; [Bike] Servicing Budget (Spring) Forks; [Crypto] CIA Internet Comms Failure; [Python] Cute Rate Limiting API; [Causality] Judea Pearl Lecture; [Security, Computing] Chinese Hardware Hack Of Supermicro Boards; SQLAlchemy Joined Table Inheritance and Delete Cascade; [Translation] The Club; [Computing] Super Potato Bruh; [Computing] Extending Jupyter; Further HRM Details; [Computing, Bike] Activities in ch2; [Books, Link] Modern Japanese Lit; What ended up there; [Link, Book] Logic Book; Update - Garmin Express / Connect; Garmin Forerunner 35 v 230; [Link, Politics, Internet] Government Trolls; [Link, Politics] Why identity politics benefits the right more than the left; SSH Forwarding; A Specification For Repeating Events; A Fight for the Soul of Science; [Science, Book, Link] Lost In Math; OpenSuse Leap 15 Network Fixes; Update; [Book] Galileo's Middle Finger; [Bike] Chinese Carbon Rims; [Bike] Servicing Shimano XT Front Hub HB-M8010; [Bike] Aliexpress Cycling Tops; [Computing] Change to ssh handling of multiple identities?; [Bike] Endura Hummvee Lite II; [Computing] Marble Based Logic; [Link, Politics] Sanity Check For Nuclear Launch; [Link, Science] Entropy and Life; [Link, Bike] Cheap Cycling Jerseys; [Link, Music] Music To Steal 2017; [Link, Future] Simulated Brain Drives Robot; [Link, Computing] Learned Index Structures; Solo Air Equalization; Update: Higher Pressures; Psychology; [Bike] Exercise And Fuel; Continental Race King 2.2; Removing Lowers; Mnesiacs; [Maths, Link] Dividing By Zero; [Book, Review] Ray Monk - Ludwig Wittgenstein: The Duty Of Genius; [Link, Bike, Computing] Evolving Lacing Patterns; [Jam] Strawberry and Orange Jam; [Chile, Privacy] Biometric Check During Mail Delivery; [Link, Chile, Spanish] Article on the Chilean Drought; [Bike] Extended Gear Ratios, Shimano XT M8000 (24/36 Chainring); [Link, Politics, USA] The Future Of American Democracy; Mass Hysteria; [Review, Books, Links] Kazuo Ishiguro - Never Let Me Go; [Link, Books] David Mitchell's Favourite Japanese Fiction; [Link, Bike] Rear Suspension Geometry; [Link, Cycling, Art] Strava Artwork; [Link, Computing] Useful gcc flags; [Link] Voynich Manuscript Decoded; [Bike] Notes on Servicing Suspension Forks; [Links, Computing] Snap, Flatpack, Appimage; [Link, Computing] Oracle is leaving Java (to die); [Link, Politics] Cubans + Ultrasonics; [Book, Link] Laurent Binet; VirtualBox; [Book, Link] No One's Ways; [Link] The Biggest Problem For Cyclists Is Bad Driving; [Computing] Doxygen, Sphinx, Breathe; [Admin] Brokw Recent Permalinks; [Bike, Chile] Buying Bearings in Santiago; [Computing, Opensuse] Upgrading to 42.3; [Link, Physics] First Support for a Physics Theory of Life; [Link, Bike] Peruvian Frame Maker; [Link] Awesome Game Theory Tit-For-Tat Thing; [Food, Review] La Fabbrica - Good Italian Food In Santiago; [Link, Programming] MySQL UTF8 Broken; [Link, Books] Latin American Authors

© 2006-2017 Andrew Cooke (site) / post authors (content).

Generating Docs from a GitHub Wiki

From: andrew cooke <andrew@...>

Date: Fri, 30 May 2014 16:01:01 -0400

I am working on a project that I have been docmuenting in a GitHub wiki, using
markdown.  Now we want to provide the client with a "paper" document.  Rather
than copy and paste everything, I am automating the document generation from
the Wiki.

(Obviously the code below isn't perfect - in particular, there's no real order
to the pages - but it's a good first start and can be extended as required.)


First, the driver script that generates the docs (change WIKI-DIR and WIKI-URL
as required);

  #!/bin/bash

  if [ ! -e WIKI-DIR ]; then
      git clone WIKI-URL
  else
      pushd WIKI-DIR
      git pull
      popd
  fi

  REPORT=report.md
  rm -f $REPORT
  cp header.md $REPORT

  for WIKIPAGE in `ls -1 cats-scripts.wiki | grep -v "~" | grep -iv "home.md" `
  do
  TITLE=`echo $WIKIPAGE | sed 's/-/ /g' | sed 's/.md$//'`
      echo "" >> $REPORT
      echo "# $TITLE" >> $REPORT
      echo "" >> $REPORT
      ./fix-md.py WIKI-DIR/$WIKIPAGE | sed 's/^% .*//' >> $REPORT
  done

  cat footer.md >> $REPORT
  pandoc -s --toc -f markdown $REPORT -o report.pdf


Second, a Python script that youcan use to modify each page as required:

  #!/usr/bin/python

  # convert wiki pages so that they are suitable for inclusion in the report

  # reads from stdin (or args), writes to stdout. i can't remember enough
  # awk...

  from __future__ import print_function
  import fileinput
  import re

  HEADER = re.compile(r'^(#+)\s+(.*)')

  for line in fileinput.input():

      # replace wikilinks by normal links
      line = re.sub(r'\[\[([^]]*)\]\]', r'[\1](\1)', line)

      # headers are increased one level and auto-capped
      header = HEADER.match(line)
      if header:
	  print('#%s ' % header.group(1), end='')
	  for word in header.group(2).split():
	      print(word[0] + word[1:].lower() + ' ', end='')
	  print()

      # anything else is copied to stdout
      else:
	  print(line, end='')


Also, you need to define your own footer.md and header.md.

Andrew

Comment on this post