Interactive Programming
for Artificial Intelligence

Dragan Djuric

[email protected]

Dragan Djuric

A cure for theory-phobia

A cure for math-phobia

More books will follow!

Interactive GPU Programming with CUDA

Based on Clojure, of course!

Interactive GPU Programming with OpenCL

Helped by Clojure, certainly!

Bayesian Data Analysis for Programmers

Anyone doubt that Clojure will power this?

Let's not look that far into the future

But I'm not short of ideas!

What?

Artificial Intelligence

  • field of study
    • artificial devices
    • perceive environment
    • take actions towards achieving goals
  • Mimic cognitive functions
    • "learning"
    • "problem solving"
    • feelings? (Do Androids Dream of the Electric Sheep?)

Magic?

AI Effect

  • Deep Blue plays chess
  • Siri talks with humans
  • Waymo car drives itself
  • Algorithms trade by themselves

Tesler's theorem: "AI is whatever hasn't been done yet"

AI Winter

  • Boom/bust cycles
  • grand objectives vs intractability
  • connectionist vs. logic-based
  • symbolic vs. statistical learning
  • integrative

Even Lisp was AI once

Machine Learning

  • popular and successful
  • subset of AI (vs. logic based)
  • probability & statistics

Deep Learning

  • a fancy name for Neural Networks
  • just multi-layered ones
  • and with novel structures

Neural Networks

  • NN ≠ neurons in human brain
  • linear layers + non-linear activation functions
  • matrix multiplication + vectorized functions
  • + a way to find useful numbers to put in these matrices

Programming

  • About Clojure… or not?
  • Implementation of AI techniques
  • Integration into other software
  • Often an afterthought in AI

Interactive

  • Python: only the model is interactive
  • Clojure: the whole system
  • Frameworks vs libraries
  • REPL
  • Calling pre-built code vs development

Who?

Data scientists & Analysts?

  • Interactive 😄 ✓
  • Artificial Intelligence 😲
  • Programming? 😕

AI researchers?

  • Artificial Intelligence 😁 ✓
  • Interactive 😪
  • Programming? 😴

Programmers!

  • Programming! 😋 ✓
  • Interactive! 😎 ✓
  • Artificial Intelligence! 😉

Why?

… against all odds…

LISP: The Original AI Language

Clojure: The AI Language!

Why not?

Cure the MATHfobia

  • Learn by implementing AI techniques from scratch
  • integrate AI into the software toolbox
  • A great hobby!

How?

Uncomplicate

https://uncomplicate.org

Clojure on top of hardware optimized routines

Neanderthal

Deep Diamond

Clojurists Together

Bayadera

ClojureCUDA

ClojureCL

Visualization

ClojureScript on top of mainstream JS visualization libraries

Oz

Data visualizations in Clojure

Saite/Hanami

Interactive arts and charts plotting

Java-based tools

Useful? Yes.

But, at this point it stops being fun

MXNet

Clojure bindings for MXNet Deep Learning framework

However:

  • Clojure
  • on top of Scala,
  • on top of Java,
  • on top of C++,
  • on top of CUDA or Intel binaries…

Deeplearning4J

  • Clojure,
  • on top of raw Java interop
  • on top of an unwieldy Java/C++ mix
  • on top of CUDA / Intel binaries

When?

We have already started and you are welcome to join right away!

From Scratch!

  • Learn by coding from scratch
  • Try it on increasingly sophisticated DL tasks
  • It doesn't have to be a Skynet to be useful!

Load the data

(def boston-housing-raw
  (-> (io/resource "boston-housing-prices/boston-housing.csv")
      (slurp)
      (csv/read-csv)))

(take 2 boston-housing-raw)
(["crim" "zn" "indus" "chas" "nox" "rm" "age"
  "dis" "rad" "tax" "ptratio" "b" "lstat" "medv"]
 ["0.00632" "18" "2.31" "0" "0.538" "6.575" "65.2"
  "4.09" "1" "296" "15.3" "396.9" "4.98" "24"])

Convert it to numbers

(def boston-housing
  (->> (drop 1 boston-housing-raw)
       (map #(mapv (fn [^String x] (Double/valueOf x)) %))
       (shuffle)
       (doall)))

(take 2 boston-housing-raw)
=>
([2.14918 0.0 19.58 0.0 0.871 5.709 98.5 1.6232 5.0 403.0 14.7 261.95 15.79 19.4]
 [9.18702 0.0 18.1 0.0 0.7 5.536 100.0 1.5804 24.0 666.0 20.2 396.9 23.6 11.3])

Transfer it to tensors

(def x-train (->> (take 404 boston-housing)
                  (map (partial take 13))
                  (transfer native-float)))
#RealGEMatrix[float, mxn:13x404, layout:column, offset:0]
   ▥       ↓       ↓       ↓       ↓       ↓       ┓
   →       2.15    9.19    ⁙       0.52    1.63
   →       0.00    0.00    ⁙       0.00    0.00
   →       ⁙       ⁙       ⁙       ⁙       ⁙
   →     261.95  396.90    ⁙     388.45  396.90
   →      15.79   23.60    ⁙       9.54   34.41
   ┗                                               ┛
(def y-train (->> (take 404 boston-housing)
                  (map (partial drop 13))
                  (transfer native-float)))
#RealGEMatrix[float, mxn:1x404, layout:column, offset:0]
   ▥       ↓       ↓       ↓       ↓       ↓       ┓
   →      19.40   11.30    ⁙      25.10   14.40
   ┗                                               ┛

Create the Neural Network

(def inference
  (inference-network native-float 13
                     [(fully-connected 64 relu)
                      (fully-connected 64 relu)
                      (fully-connected 1 linear)]))
(init! inference)

13 × 64 + 64 × 64 + 64 × 1 = 4992 weights

Prepare it for training

(def x-minibatch (ge x-train (mrows x-train) 16))
(def adam (training-network inference x-minibatch adam-layer))

Train!

(time (sgd-train adam x-train y-train quadratic-cost! 80 [0.005]))
"Elapsed time: 260.730503 msecs"
=> 3.2175311257890655

Provide new, unseen data

(def x-test (->> (drop 404 boston-housing)
                 (map (partial take 13))
                 (transfer native-float)))
#RealGEMatrix[float, mxn:13x102, layout:column, offset:0]
   ▥       ↓       ↓       ↓       ↓       ↓       ┓
   →      38.35    2.31    ⁙       0.64    7.05
   →       0.00    0.00    ⁙       0.00    0.00
   →       ⁙       ⁙       ⁙       ⁙       ⁙
   →     396.90  348.13    ⁙     380.02    2.52
   →      30.59   12.03    ⁙      10.26   23.29
   ┗                                               ┛
(def y-test (->> (drop 404 boston-housing)
                 (map (partial drop 13))
                 (transfer native-float)))
#RealGEMatrix[float, mxn:1x102, layout:column, offset:0]
   ▥       ↓       ↓       ↓       ↓       ↓       ┓
   →       5.00   19.10    ⁙      18.20   13.40
   ┗                                               ┛

Test it on unseen data

(mean-absolute-cost! (axpy! -1 y-test (inference x-test)))
=> 2.790548885569853

6 lines of Code

(backward [_ [t eta lambda rho1 rho2 epsilon]]
    (let [t (inc (long t))
          eta (double (or eta 0.001))
          lambda (double (or lambda 0.0))
          rho1 (double (or rho1 0.9))
          rho2 (double (or rho2 0.999))
          epsilon (double (or epsilon 1e-6))
          eta-avg (- (/ (double eta) (dim ones)))]
      (mm! (/ 1.0 (dim ones)) z (trans a-1) 0.0 g)
      (axpby! (- 1.0 rho1) g rho1 s)
      (axpby! (- 1.0 rho2) (sqr! g) rho2 r)
      (linear-frac! (/ (- eta) (- 1.0 (pow rho1 t))) s 0.0
                    (/ 1.0 (sqrt (- 1.0 (pow rho2 t))))
                    (sqrt! r g) epsilon g)
      (when-not first? (mm! 1.0 (trans w) z 0.0 a-1))
      (mv! eta-avg z ones 1.0 b)
      (axpby! 1.0 g (inc (* eta-avg lambda)) w)))

the only AI book series for programmers

  • interactive & dynamic
  • step-by-step implementation
  • incredible performance, yet no C++ hell (!)
  • Intel & AMD CPUs (DNNL)
  • Nvidia GPUs (CUDA and cuDNN)
  • AMD GPUs (yes, OpenCL too!)
  • Clojure (it's magic!)
  • Java Virtual Machine (without Java boilerplate!)
  • complete source code
  • beautiful typesetting

no middleman!

Continuous improvement - always up-to-date

100% of the revenue goes towards my open-source work!

the only AI book that walks the walk

complete, 100% executable code

step-by-step instructions

full path from theory to implementation in actual code

superfast implementation

learn DL by implementing it from scratch

classic neural networks using fast linear algebra

build an optimized backpropagation algorithm step-by-step

explore it on the CPU

run it on the GPU!

design an elegant neural network API

add tensor support

integrate with Intel's DNNL and Nvidia's cuDNN performance libraries

learn the nuts and bolts

build convolutional layers

build RNN support

understand how to use it to solve practical problems

…and much more!

Software + Learning

  • Simple yet powerful libraries
  • Full learning path: from theory to code

Subscribe now at: https://aiprobook.com