Based on Clojure, of course!
Helped by Clojure, certainly!
Anyone doubt that Clojure will power this?
But I'm not short of ideas!
Magic?
Tesler's theorem: "AI is whatever hasn't been done yet"
Even Lisp was AI once
… against all odds…
Why not?
Clojure on top of hardware optimized routines
Clojurists Together
ClojureScript on top of mainstream JS visualization libraries
✓
Data visualizations in Clojure
Interactive arts and charts plotting
Useful? Yes.
But, at this point it stops being fun
Clojure bindings for MXNet Deep Learning framework
However:
We have already started and you are welcome to join right away!
(def boston-housing-raw
(-> (io/resource "boston-housing-prices/boston-housing.csv")
(slurp)
(csv/read-csv)))
(take 2 boston-housing-raw)
(["crim" "zn" "indus" "chas" "nox" "rm" "age"
"dis" "rad" "tax" "ptratio" "b" "lstat" "medv"]
["0.00632" "18" "2.31" "0" "0.538" "6.575" "65.2"
"4.09" "1" "296" "15.3" "396.9" "4.98" "24"])
(def boston-housing
(->> (drop 1 boston-housing-raw)
(map #(mapv (fn [^String x] (Double/valueOf x)) %))
(shuffle)
(doall)))
(take 2 boston-housing-raw)
=>
([2.14918 0.0 19.58 0.0 0.871 5.709 98.5 1.6232 5.0 403.0 14.7 261.95 15.79 19.4]
[9.18702 0.0 18.1 0.0 0.7 5.536 100.0 1.5804 24.0 666.0 20.2 396.9 23.6 11.3])
(def x-train (->> (take 404 boston-housing)
(map (partial take 13))
(transfer native-float)))
#RealGEMatrix[float, mxn:13x404, layout:column, offset:0]
▥ ↓ ↓ ↓ ↓ ↓ ┓
→ 2.15 9.19 ⁙ 0.52 1.63
→ 0.00 0.00 ⁙ 0.00 0.00
→ ⁙ ⁙ ⁙ ⁙ ⁙
→ 261.95 396.90 ⁙ 388.45 396.90
→ 15.79 23.60 ⁙ 9.54 34.41
┗ ┛
(def y-train (->> (take 404 boston-housing)
(map (partial drop 13))
(transfer native-float)))
#RealGEMatrix[float, mxn:1x404, layout:column, offset:0]
▥ ↓ ↓ ↓ ↓ ↓ ┓
→ 19.40 11.30 ⁙ 25.10 14.40
┗ ┛
(def inference
(inference-network native-float 13
[(fully-connected 64 relu)
(fully-connected 64 relu)
(fully-connected 1 linear)]))
(init! inference)
13 × 64 + 64 × 64 + 64 × 1 = 4992 weights
(def x-minibatch (ge x-train (mrows x-train) 16))
(def adam (training-network inference x-minibatch adam-layer))
(time (sgd-train adam x-train y-train quadratic-cost! 80 [0.005]))
"Elapsed time: 260.730503 msecs"
=> 3.2175311257890655
(def x-test (->> (drop 404 boston-housing)
(map (partial take 13))
(transfer native-float)))
#RealGEMatrix[float, mxn:13x102, layout:column, offset:0]
▥ ↓ ↓ ↓ ↓ ↓ ┓
→ 38.35 2.31 ⁙ 0.64 7.05
→ 0.00 0.00 ⁙ 0.00 0.00
→ ⁙ ⁙ ⁙ ⁙ ⁙
→ 396.90 348.13 ⁙ 380.02 2.52
→ 30.59 12.03 ⁙ 10.26 23.29
┗ ┛
(def y-test (->> (drop 404 boston-housing)
(map (partial drop 13))
(transfer native-float)))
#RealGEMatrix[float, mxn:1x102, layout:column, offset:0]
▥ ↓ ↓ ↓ ↓ ↓ ┓
→ 5.00 19.10 ⁙ 18.20 13.40
┗ ┛
(mean-absolute-cost! (axpy! -1 y-test (inference x-test)))
=> 2.790548885569853
(backward [_ [t eta lambda rho1 rho2 epsilon]]
(let [t (inc (long t))
eta (double (or eta 0.001))
lambda (double (or lambda 0.0))
rho1 (double (or rho1 0.9))
rho2 (double (or rho2 0.999))
epsilon (double (or epsilon 1e-6))
eta-avg (- (/ (double eta) (dim ones)))]
(mm! (/ 1.0 (dim ones)) z (trans a-1) 0.0 g)
(axpby! (- 1.0 rho1) g rho1 s)
(axpby! (- 1.0 rho2) (sqr! g) rho2 r)
(linear-frac! (/ (- eta) (- 1.0 (pow rho1 t))) s 0.0
(/ 1.0 (sqrt (- 1.0 (pow rho2 t))))
(sqrt! r g) epsilon g)
(when-not first? (mm! 1.0 (trans w) z 0.0 a-1))
(mv! eta-avg z ones 1.0 b)
(axpby! 1.0 g (inc (* eta-avg lambda)) w)))
Subscribe now at: https://aiprobook.com