( news )

We'd like to thank Rich Hickey for a great presentation here in New York City on November 10th, 2016

Rich Hickey is best known for creating Clojure, a functional programming language that runs on the Java virtual machine. Rich is also the inventor of ClojureScript, a compiler for Clojure that targets JavaScript, as welll as Datomic which is a fully transactional, cloud-ready, distributed database.

Rich presented Clojure Spec, a new closure library which helps automate Validation, Error reporting, Destructuring, Instrumentation, Test-data generation and Generative test generation

Our sincerest gratitude for an informative and delightful evening!

( meeting - Tuesday, June 13, 7:00 PM - Pierre de Lacaze on Deep Learning )


This presentation is  Part 2 of my September Lisp NYC presentation on Reinforcement Learning and Artificial Neural Nets.  We will continue from where we left off by covering Convolutional Neural Nets (CNN) and Recurrent Neural Nets (RNN) in depth. We will also touch upon Deep-Q Networks (DQN) and time permitting we will briefly describe Deep Mind's latest Nature paper on Differentiable Neural Computers (DNC). Code examples will again be provided in Clojure. 

After a very brief recap of Part 1 (ANN & RL), we will jump right into CNN and their appropriateness for image recognition. We will start by covering the convolution operator. We will then explain feature maps and pooling operations and then explain the LeNet 5 architecture. The MNIST data will be used to illustrate a fully functioning CNN. Next we cover Recurrent Neural Nets in depth and describe how they have been used in Natural Language Processing. 

The remainder of the talk will essentially describe two papers by Google's Deepmind team. The first paper (the Atari paper) will be used to describe Deep-Q Networks and the bridging of two different fields in Machine Learning: Reinforcement Learning (RL) and Deep Learning (DL) to produce Deep Reinforcement Learning. The second and very recent paper will be used to illustrate a new type of computer architecture based on DQN but augmented with memory and referred to as Differentiable Neural Computers (DNC).

Please note that some exposure or familiarity with the following concepts and algorithms will be assumed: Gradient Descent, Backpropagation, ANN, MLP, Activation Functions, Differentiation & Integration, Q-Learning, and TD-Learning.
These are covered in the first part of the talk for which both video and slides are available online.

A lot of material will be drawn from the new Deep Learning book. 


Pierre de Lacaze has over 20 years industry experience with Lisp and AI based technologies. He holds a Bachelor of Science in Applied Mathematics and a Master’s Degree in Computer Science. He is the President of  LispNYC and Director of Machine Intelligence at Shareablee

Ladders Inc.
55 Water Street, 51st Floor


LispNYC is a nonprofit unincorporated association dedicated to the advocacy and advancement of Lisp-based software and development technologies such as Common Lisp, Clojure and Scheme.

We focus on education, outreach, regular monthly meetings and development projects.

Meetings are the second Tuesday of every month, are free and open to all.

Providing parentheses to NYC since 2002