<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/"><channel><title>AI on Squid's Blog</title><link>https://gigasquidsoftware.com/categories/ai/</link><description>Recent content in AI on Squid's Blog</description><generator>Hugo</generator><language>en-us</language><lastBuildDate>Sun, 02 Jul 2023 12:31:00 +0000</lastBuildDate><atom:link href="https://gigasquidsoftware.com/categories/ai/atom.xml" rel="self" type="application/rss+xml"/><item><title>Ciphers with Vector Symbolic Architectures</title><link>https://gigasquidsoftware.com/blog/2023/07/02/ciphers-with-vector-symbolic-architectures/</link><pubDate>Sun, 02 Jul 2023 12:31:00 +0000</pubDate><guid>https://gigasquidsoftware.com/blog/2023/07/02/ciphers-with-vector-symbolic-architectures/</guid><description>&lt;p&gt;&lt;img loading="lazy" src="https://raw.githubusercontent.com/gigasquid/vsa-clj/main/examples/secret-message.png"&gt;&lt;/p&gt;
&lt;p&gt;&lt;em&gt;A secret message inside a 10,000 hyperdimensional vector&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;We&amp;rsquo;ve seen in previous posts how we can encode data structures using &lt;a href="http://gigasquidsoftware.com/blog/2022/12/31/vector-symbolic-architectures-in-clojure/"&gt;Vector Symbolic Architectures in Clojure&lt;/a&gt;. This is an exploration of how we can use this to develop a cipher to transmit a secret message between two parties.&lt;/p&gt;
&lt;h3 id="a-hyperdimensional-cipher"&gt;A Hyperdimensional Cipher&lt;/h3&gt;
&lt;p&gt;Usually, we would develop a dictionary/ cleanup memory of randomly chosen hyperdimensional vectors to represent each symbol. We could do this, but then sharing the dictionary as our key to be able to decode messages would be big. Instead, we could share a single hyperdimensional vector and then use the protect/ rotation operator to create a dictionary of the alphabet and some numbers to order the letters. Think of this as the initial seed symbol and the rest being defined as &lt;code&gt;n+1&lt;/code&gt;.&lt;/p&gt;</description></item><item><title>Vector Symbolic Architectures in Clojure</title><link>https://gigasquidsoftware.com/blog/2022/12/31/vector-symbolic-architectures-in-clojure/</link><pubDate>Sat, 31 Dec 2022 15:41:00 +0000</pubDate><guid>https://gigasquidsoftware.com/blog/2022/12/31/vector-symbolic-architectures-in-clojure/</guid><description>&lt;p&gt;&lt;img loading="lazy" src="https://live.staticflickr.com/65535/52596142860_c4cf8642b0_z.jpg"&gt;&lt;/p&gt;
&lt;p&gt;&lt;em&gt;generated with Stable Diffusion&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;Before diving into the details of what Vector Symbolic Architectures are and what it means to implement Clojure data structures in them, I&amp;rsquo;d like to start with some of my motivation in this space.&lt;/p&gt;
&lt;h2 id="small-ai-for-more-personal-enjoyment"&gt;Small AI for More Personal Enjoyment&lt;/h2&gt;
&lt;p&gt;Over the last few years, I&amp;rsquo;ve spent time learning, exploring, and contributing to open source deep learning. It continues to amaze me with its rapid movement and achievements at scale. However, the scale is really too big and too slow for me to enjoy it anymore.&lt;/p&gt;</description></item><item><title>Breakfast with Zero-Shot NLP</title><link>https://gigasquidsoftware.com/blog/2021/03/15/breakfast-with-zero-shot-nlp/</link><pubDate>Mon, 15 Mar 2021 09:07:00 +0000</pubDate><guid>https://gigasquidsoftware.com/blog/2021/03/15/breakfast-with-zero-shot-nlp/</guid><description>&lt;p&gt;&lt;img loading="lazy" src="https://i.imgflip.com/51ror1.jpg"&gt;&lt;/p&gt;
&lt;p&gt;What if I told you that you could pick up a library model and instantly classify text with arbitrary categories without any training or fine tuning?&lt;/p&gt;
&lt;p&gt;That is exactly what we are going to do with &lt;a href="https://joeddav.github.io/blog/2020/05/29/ZSL.html"&gt;Hugging Face&amp;rsquo;s zero-shot learning model&lt;/a&gt;. We will also be using &lt;a href="https://github.com/clj-python/libpython-clj"&gt;libpython-clj&lt;/a&gt; to do this exploration without leaving the comfort of our trusty Clojure REPL.&lt;/p&gt;
&lt;h3 id="whats-for-breakfast"&gt;What&amp;rsquo;s for breakfast?&lt;/h3&gt;
&lt;p&gt;We&amp;rsquo;ll start off by taking some text from a recipe description and trying to decide if it&amp;rsquo;s for breakfast, lunch or dinner:&lt;/p&gt;</description></item><item><title>Thoughts on AI Debate 2</title><link>https://gigasquidsoftware.com/blog/2020/12/24/thoughts-on-ai-debate-2/</link><pubDate>Thu, 24 Dec 2020 10:59:00 +0000</pubDate><guid>https://gigasquidsoftware.com/blog/2020/12/24/thoughts-on-ai-debate-2/</guid><description>&lt;p&gt;&lt;img loading="lazy" src="https://montrealartificialintelligence.com/aidebate2mosaic1440x720v8.jpg"&gt;&lt;/p&gt;
&lt;h2 id="ai-debate-2-from-montrealai"&gt;AI Debate 2 from Montreal.AI&lt;/h2&gt;
&lt;p&gt;I had the pleasure of watching the second AI debate from Montreal.AI last night. The first AI debate occurred last year between &lt;a href="https://yoshuabengio.org/"&gt;Yoshua Bengio&lt;/a&gt; and &lt;a href="https://en.wikipedia.org/wiki/Gary_Marcus"&gt;Gary Marcus&lt;/a&gt; entitled &lt;a href="https://montrealartificialintelligence.com/aidebate.html"&gt;“The Best Way Forward for AI”&lt;/a&gt; in which Yoshua argued that Deep Learning could achieve General AI through its own paradigm, while Marcus argued that Deep Learning alone was not sufficient and needed a hybrid approach involving symbolics and inspiration from other disciplines.&lt;/p&gt;</description></item><item><title>Cats and Dogs with Cortex Redux</title><link>https://gigasquidsoftware.com/blog/2017/11/07/cats-and-dogs-with-cortex-redux/</link><pubDate>Tue, 07 Nov 2017 18:51:00 +0000</pubDate><guid>https://gigasquidsoftware.com/blog/2017/11/07/cats-and-dogs-with-cortex-redux/</guid><description>&lt;p&gt;I wrote a &lt;a href="http://gigasquidsoftware.com/blog/2016/12/27/deep-learning-in-clojure-with-cortex/"&gt;blog post&lt;/a&gt; a while back about using a Clojure machine learning library called &lt;a href="https://github.com/thinktopic/cortex"&gt;Cortex&lt;/a&gt; to do the Kaggle Cats and Dogs classification challenge.&lt;/p&gt;
&lt;p&gt;I wanted to revisit it for a few reasons. The first one is that the Cortex library has progressed and improved considerably over the last year. It&amp;rsquo;s still not at version 1.0, but it my eyes, it&amp;rsquo;s really starting to shine. The second reason is that they recently published an &lt;a href="https://github.com/thinktopic/cortex/tree/master/examples/resnet-retrain"&gt;example&lt;/a&gt; of using the RESNET50 model, (I&amp;rsquo;ll explain later on), to do fine-tuning or transfer learning. The third reason, is that there is a great new plugin for leiningen the supports using &lt;a href="https://github.com/didiercrunch/lein-jupyter"&gt;Jupyter notebooks with Clojure projects&lt;/a&gt;. These notebooks are a great way of doing walkthroughs and tutorials.&lt;/p&gt;</description></item><item><title>Deep Learning in Clojure with Cortex</title><link>https://gigasquidsoftware.com/blog/2016/12/27/deep-learning-in-clojure-with-cortex/</link><pubDate>Tue, 27 Dec 2016 10:44:00 +0000</pubDate><guid>https://gigasquidsoftware.com/blog/2016/12/27/deep-learning-in-clojure-with-cortex/</guid><description>&lt;p&gt;&lt;strong&gt;Update: Cortex has moved along since I first wrote this blog post, so if you are looking to run the examples, please go and clone the &lt;a href="https://github.com/thinktopic/cortex"&gt;Cortex&lt;/a&gt; repo and look for the cats and dogs code in the examples directory.&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;There is an awesome new &lt;em&gt;Clojure-first&lt;/em&gt; machine learning library called &lt;a href="https://github.com/thinktopic/cortex"&gt;Cortex&lt;/a&gt; that was open sourced recently. I&amp;rsquo;ve been exploring it lately and wanted to share my discoveries so far in this post. In our exploration, we are going to tackle one of the classic classification problems of the internet. How do you tell the difference between a cat and dog pic?&lt;/p&gt;</description></item></channel></rss>