07 October 2022 |

AI Arrives Pt. 2

By


OPEN SOURCE

Open Source and The Transformers

There are two huge drivers propelling AI right now: 1) Access to free open-source AI models and 2) Huge technical breakthroughs in how AI models work.

We’re breaking down both, starting with the center of the free and open AI world: Hugging Face.

Hugging Face

One Liner: The home for open source AI

Raised: $100M Series C at a $2 billion valuation from Lux Capital

This is Lux’s second time leading a round in 🤗  which is quite a vote of confidence. After diving into the company I’m jealous.

Investors: Lux Capital, Sequoia, Coatue, Addition, AV Angel, Oliver Pomel founder of Datadog

Traction: 10,000+ paying enterprise customers 70k+ developers

Why it Matters: Hugging Face is the leading platform for AI developers and researchers. It hosts over 75,000 pre-trained AI models and 11,000 datasets that anyone can build on top of. Literally named 🤗 , it’s something between Github and AWS for AI.

Critically, with Hugging Face you don’t have to know how to build AI models or even how they work. Just like you don’t need to understand a car to use one. Hugging Face’s tooling empowers any developer to build on AI models from Facebook, Google, Open AI, and dozens of others.

It provides the models, the datasets, and the cloud to train new AI. Even a home for web apps people have built. You can check out some of the demos on the Spaces page.

Armed with these tools developers can build start building real products and startups on AI.

Note: When I wrote Own the Pipes about companies like AWS and TSMC and Stripe that created critical infrastructure to unlock entire industries, I wondered where the next pipes companies might come from. Biotech? AI?

Hugging Face looks a lot like the AI industry’s pipes company.

Community? It’s also a case for investing in community. Hugging Face didn’t generate any revenue for the first 5 years, instead focusing entirely on building the destination for AI researchers and developers. Now it’s positioned to become the leading provider of AI services to enterprises.


Product Spotlight

I stumbled on a few early products this week that are taking these tools and putting them to work. This is a Photoshop in-painting tool from Christian Cantrell:

Because Jack Sparrow deserved a parrot.

Soon in-painting and image generation will be productized and built into most design tools.


HOW IT WORKS

AI By AI

Just as important as access to AI for builders is the speed that AI is evolving.

This week DeepMind revealed a new AI model for discovering new algorithms at the edges of mathematics. The first target is matrix multiplication, a decades-old challenge in math and a key component of AI itself.

Translation: AI is now coming up with algorithms that advance mathematics, which will in turn be used to build better AI models.

The Transformers

At the heart of this AI wave is a new class of AI models called Transformers.

Transformers are large language models and our best attempt so far in teaching computers language.

using a method called self-attention, they deconstruct sentences and use math to map each word’s relationships to the words around it. It learns meaning through a word’s context and connection to other words — not unlike how we as humans wrap our heads around ideas.

In the industry, these models are so important AI researchers have started calling these Foundation Models. They can do a whole bunch of language related tasks well:

From NVIDIA’s What is a Transformer Model?

That’s how GPT-3 learned to generate text and how image generators like Stable Diffusion learned how to understand prompts. Since bigger models perform better there’s currently an arms race going on for the biggest possible models.

GPT3 trained on 175 billion end points. NVIDIA and Microsoft hit a high watermark in November, with the Megatron-Turing Natural Language Generation model with 530 billion parameters.

From Hugging Face’s fantastic course on Transformers. Highlighted is 🤗’s model DistilBERT which aimed to make the model cheaper to run instead of bigger and more powerful.

Critically, once they’re trained transformers are Zero-Shot, meaning they only need one or two of a new object, say your cat, to generate images of it in all sorts of poses and scenes.

Artificial Scientists

The most promising area for AI might be science. If transformer models continue to get better at understanding language they may soon be able to read and analyze scientific paper and then start connecting dots between research and proposing new theories and experiments.

There are big hurdles to cross but Josh Nicholson the founder of Scite broke down the vision in Future:

Because scientific papers are not easily accessible, we can’t easily use the data to train generative models like GPT-3 or DALL-E.

Can you imagine if a researcher could propose an experiment and an AI model could instantly tell them if it had been done before (and better yet, give them the result)?

Then, once they have data from a novel experiment, the AI could suggest a follow-up experiment based on the result.

Finally, imagine the time that could be saved if the researcher could upload their results and the AI model could write the resulting manuscript for them.

The Bear Case

What are the biggest barriers to AI taking off?

Massive Amounts of Data → There are some things that aren’t cataloged on the web. The founder of Scite points out that access to scientific research papers is highly regulated by journals and institutions. Medical data is the same. Text is the most common dataset on the web. After that might be video which is why ByteDance’s TikTok algo is so good. Other data is harder to come by in massive quantaties.

We don’t know what we want → AI is excellent at tightly-defined tasks with clear victory conditions. Which means to use an AI you need to be ridiculously, frustratingly clear. If you don’t believe me try having a complex conversation with Alexa or Siri. Even if AI can generate anything how will it know what to generate for us?

Bias → AI models learn based on the datasets you give them. If you were to say, train a model on a dataset of movies made before 1990 you can bet they’re not going to generate a lot of strong female leads. Diversity would probably be even more lacking. Scrapping the entire web to train a model means the model will come with all of its biases — it will learn to mimic all of our flaws. Researchers are working hard at filtering those biases out but it’s challenging. This unfortunately isn’t a barrier to the technology taking off, but it is scary.

Request for Startups: Instead of making everyone get a specialty in prompt engineering there’s an opportunity for a company to intermediate between the AI and the user. For instance, an image generator startup that takes a simple prompt from a user “a cat on a bicycle” and then adds to it dozens of other features to give it a specific style


VC FUNDING

Golden

Raise: $40 from Andreessen Horowitz

Notable: Marc Andreessen is on the board

One Liner: A new way to organize all knowledge on the internet

Golden is an AI company building a kind of hybrid between Google and Wikipedia. But unlike Wikipedia which relies on an army of contributors to write articles, Golden uses AI to compile data from the web and generate articles. For instance, here’s its page on synthetic biology.

Everything is AI generated which means you can use it on any conceivable topic, and it’s programable, so you can use it to collect data on whatever you’d like. See the query table below:

But the UX isn’t great. Personally, I’d rather use a combination of Google and Wikipedia than Golden. On the other hand, the front end is the easy part. An AI-generated catalog of all information is the challenge and if it really is a decentralized protocol maybe someone else can build a better UX on top of it?