OpenAssistant: The Open-Source ChatGPT Alternative, with Dr. Yannic Kilcher

Data Science

In this episode of SuperDataScience hosted by our Chief Data Scientist, Jon Krohn, Yannic Kilcher — famed Machine Learning YouTuber and creator of OpenAssistant, the best-known open-source conversational A.I., shares where the biggest A.I. opportunities are in the coming years.

If you’re not already aware of him, Dr. Yannic:
• Has over 230,000 subscribers on his machine learning YouTube channel.
• Is the CTO of DeepJudge, a Swiss startup that is revolutionizing the legal profession with AI tools.
• Led the development of OpenAssistant, a leading open-source alternative to ChatGPT, that has over 37,000 stars on GitHub.
• Holds a PhD in A.I. from the outstanding Swiss technical university, ETH Zürich.

Despite being such a technical expert himself, most of this episode should be accessible to anyone who’s interested in A.I., whether you’re a hands-on practitioner or not.

In this episode, Yannic details:
• The behind-the-scenes stories and lasting impact of his OpenAssistant project.
• The technical and commercial lessons he’s learned while growing his A.I. startup.
• How he stays up to date on ML research.
• The important, broad implications of adversarial examples in ML.
• Where the biggest opportunities are in A.I. in the coming years.

The SuperDataScience podcast is available on all major podcasting platforms, YouTube, and at SuperDataScience.com.

Getting Value From A.I.

In February 2023, our Chief Data Scientist, Jon Krohn, delivered this keynote on “Getting Value from A.I.” to open the second day of Hg Capital’s “Digital Forum” in London.

read full post

The Chinchilla Scaling Laws

The Chinchilla Scaling Laws dictate the amount of training data needed to optimally train a Large Language Model (LLM) of a given size. For Five-Minute Friday, our Chief Data Scientist, Jon Krohn, covers this ratio and the LLMs that have arisen from it.

read full post

StableLM: Open-Source “ChatGPT”-Like LLMs You Can Fit on One GPU

The folks who open-sourced Stable Diffusion have now released “StableLM”, their first Language Models. Pre-trained on an unprecedented amount of data for single-GPU LLMs (1.5 trillion tokens!), these are small but mighty.

read full post