Computational Mathematics and Fluid Dynamics, with Prof. Margot Gerritsen

Data Science

In this SuperDataScience episode hosted by Our Chief Data Scientist, Jon Krohn, the extremely intelligent and super delightful Prof. Margot Gerritsen returns to the show to introduce what Computational Mathematics is, detail countless real-world applications of it, and relate it to the field of data science.

Margot:
• Has been faculty at Stanford University for more than 20 years, including eight years as Director of the Institute for Computational and Mathematical Engineering.
• In 2015, co-founded Women in Data Science (WiDS) Worldwide, an organization that supports, inspires and lowers barriers to entry for women across over 200 chapters in over 160 countries.
• Hosts the corresponding Women in Data Science podcast.
• Holds a PhD from Stanford in which she focused on Computational Fluid Dynamics — a passion she has retained throughout her academic career.

In it this episode, Margot details:
• What computational mathematics is.
• How computational math is used to study fluid dynamics, with fascinating in-depth examples across traffic, water, oil, sailing, F1 racing, the flight of pterodactyls and more.
• Synaesthesia, a rare perceptual phenomenon, which in her case means she sees numbers in specific colors and how this relates to her lifelong interest in math.
• The genesis of her Women in Data Science organization and the impressive breadth of its global impact today.


The SuperDataScience podcast is available on all major podcasting platforms, YouTube, and at SuperDataScience.com.

 

Getting Value From A.I.

In February 2023, our Chief Data Scientist, Jon Krohn, delivered this keynote on “Getting Value from A.I.” to open the second day of Hg Capital’s “Digital Forum” in London.

read full post

The Chinchilla Scaling Laws

The Chinchilla Scaling Laws dictate the amount of training data needed to optimally train a Large Language Model (LLM) of a given size. For Five-Minute Friday, our Chief Data Scientist, Jon Krohn, covers this ratio and the LLMs that have arisen from it.

read full post

StableLM: Open-Source “ChatGPT”-Like LLMs You Can Fit on One GPU

The folks who open-sourced Stable Diffusion have now released “StableLM”, their first Language Models. Pre-trained on an unprecedented amount of data for single-GPU LLMs (1.5 trillion tokens!), these are small but mighty.

read full post