Home
Categories
EXPLORE
True Crime
Comedy
Society & Culture
Business
Sports
History
News
About Us
Contact Us
Copyright
© 2024 PodJoint
00:00 / 00:00
Sign in

or

Don't have an account?
Sign up
Forgot password
https://is1-ssl.mzstatic.com/image/thumb/Podcasts221/v4/c8/99/7c/c8997cbf-7008-e83c-f5b8-219e6da91445/mza_4479149704352098345.jpg/600x600bb.jpg
The Build - Ai dev and product show.
Cameron Rohn and Tom Spencer
19 episodes
2 days ago
Weekly deep dives on the most interesting dev, ai and product releases, research updates and emerging trends in the AI engineering, agent development and software industry.
Show more...
Technology
RSS
All content for The Build - Ai dev and product show. is the property of Cameron Rohn and Tom Spencer and is served directly from their servers with no modification, redirects, or rehosting. The podcast is not affiliated with or endorsed by Podjoint in any way.
Weekly deep dives on the most interesting dev, ai and product releases, research updates and emerging trends in the AI engineering, agent development and software industry.
Show more...
Technology
https://d3t3ozftmdmh3i.cloudfront.net/staging/podcast_uploaded_nologo/43735253/43735253-1748210934824-470cb6c68d93c.jpg
EP 11 - Open ai OSS via open coder, Langchain open SWE, local inference with ollama turbo, virtual audiences content testing
The Build - Ai dev and product show.
1 hour 39 minutes 15 seconds
3 months ago
EP 11 - Open ai OSS via open coder, Langchain open SWE, local inference with ollama turbo, virtual audiences content testing

We recorded August 7th, right before ChatGPT launched.

We dove into GPT open source, OpenCode, Ollama Turbo, and deep agent setups.

I wanted to see LangChain’s open suite and test agent environments.


OpenCode stood out for its flexibility — multiple model providers, easy local setup, works with Ollama Turbo for $20/month.

LM Studio runs similarly.

I’m considering a high-spec NVIDIA rig and DGX Spark for local inference.


GPT-OSS is cheap, fast, and excellent for coding and tool-calling, but weaker on general knowledge.

Running it locally means more setup work but more control.

Hybrid local-plus-cloud routing feels inevitable.


We demoed OpenAgent Platform — fast, multi-provider agents without writing code.

Then explored LangChain SWE — an open-source, multi-threaded coding agent with planner/programmer loops, GitHub integration, Daytona sandboxes, and detailed token-cost tracking.


We looked at Vercel’s v0 API for quick generative UI, and the potential to run it privately for internal teams.

I closed with Google’s upcoming AI-mode ads and Societies.io — a virtual audience simulation tool for testing and optimizing content before publishing.


Chapters


00:00 Introduction to ChatGPT Launch and Demos

01:40 Exploring Open Code and LangChain

04:37 Local Inference and Olamma Integration

07:25 Cloud Acceleration with Turbo Service

10:11 Open Source Model Benchmarks and Feedback

The Build - Ai dev and product show.
Weekly deep dives on the most interesting dev, ai and product releases, research updates and emerging trends in the AI engineering, agent development and software industry.