
In this episode, we cover:
(00:00) Introduction to the AI Silicon Valley podcast and hosts Eric and Rajesh
(01:00) Highlights from NVIDIA GTC and Jensen Huang’s keynote
(04:15) The rise of NVIDIA as the “Intel of AI” and GPU-GPU communication with Dynamo
(07:40) DeepSeek’s cost-efficiency breakthrough and implications for AI infrastructure
(11:30) The Jevons paradox and why cheaper compute may increase AI demand
(15:20) Introduction to the Model Context Protocol (MCP)
(17:02) How MCP differs from traditional APIs and the role of natural language interfaces
(21:48) Zapier's shift toward MCP and protocol standardization
(24:10) Anthropic’s open-source approach to MCP and its origin
(27:05) Practical enterprise applications of MCP
(30:22) PipeIQ use cases and integrations with services like Salesforce, Gong, and Snowflake
(35:01) Internal vs. external MCP servers and the rise of natural language endpoints
(38:35) The future of low-code/no-code powered by MCP
(41:00) VMware and infrastructure orchestration via natural language
(44:44) The shift from software coders to software architects
(47:10) Enterprise potential: enabling citizen developers through MCP
(50:05) Emerging platforms like Mintlify and automatic MCP generation from documentation
(53:00) The future of marketing and AI-driven content creation
(56:40) Comparing MCP to APIs and the role of context-aware AI coordination
(01:00:08) The Silicon Valley gold rush moment for MCP adoption
(01:03:00) Wrapping up and what’s next for MCP demos