Another great session with Bill Price talking about Silent Sufferers and the future of Customer-Managed Relationships (which is way more accurate than CRM or Customer Service)
Another test live show, but this time with some other interesting content. How I use Claude to do research
This was more of a test podcast than anything else, where we are now streaming on LinkedIn Live, YouTube Live, and Twitch with rebroadcast on LinkedIn, YouTube, and the Podcast world.
But Seb does a good job talking about what it's like for someone in college now with Generative AI everywhere
OK, an oldie but goodie, this is from way back in May 2025 where we see how good our predictions really were back then.
Steve and Rich had a long chat way back in February and I completely forgot to make it a podcast, so here goes, we will be catching up on episodes for the last six months, so sorry about that!
Life has been crazh busy!
Another great session. Thoughts on AI in 2024 and how well we did with predictions, then a look at the new innovations of 2025 and what's in store. See https://tongfamily.com/2025/01/26/pod-london-ai-hub-january-13-2025/ for more details
There's been a hiatus mainly because we've been shipping products, but more because I lost all my skills at making new videos. I was stuck for a long time on Drop Zones and doing a better intro and outro, but that's a digression for Final Cut Pro nerds.
In this episode, we cover the latest trends as of August 8, 2024, on AI and the latest trends. The big news has been the shipment of so many Large Language Models (LLMs) and what it means for AI which is way more choice and confusion.
Plus the emergence of a much better set of tools that are much smaller as well called Small Language Models (SLM sometimes) and also Agents that are chopping the problem into many small pieces.
I also wanted to introduce Steven to the mix, we are going to have a rotating set of co-hosts and solo episodes as well so we can get the content out on time and not take 6 weeks for post productions. Thanks to the new intro and outro, that should be easier. I'm playing with these a bunch, but check out https://tongfamily.com and https://tne.ai where I hang out a bunch!
This is another sort of nerdy side note. If anyone is still watching, this section is just to give intuition on the basics of the hardware. There are lots of assumptions about GPUs and CPUs that I wanted to make sure people understood.But the basics are that CPUs are tuned for lots of branches and different workflows, while GPUs are tuned for lots of the same things like matric math. And because they are so fast, most of the job of the computer folks is "feeding the beast". That is caching the most frequently used information so they don't have to wait. There are some errors I think in the levels of the CPU and GPU performance particularly in the cache performance as it is not very clear how this works and the results of course vary depending on the models of processors, so these are all approximations. Put in comments better sources. I have all the sources listed in a spreadsheet that is part of this. We are happy to send this to anyone who wants the source data. I'll fix these errors in later editions (as I'm obsessive that way)
.Also, I'm quite proud of the HDR mix, using the latest OBS settings, producing in HDR in Final Cut Pro, and adjusting the video scope levels helps. The audio is a little hot and I'm sorry about that, I'll turn it down next time, I stoo much time in the red. My Scarlett needs to about 1 O'clock and it works.
See https://youtu.be/FupclouzYTI for a video version. And more details at https://tongfamily.com/2024/03/08/pod-rt3-ai-hardware-introduction/
Chapters: 00:00 AI Hardware Introduction 00:42 Computer Engineering in Two Slides 05:40 It's 165 Years to a Single Disk Access?!! 14:12 Intel Architecture CPU 17:03 What's all this about NVidia 25:24 And now for something completely different, Apple 29:45 Introduction Summary
2024-03 -08 Shot as UHD HEVC HDR PQ 10 bit using OBS Studio and Final Cut Pro© 2024. Iron Snow Technologies, LLC. All Rights Reserved.
OK, we are not experts nor PhDs so most of this is probably not technically correct, but the math is so complicated and the concepts so complicated, that we thought it would be good to just get some intuition on what is happening. So this is a quick talk that summarizes readings from so many different sources about the history of AI from the 1950s all the way to January 2024 or so. It is really hard to keep up but much harder without some intuition about what is happening. We cover expert systems, the emergence of neural networks, Convolution Neural Networks, and Recurrent Neural Networks. The Attention is All You Need paper led to Transformers and then finally some intuition on how such a simple idea, that is training on things can lead to emergent and unexpected behaviors, and finally, some intuition on how Generative images work.
You can go to YouTube to see the slides we are using at YouTube and more information at Tongfamily.com
Chapters:
OK, this is a reboot of systems. And getting ready for next year, the new pipeline is ready and the most important thing is that the audio finally sounds better. Stay tuned for more soon!
The AI Adventurers return! We are back after two months off retooling our brains and various ventures to be AI-Native. It's not easy and we talk about why. What does it mean to look at your software people and decide if they are type 1, 2 or 3. And how hard it is to make it all work.
Show notes at Tongfamily.com
A conversation with Devindra on the possibilities of ChatGPT and other Large Language Models for education in India and other Middle and Low-Income Countries. While these large language models today mainly work in the cloud and require high-speed internet connectivity, the promise is that smaller models that are more specialized have already been ported to MacBook Pro M1 and other laptops. And with advances like 4-bit quantization, there is the possibility they can even run on modern smartphones.The experience in China and in India is that parents will save inordinately to give their children a better life than theirs. Both Devindra and I are the beneficiaries of that thinking and we are forever grateful to our parents for their sacrifice. So, this means that there is a possibility that there could be a commercial incentive which is much better than constantly asking for charitable dollars.The new LLMs promise interactivity and individualized instruction that was impossible before. Early demonstrations of learning a foreign language are promising as is the student being able to ask the "why" of how Python is structured, not just the "get it done" without understanding that online education often creates.Finally, having an eye toward inclusivity for everyone of any gender, race or social status is something that needs to be baked in. We look forward to helping!
This time, we bumble through an episode featuring the amazing Mike Conte and we talk about what's wrong with this Podcast technically (see below), about ChatGPT, and about how you don't want to show others the Money. Mike is as always hilarious! Ok, our exploration of Podcasting continues with tech glitches galore. First of all apologies to our 15 viewers. Please use this version as the audio had terrible cutouts because I was messing with the audio, this should work better (and it has the brightness turned down from 400 nits to 300 nits which help a little on TV-type HDR systems). a) the segment producer problem and move to a news section and then a special topic, the news will be on cool things in Tech from AI to Gadgets to Smart Home, things opinionated Verge) b) Audio, yes there are lots of glitches here, I had not realized that OBS really requires all manual controls. I just want Automatic Gain Control and Radio voice, but I'm learning. The audio in this one is very quiet because I had the Gain stage turned up, but didn't realize the compressor also had gained. And the Final Cut Pro gain is only 12dB. Also, I didn't realize how the Noise suppressor and gain work on low signals (like it cuts them out) and Zoom is very low. c) Video, I've been mastering in HDR and the default in OBS is for 400 nits peak brightness, but while this looks awesome on iPhone, Mac, and iPad (let's call them "real" HDR devices with 1000 peak nits), it looks dreadful on lesser HDR devices like my LG B9 which is only at most 400 nits. So I'll probably settle at say 300 nits for this next time, but most people I think are just getting the transcoded SDR Rec.709 output). See https://tongfamily.com/2023/03/27/pt5-chatgpt-and-mike-conte-special-guest-and-tech-glitches-galore/ for show notes
Well in these incredibly turbulent times, what's the best way to deal with venture capitalists and other investors particularly if you are early stage? In places like Europe there is a real zone between $50K and $1M that is hard to cross, so should you spray and pray or be more targeted? Well, who knows what the right answer is, but there are probably too.
Answer the basic questions in your presentations and you can use the Sequoia template as an example. See https://perfectpitchdeck.com/2018/01/30/sequoia-capital-pitch-deck-template
But the harder part is realizing that you are not trying to sell an investor but asking, "What does the customer want to buy?" In that way, this is no different than any marketing exercise. I can't find the exact quote anymore, but Peter Drucker the father of modern marketing did say, Markeint does not ask, "what do want to sell?" It asks, "what does the customers want to buy". So when you are meeting an investor that's the key question.
The good news is that the investor unlike almost any other customer is completely activated, they want to buy your product, and you just have to find out what's in the way. The easiest way to do it is just to ask them. So for instance, if you have a construction tech company, just ask, "so what do you think about construction tech, is it a good category?" And you will be amazed at what they will tell you. Don't ignore it, make their points the crux of your pitch. There will be lots of objections and the mark of a good plan is that you've thought the same things, so what are you doing about it?
Show notes at https://tongfamily.com/2023/03/10/dt2-fueling-your-startup-with-great-investors/