April 6, 2025

48 thoughts on “@Asianometry & Dylan Patel – How the Semiconductor Industry Actually Works

  1. This is some sage career advice! "it's really about: what are you good at, where can you vibe, enjoy your work,, and where can you make an impact in society…what engages you? Because if you're interested in it, you"ll work harder" Dylan

  2. What the heck is half Taiwanese and half Chinese? Taiwanese are ethnically Chinese. You speak Mandarin. That makes you ethnically Chinese. Hell, the Dais, the Miao, the Uighurs and the Inner Mongols all don't speak Mandarin but they are sure as hell Chinese.

  3. Big cheers to dwarkesh, throughout the interview with these two giants he´s clearly done its research, yet he´s constantly pushed to the edge of his knowledge, while still keeping them to getting into semiconductor cloud nine.

  4. Heard of "Terrance Tao", you dumbos? He's ethnically Chinese. Have the CIA/FBI harrass him a few times and he'll be off to China, to join the rest of the Chinese Americans that were treated like WW2 Japanese Americans living in America.

    Your 'speculation' is weak, sons. Level up. Much more insightful podcasts in China that have more realistic analysis of the situations.

  5. dylan patel is way too smart to accidentally repeat chinese propaganda…
    i really wonder why he is rooting for china and downplaying them as a threat, and uplaying their capabilities…

  6. I am still dont understand. The use massive amounts of GPU is for build base LLM. Now is about RLHF , GPT 01 is reasoning is loopback from level 4 (according to AndreKarpaty 2023 microsoft presentation). Groq with they LPU work base on inference with different design. Is too much hype building thousand GPU pumping AI and chip stock

  7. I kinda feel lucky to have the background that I have. Power Systems/Electronics, Digital IC design, Analog circuits, Control/Signals Systems, and lastly AI design. This episode is such an overload that it is giving me anxiety.

  8. Okay dumb question, would a "folding at home" style distributed compute among consumer hardware work for AI training? I don't know if there's any way to make a large training set into something like a packet to essentially break down training into smaller segments or if the model has to be trained all at once.

  9. I would think an AI recommender system consisting of say 7B SLM models would be able to split off large problems into smaller ones. If you have the ability to pull essentially any PHD level specialized discipline AI for any given topic, you could then just use a bunch of SLM agents to reduce your compute overhead instead of this massive model trying to encompass all information. As humans we work as a bunch of experts trying to collaborate with other human experts to accomplish a complex task. I would think creating AI systems that mimic this behavior would be much more effective than trying to train one large model to do everything. So instead of one model responding to a prompt it would offload the response to a domain specific model after analyzing the prompt for specific keywords. For example any coding related question would be answered by a different model trained only on code instead of natural language, hence the need for a massive amount of different but smaller domain specific models.

  10. What is this fucking channel and how is this motherfucker getting these insane interviewees but not cracking 1m views on more than one video??? Bruh, is it thumbnails or something? Interview someone who intimately knows about the YT algos or something and fix your shit because this is wild stuff.

  11. @zerbaxa599 its total misinformation the dam already completed and it doesn't have any security issue unless Egypt planned to be weeped out of the map; for your surprise there is billions of dollars coming in

Leave a Reply

Your email address will not be published. Required fields are marked *