Artwork

Nội dung được cung cấp bởi Stewart Alsop. Tất cả nội dung podcast bao gồm các tập, đồ họa và mô tả podcast đều được Stewart Alsop hoặc đối tác nền tảng podcast của họ tải lên và cung cấp trực tiếp. Nếu bạn cho rằng ai đó đang sử dụng tác phẩm có bản quyền của bạn mà không có sự cho phép của bạn, bạn có thể làm theo quy trình được nêu ở đây https://vi.player.fm/legal.
Player FM - Ứng dụng Podcast
Chuyển sang chế độ ngoại tuyến với ứng dụng Player FM !

Episode #510: Open Source, Open Minds: a Conversation with Dax Raad on the Future of Coding

57:32
 
Chia sẻ
 

Manage episode 522146295 series 2113998
Nội dung được cung cấp bởi Stewart Alsop. Tất cả nội dung podcast bao gồm các tập, đồ họa và mô tả podcast đều được Stewart Alsop hoặc đối tác nền tảng podcast của họ tải lên và cung cấp trực tiếp. Nếu bạn cho rằng ai đó đang sử dụng tác phẩm có bản quyền của bạn mà không có sự cho phép của bạn, bạn có thể làm theo quy trình được nêu ở đây https://vi.player.fm/legal.

On this episode of Crazy Wisdom, I, Stewart Alsop, sit down with Dax Raad, co-founder of OpenCode, for a wide-ranging conversation about open-source development, command-line interfaces, the rise of coding agents, how LLMs change software workflows, the tension between centralization and decentralization in tech, and even what it’s like to push the limits of the terminal itself. We talk about the future of interfaces, fast-feedback programming, model switching, and why open-source momentum—especially from China—is reshaping the landscape. You can find Dax on Twitter and check an example of what can be done using OpenCode in this tweet.

Check out this GPT we trained on the conversation

Timestamps

00:00 Stewart Alsop and Dax Raad open with the origins of OpenCode, the value of open source, and the long-tail problem in coding agents.
05:00 They explore why command line interfaces keep winning, the universality of the terminal, and early adoption of agentic workflows.
10:00 Dax explains pushing the terminal with TUI frameworks, rich interactions, and constraints that improve UX.
15:00 They contrast CLI vs. chat UIs, discuss voice-driven reviews, and refining prompt-review workflows.
20:00 Dax lays out fast feedback loops, slow vs. fast models, and why autonomy isn’t the goal.
25:00 Conversation turns to model switching, open-source competitiveness, and real developer behavior.
30:00 They examine inference economics, Chinese open-source labs, and emerging U.S. efforts.
35:00 Dax breaks down incumbents like Google and Microsoft and why scale advantages endure.
40:00 They debate centralization vs. decentralization, choice, and the email analogy.
45:00 Stewart reflects on building products; Dax argues for healthy creative destruction.
50:00 Hardware talk emerges—Raspberry Pi, robotics, and LLMs as learning accelerators.
55:00 Dax shares insights on terminal internals, text-as-canvas rendering, and the elegance of the medium.

Key Insights

  1. Open source thrives where the long tail matters. Dax explains that OpenCode exists because coding agents must integrate with countless models, environments, and providers. That complexity naturally favors open source, since a small team can’t cover every edge case—but a community can. This creates a collaborative ecosystem where users meaningfully shape the tool.
  2. The command line is winning because it’s universal, not nostalgic. Many misunderstand the surge of CLI-based AI tools, assuming it’s aesthetic or retro. Dax argues it’s simply the easiest, most flexible, least opinionated surface that works everywhere—from enterprise laptops to personal dev setups—making adoption frictionless.
  3. Terminal interfaces can be richer than assumed. The team is pushing TUI frameworks far beyond scrolling text, introducing mouse support, dialogs, hover states, and structured interactivity. Despite constraints, the terminal becomes a powerful “text canvas,” capable of UI complexity normally reserved for GUIs.
  4. Fast feedback loops beat “autonomous” long-running agents. Dax rejects the trend of hour-long AI tasks, viewing it as optimizing around model slowness rather than user needs. He prefers rapid iteration with faster models, reviewing diffs continuously, and reserving slower models only when necessary.
  5. Open-source LLMs are improving quickly—and economics matter. Many open models now approach the quality of top proprietary systems while being far cheaper and faster to serve. Because inference is capital-intensive, competition pushes prices down, creating real incentives for developers and companies to reconsider model choices.
  6. Centralization isn’t the enemy—lack of choice is. Dax frames the landscape like email: centralized providers dominate through convenience and scale, but the open protocols underneath protect users’ ability to choose alternatives. The real danger is ecosystems where leaving becomes impossible.
  7. LLMs dramatically expand what individuals can learn and build. Both Stewart and Dax highlight that AI enables people to tackle domains previously too opaque or slow to learn—from terminal internals to hardware tinkering. This accelerates creativity and lowers barriers, shifting agency back to small teams and individuals.
  continue reading

513 tập

Artwork
iconChia sẻ
 
Manage episode 522146295 series 2113998
Nội dung được cung cấp bởi Stewart Alsop. Tất cả nội dung podcast bao gồm các tập, đồ họa và mô tả podcast đều được Stewart Alsop hoặc đối tác nền tảng podcast của họ tải lên và cung cấp trực tiếp. Nếu bạn cho rằng ai đó đang sử dụng tác phẩm có bản quyền của bạn mà không có sự cho phép của bạn, bạn có thể làm theo quy trình được nêu ở đây https://vi.player.fm/legal.

On this episode of Crazy Wisdom, I, Stewart Alsop, sit down with Dax Raad, co-founder of OpenCode, for a wide-ranging conversation about open-source development, command-line interfaces, the rise of coding agents, how LLMs change software workflows, the tension between centralization and decentralization in tech, and even what it’s like to push the limits of the terminal itself. We talk about the future of interfaces, fast-feedback programming, model switching, and why open-source momentum—especially from China—is reshaping the landscape. You can find Dax on Twitter and check an example of what can be done using OpenCode in this tweet.

Check out this GPT we trained on the conversation

Timestamps

00:00 Stewart Alsop and Dax Raad open with the origins of OpenCode, the value of open source, and the long-tail problem in coding agents.
05:00 They explore why command line interfaces keep winning, the universality of the terminal, and early adoption of agentic workflows.
10:00 Dax explains pushing the terminal with TUI frameworks, rich interactions, and constraints that improve UX.
15:00 They contrast CLI vs. chat UIs, discuss voice-driven reviews, and refining prompt-review workflows.
20:00 Dax lays out fast feedback loops, slow vs. fast models, and why autonomy isn’t the goal.
25:00 Conversation turns to model switching, open-source competitiveness, and real developer behavior.
30:00 They examine inference economics, Chinese open-source labs, and emerging U.S. efforts.
35:00 Dax breaks down incumbents like Google and Microsoft and why scale advantages endure.
40:00 They debate centralization vs. decentralization, choice, and the email analogy.
45:00 Stewart reflects on building products; Dax argues for healthy creative destruction.
50:00 Hardware talk emerges—Raspberry Pi, robotics, and LLMs as learning accelerators.
55:00 Dax shares insights on terminal internals, text-as-canvas rendering, and the elegance of the medium.

Key Insights

  1. Open source thrives where the long tail matters. Dax explains that OpenCode exists because coding agents must integrate with countless models, environments, and providers. That complexity naturally favors open source, since a small team can’t cover every edge case—but a community can. This creates a collaborative ecosystem where users meaningfully shape the tool.
  2. The command line is winning because it’s universal, not nostalgic. Many misunderstand the surge of CLI-based AI tools, assuming it’s aesthetic or retro. Dax argues it’s simply the easiest, most flexible, least opinionated surface that works everywhere—from enterprise laptops to personal dev setups—making adoption frictionless.
  3. Terminal interfaces can be richer than assumed. The team is pushing TUI frameworks far beyond scrolling text, introducing mouse support, dialogs, hover states, and structured interactivity. Despite constraints, the terminal becomes a powerful “text canvas,” capable of UI complexity normally reserved for GUIs.
  4. Fast feedback loops beat “autonomous” long-running agents. Dax rejects the trend of hour-long AI tasks, viewing it as optimizing around model slowness rather than user needs. He prefers rapid iteration with faster models, reviewing diffs continuously, and reserving slower models only when necessary.
  5. Open-source LLMs are improving quickly—and economics matter. Many open models now approach the quality of top proprietary systems while being far cheaper and faster to serve. Because inference is capital-intensive, competition pushes prices down, creating real incentives for developers and companies to reconsider model choices.
  6. Centralization isn’t the enemy—lack of choice is. Dax frames the landscape like email: centralized providers dominate through convenience and scale, but the open protocols underneath protect users’ ability to choose alternatives. The real danger is ecosystems where leaving becomes impossible.
  7. LLMs dramatically expand what individuals can learn and build. Both Stewart and Dax highlight that AI enables people to tackle domains previously too opaque or slow to learn—from terminal internals to hardware tinkering. This accelerates creativity and lowers barriers, shifting agency back to small teams and individuals.
  continue reading

513 tập

Tất cả các tập

×
 
Loading …

Chào mừng bạn đến với Player FM!

Player FM đang quét trang web để tìm các podcast chất lượng cao cho bạn thưởng thức ngay bây giờ. Đây là ứng dụng podcast tốt nhất và hoạt động trên Android, iPhone và web. Đăng ký để đồng bộ các theo dõi trên tất cả thiết bị.

 

Hướng dẫn sử dụng nhanh

Nghe chương trình này trong khi bạn khám phá
Nghe