Takeaways from the AI Engineer Summit
The last weekend, Oct 8-11, the AI Engineer Summit happened in SF. It was a unique, well-organized event with many high-caliber people coming.
Here are some surprises, hot takes, and learnings I took away from the event.
Hot Take: TypeScript will become the AI programming language. I’ve heard this from multiple people. Some think that even AI model development would happen in TypeScript. If not that, it’s currently clear that more AI apps are built in TypeScript than in Python, as JavaScript/TypeScript is necessary to build a web app anyway, and writing everything in one language makes it easier to maintain.
Many folks use LangChain for prototyping but still roll their solution in production. That’s due to a few reasons:
The abstractions might not yet be right
They want more control
AutoGPT is so far the jack of all trades, master of none. While very promising and exciting, apparently, folks haven’t been able to get good results for the tasks they tried yet. AutoGPT and other creators of agents realized that they were creating their internal protocols, which made it hard for other agents or AI tools to work with. That’s why they launched the Agent Protocol, to create an interoperability layer for all agents and tools around them. I have full respect for the AutoGPT folks - they know they’re trying to solve a tough problem – AGI. I’m curious to see where this is headed, given that many labs try to get there first.
Some folks argued that it might be too early to build AI dev tools, as we don’t know the use cases in real businesses yet. Instead, go and solve problems for businesses today and then try to find the commonalities. Instead of trying to create the generic mega agent, I met a few folks who are building agents for specific use cases now, such as customer support.
I asked folks what their favorite talk was, and a few people mentioned Climbing the Ladder of Abstraction by Amelia Wattenberger. She’s introducing two quite exciting concepts:
Augmentation through automation: Let’s say our job is to deal with Excel and we need to manually copy and fill specific cells. A tool that would automate those micro tasks for us would on a higher level, augment us.
Ladder of Abstractions: If you zoom out of Google Maps, seeing every street and house is impossible - you have a different level of abstraction. The same with summaries — Amelia showed a demo where you could “zoom-out” of the text by choosing higher and higher levels of abstractions.
From here, she shows a fascinating example of how an Airbnb booking experience could look if we could zoom out on the things that matter to us. You can watch her full talk here.
These are just some of the takeaways of the event, much, much more happened. Charlie Guo did a great job in giving a more comprehensive summary of Day 1 and Day 2. Livestreams of Day 1 and Day 2.
TL;DR: It was very much worth coming, and I’m looking forward to the next event in spring!