June 2nd, 2023 × #AI#GitHub#Developers#Engineering#Innovation
Supper Club × Matt Rothenberg and Idan Gazit on Github Next
GitHub's R&D team discusses pioneering AI developer tools like Copilot and shares insights on creating reliable, intuitive experiences. They examine challenges like latency, trust, feedback and rethinking workflows in light of AI capabilities.
- GitHub Next started as the Office of the CTO to explore risky, long-term technology bets for GitHub.
- GitHub Next explores technologies that engineering teams avoid due to risk, like AI that can write code.
- Copilot started small but exploded in popularity when released publicly in June 2021.
- The team was initially skeptical Copilot would be useful but were convinced after solving problems like reliability and latency.
- Larger AI models enable conversation-like persistence of context and information.
- Fast response time was critical for Copilot adoption as people wouldn't wait for suggestions.
- Code structuring like hoisting is done to prompt models effectively.
- Prompt crafting is an art with trial and error, not pure deterministic programming.
- Smaller models can be better for some uses like Copilot where latency is critical.
- The team tries not to take negative feedback personally but sees it as input to improve.
- New VS Code APIs were needed to enable Copilot features like ghost text.
- AI will augment developers, not replace them, by assisting on tedious tasks.
- Copilot for pull requests summarizes changes and generates descriptions to help collaboration.
- Reliability remains a challenge for more ambitious AI tasks like full PR generation.
- Workflow improvements may require rethinking existing software practices in light of new AI capabilities.
- GitHub Next establishes trust and delivers value to bring other teams on board with their ideas.
- They use a variety of keyboards, themes and terminals but value speed and usability.
- They highlighted accessibility improvements and collaborative editing as areas for AI assistance.
- Many are innovating in CLI AI space and sharing ideas openly.
Transcript
Announcer
I sure hope you're hungry.
Announcer
Hoo. I'm starving.
Announcer
Wash those hands, pull up a chair, and secure that feed bag, because it's time to listen to Scott Tolinski and Wes Bos attempt to use human language to converse with and pick the brains of other developers. I thought there was gonna be food, so buckle up and grab that old handle because this ride is going to get wild.
Announcer
This is the Syntax supper club.
Wes Bos
Welcome to Syntax, The podcast with the dayciest treats out there. Today, we've got a show we have been working on getting for probably 6 months now, and I am so stoked To have it on, we are talking to the folks from GitHub next, specifically around the big projects like Copilot, GitHub Copilot and all the Other crazy stuff that they've been cranking out. I've been really enjoying all the kind of interesting projects that they've been working on. So we're going to talk to them about that today.
Wes Bos
Welcome. We have Matt Rothenberg and Yidan Gazzit. Hey. Hey. Hey. Hey. Welcome. Hey. Hey. Welcome, guys.
Wes Bos
Why don't We do a quick introduction of who you folks are. We'll start with Matt and then go to Yidane, and then we'll get into all the different stuff we wanna talk about.
Guest 3
Awesome. So, yeah, I'm Matt. I've been on the team now for about two and a half years Based in New York City and come from more of a design and front end engineering background, not so much an ML background. And as you'll hear throughout the conversation, we have quite a diverse set of skills in the team, and I definitely round out more of the visual sort of experiential stuff. Hey. I'm Idan, and,
Guest 4
I have been at Next since, Since next, more or less, I joined very early sort of in its in its history. And, I also come originally from sort of A a full stack front end, back end background.
Guest 4
Prior to GitHub, I was at Heroku.
Guest 4
I used to do Python, and Now I do JavaScript or, really, I just do email because that's what managers do.
Guest 4
But, but, yeah, I've, sort of Been along for the entire ride of of GitHub Next from its sort of inception.
Scott Tolinski
So that's a a good lead into I guess our first question, which should be GitHub next. How did how did it start? Do you wanna maybe get into some of the the, history of this this project, when, how, why this this thing exist?
Guest 4
Yeah. I'm happy to. So, it didn't it didn't start out life as as good up next. It started out as the office of the CTO. So, Jason Jason Warner, was the SVP of engineering. He ran all of engineering. Get up. So, like, you know, a 1000 reports was responsible for, you know, day to day execution engineering, but he wanted to focus more on on sort of, future facing bets. Like, what's going to, you know, have impact on the business, beyond the adjacent possible? And so he traded his SVP role for the CTO role, and started what was the the office of the CTO.
GitHub Next started as the Office of the CTO to explore risky, long-term technology bets for GitHub.
Guest 4
And at the time, it was, you know, like, 5 5 people order of, Oguodhamor and and and, a few others.
Guest 4
And the idea was to chase to chase bets, to chase risky bets. Like, engineering orgs everywhere are incentivized to Avoid risk. Right? So, like, always you know, every project should succeed. We should know exactly how we're gonna do it. We should know exactly how long it's gonna take, Yada yada.
GitHub Next explores technologies that engineering teams avoid due to risk, like AI that can write code.
Guest 4
But the problem is that as as an org grows, eventually, you sort of, we doubt the desire to make bets from the business, and the risky things get passed over over and over. So that was the idea of the office as CTO is to be sort of like a counterweight to that. And then in Microsoft terms, they call these, like, horizon three opportunities, things that are sort of over the horizon.
Guest 4
Ridiculous things. Things that where it's like, we're not even sure if it's possible. And if it's possible, how we would go about doing it. Like, what if we had an AI that could write code that that doesn't suck. You know? Like, that that was ridiculous when when we started started it out. So, so, yeah, so that's that's That's where it started. And over time, it the the team grew, but not that much. We're we're about 20 people now. It's it's not huge. And along the way, Jason departed. He's Now a venture capitalist slash thought leader, and, Ugh, Ugh, Ugh, Ugh, Ugh, and because we don't have a CTO, we can't be the office of the CTO. So They were gonna call us Innovation Lab, but that doesn't fit into a Twitter handle. So, thankfully, they they let us, stick to get up next. And, yeah. Now we're we're we're still sort of firing on all cylinders, exploring stuff, hopefully, supplying a map of the territory to the rest of the business. Like, our our function is To, like, qualify bets, to prove them out, to prototype them, not just write papers about, like, what might be, but to actually build Proofs of concept and hand them over to engineering, ideally, with proof that it's like we found some kind of product market fit. It's a little bit like a startup inside GitHub That's constantly trying stuff out and handing it off and going back to the drawing board. Awesome. How do you even come up with an idea of something that
Guest 3
might pop up. Like, is there are you do you guys just sit around throwing a basketball in a ring and riff on ideas? I think to some degree, yeah, it's It's pretty bottom up at GitHub next. I wouldn't say there's any real top down influence where an executive or a leader says you must do x y z thing that has AI or ML, because the team has such a diverse set of skill sets on it. Right? You have people who have done PhDs in natural language processing, people that have done ML research for for years. You have this really strong wealth of ideas swimming around at anyone's head at anyone at any given time. Right? So We do a lot of informal brainstorming where if someone has an idea, they might just reach out on Slack and say, I'm gonna put a session together. We're gonna get a fake gem going, and we're just gonna riff and get ideas down. And And I think the natural desire is out of that brainstorm session comes some discreet marching orders for really figuring out, okay. I have this super wild idea. How can I quickly test How feasible this is because, really, we wanna just get a sense of, is this something that we feel confident about spending our time on? And do we feel like this might have the kind of impact that we're being looked to provide for the business, but it's extremely organic. And as we talk, you'll hear stories about how, like, the CLI came together, which was a really good example of that very organic, Oh my gosh. I have an idea to here's a wait list of people waiting on it to actually use our project, and it happened extremely quickly. Yeah. That's something I heard about I have a GitHub Copilot, and you can tell me if this is true. I just heard it from somebody who'd worked at. They said that GitHub Copilot blew up like it was just a a little project
Wes Bos
That was going on. And before you know it, half the world is using is that true? Yes. I I mean,
Copilot started small but exploded in popularity when released publicly in June 2021.
Guest 4
I mean, it's It started out, like, you know, in the in the very early days. Like, you know, we we obviously had access to, through the sort of the Soft open AI partnership. We had access to, codecs, the sort of the the the original granddaddy of of, of LLM models pretty early on. And what I think a lot of people are realizing as they try to build products on top of generative AI now is that it's not as it's not like an API you call and get a response and then paste the response directly to the user and and and good luck. There's There's all this sort of under the waterline work that goes into how do you prompt the model, and then how do you take what comes back and decide whether or not it's good. And and How do you even expose, like, what's the right user experience? What's the right user interface for for showing this stuff? There's The the simplicity, the magic of of Copilot is is underpinned by all this work to make it that simple, that magic. So at the time, none of that existed. At the time, it's like we have this this API. You know? Good luck. Like, try try to make something. And and Hand on heart, I was I was a skeptic.
Guest 4
Like, when we first started using this, I was like, I don't know. Everything I've ever used And AI has has never been that great.
Guest 4
Yeah. But bit by bit, like, you know, the team hammered away at, like, all these This this constellation of problems that stood between, like, you know, like, how do we make it fast? How do we make it ignorable if it's not right? How do we make it mutable if it's Almost right, but not exactly what you want. And, like, you know, we explored, like, 5 different ways for showing this, experience. Matt Matt and I hacked on, like, you know, like, a Tinder style, like, accept reject kind of kind of UI.
Guest 4
So it it's it's it's It started small, and then I think it was it was June 2021, I think, was, the sort of the the the first, like, you know, like, Here's Copilot. The world first saw it. And, yeah, it blew up. We had I don't even remember how many people on the wait list, but it was a lot. Half the Internet.
Guest 4
Yeah. And the rest, as they say, is is history. Wow. So when this first,
The team was initially skeptical Copilot would be useful but were convinced after solving problems like reliability and latency.
Scott Tolinski
worked for you, Were you all, when you were using it internally and seeing that this was feeling really nice, did you instantly get this feeling of,
Guest 3
everyone is going to be wanting to use these AI tools very soon? I think a little bit. I remember back to my interview process, actually. I remember getting this demo of Copilot at the time, which was like It was like an in browser prototype. There was like a text area on the left and like a blank box on the right, and Ugo was writing a Python function and started to write a doc string and, like, Press a submit button and, like, out came the AI generated thing on the other end. And I was like, okay. This is cool, but I couldn't quite see how that fit into my workflow. Like, as a developer, I don't think about, here's a problem I wanna solve. Let me open Arc or Chrome. Open a tab. Go to a URL. Paste the function in. Start writing a comment. This is not how we work. Right? It's not seamless.
Guest 3
But as we started to do prototyping in Versus code and extension land and started to investigate some of these APIs like the inline completion API, the the ghost Text as we refer to it. It really started to solidify in my mind, like, oh my gosh. This is actually an interface that fits my workflow It has tremendous legs provided that Versus code can give us more API things to provide additional metadata. You know, Adan mentioned, like, How can you trust the completions? How can you accept, reject them, get a slightly refined completion? These are, like, problems that we're still thinking about solving, but Knowing that this is the core experience gives me strong confidence that this will be a pretty game changing workflow for developers. It's wild how the design
Wes Bos
is so important in a product like this because, like, before ChatJpt, we had that tech And it wasn't blowing up. And Chat GPT came out and then, boom, everybody's freaking out over it. And it's because Some designers came around and said, maybe this is a good way to use this. Yeah. Absolutely. I think, I think chat
Guest 4
Chao was made possible by, by the the bigger models, like 35 and then later what, what came to be known as g p t four. You know, the original codex was Not it didn't have that sort of, like, conversational ability. And in fact, on some level, I'd personally argue that I think the, the real magic of the bigger models is not how much better they are at answering things, and they are they are categorically better at answering, like, you know, questions or or, the code that they generated, they're able to deal with more context. There's there's a lot of things about them that make them more powerful, but, the magic to me having used them for a while is, the persistence of, of knowledge in a conversation.
Larger AI models enable conversation-like persistence of context and information.
Guest 4
Mhmm. It makes me feel like I'm talking with a human. And I think To the broad audience of, like, everyone on the Internet, not developers, you know, that to them is something that's just like, oh my god. I'm I'm I'm talking to something that feels Almost human because it remembers what I said a minute ago. Like, when I, you know, ask my phone to set a timer and then I talk with it about something else, it's not gonna remember that I asked to set a timer a minute ago. Like, you know, I won't be able to reference that. But with chat, I think everybody's responding to the fact like, wow. It's like there's persistence of A vision, persistence of information,
Guest 3
and and and people really, really vibe with that because it it feels like another person. Mhmm. I think also that was kind of my first wow with Copilot. You know, it definitely has that sort of retention of context thing that you're talking about at dawn, but it also has this recognition of pattern thing patterns thing that feels super powerful to me where if I'm in a file writing a TypeScript interface and it's starting to auto complete fields based on things I'm accessing in an API getter or a function below, It really feels like it understands what I'm trying to do. And, of course, it'll hallucinate things. Of course, it'll make mistakes, but I feel like there is some, like, humanoid partner on the other end of the wire, like, offering things to me, and it feels relevant. And it feels actually I I don't wanna say human, but it feels like Something's there that I can actually rely on. Totally. I think one of the core,
Scott Tolinski
pieces of of Copilot is just how fast it is in usage, And and the speed really makes it feel like a streamline streamline part of your workflow. Right? It's just in line with what you're doing. It pops up with the ghost text like you said, and And you can just choose to accept it or or not.
Fast response time was critical for Copilot adoption as people wouldn't wait for suggestions.
Wes Bos
How important was if for you all that it it had to be very fast, and maybe can you talk about some of the things that You did to make it that fast. Yeah. I wanna I wanna know how how it's so fast because I'm I was working with it yesterday, and I'm sitting here on my hands For 30 seconds for the reply to come back. Oh, yeah. Right. You guys are just boom, boom, boom, boom, boom, 80 times a second. How do you get it so fast?
Guest 4
Well, Okay. Special. It's not a fair compare well, no. It's not it's it's not a fair it's not a totally fair comparison because, remember, like, OG Copilot, the one that's That's that's in your editor, not Copilot x, not Copilot chat. I'm talking about the the the ghost text Copilot, is using is using a smaller model. Right? It's using, is using codex.
Guest 4
Part of the the trade off of this, like, with with great power comes Super long latency.
Guest 4
The more powerful models also stream in their responses like 1 1 token at a time. And behind the scenes, they're using a a ridonculous amount of of GPU compute to to to synthesize those those better suggestions.
Guest 4
And then, you know, the question that naturally follows is, like, well, you know, do do we need the more powerful models for all situations? The answer is not always. Like, you know, it turns out that the the less powerful models, have Much lower compute costs and much, lower latency to, like, respond.
Guest 4
And it's it's not just like, oh, it's nice that it's fast.
Guest 4
It has like, you know, this is one of the things that we tracked early on with Copilot. As we were rolling out, we didn't have GPU in every region. And we saw that, you know, completion acceptance rates in in in regions outside of, like, you know, the US or whatever were way lower. And when we actually dug into wise because people just weren't waiting long enough to see the suggestions because we had to ship the the prompt across the Atlantic and then generate the thing and then it back, and that's, you know, a few 100 millis. And by that point, they kept typing, and so they never saw the the suggestion.
Guest 4
So latency is not just like nice to have here. It is like a critical aspect and part of the sort of explosion of charts of stuff that, like, you know, we looked at as we were building. It's like, you know, acceptance rates by language, by geo, You know, by length, by that, you can already imagine the number of charts that we're looking at and sort of trying to Yeah. Like, read the tea leaves and figure out, like, how is the product doing? Like, Was there, like, a specific,
Scott Tolinski
time that you were trying to get everything underneath every response time needs to be x amount of milliseconds, or was it just it needs to feel fast? I wanna say a 100 milliseconds,
Guest 4
but I don't remember.
Guest 4
And, obviously, like, I I also have to, like, you know, give give credit where credit's due. Like, you know, Copa has started in get up next. But now there's, Like, you know, 3 teams in engineering that are responsible for, you know, this product and a lot of this stuff, and they've made great strides, like, you know, sort of, like, Improving it or whatever. So I I can't speak to what it is today, and I and I don't remember what it was at the very, very beginning. A lot of it was bound by how much GPUs How many GPUs we were able to throw at the problem? So, I wanna say a 100 milliseconds, but don't I I wouldn't bank on that. So I've I've got some questions on on, like, how Copilot
Wes Bos
works. Like, obviously, we know there's a ML model behind it. But, Like, what special sauce are you putting on top of of those APIs in order to to it knows context for Syntax, it knows if I'm using semicolons, it knows my TypeScript types.
Wes Bos
So I'm curious both like, how much of my code base Are you looking at are you looking at what's in my clipboard? Are you looking at what's in the file? Are you looking at what's in other files that may be imported from that type of thing? And then what do you have to do to
Guest 3
parse that in in inject it into my own code? Yeah. So admittedly, Adan and I Haven't done much production engineering on CoPilot. And in point of fact, a blog post actually just came out today. I'm not sure if you've seen it on the GitHub or not. I'll share a link. It is amazing, and I think it answers your question pretty directly, Wes, around what magic Copilot is doing to give you accurate quick suggestions, Completions, I think what you'll see is that there's some prompt trickery, things like putting the name of the file at the top of the prompt that gets sent along, pulling in context from open tabs in your Versus code. And those prompt hints seem to be enough to give you pretty accurate completions for the task at hand. Now I'm greatly oversimplifying that. I'm not qualified to answer that, but In that blog post that we'll share, I think you'll get a lot of answers to that question. Yeah. I'll I'll, I'll draft on to that and say,
Code structuring like hoisting is done to prompt models effectively.
Guest 4
It's a lot of trial. Like, you know, I can't speak to sort of the the early days of Copilot. It's a lot of trial and error. It's still a lot of trial and error when we try to build new ML things. You know, the, the the dark art slash science of prompt crafting and engineering is very often, well, we'll try it, and and let's see if it's if it's good. But, I think a a great illustration, which is still true today as far as I know, is, how we restructure code that we're sending as prompt. Like, the models are, machines that say what comes next. Like, statistically, I believe the next character is gonna be in queue.
Guest 4
You know? Whatever.
Guest 4
And it does that over and over again, and and and and it appears to be writing, know, legit code.
Guest 4
But what if your cursor is, like, somewhere in the middle of the of the code? Right? Originally, we didn't have the fill in the middle model, which knows how to do that. And so, like, what did we have to do? We had to, AST the source code, meaning, like, create what's called an abstract syntax tree. Like, you know, that's that's Independent of language, whatever language you're using, so that we can, like, restructure your code in a in a in a way that's semantically sound.
Guest 4
So, like, hoist. Say I've got some code at the top that's referencing code below the cursor. I'm gonna wanna hoist that code below the cursor up top so that the cursor is effectively at the end of of the file so that the model can say, okay. Here's what should come next. And so all this work to, like, you know, like, restructure the code, But in, like, semantically sound legal ways because if you're putting garbage into the model, you're gonna get garbage out. If you wanna get, like, good results, you need to Do that sort of thing. And then get back the response and then splice it in at the right place and, like, undo the the sort of the this one. So That sounds like a real headache to to develop, and that's exactly what I mean about sort of the the below the waterline real headache kind of work that goes into, like, And all of this is totally, you know, transparent to users. No. You don't see this. It happens, like, in an in an eyeblink, but but that's what goes into You know, like, that's the level of effort that the secret sauce is just effort. Yeah. Lots of lots of fine tuning. It's true. We so we're building,
Wes Bos
a bunch of AI stuff for the Syntax website right now. And it's funny. I went through $200 worth of prompts in one day just being like, what happens if I change this word from condensed to smaller, you know, just kind of trying to and it's a it's a weird world because
Guest 4
programmers are used to pure functions, Same input, same output, and ML is not pure at all. No. It's not deterministic. Like, you can't bank on it being the same, and You can't even know if, like, what you're doing like, you know, it's like, okay. It seems to work on my laptop. Is it gonna work for everybody else? Won't know until I put it in front of them. Like, you know, it's Mhmm. Have to Try things out at scale and, build these these we have, like, these large smoke test suites that work At scale and, like, all the code on GitHub to, like, try try it out to give us, like, early warnings about whether or not, like, changes that we're making are are are good.
Guest 3
I think this is definitely a theme in our work to start to interrupt, Wes. No. No. Prompt crafting prompt crafting really does feel like an art, not a And while there are levers you can pull to increase the likelihood of consistent responses, things like temperature, you're really just stabbing in the dark of the time, and I think that's a lot of the joy we get in our work when we find a prompt that yields the kind of thing that we want over and over and over again, and we can work to make sure some consistency comes through that. So Codec is behind
Prompt crafting is an art with trial and error, not pure deterministic programming.
Wes Bos
what most of us are using right now.
Wes Bos
However, codecs is being deprecated, if if I'm right. And you guys are rolling out Copilot x, can you talk to how that is changing? Only a little bit because it's not really good up next anymore. I mean, Copilot x so Copilot x is is
Guest 4
A few different things. Call it, sort of GitHub's, like, sort of brand for the next Evolution of Copilot. It's not just Copilot in your editor. It's Copilot in your pull request. It's Copilot in in chat. It's Copilot. It's, in all these different things. Because, ultimately, what we wanna do is bring, like, you know, the sort of, like, AI assistance to developers throughout the development life cycle.
Guest 4
Chat was actually not done. Like, you know, the the new Copilot Chat was not done by us. That was done by the the actual copilot team that owns, copilot.
Guest 4
I Believe it's underpinned by GPT 4. I might be wrong, and it's 35, but I'm pretty sure it's GPT 4 behind the scenes. And yeah. And, like, you know, that's that's great for the the chat thing because there, it's, like, it's tolerable to have that sort of chat speed of, like, streaming responses where it's, like, Somebody on the other end is typing back at you Mhmm. Or whatever.
Guest 4
And then the question of, like, what's the the Copilot, Like, you know, mainline copilot gonna use as its underlying model. I I can't speak to that.
Guest 4
There's 3 5 turbo, which is Fast? Yeah. I'm not sure if it's fast enough, but I really can't speak to their plans. That's that's their plans. Oh, yeah. Yeah. We've been going back and forth between
Smaller models can be better for some uses like Copilot where latency is critical.
Wes Bos
Summarizing our notes with 4 because it takes longer, but it has a larger token window. You can send it way more content.
Wes Bos
And then for, like, the chat stuff, I've been going back to 35 because it's, like, 16 times cheaper and much faster. So, like, you It's not always that the latest one is better if you can figure out how to make it work with the and you said that as well with the the codecs one is much smaller
Guest 4
faster. Yeah. I think also just generally as an industry trend. I mean, everybody saw this this leaked memo from Google a couple weeks back. Yeah.
Guest 4
The the future might not be, like, you know, like, always bigger models. We're already Struggling to get enough GPUs to do the inference that we need to power these products. Maybe the future is, like, using Multiple smaller models in conjunction or other techniques to, like, sort of squeeze better results out of smaller, cheaper models that are also faster.
Wes Bos
I don't know. This is very active area of research right now at Nxt as well. Yeah. I I was thinking the other day. I was like, I wonder how long until we see you know how you have, like, green hosts? Like, you can host your website on a a host that's a 100% renewable. You know, like, when do we see the solar powered, GPT come out? You know, It will be there. Like, mark my words. We'll see it. But it's certainly something to think about of how much power these things are using. So much power. Yeah. That feels like it's you're gonna have to figure out a new nuclear
Scott Tolinski
fission for that, To get us there. So when you're you all have a a hit like this, right, you come up with, a, you know, idea.
Scott Tolinski
It evolves into what it has evolved, and it's seemingly a huge hit. Everybody's using it. It turns into probably a moneymaker for GitHub considering it's a paid product. Like, does that Look. What's the celebration in your team like for something like that? Like, just as a camaraderie type of thing,
Guest 3
What's that like for to ship such a a big hit? I mean, it feels really good. Right? Like, it's really great to go on Twitter, to go on Hacker News, to go on all these sources where developers get their news and see people finding use cases and finding examples of where it's making their lives better. Granted, that's all tempered with a lot of negative feedback. Some of which is constructive, some of which isn't, but that's just the nature of the job. No. But I think we take a lot of joy in that. I think we are mindful of, okay, we've had this hit. Like, how do we keep this up? What do we need to deliver next to feel like we're producing strong, valuable things, not just for GitHub, but for software developers largely. Because I think that's kind of our meta mission. Right? It sounds vague and hand wavy, but I really do believe that everyone on the team is interested in making software developers' lives better. Like, that's the through line of everything we do. And whether that manifests as, like, An AI powered Versus code extension or something completely different, I think that's the actual goal. So getting the kind of feedback that Copilot saves me time or makes me feel like a better developer as a junior coming into a new project. That's great feedback for us and really fuels the fire that I think is gonna Produce the the next idea from GitHub next that might come out of one of these brainstorming sessions that I talked about at the beginning.
Scott Tolinski
Yeah. Yeah. And so when you get some hate on this stuff, like, Does that feel personal, or do you even personally get hate on any of this stuff?
Guest 3
Yeah. I mean, I think you'll get a different answer from everyone on it's hard not to take it personally. Like, I feel personally responsible when someone says, you know, Copilot generated something highly inappropriate or highly irrelevant to my code base, but not productive for me to harbor that feeling for too long. Like, people are gonna have different experiences. To be Don's point, you know, the responses are nondeterministic.
The team tries not to take negative feedback personally but sees it as input to improve.
Guest 3
Like, there's only so much we can do to ensure quality of output, but, no, I think the team is keenly aware of the fact that if we're getting negative feedback, There's likely an area for improvement, and we take that extremely seriously. We don't dismiss it, but we're also judicious of not indulging too much of the, you know, you you've seen the the comments on Hacker News. I don't have to to you guys. So Yeah. You have to ignore some of this stuff.
Wes Bos
Well, I have 1 question, slightly hateful, and You can say I don't know to this, but I I promised myself I would ask it to so many people is in Versus code, when you have a a completion, It does the double quote double parenthesis thing. What's the deal with that?
Guest 3
I I wish I had a good answer for you. I know exactly what you're talking about. How about this, Wes? I'll take it as an action item to go talk to the team right after this call. So
Wes Bos
That's good. Because I I opened a thing on the Copilot, whatever, and they said they'll bring it to the team, but that was months ago. And I'm I'm just curious, like, is it like an incompatibility between The API is available to you on Versus Code, and the API is, like, you know, like, I'm sure you are also limited in terms of, like, how do you display these things to the user? Because you only have so many ways to show stuff in v s code, let alone, like, them or something. You know? Yeah. I I I'm I'm willing to to to place a gentleman's wager of 1 donut that it is,
Guest 4
like, probably something to do with the way that prompts are formulated and the fact that sometimes it is wrapped in quotes or prints and sometimes It isn't. And then we would need some kind of heuristic for, like, detecting when it's that or not that. But making that reliable enough that we should ship that heuristic is is hard, and we haven't yet. And so that's why it's there or something like that.
Wes Bos
Seems seems like edge cases all the way down with that type Stuff.
Guest 3
Yeah. Yeah. And I think you're putting your finger on an interesting dynamic as well. Something that's really exciting working at GitHub and by extension Microsoft is that we get to collaborate with some really interesting teams like the Versus code team. Right? And oftentimes, you know, me, Adan, other people on the team wanna do something really wild in Versus Code, something that the API doesn't accommodate, and we can prototype it. We can, you know, fake it in another environment that looks v s Cody, but we can actually go to these teams and say, like, what would it take? What would it cost? What can we do? Pretty, pretty pleased with sugar on top. Can you make this API for us? And we've had limited success in doing that, but I think our goal as a team is to really Come up with compelling use cases that can ultimately result in new additions to these different APIs that are being used to power the the Copilot chats, the the Copilot,
Guest 4
ghost text stuff across the, across the board. Yeah. Case in case in point, ghost text itself was not a capability that Versus Code had When we when we first when we first built it, like, you know, it's like there was there wasn't like, the decorations API didn't didn't have an affordance for doing something like ghost text. And we actually needed help from the Versus Code team to, like, give us the capabilities so that we could do the the ghost text thing. And the multiline ghost text thing, it was just like, oh, I'd like to decorate just to the end of this line. It's like I I might wanna stick several lines in that are ghosted.
Guest 4
And, so, yeah, we're we're somewhat frequently knocking on the doors in supplication Asking for, like, you know, it's like the Bernie Sanders meme. It's like, I'm once again asking for an API.
New VS Code APIs were needed to enable Copilot features like ghost text.
Guest 4
Yeah.
Scott Tolinski
So so would you say that the partnership or the acquisition from Microsoft was, like, really important Through making GitHub Copilot exist as it is today? I
Guest 4
conjecture.
Guest 4
I think yes by dent of the fact that the partnership Between OpenAI and it's it's between OpenAI and Microsoft. It's not between OpenAI and GitHub.
Guest 4
So obviously but I I think more broadly, it's just Microsoft in in in the last, I don't know, like, under Satya has sort of really made conscious choices to come back to, like, its roots as a developer company.
Guest 4
And, I think GitHub being a part of Microsoft is is clearly, like, you know, a part of that strategy.
Guest 4
And it's exciting to work in an environment where, like, you know, even the The the the mothership, the big corporate entity is just like, no. Actually, we we want to make developers' lives better, not just The ones who are building for windows or whatever, but, like, all of them.
Guest 4
And so, You know? Yeah. Obviously, that that laid the groundwork for, that partnership with OpenAI, I'm sure. But I I can't speak to anything deeper than that. Do what do you think about in general? There's a lot of developers,
Wes Bos
specifically. Like, I teach coding. A lot of people are coming to me and saying, hey, do I even learn coding, or do I go pick up Plumbing or something like that.
Wes Bos
What do you think the future of and I know that this is a total guess because I don't know that anybody really knows. But You've been in this world. What are your thoughts on what does the future for developers look like? I think what's really interesting about these models is that
Guest 3
It's easy to fall into the trap of thinking, oh, because they're so capable, they can outright replace us. And to your point, Wes, like, there may come a day where that is true. Like, unequivocally true that's just going to happen. But right now, from my perspective, not as an authoritative expert here, they really do feel like something that augments my ability to do software development. Now I have a lot of experience, and I am very different than, like, a new junior developer. But I can imagine, you know, as a junior developer coming into the industry, These models might be able to replace a whole class of problems that are just really annoying and really sucky as a developer, like scaffolding projects or, like, Getting TS configs to do what you want. Like, what if there's a different modality that a developer can use to specify what it is they're trying to solve, And an LLM can figure out those nitty gritty details that otherwise we have to write code today to do. So in that case, you're still a software developer. You're still writing code, But you're deferring or offloading some of that work to this model that's just better at some of those, like, more guts, nitty gritty type work that you would otherwise have to do. So In that case, it still feels like an augmentation rather than a replacement. But, yeah, as they get more powerful, like, it's unclear to me where that point will be where, Yeah. But you know what? Maybe maybe being a developer isn't the right thing, but I I I don't see that day coming soon, personally. Yeah. I don't I don't see that day coming ever. Like, you know, it's It's like our computers have gotten faster.
Guest 4
Right? Does it mean that we don't computer as much because we just get the job done and then Yeah. And then shut them down and go outside? No. Like, you know, it's like, now my video games come in, like, ultra HD, and they Yeah. Have more effects. And, like, you know, The more power we have, the more we use that power. It's not like the less we do. So, I definitely see that. I I think that the metaphor that That I use most frequently is is this notion of, like, you know, giving developers superpowers, giving them, like, a mech suit that they step into that allows them to, like, jump over buildings and do stuff they couldn't otherwise do. The future is is about, you know, Lowering the floor maybe. Like, you know, those junior developers that that Matt's talking about, they're coming in. We're handing them a guitar, and we're being Like, play a song for us. You know? And and then they're like, I don't I don't know. And, like, you know, everything I try, it doesn't sound good. What if the the mech suit was just like, here, I'll move your fingers for you a little bit just to help you, like, form the basic chords. And then, like, you know, you still construct the song, but you don't need to struggle for, like, the basics of, like, How do I fit my fingers on the frets? Mhmm. And that'll repeat itself over and over again, maybe at higher levels of, like, oh, I'm trying to do this This architectural thing, okay, it'll it'll fill in more of the blanks for you. I think the most interesting comment I saw, I don't remember, on Hacker News or in some blog post recently, somebody was just like, Having this makes me want to hire more developers, not fewer, because I know that every one of them is going to be far more productive than they used to be in the past, And that really resonates with me.
AI will augment developers, not replace them, by assisting on tedious tasks.
Scott Tolinski
And that also jives with our research. Right? Like, the the research that we've done into, like, developer productivity and happiness. Totally. In my personal experience, that's dead on to how I've seen it being used and how I'm using it personally. It makes me a faster developer. It doesn't, I still need to understand the things that I'm outputting. I still need to understand if they're correct. I need to understand if it's even in the neighborhood of what I want. But it helps me get there faster, and I know I can confirm visually looking at it. It is what I want. Let's go forward. Let's do this.
Scott Tolinski
So let's actually talk about some more Projects y'all are working on, at GitHub Next. Are are you working on any more AI based projects specifically?
Guest 4
Yes.
Guest 4
I mean, is the is the short answer. I don't know, Matt, if you wanna, jump in and talk
Guest 3
a little bit about, some of the things, or I can. We can chat Copilot for pull requests. That was a really fun project. You know? Don mentioned earlier, you know, we're looking for ways to introduce AI at different parts of the developer life cycle. So writing code is obviously one of those natural insertion points. Right? And the AI is actually really, really good at helping you at that juncture.
Guest 3
But what we realized And looking at research and talking to users, you know, reading code and reviewing code is actually a great point at which an AI can be of a service to you. Right? And Copilot for pull requests, Formerly known as PR bot, really started as a project to understand. Okay. You're writing code in the IDE, but eventually, you have to go to GitHub and, let's say, add a pull request To collaborate, to share this piece of code with the rest of your team.
Guest 3
Is there a point there that we can help you as a developer communicate what it is you're trying to do as this with a spill request? And what we found was there's academic research, that shows that, you know, x number of pull requests actually get submitted without descriptions outright. I think the number is 34% of open source projects don't have descriptions on their pull requests, which is fairly staggering. And Irene on our team, who's a great user researcher, can check me on that. But in face of that observation, we wondered, is this a place where the AI could look at the diff, try to understand what it is you're trying to do, and actually generate a description? And through tinkering, through trial and error, we actually came up with a GitHub app that does just that. So when you create your pull request, right, It's gonna look at the diff, and it's gonna give to you a handful of things, a human readable summary of the changes, a walk through that's Semi line by line, but more change by change walk through of what happened. And in trying to add some surprise and delight, poem That sort of encapsulates what the changes were about, and they're really fun. I think we have haikus. We have limericks, and we have something I got heavy metal lyrics I don't know how that one came up, but suffice it to say we have that. But it was super cool, and, you know, we've gotten really, really good feedback on it. But As we can talk about and dive deeper, there's lots of technical challenges here, and there's lots of serious user experience challenges latent in this. It's not as simple as like, Okay. Make an API request with a diff and get a description back. Like, there's a lot that goes into this. I'll add that sort of a lot of the the things that,
Copilot for pull requests summarizes changes and generates descriptions to help collaboration.
Guest 4
we sort of debuted as part of Copilot x in in March.
Guest 4
Were all explorations of what can we do with, The additional power of GPT 4.
Guest 4
We had early access to, like, early versions of it it wasn't even GPT 4 as we know it today. It was sort of like proto GPT 4.
Guest 4
And we were just trying out different things to see, like, well, what What is this capable of that we weren't capable of before? Copilot for PRs, we also explored a thing that was like, you know, what if I open an issue that describes what I wanna do? Can it synthesize the entire PR for me? Like, what if I didn't do that? And kind of. The answer is kind of. The the the problem is is that, like, you know, it's With these things is is that, when they don't work well it's not that they don't work well. It's that they they work well some of the time, But not enough of the time for you to, like, treat it as as reliable. Like, one of the things about Copilot is that by and large, you know, we we did all this mountain of effort to, like, raise The reliability of it to a point where, like, you know, you can sort of generally trust that it's if you supply it with enough context, it's gonna say something useful back to you.
Guest 4
But with these sort of, like, more ambitious things, it's like, okay. Now I don't wanna synthesize a couple lines. I wanna synthesize an entire pull request.
Guest 4
Like, getting that to a level of reliability, well, we we aren't there yet. Like, you know, we haven't we haven't cracked it.
Guest 4
But that's that's a great example of sort of, like, you know, stuff that we have sort of, like, cooking and simmering in the kitchen. Maybe we don't we haven't figured out an approach, and we need to try a different Sometimes prompting is not 1 shot. It's like multiple round trips.
Reliability remains a challenge for more ambitious AI tasks like full PR generation.
Guest 4
Like, you know, we'll we'll we'll send a thing and get And and say, like, what are the files that I should have in order to, like that, okay. I get back a list of files. We'll take that, and we'll send it to the model. And we'll say, okay. Like, Given these list of files, tell me what's in them. Right? And then we'll get that back, and then we'll send all of that back again. And we'll be like, okay. Given these things, write some tests for it, And then we'll get that back, and then, like, that'll be the the PR. But, like, does that work? Not well enough. Other otherwise, we would we would be putting it in front of you. And
Guest 3
There's also a fun interaction question in that where okay. To Adan's point, if the results aren't reliable enough, you're gonna have to have a human intervene and sort of, like, Shape the PR to to how they want it. And we're aware that that interaction might not necessarily need to happen in the browser. That probably should happen in the IDE where you're writing code. So how do you create a workflow that allows you to go from, here's an issue summarizing what I want. Here's a draft PR. Maybe open it up in a code space, start to tinker on it, start to make it look like how you want. Making sure we build a seamless workflow from GitHub .com to wherever you write your code is actually a really thorny user experience problem and one that we're aware of as we try Figure out how AI can manifest both as you write code, as you describe code, as you review code. There's a lot of really fun challenges latent in that. Yeah.
Guest 4
I think I think one of the most interesting things that we sort of have a a very privileged version by Dint of working with this stuff since early days is, There's an instinct. You know, we're all sort of, like, wowed by the abilities of chat. There's an instinct to, like, bolt AI onto the side of products, And, that's not always what they want to be. Like, you know, what should the pull request be in the age of AI? We don't know. Nobody else knows either. Like, but, like, figuring out sort of, like, what that natively should be, with these capabilities requires maybe changing how PRs works. And so that's exactly the kind of risky exploration that, like, next loves It gets its hand hands dirty with. Like, you know, that's that's the that's the fun stuff. It's like, oh, let's take this thing that we've been using for a decade and and redesign it from first principles, like, you know, around this thing. What should the workflow be like? How do we bounce back and forth between suggestion and creation? How do we make it work for, like, teams of people, like, you know, back When PRs were invest invented, like, real time multiplayer was was not a thing. Now it is.
Guest 4
You know, like, that's All of this is all stuff that, like, we're like, curious, like, what should be? I love that type of stuff where it's not necessarily,
Wes Bos
How do we fix this problem that we have? It's how do we step back even further and just not have that problem at all? There's lots of software projects that have been like that as well. But, It's interesting with AI where, like you say, you're not just bolting it on top of it. You may be reimagining the entire thing. 100%.
Workflow improvements may require rethinking existing software practices in light of new AI capabilities.
Wes Bos
Has there has there been any any, like, flops that, like, you haven't even come to the public with that you could share with us, like stuff that just totally didn't pan out?
Guest 4
No. Never. All of our projects succeed. They all hit. All hit. Percent. All hits.
Guest 4
That's never happened to us ever.
Guest 4
A lot of them, but I don't know. Arguably, like, you know, almost everything we do. Like, we try, like, 10 things, and, like, one of them is promising. And then Mhmm. We We don't we don't work on the other 9. It's like it's not that they flopped. It's like they never made it past the embryo stage. Like, we're just We're just working on on on the things that we believe, like, you know, we believe in, and sometimes
Guest 3
that's subjective. I don't know. Matt, can you remember, like, a Like a thing that we've, like, explicitly killed? I'm trying to think of, like, a No. I think for us, the observation is that success has just been very relative. And once you've shipped a Copilot, it's really hard to hit that watermark other projects, to be very honest. And I think, you know, personally, to to share on the team, it's it's a struggle sometimes, like, to feel the pressure of like, okay, Matt. You gotta go into a brainstorm, and we need you to come up with 6 copilots. Go. Like, I may. Give me, like, a couple weeks, but it's it's really challenging. But it's also really inspiring because being part of these conversations, People are articulating ideas that have that same level of, like, holy moly as Copilot, and the test will be, okay. Can we take this from Idea on a FigJam to, working something that we can ship to the world. Totally.
Scott Tolinski
Does a copilot buy you more win? Like, buy you more, freedom, so to say.
Scott Tolinski
In turn, does it buy you more freedom or more restriction? Maybe, I guess, is a a good question.
Guest 3
No. It it's it's a good question. Adan, your perspective might be different.
Guest 3
I feel personally as a member on the team, as an individual contributor that Getting Copilot at the door bought us a lot of trust within the organization. Trust that we can be left to our own devices, use our diverse skill sets and come up with something really, really valuable, but also be judicious about spending too much time on one thing. And if we're observing that maybe this isn't going in the right direction, like, we should probably abandon this, Hit the brakes and move on to something else. So I think it's definitely given given us that sort of a long leash in the organization.
Guest 3
But on the flip side of that, a lot of pressure to, okay, you've done this once. Like, could you please, rev up the hit factory and and jam 1 out? But, Adan, I'm particularly curious how You see this as someone on sort of a different,
Guest 4
food chain, so to speak, at the business. Yeah. Definitely sort of when I look back at the the history of Next, you know, it's It's all it's all good and well to be like, ah, we're gonna explore the future.
Guest 4
You know? But, that's that's that's that's quite the claim. You know? You have to You have to walk the walk, and and build that trust with the rest of the business. I think this is a problem faced by sort of exploration teams at a lot of different companies, where it's it's not enough to just, like, you know, make bold assertions. You gotta you gotta actually produce and and and build that trust and that relationship and build that interface with the rest of the business. Like, it's not enough for us to just, like, Like, do cool stuff. Like, you know, Kevin Costner, field of dreams. We'll build it, and they'll come. Like, no. It's it's not It's not enough for us to do that. We have to go out and and do research and measure and come up with the proof.
GitHub Next establishes trust and delivers value to bring other teams on board with their ideas.
Guest 4
And that's why I sort of I relate A lot of, like, the terms they used to describe next is, like, start up terms because it's like it's like we gotta do the legwork pound pavement and figure out What's gonna persuade the rest of the business to throw to throw down with us on a thing? And that varies project to project. It varies on, like, you know, Who's gonna eventually own it? In the case of Copilot, they an entire team several teams got stood up to, like, own this New wing of the business. Right? But, like, if we're gonna do something about PRs, like, I need to figure out what the PRs team cares about and, Like, how they see things and what's gonna persuade them and their management chain.
Guest 4
So that interface with the rest of the business is is super critical.
Guest 4
And, yeah, step 1 is trust. Like, first, they gotta have standing in a relationship with them where they're, like, willing to entertain, like, talking with us instead of us being seen as as a distraction, right, because they have they have operational fires. They have, like, a backlog. They have all this stuff, and here I am rolling up to them with, like, Hey, guys. Check out my my flying car. Like, don't you want a piece of this? And they're like, my house is on fire. Leave me alone. Like, you know? It's like, oh, trust is our number one value. That sounds super corporate and cheesy, but it but it really does matter to, like, sort of build that relationship so that we can Have that hand off to the rest of the business because everything we do like, you asked her, like, what does success look like? What's her success ceremony? Our success ceremony is like waving bye bye to our kids at the dock while they sail off into engineering and grow up Mhmm. And go to college and and and lead fruitful lives as products. It's like,
Wes Bos
yeah, that's great. I do have 1 more question about the products maybe you've thought about. But Even as I was thinking of this question, I thought the kind of how this is is if you've ever looked into the linter And format or space.
Wes Bos
But I guess that's what brushes does. Right? Like, you can make a brush that does anything. Have have you looked into that even more like a Automatic formatter where you don't need Prettier. You could just be like, make it look nice. I don't I don't know. Prettier is so good, but Yeah. It is. I've never really, really thought
Guest 4
about it. I think of brushes more as as sort of like a like a modality in the same way that, like, ghost text is a modality and chat is a modality. I think Brushes is about transformation. Like, you know, you select text in, like, you know, in in a in a in a word processor, and you Click the bold button, and it makes the text bold.
Guest 4
But, like, you know, brushes is exactly that mechanic, but with a more flexible Sort of, like, what can it do? Like, I would like to make this HTML markup more accessible.
Guest 4
You know? Like, there's no button for that in the toolbar.
Guest 4
Yeah. But that's exactly what sort of, like, the the ability to like, a modality for transformation of existing
Scott Tolinski
Code. Which is great, and it's time saving concerning, you know, you could have to research this. There's these things, like, you mentioned even accessibility.
Scott Tolinski
Why put that on the onus of the User to have to memorize all of the accessibility tools when you could wave your magic wand in front of some existing code. Yeah. There's, like, a 1000000 ARIA directives. I don't I don't remember all those. You know? Totally. Yeah.
Scott Tolinski
Yeah. I mean, there's actual experts who dedicate their entire careers to understanding those things. So let them let let us utilize their their brain space. Yep. Exactly. It's it's hard because you you all are working on so many awesome things. And if you all would be down for it, I'm sure we would love to have you back concerning this This hour flew up on us in terms of exact like, we probably scratched the surface on on cool stuff. So, now we're gonna get into the part of the show that we call the supper Club questions where we just ask you general questions that we ask to everybody. So just general programming stuff that our audience is interested in. For instance, Like, what computer and keyboards and mouse do you use? Anything exciting? I keep it very simple. I use a MacBook Pro Trackpad and built in keyboard. I am, like, very, very simple. I don't do external monitors. Like, I'm that guy with, like, headphones on and, like, 6 cups of coffee at a cafe, and that that's all I need to be Productive.
Wes Bos
Every New Yorker we've ever had on the podcast, that's their answer.
They use a variety of keyboards, themes and terminals but value speed and usability.
Scott Tolinski
Yeah. It works.
Guest 4
Well, I'm I'm here to break that because I'm originally a New Yorker, and yet I'm, like, totally, like, a mechanical keyboard nerd.
Guest 4
Oh, nice. Oh, they got the split. An afternoon labs breeze, And I'm busy. Like, I'm, like, you know, I'm that unfortunate soul who is just like, none of these existing.
Guest 4
Oh, that's so cool. What is that? It's a it's a it's a breeze by Afternoon Labs, but I I modded it with, With, nice nano like microcontrollers that can do wireless and CMK. So it's it's a wireless breeze wireless split breeze. And I I, like, went so far down the rabbit hole. I started learning how to design PCBs because I I have a keyboard in my head that doesn't exist, And all the existing ones out there are almost it, but not quite. And and that for you is the keyboard community is in a nutshell. Yeah. That that was me until I I got, you have the dactyl, like like, the with the the curved
Scott Tolinski
And it's a low profile.
Scott Tolinski
Chalks for life. I know. I I that was the whole thing for me is that it need to be split. It needed to be low profile,
Guest 4
and it needed to be wireless. Those were my, like, big three things. And then, Yeah. This perfect keyboard for me. If if split is, like, you know, like, you know, like, breaks your brain just to go from, like, regular to that my wife says Then in order to use the split keyboards, you need, like, a lobotomy or something to, like, split your brain in into that curved three d thing seems just like next It's like we're taking it literally to the next dimension. Like, you know Especially when you get into, like, layers and macros.
Scott Tolinski
Yeah. I've got a I got, like, a ray cast button on this thing so I don't have to do command shift. Just so I hit the the button for ray cast.
Scott Tolinski
That's a really good idea. Yeah. Oh, yeah. I use it All day, every day. Okay.
Wes Bos
I'm gonna steal that. That's super interesting. We're having a couple of people on the show, who are really into hardware, specifically running quick JS on hardware, which is like a tiny JavaScript library. Like, it's not like a 80 meg thing. It just it runs on actual microcontroller
Guest 4
so That you're printing your own PCB is very interesting to me. It's it's like the tools are out there. It's it's a lot of fun. Hardware in general is a lot of fun, and it's getting Easier day by day. Like, now, there's a class of, microcontrollers that can run Rust directly. And so I've I've been, like, messing around with Rust lately because, like, previously, you'd need to write c or Arduino, and that's still c. And it sucks, and I don't love it. But, like, Rust is actually pretty nice and pretty civilized. And now you can buy for, like, you know, $10 on Amazon, like, these little controllers that You could just natively target with Rust and do cool stuff with, so I really love that stuff. It's it's a fun it's a fun time to be in hardware. It's not as It's not as painful as it used to be. Is that the ESP 32 that you can run Rust on? Yeah. Yeah. That's why it's it's specific ones. It's ESP 32 c 3. Okay. Not s or h or there's a few others, whatever, but, like, specifically, the c one. You can look it up if you like. There's a whole GitHub org. I have a few on my desk here, actually, so I'm I'm looking for them right now. I wanna see if I have it. That's cool. I didn't know you could do that. You should look it up like Like Rust on ESP, there's, like, there's, like, a, like, a, like, a mini book and, like, you know, a whole GitHub organ. Like, it's supported by, like, the The manufacturer of the the microcontroller that they they have, like, some people working on Rust support for that controller. So it's Wow. That's pretty solid.
Scott Tolinski
I'll post that link in the show notes for the rest on ESP bug.
Wes Bos
Sorry. I'm I'm 8 tabs deep and researching hardware right now. I lost
Guest 3
I lost my supper club questions. Alright. Here we are. What text Edit or even font. Are you all using Let me pull up my Versus Code. I can tell you it's definitely Versus Code, and I'm obsessed these days with the recursive mono. I think the website's recursive dot design. It's this really, really sick variable font that can go from, like, casual to non casual. It's it's super sick. I can guarantee you it's readily polarizing, so apologies in advance.
Guest 3
Oh, yeah. Got a vibe. Wise. Yes. It's got a vibe for sure. Theme wise, oh, man. Now I need to pull this up. Give me a second. Adan, maybe you can jump in while I Matt is my theme guru. Like, every time I pair with Matt and, like, know, every time he has, like, a different theme, and I'm like, damn. That looks good. The right person has it. Then he's like, you know, like, by that point, he's already, like, 3 themes of it. Yeah. Yeah. Oh, yeah. So this is what I'm rocking these days. I found this group of themes called bearded. Like, I don't know why it's called bearded, but there's, like, 12 or 14 themes, and they have some really sick sort of, like, Manukai inspired ones that have, like, a faded color, tone. I can paste the link in the the chat, obviously, but it's super sick. So shout out to Bearded.
They highlighted accessibility improvements and collaborative editing as areas for AI assistance.
Guest 4
I'm I'm still rocking Andromeda, which is what Matt showed me, like, 4 themes ago.
Guest 4
You know, and every time, I'm always just just chasing chasing chasing Matt's theme preferences.
Guest 4
I also use, recursive.
Guest 4
I've used every monospace thing in the book.
Guest 4
I've recently started playing around with Zed, the editor, Zed dot dev.
Guest 4
Very, very interesting work, from one of the original authors of, the Atom editor is ex ex GitHubber. Yeah.
Guest 4
Split off, and they built the entire thing in Rust. And it's fast. Like, you can feel it. It's so good. Yeah. It feels a little bit like you know, like, remember when Firefox came along and disrupted Everything, and then Chrome came along, disrupted Firefox. And, like, the early days of, like, both of those, like, it was sort of, like, It was small. It didn't have, like, you know, all the features. Like, you know, Versus code is now a pretty powerful beast, and it's still really fast. But, like, you know, the, Zed feels next level fast. It just doesn't have all the all the features. So I'm really curious to see, like, Where it goes, I think they have a lot of really interesting, ideas.
Guest 4
They ship with, like, a variant of Iosevka Iosevka. Vivek, I don't know how to say it. Whatever. It's like one of the modest based ones, which actually looks pretty attractive, but,
Scott Tolinski
but I'm Definitely playing around with it. I haven't used it for anything really yet. But Yeah. And and to me, like, coming from Sublime Text, right, It always felt like Versus Code was slower than Sublime Text to start, but it you always picked it because it was so powerful,
Guest 4
to me. See, I came from Adam. And so by comparison, like, you know, Versus Code felt So fast, like, you know?
Scott Tolinski
Oh, yeah. Yeah. I know. I I I never got on the Atom train because I was so you know, I I went to, like was a text mate To Sublime, and they're just stuck on Sublime for until v s code shipped. But, yeah, Zed, I I I can't wait for them to get plug ins. That's, like, the one thing holding me back There, I think.
Guest 4
Same. Also, themes. They're working on a theme engine, and they're working on I think they're working on plug ins I I haven't seen. There's a Discord. I I I should probably, like, lurk there and see what's up. Yeah. What about what about terminal and shell? I'm a Zish guy through and through. So I have Itermocum with Zish, and that's just always been mine. And although I will say,
Guest 3
I'm super fascinated by what the folks at Fig and Warp are doing. Like, I really do think there's so much value to be exploring the CLI as, like, a space for, more enhanced interaction.
Guest 3
So super curious about what they're doing, and I love to play with that tool as well. Check out Warp. Definitely. I use, their AI tool all the time because it's just like,
Scott Tolinski
I need I need to do this right now. In the in the command line app, she gives me the command. You wanna do this? Okay. Yes. So this is what I want to do. Okay. Let's go. I'm stuck between
Wes Bos
Copilot for CLI and warp because copilot is better at explaining things and finding things, but Warp has its own UI. Like, you're not limited by It's integrated. Yeah. The the terminal. Right? So Warp can can
Guest 3
put circles if they want to anywhere they want. Yeah. Like, I'm contractually obligated by Adan to tell you that Copilot for CLI is the better product. Yeah. That's it. But no. I'm totally just kidding. Like, This is a fun, like, you know, interface question. Like, were we gonna do something that's, like, more warp inspired or something that's a bit more, like, limited, but more like Shelley? So, Yeah. That that's a different episode altogether right there. Yeah. Comrade Matthew speaks great truth in service of.
Guest 4
Yes.
Guest 4
But, Yeah.
Guest 4
I I I think there's like, you know, it's an it's an open field. Like, you know, it's it's exciting to see, like, what warp is up to, what Fig is up to. There's So much greenfield in the space.
Many are innovating in CLI AI space and sharing ideas openly.
Guest 4
Like, we could be trying things from here until next year, and they can be trying things at the same time, and and There isn't necessarily, like, overlap. There's so much stuff to be explored, so many things to be checked out. Everybody's riffing on everybody else's stuff, and that's That's the exciting time we live in. Like, you know, to to see all of us collectively grappling with, like, you know, how should this technology Fit into our lives, and what like, like I was saying earlier, what does it wanna be? Copilot CLI, like, You know, in its current incarnation, you know, hand on heart, I look at it, and it feels a little bit like, you know, the bolted on thing that I was saying earlier. Not in a bad way. It's like, you know what? We had to start somewhere. It was easy as to bolt it onto the side of the terminal through, like, shell aliases, but, like, What does it want to be? Like, I don't know. I have a bunch of ideas. Maybe we'll prototype them. Maybe you'll see them next month. So,
Wes Bos
like, you know, we'll we'll see we'll see where that goes. And these Supper club questions have been fantastic with you guys. I have sometimes with my friends, I have a question that will We'll always get a good conversation going, and I said, have you ever had a raccoon in your house? And there's always amazing stories that come out of that. And, like,
Scott Tolinski
That this Y'all have different problems in Canada. Yeah. We do.
Wes Bos
But if you just say something like, oh, raccoon in my house or squirrel in my house, somebody has the Craziest story about a squirrel getting into their house and like it. This is like that where you just ask 1 little question and you just get into all kinds of amazing stuff. Like, I probably do a whole hour just on lobbying these generic questions and getting answers from you. But We'll wrap it up now. The last thing we have here is shameless plug and sick pick.
Wes Bos
A shameless plug is literally anything you'd like to plug. And a sick pick is anything from a keyboard to a chocolate bar to anything that you would like to
Guest 4
share with the audience. I mean, shameless plug, get up next.com.
Guest 4
It's my job. You should you should check out, and that's a great place to start to follow along. We also have a Twitter account.
Guest 4
Do not yet have a Mastodon account or a Blue Sky account. Sorry. Just I got an invite for you if you need it.
Guest 4
I don't have I don't want I don't want more apps. I just I just want 1.
Guest 4
Like, you know, it's it's, that's hard.
Guest 4
But, but, yeah, that's, I think that's that's my shameless plug. That's, like, where we're because we do so much of our work in the open.
Guest 4
We also have a Discord, a GitHub Next Discord. I'll I'll share the link. We could Post it in the show notes for folks that are interested in sort of, like, following along and also, like, asking questions and about specifically next, like, explorations, experiments.
Guest 4
It's a healthy, a healthy community of folks who are like, you know, like, yeah. This is cool, and it's also somewhat broken. Let's play around.
Guest 3
So, that's my shameless plug right there. Nice. Yeah. I think I'm contractually obligated to plug GitHub next as well, but I do have something different from my, like, you should check this out. So, we can segue into that if you want. Yeah. Let's do it.
Guest 3
So there's 2 things I'm gonna plug here or, shout out here. One, that's making my life super easy as a front end developer. I've been following his work for a long time. The creator of Ariakit, Formerly known as Ariakit. I believe his name is Diego Haz.
Guest 3
Ariakit.org, some of the best well written React accessible components I've ever used. So if you're building a front end, check this out And, join the community because I know there's a lot of, future ahead there.
Guest 3
And then secondly, I wanna shout out the tool Liveblocks, Which I've been using for a few of my side projects. Yes.
Guest 3
Liveblocks is so good. Honestly, it it's so well harnessed this idea of don't make me think. The product is intuitive. It's easy to set up, and I think it's a real game changer if you're building any app that has collaboration. So shout out to Steven and the team. Thank you all are up to some seriously sick work. Yeah. That's awesome that you said that. I'm working on a a Liveblocks episode right now because it just feels like, to me, I I I was a long time Meteor user, so Real time was like everything baked in there. And it feels like they really figured out the business model for real time. Right? We we Take the data. We hold the data, but we make everything super easy. Exactly. It's like it it's so good, and it's so intuitive. And I think, you know, like, there's such a proliferation of amazing tools out there. But let's be honest. They're not all cut from the same cloth when it comes to usability and developer experience, and I think the folks at Liveblocks have really invested in making that DX incredible.
Guest 4
So, yeah, huge shout out to them. Yeah. It's it's a really cool it's a really cool space. There's it's really easy to be like, oh, like, you know, just sprinkle some CRDTs on top And, like like, multiplayer will happen. But in reality, like, you know, if you actually start reading about how CRDTs work, it's like You still need to figure out the problem of, like, how do you how do you deal with merges? How do you deal with, like, the sort of, like, I have conflicting sets of state across the wire? And then all the different like, you know, whether or not it's it's Firebase or Supabase or Liveblocks or Party Kit or Yjs, and you're gonna roll yourself over WebRTC. Like, you know, there's so much stuff happening in this space where everybody's sort of figuring out. And I feel like Probably, you know, stuff like Liveblocks or or anybody that's pushing on this developer experience, it's gonna become as sort of, fundamental to building, like, a lot of a lot of web apps could use this. Like, not everything, but quite a lot will. And as the floor gets lower, More and more things will sort of jump on the bandwagon, and all these companies are trying to figure out how to lower that floor. And Live Blocks, I think party kit, Superbass.
Guest 4
Definitely at the top of my list when it comes to, like, you know, don't make me think. Love that stuff. They lower the flow. Totally. I built a, Like a collaborative document editor in about, you know, 5 hours or something, and that's absurd. Yeah. Especially when you think about, like, you know, how What what mountains would you have needed to move even 5 years ago? Like, you know, it's like you wanna compete with Google Docs. Well, good luck.
Wes Bos
Like, you know but no. Now it's just like, yeah, man. I can spin that that stuff up in an afternoon. Awesome. They well, thank you so much for coming on, Rupert. We appreciate all your time and all the Insights you've given on such exciting space right now, so I appreciate it. Yeah. Thanks for having us. We really appreciate it. Our pleasure. Alright. You're welcome. Peace. Peace.
Scott Tolinski
Head on over to syntax.fm for a full archive of all of our shows.
Scott Tolinski
And don't forget to subscribe in your podcast player or drop a review if you like this show.