00:002022 was a breakout year for AI
00:04in fact many have even claimed that
00:06Chachi BT is the fastest growing app of
00:09all time so with so much opportunity on
00:12the table AI is the topic of
00:14conversation in every boardroom as CEOs
00:16figure out how to best integrate this
00:18new superpower but they're also asking
00:20really important questions around data
00:22privacy competition cost accuracy and
00:25also doing this all really quickly
00:27because just like your customers really
00:30don't care if your product is built with
00:31angular or react or runs on AWS or
00:34Heroku there will be a whole host of
00:37ways that companies differentiate as
00:38they look to cleverly embed AI as a
00:41reminder the content here is for
00:43informational purposes only it should
00:44not be taken as legal business tax or
00:47investment advice or be used to evaluate
00:49any investment or security and is not
00:51directed at any investors or potential
00:53investors in any accz fund for more
00:55details please see a16z.com disclosures
01:09now before we get into the weeds of how
01:11each company is implementing AI we
01:13wanted to start our conversation with
01:15zed inam co-founder and CEO of Cresta
01:19Cresta is bringing real-time
01:20intelligence and AI to the contact
01:22center and here Zed reminds us of the
01:25upside of AI and what it can truly
01:27unlock so I read that 30 to 45 well
01:31contact centers have 30 to 45 attrition
01:33in the first year with agents and in
01:36some cases that can even be as high as
01:3880 so what is going on in these contact
01:41centers why is attrition so high yeah I
01:44mean that's a great question that not
01:45employee net promoter score for contact
01:47centers often less than zero so
01:49generally it's not a role that folks
01:51sort of end up sticking with for a long
01:53time for a few different factors you
01:55have sort of seasonal demand for
01:57increased contact center volume you have
01:59relatively low wages and then sort of
02:01high inflation environment and then you
02:03have overall sort of a pretty high
02:05stress job it's like taking phone calls
02:07frustrated customers and customers in
02:09that kind of environment sort of pretty
02:10stressful demanding job for relative
02:12below wage and so because of that like
02:14all those things add up and leads to
02:15like sort of a relatively higher
02:17turnover turnover environment I
02:19absolutely I feel like most people don't
02:21need a reminder of how brutal that job
02:22can get because they've been on the
02:24other side now many people jump to
02:26thinking that AI should replace the
02:28contact center in Customer Service
02:29agents but Cresta instead has its eye on
02:32transforming a historically low NPS job
02:35into one of Mastery and creativity when
02:39you look at artificial intelligence I
02:40think there's really two ways to look at
02:41it there's one way to look at it which
02:43is lazy artificial intelligence and
02:45that's basically I have an existing
02:47process that my business does right now
02:50I'm going to take that process end to
02:51end and I'm going to automate it and AI
02:53is really good at that you can take an
02:54existing process automate it end to end
02:56there's another way to leverage AI which
02:58is more like a creative approach which
03:00is understanding how does AI how can AIP
03:02use as a building block so like AI is
03:04now a fundamentally new capability it
03:06unlocks new things and I can reimagine
03:08my business or my processes or how I do
03:10things my products based on this new
03:12capability and how do I combine humans
03:14products AI together to result in
03:16something that wasn't possible before
03:18and so when you look at the sort of
03:19difference between lazy versus creative
03:21AI the latest approach is to basically
03:23go say hey this is the same process let
03:25me just go automate the end to end but
03:27then the creative way to look at it is
03:28that hey you know my job or my role as a
03:30business is to deliver the best possible
03:32customer experience or the best possible
03:34product experience and so like the
03:36information and the knowledge that's in
03:38my conversations with my customers is
03:40one like just really great ways to build
03:42strong relationships with my customers
03:43but two is just like a gold mine of data
03:46information about my product my market
03:47my competitors what's happening how the
03:49world is changing so can you give a
03:51couple examples of that like what can we
03:52do with this technology that a human
03:54alone could not yeah I mean so like a
03:57great example of it is like when you
03:59have like thousands and hundreds of
04:01thousands of conversations with
04:02customers the the Nuggets of information
04:05about product feedback that you can use
04:07to sort of improve your product or
04:09improve your sort of how you position
04:10your products in the market or how
04:12competition is changing we work with
04:14companies that are large
04:15telecommunications companies that sort
04:17of are constantly looking at evaluating
04:18how are the pricing and packaging their
04:20various phone bundles or various cable
04:22bundles bundles and there's just a lot
04:25of nuance and data understanding okay
04:28how does my customer perceive when I
04:29make this offer like what is the context
04:31that they're coming in with and that
04:32really informs their like their Market
04:34strategy in terms of what it's a new
04:36product that they should launch what's
04:37the new package or pricing that they
04:38should bring to Market like that's a way
04:40that a company can accelerate its
04:41development based on just like having a
04:43very very close like gear to the ground
04:44and like a real strong pulse of like
04:46what your customer is saying um and
04:48that's ultimately the best way to build
04:49companies and artificial intelligence is
04:51amazing at that because like what's
04:53possible now with large language models
04:55what's possible now with like sort of
04:57very Advanced deep learning is that you
04:58can summarize you can sort of synthesize
05:00you can pull together information and
05:01context in ways that are like really
05:05there they can take huge amounts of data
05:07make it super simple to understand and
05:08get to Insight and get to understanding
05:10really quickly in ways that the
05:12conversations were unstructured not
05:14possible like they were sort of on
05:16Legacy on premise like audio files that
05:18you'd have to like listen to one by one
05:20it's listed for 20 hours you figure out
05:22what that's going on and now you have
05:23like this super Advanced like human
05:25level speech transcription to like human
05:27level summarization human level sort of
05:29question answering that sort of helps
05:31these companies get to get to just the
05:32next level of iteration as a business
05:33what you're saying is actually really
05:35fascinating so when I thought of
05:37customer service I thought okay this AI
05:40will actually be able to help converse
05:41with the customer but what it sounds
05:43like you're saying is it also provides
05:44the business with an additional layer of
05:46understanding because it can basically
05:47go through all of this unstructured data
05:50and gain insights from it like you said
05:51if someone or some cohort of customers
05:54is on one plan and they notice something
05:56about the conversations that they can
05:59now parse through again this large
06:01bucket of unstructured data and go and
06:03basically back to the company and say
06:05hey maybe you should start a plan of
06:07this nature maybe you should offer this
06:08cohort of customers this deal maybe you
06:11should talk to them in this way is that
06:12basically what you're saying yeah yeah
06:14and that's like the missing piece of all
06:16this is that it's a bi-directional thing
06:18right it's like you're serving the
06:19customer but you're also serving the
06:20business you know a lot of information
06:22in all this and there's a really like
06:24for me like the most inspiring example
06:26of this is actually uh Andy Grove intel
06:28was originally a memory business for for
06:30a very long time and they started as a
06:32memory business and then there was like
06:33what Andy calls strategic in fact action
06:35point where there's a 10x difference in
06:37the cost of memory production and so all
06:39these Japanese manufacturers were sort
06:41of able to produce memory in like this
06:4310x difference and it sort of completely
06:45changed the Dynamics of that market it's
06:46funny like the way he phrased it but the
06:48sales people at Intel came back to
06:50headquarters and they would the Japanese
06:52sales people and they would say the
06:54customers are no longer respectful as
06:56respectful as they were like that was
06:58like the first signal like that that he
07:00got that something is different to the
07:01market and it's just interesting raising
07:03like the customers are no longer as
07:05respectful as it used to be that was an
07:07input to him that ultimately led to him
07:09dramatically pivoting it's changing the
07:11strategy of the company to become a
07:13microprocessor company because they
07:15realize that the market is completely
07:16changing customers are changing customer
07:18reception to their products is changing
07:20into their sales people and that they
07:21need to hard paper the company to
07:23becoming microprocessor company it's a
07:25hard decision to make but when you have
07:26the data like when you understand what's
07:28going on with your customers like you
07:30then have the data you're informed to
07:31make the right Product Company strategy
07:33decisions that ultimately saved the
07:35company so it turns out that these AIS
07:37won't just support your customers but
07:39they can actually be an ear or a
07:40thousand ears to the ground feeding data
07:43back to your company informing how you
07:45can better surf them now we'll come back
07:47to the contact center of the future but
07:49now let's introduce Barry McArdle
07:50co-founder and CEO of hex hex is like a
07:54sculptor's tool for data molding shaping
07:56refining data into valuable insights and
07:58visualizations in various formats or in
08:01Barry's words this is a platform for
08:03collaborative data science and analytics
08:05let's hear a little bit more about what
08:07the hex team is building into their
08:08product earlier this year we launched
08:10the closed beta of what we call Hex
08:12magic magic is basically a set of AI
08:14tools built right into the hex UI that
08:16our users already know and love so it
08:18lets you generate an edit code for
08:20example this morning I asked it you know
08:21what's our count of paying customers
08:23broken down by pricing tier with Summer
08:24revenue and it wrote me a SQL query that
08:27did that or you can say refactor this
08:29python code to be a function and it will
08:31do that for you it also has features to
08:33document your codes which is really
08:35useful if you're staying at like a super
08:36complex query or something that someone
08:38else wrote It's like what's going on
08:40with this you can ask it I'll tell you
08:41sounds pretty magical if you ask me but
08:44it's one thing to dream up these new AI
08:45features and another to successfully
08:47roll them out and as companies like hex
08:50Cresta and soon you'll hear from Source
08:52graph do exactly that that they're
08:54inevitably running into challenges one
08:56issue we've probably all experienced at
08:58some point is when technology just does
09:01not have the right contextual
09:02understanding like being caught in a
09:04customer chatbot Loop that just is not
09:06addressing your query something that
09:08Crescent knows all too well we've all
09:10been there when we're interfacing with a
09:13virtual agent and it's asking you the
09:15same question over and over and you're
09:16like that is not my problem and clearly
09:19my problem has not been coded into the
09:21potential set of solutions and I find
09:23myself in those cases just writing agent
09:25agent agent because I need someone with
09:27more of that broad context or the
09:29ability to interface in a more Dynamic
09:31way but I could also see how that could
09:34introduce maybe some unexpected
09:37responses I mean I actually saw someone
09:38on LinkedIn today post that he was
09:42talking to a virtual service agent and
09:44they called him beautiful soul and he's
09:46like who would do that what human would
09:49would do that soon we'll hear from Barry
09:51and also our third guest Beyond about
09:54how we can actually improve the outputs
09:55of these models but first here's Ed
09:57commenting on how the utilization of
09:59these models if implemented incorrectly
10:01could really impact your brand
10:04what are you seeing in terms of in the
10:06field how different companies are gut
10:09checking what's coming out of these llms
10:11and interfacing with their customers and
10:13I'll also tack on a question what are
10:15people doing in terms of ascertaining
10:18whether they should actually disclose
10:20whether it's a bot or not because as we
10:23are getting closer to that idea of human
10:27should we be telling people it's still a
10:29bot or is it actually good enough where
10:31we don't have to anymore yeah that's a
10:34that's a good question I think at the
10:35end of the day like if you are a brand
10:39fundamentally building trust with your
10:42subscriber or customer base right and
10:44your brand value is like sort of hey I
10:47can like call up this company or this
10:49and like they'll take care of me or like
10:50I can trust them and so like in that
10:52case like I do think Trend leaning
10:54toward transparency is like the best
10:55sort of uh thing there because long term
10:58you want to sort of be known as a
10:59company that like doesn't mess around or
11:01like it's like you're you're direct
11:03you're sort of you tell the customers as
11:04it is and it becomes like an interesting
11:06topic as well on the topic of accent
11:08masking right because like that's uh
11:09that's another use case of the
11:11technology that sort of coming to Market
11:12where folks can like use this technology
11:14to sort of mask accents or change
11:15accidents or these kinds of things if
11:17you're not upfront about it and like
11:19there's like some Edge case that comes
11:21up and it becomes like clear it's like
11:23do you want to take that risk to your
11:24brand reputation or not so in addition
11:27to the right disclosures we can actually
11:29improve the performance of these models
11:30relative to competitors even if we're
11:33using the same underlying models how do
11:36we do that through fetching the right
11:38data because it turns out that if AI
11:40doesn't have access to the right data
11:41it'll just make something up garbage in
11:44garbage out value in value out now here
11:48is beyond Liu co-founder and CTO Source
11:51graph a code search and navigation tool
11:53for Dev teams this is kind of like
11:55general purpose source code
11:57understanding engine our first kind of
11:59major push I would say is this editor
12:01extension called Kodi
12:03um actually we had we had an earlier
12:04effort that was called code search Ai
12:06and it was that was more like a
12:07prototype just demonstrating some of the
12:08underlying capabilities but where we're
12:10focusing right now is Cody which is an
12:13editor extension it's available today
12:15for vs code and essentially what it does
12:17is it's a it's a chat based interface
12:19but also allows you to like search for
12:21stuff and context in the code it's going
12:23to do autocomplete fairly soon and and
12:25the idea is that like we wanted
12:27something in our editors that took full
12:29advantage of the power of language
12:30models but also kind of addressed a lot
12:33of the challenge that people have
12:34encountered with uh large language
12:36models you know namely the tendency to
12:38hallucinate uh facts and they don't
12:40really know the answer and so that's a
12:42place where we thought we could be
12:44uniquely positioned to help because
12:46um Source graph you know with all the
12:47pieces of context that we have around
12:49searching for code and you know Finding
12:52references and verifying things actually
12:55exist we are kind of like the perfect
12:57fact Checker if you will for the
12:58language model and perfect like relevant
13:01context provider to the language model
13:02and so we're really focusing on this
13:04kind of like in your editor experience
13:06right now and pulling in all the context
13:07that we can provide combining that with
13:09the power of the language model and just
13:11solving all those like questions and
13:14challenges and sources of like tedium
13:16and toil that developers encounter like
13:18daily hourly minutely when they're
13:21trying to build software yeah I mean
13:27trying to implement meant AI then how
13:30are you going to differentiate relative
13:31to your competitors and it sounds like
13:33something you're pointing to is that
13:35your focus for so many years at
13:37sourcegraph on focusing on really good
13:39code search is actually somewhat of a
13:41differentiator because what you put into
13:43the model matters a lot in terms of what
13:46you actually get out of a model even if
13:48everyone is using the same model and so
13:50maybe you could just speak a little more
13:51to that about the importance of having
13:54code search be integral to your product
13:56and how that maybe is emote relative to
13:59competitors totally so like internally
14:02we've started saying this phrase
14:03essentially like context is king So the
14:06context that you provide to a language
14:07model really dictates the quality of the
14:10output of the model especially if you're
14:11asking about things that are kind of
14:13outside the training Corpus of that
14:15model so you know a lot of the big
14:17language models that we know about today
14:18well you know Chachi BT Claude uh you
14:22know Bard they were trained on on very
14:24large you know public data sets and
14:26probably some amount of private data but
14:27you know there stopped at you know 2021
14:29or they're not going to include the the
14:32private code inside your organization so
14:34they work very well when you're trying
14:35to do something kind of generic like uh
14:37Hey build me this app with some standard
14:39react components or something they do
14:41generally well in those cases there's
14:42still a bit of hallucination because the
14:44the language model's memory is is kind
14:46of like human memory where it's like not
14:48precise like we expect computer memory
14:50to be it's just sort of like a little
14:52bit fuzzy and then it kind of fills in
14:54the gaps and and hallucinates some but
14:56if you ask it about something that's
14:57entirely outside of its uh training set
15:00it just doesn't know and worse than that
15:02it's not going to tell you it doesn't
15:03know it's going to pretend it does know
15:05and uh hallucinate something that looks
15:07right but is completely off and think
15:09like everyone who's tried to actually
15:12play around with these models in the in
15:14their day-to-day workflow has hit this
15:16issue that's a place where sourcecraft
15:18can come in and essentially provide like
15:20the missing knowledge Gap so that
15:21instead of like talking to this language
15:24model which is kind of like talking to
15:25like a a reasonably smart person on the
15:28Street asking them a question and them
15:30just trying to like invent an answer
15:31from what they remember in their minds
15:34now you arm them with like you know
15:36Google search or uh you know what not
15:38and they can go and like not only just
15:40like use their memory to answer your
15:42question but like go look up stuff on
15:44the internet or in our case like in in
15:46the wide wide world of Open Source Code
15:48the recent changes that have happened
15:49there as well as like all the private
15:51code inside your organization
15:53and that makes the the answers much more
15:55precise like when we think about the the
15:57issues that Cody still has today and
15:59it's you know obviously not perfect
16:01um almost all the quality issues come
16:03down to oh we didn't fetch the right
16:05context into the context window
16:06interesting and I mean let's let's
16:09double click on that because there are
16:11other tools that help you build code
16:13using these language models we'll just
16:16throw out a couple like a lot of people
16:17are familiar with GitHub co-pilot a lot
16:19of people are familiar with what replit
16:21is doing with Ghostwriter but maybe you
16:23could actually speak to this idea of
16:25fetching the right information like how
16:27would something like a co-pilot do that
16:30and how would something like a Kodi
16:31actually differentiate in its ability to
16:34fetch the right information I think the
16:35way we think about it like as far as I
16:37know Cody is is the only AI enabled
16:39editor assistant or coding tool today
16:41that that fetches context is is kind of
16:43like as broadly uh as we do so when you
16:46ask you can ask Cody a question like hey
16:48you know where where is like the the
16:50saml off provider to find my code base
16:52or where's like the graphql search API
16:55defined and Cody will actually go and
16:58convert that to a couple of search
16:59queries to Source graph and surface
17:02these relevant Snippets of both code and
17:04documentation use that as like concrete
17:07references to answer the user's question
17:09and that's in contrast to the way that
17:11copilot works today where most of the
17:13contacts that they're looking at number
17:15one they don't do a q a yet I think
17:17that's on the way with copilot X but
17:19currently they don't do q a it's purely
17:21kind of like autocomplete driven and the
17:24context that they fetch to do that odd
17:26completion is kind of like recent files
17:28that you've opened in your editor so
17:29it's kind of like this very local
17:31context which works amazingly well I
17:33mean like huge credit of that team
17:34they've built an awesome user experience
17:35we think that the next evolution of that
17:37is is is providing more relevant context
17:40and essentially emulating like what a
17:42human kind of does when you're trying to
17:43write code right like you as a human you
17:45might go back through some recent
17:47history in your Editor to see like okay
17:48how does that code work how did that
17:49code work and use that as like a pattern
17:51matching reference point for the thing
17:52that you're are currently writing but
17:54more often than not I think you're doing
17:55stuff like you know go to definition
17:56find references you know let me see a
17:59couple of examples of how to use this
18:01particular API that just I just imported
18:03and that's sourcegrass bread and butter
18:04right like that's the code graph that we
18:05spent the past 10 years building and we
18:07can do that essentially in any language
18:09any code base at compiler level accuracy
18:11and so we can do that automatically so
18:14like the the language model Cody will
18:17essentially do the kind of like toilsum
18:19task of like you know fetching contacts
18:21and then click click click clicking
18:22through the code graph to acquire the
18:24necessary context and then take your
18:26user prompt and synthesize an answer
18:28from like the actual code Snippets that
18:31it thinks are relevant and I think that
18:34that's going to lead to much better
18:35results I think it's also going to lead
18:37to much more kind of introspectable
18:39results so getting Beyond this like oh
18:42elements are magic how do they work is
18:43it AGI you know what not it's like Cody
18:45will actually tell you like hey I read
18:46these files and these are the files I'm
18:48using to generate an answer and if it
18:51completely uh you know returns a lie or
18:54is wrong you can usually tell by looking
18:56at the context uh that it read like oh
18:58like that why are you reading that five
18:59code Cody that's dumb and you can like
19:01thumbs down that and and we'll take that
19:03as as like a reference point to improve
19:05the product later another key question
19:07that Founders are asking is how they'll
19:10and Source graph thinks that open source
19:12is part of that answer
19:14the other major distinction I would draw
19:16from a lot of other offerings on the
19:18market is we are trying to build this as
19:20much in the the open as possible so like
19:22Cody the the other extension we've
19:24released as uh open source under the
19:26apache2 license and we think that's kind
19:27of like the right mentality to build you
19:30know the standard AI editor assistant
19:32that every Dev uh should use we think
19:34that there's kind of like a natural
19:35inclination among developers I'm a
19:37developer myself I prefer to use open
19:39tools and I think it'll it'll make it
19:42possible for us to build like a a much
19:45more like pluggable ecosystem like when
19:47you think about like like the wide wide
19:49world of like context that you would
19:51might want to pull in to answer a
19:53question in your mind as a developer you
19:54know it's not just the source code or or
19:57the markdown documentation like maybe
19:58you're searching through your issue
20:00tracker maybe you're searching through
20:01like chat messages maybe you're
20:02searching through like Google Docs or or
20:05notion for like the the latest products
20:07back and by virtue of being open source
20:09we we make it a much better and friendly
20:13kind of ecosystem into which we can plug
20:16in like other developer tools and other
20:17pieces of context that aren't
20:19necessarily tied to a proprietary
20:21compute platform yeah I mean something
20:23that's jumping out to me is if people
20:25aren't developers like even think about
20:26writing an article right imagine writing
20:29that article based on just like your
20:31last five tabs that you had open that's
20:33gonna be very different to being able to
20:35actually Channel your goal and search
20:37the web yourself search your own notes
20:40um that are specifically likely relevant
20:42to the task at hand and so I think this
20:45is like a really interesting perspective
20:46as to we know what goes into the model
20:49matters in terms of what you get out of
20:52the model and basically what you're
20:54establishing is two different approaches
20:56both the open source approach which
20:58allows more people to help you extend it
21:00but then also this approach of really
21:03finding the most relevant information
21:05that's going to help the model give you
21:08the best answer all right so we learn
21:10from Beyond that what you feed models
21:11really matters and that may be an
21:13opening for differentiation relative to
21:15competitors but part of that equation is
21:17just the data that you have access to
21:19here's Ed commenting on the value of
21:22creating proprietary data sets the
21:24opportunity that we see or the
21:25opportunity I think that exists in the
21:26market is that you can do that exact
21:28same thing if you can collect
21:29proprietary data sets that sort of are
21:31unique and at that same scale and you
21:33can actually sort of in some ways sort
21:35of actually sort of train Foundation
21:36models that are like sort of able to do
21:38even more so like the internet or the
21:40web web pages are like a fairly sort of
21:42like static sort of language modeling
21:44task where like you're sort of doing a
21:45task and you're trying to complete the
21:47next token or the next word the kind of
21:49things we interact with in the contact
21:50center which is like what dialogue and
21:51action on sort of you're interacting
21:53with someone and then you're sort of
21:55working with Enterprise software and
21:56systems of record to like fill out basic
21:58things it's both sort of this
22:00intersection of like two sets of data
22:01sets that don't really exist together
22:03but and we sort of are able to collect
22:05through through all our work with a lot
22:07of different companies and customers
22:08like a really really big data set of
22:09like this kind of this type of work and
22:12then that sort of enables us to then
22:13train over time these large Foundation
22:15models that sort of use that
22:17intersection of that to data set to sort
22:19of do things that aren't possible just
22:21with web pages another wedge may be
22:23customization how can you actually use
22:25the information from past user Behavior
22:28to support the user in creating better
22:30prompts here's Barry's tick under the
22:33hood we are doing a ton in terms of
22:35constructing the right prompts and
22:37parsing responses back from the model
22:40apis we're using and so we again we have
22:43thousands of people already writing SQL
22:45and writing Python and doing data work
22:46and hex every day so we have a ton of
22:48information we've got we're connected to
22:50their database schemas so we see the
22:52structure of their data we see past
22:53queries and past code they've written so
22:55we know which tables and columns are
22:56most frequently referenced we have
22:58information about the project they're
23:00building so we know like oh this project
23:01is already referencing this part of the
23:03schema and that's probably the relevant
23:06you can even look at things like this is
23:09the typical way this organization is
23:10formatting their charts and infer how
23:13they might want their visualizations to
23:15look and so there's a ton of context we
23:17get because we're incorporating this in
23:19an existing set of workflows that can
23:21help us create the right context and and
23:23create the right prompt to pass to the
23:25model and I think with all of this right
23:27now our again our Focus really is on
23:30the data professionals like we're not
23:31this is not like you know trying to
23:34Black Box thing it's it's actually like
23:36we have a rule as we're building these
23:38features it's like we don't run the code
23:39for you like we will generate the code
23:41we will show it to you but we want to
23:42keep the human in the loop AI sure can
23:45feel like a black box so designing a UI
23:47that actually guides the user is also
23:49becoming a differentiator at least for
23:51the time being where you're effectively
23:53nudging users to want to interface with
23:55these new tools but also come back any
23:59learnings in terms of what is increasing
24:01completion rates what is getting people
24:03to actually interface with this new in
24:06some ways superpower within the app
24:08because there is the flip side where as
24:10you said if it's hallucinating or if
24:12it's not helping them or in some cases
24:14it may even be hindering their workflow
24:16they're not going to return to it so how
24:18are you thinking about designing that
24:20there's like some UI things we've done
24:22as an example like in HEX you know we
24:24kind of have these code cells that you
24:26use and so we put a lot of thought into
24:28like what's the right way to interact
24:29with that is like a comment completion
24:31we have this cool thing that um seems to
24:34have been inspirational to other people
24:36now um a little drawer that kind of pops
24:38out of the top and like allows you to
24:39like kind of edit what's been alone and
24:41it kind of talks itself away when you're
24:42done with it you can keep using it I
24:44think giving people feedback on what's
24:47Happening um some of these models like
24:48the latency is still really high so even
24:50something small like just being really
24:51thoughtful on like how you're exposing
24:53to the user what's going on and why
24:55they're waving one of the things that
24:56we've observed sort of more on the back
24:58end in terms of increasing completion
25:01when you're building prompts I think
25:04we've been we were tempted early on to
25:05try to shove as much context as we could
25:07in you kind of figure like the more I
25:08can tell this model the better and you
25:10realize you can pretty easily confuse a
25:13model in terms of the amount of context
25:15you're passing and so we've been really
25:17thoughtful on iterating through and
25:19finding what's the right level of
25:21context to pass through between what
25:22else is going on in the project or the
25:24underlying data schemas or or past work
25:26they've done that's going to best
25:28encourage and there's a lot of
25:29techniques you can start to La like sort
25:31of ladder up and use in terms of working
25:32with chains through link chain through
25:34starting to apply embeddings to better
25:37allow you to sort of craft the right
25:39context to be able to pass through that
25:40model and I think that is really
25:41actually the hard work in many ways of
25:44incorporating AI into your product right
25:46like it feels really simple like you can
25:48go get maybe anyone listening to this
25:50today could go to the open air website
25:53it's everything you're doing before
25:55you're sending requests to that API of
25:57like building the right context
25:58understanding how you're iterating on
26:00how different models are going to
26:01respond to different types of prompts
26:02and different amounts of context how
26:04you're potentially even chaining
26:05together different types of models or
26:07different modalities of models together
26:09in one sequence in ux and then how
26:12you're giving that feedback to the user
26:13I think again she feels like a really
26:16exciting time because it's very early
26:17days and figuring all of this out yeah I
26:20mean mean I spoke to someone yesterday
26:21and what you're saying really resonates
26:23with what they said they started off by
26:25creating like a lensa like product but
26:28at the end of the day there wasn't
26:28really a moat there because anyone could
26:30kind of hook up to dream booth and
26:32create the same thing yeah right but
26:34today I I asked him like how he's
26:36thinking about differentiating and at
26:37least in his scenario he was saying you
26:39know I I now link 15 different models
26:41and so that maybe that isn't much of a
26:44mode either but the point is it's not as
26:46simple as just hooking into one API so
26:48we've touched on different aspects of
26:50differentiation from customization to
26:52context fetching but is any of that
26:56where does value really accrue when your
26:58competitors can copy your UI or your
27:00latest feature let's attack the question
27:02head on where is the competitive
27:05Advantage where is the moat is it in the
27:07UI it feels like that can be replicated
27:09very easily is it in the cleaning of the
27:12data the linking of the data is it
27:14something else like how are you thinking
27:15about the fact that basically you can
27:17kind of depend on all your competitors
27:19to replicate what you do if it's working
27:22yeah we've already seen that um yeah uh
27:26down to some UI elements that look very
27:28familiar um yeah you have to take that
27:31as a given I think the the models
27:33themselves and being able to send a
27:34request to one of them is is already a
27:36commodity I think there's a few layers
27:38that we think about this hat one is we
27:40do have a pretty great data Advantage
27:42you know already having hundreds of
27:44customers thousands of users writing
27:46millions of lines worth of SQL and
27:49we can use that sort of Rich information
27:52um we're not using it to train models I
27:54should be very clear like that's one
27:56thing we've been very upfront with our
27:57customers about like we're not training
27:59models where they would expect to ever
28:00have like some code they've written via
28:02completion for someone else which is a
28:04problem in other places but it's more
28:05like using that actually to personalize
28:08for that person and their team you know
28:10again like which schemas you're using
28:11which what is their code style been like
28:13in the past like having all of that
28:15information and people who are already
28:17doing that work and hex gives us a big
28:18Advantage thing too is I talk about the
28:21team all the time like it's kind of what
28:22we've already been doing which is
28:24building really thoughtful
28:25well-constructed user experiences and
28:27uis like you could say that there's very
28:29little move for a lot of things we or a
28:31lot of other SAS companies do inherently
28:33for in their products it's really how do
28:35you put these pieces together and how do
28:38you build a really great user experience
28:41you know the pixels on the screen to
28:43thinking about performance behind the
28:46scenes to things like docs like that all
28:49sort of combines to being a really
28:50Superior product experience and it's
28:52stuff that we're you know three and a
28:54half years into taking super seriously
28:56and so that that is a sort of advantage
28:58and almost like infrastructure we can
29:00already stand on in terms of how we
29:02start incorporating these things in
29:04I do think that long-term generative AI
29:07large language models they will change a
29:09lot of like fundamental assumptions and
29:11I think that there will be an advantage
29:12to the companies that can be the first
29:14ones to figure out what those things are
29:16just like social apps benefit from
29:18Network effects it's actually not crazy
29:20to imagine that even if certain products
29:22are built on the same models first
29:24movers that incorporate customization
29:26May maintain an advantage because if you
29:29spent months or even years personalizing
29:31a model to your specific needs would you
29:34want to do that again but as these
29:36platforms collect more information about
29:38your preferences and potentially even
29:40train models on your data certain
29:42companies May build a data moat but
29:44that's also where privacy comes into
29:46play the future of privacy and Security
29:48in AI is a complex and evolving issue
29:50especially since your competitors may be
29:52on the other side of that API call
29:54companies are rightfully asking
29:56questions about data collection and data
29:58storage a lot of these companies that
30:01are integrating AI are building off of
30:03just a few models right a lot of people
30:05are familiar with open ai's API that
30:07came out recently but there's also that
30:10very interesting Dynamic that a lot of
30:12the same companies that might even
30:14consider themselves competitors are
30:16using similar models and so how did you
30:19think about that and also there's this
30:21kind of layered question as it relates
30:22to security and privacy because
30:24depending on the company that you are
30:26your code is actually potentially
30:29somewhat all the way to extremely
30:31proprietary right if you're talking
30:33about like a self-driving car company
30:34and so how did you think about that
30:37given that again this code for the the
30:40customers that work with you is
30:41extremely important yeah that's an
30:43excellent question and it's especially
30:45pertinent to us because we have a lot of
30:47Enterprise customers that are very
30:49Security in privacy sensitive to the
30:52point where you know Source graph is one
30:54of the reasons we made it self-hostable
30:56is because we wanted to enable companies
30:59that didn't want to put their code bases
31:00in the cloud to still have like awesome
31:03code understanding and code search tools
31:06I think for us like the way we think
31:07about it like we love opening AI we love
31:11um cohere is also great I'm looking
31:13forward to you know what Google releases
31:15in in this Arena too and there's just
31:17like amazing developments happening in
31:19open source uh as well um like there's a
31:22team at Salesforce that produced a
31:24really good Cogen model called Cogen and
31:27it's it's kind of competitive from what
31:30I can tell with with kind of the the
31:31Codex based model that you know other
31:33tools use at the moment for throttle
31:35complete and so the the spaces is kind
31:37of like Fast evolving and our mentality
31:40is like look we have
31:41a wide range of customers from like very
31:44conservative large Enterprises to like
31:46fast-moving startups that have different
31:48risk and security profiles the language
31:50model in in our like overall
31:53architecture is just one component it's
31:55one component of the the bigger Cody
31:56picture the the other you know
31:58components being the The Source graph
31:59code graph and and the various other
32:01developer tools that we want to
32:03integrate in in kind of an open way and
32:06so we want to make it possible to kind
32:08of like bring your own language model to
32:09the table you know some of our customers
32:11have negotiated separate deals with
32:13model providers they want to use that
32:14model some have in-house models that
32:16they built that they want to use and
32:19some are like hey can you give us
32:20something that we can self-host and
32:22while we don't have one that's like
32:23completely self-hostable yet I think you
32:25know with all the interesting things
32:26happening around you know llama alpaca
32:28and that sort of thing I think like
32:29within six months we'll have like a
32:31viable language model that can do like
32:33really good q a based interactions and
32:36so our mentality has just been like look
32:38we want to work with all the model
32:39providers they're all innovating at just
32:41a a crazy uh like Breakneck pace and our
32:45desire as a business is to do what's
32:47best for our customers and users and and
32:48right now that just means like whatever
32:51is the latest and greatest on the market
32:52like let's try to plug it in and and see
32:54how it does so you're basically saying
32:56that you give them the selection or the
32:58option am I understanding that correctly
33:00or are you still building off of one
33:02model like when someone uses Kodi is
33:04that integrated with one or are you
33:06basically saying again you're going to
33:07give them the option to select how they
33:10want to integrate uh we'll we'll give
33:12you the option so right now yeah you can
33:14use uh Claude which is anthropics uh
33:17kind of Flagship model you can use
33:19Chachi BT which is kind of the open AI
33:22model and we're looking to integrate
33:23additional models too and there's also
33:25kind of like different models that we
33:26plug in in different pieces of Kodi
33:29right so there's kind of like the chat
33:30based models which are often like the
33:31largest ones but there's also things
33:34like the embeddings model that's what we
33:36use to generate the kind of embeddings
33:37vector uh the embeddings vectors that we
33:40we use to do really good kind of like
33:42fuzzy semantic level code search and we
33:45have an open source model that that we
33:47fine-tuned and that's actually like the
33:49best embeddings model that we have now
33:51if you want to use like one of the
33:53proprietary embeddings offerings too we
33:54can definitely do that but I think our
33:56our mentality is just like the language
33:58model aspect of this we want to make as
34:00kind of like pluggable as possible
34:01that's amazing because something that
34:03that also relates to is cost right like
34:05each of these different models has a
34:07different cost I think a couple weeks
34:09ago being like 5x their pricing
34:11overnight right like you have a
34:14dependency as well both Source graph but
34:16also like that ends up filtering down to
34:18your your customers and so every one of
34:20these models I mean I think we're still
34:22in the early Innings and there's going
34:23to be so many more developed and each
34:25one well to your point it'll have a
34:27different security posture it'll have
34:28different pricing scheme it'll probably
34:30you know there will be a range in terms
34:32of its efficacy or specialty in certain
34:34uh areas and so that's actually you know
34:37it never dawned on me that actually you
34:39know you could offer the access across
34:42across the board to all these models but
34:44also kind of relay the transparent pros
34:47and cons to the customer base is that
34:49how you're kind of thinking about it
34:50that's exactly how we're thinking about
34:52it and I I think for us it's kind of
34:54like there's so much Innovation
34:55happening in that space and and we don't
34:57want to be kind of tied to any one
35:00provider and and also I think a lot of
35:02the value that we can provide in terms
35:04of Cody and in terms of like integrating
35:06AI deeper into Source graph is really
35:08about combining the language model with
35:10the pieces of context and the structured
35:12uh understanding of code that we have
35:14and so it's really the combination of
35:15those two I think that that uh make
35:17something truly valuable and it's funny
35:18that you mentioned the kind of Bing
35:20price hike I thought that was like a big
35:23proof point and people notice because
35:24like when when Chachi PT first came out
35:27I think a lot of people said like hey
35:29you know this kind of replaces search
35:31engines right like I could just chat
35:32with this thing and it'll just tell me
35:34the answer instead of me having to go
35:35and like click through a bunch of
35:37different results uh and figure out the
35:39answer myself but then as people started
35:41to use language models a bit more they
35:43started to run into more hallucinations
35:45and I think it was like the release of
35:47being where people finally realized like
35:49like being released they they integrated
35:52Chachi BT or or gpd4 you know one of
35:55those like awesome like open AI models
35:57in but they combined it with the the
36:00fundamental Bing search they just didn't
36:02they didn't just like ship a white label
36:04chat EBT they combined that with Bing
36:06search on the back end and a lot of the
36:09other AI enabled search engines were
36:11actually using the Bing back end as well
36:12because that was as far as I know like
36:14Google doesn't have like a search API
36:15that that you can use so everyone was
36:17using Bing and that just spoke to the
36:20the necessity I think of combining kind
36:23of the language model as as sort of like
36:25the the reasoning engine but you still
36:27need kind of like an informational
36:28retrieval engine to make that truly
36:30powerful and the Unison that that really
36:32is valuable and that's maybe I'm
36:35speculating here but like that maybe had
36:37something to do with the being uh price
36:39hike like it it is not true that
36:42language models make search engines
36:44unnecessary if anything they make the
36:45search engines more valuable because now
36:47all that data that you can search
36:48becomes like 10x more powerful because
36:51you can use that to you know get to get
36:53to your answer with like you know one
36:55tenth the effort or in one tenth of time
36:57yeah I mean that's such a a great point
36:59that the search engine itself is still
37:01it still has utility to find all the
37:04information that at the end of the day
37:05feeds into the language model Beyond
37:08brings up a really important Point
37:11these models are not free and we're
37:13still in the early Innings of figuring
37:15out the right business models for llms
37:17how much do they cost what value add
37:19would allow certain providers to charge
37:21way more and is it worth integrating
37:24into multiple models to reduce price or
37:26platform risk but something else that
37:28feeds this equation is the cost to
37:30maintain data which is also not free
37:33here is Barry reflecting on the deep
37:35relationship between dollars and data
37:38one interesting facet of all this is
37:40that there's kind of this as we just
37:42discussed this realization of the value
37:44of data and I think we're going to see a
37:47lot more companies retain their data for
37:49longer collect more data from their
37:52products or from their users and so
37:54there's this again this privacy security
37:56posture of like okay we want to have
37:58this data we're probably going to keep
37:59it for longer how do we keep it safe but
38:02then there's also a cost element right
38:04it's not necessarily free to retain data
38:06it's also not free to run these models
38:08it's also not zero risk a lot of
38:10companies intentionally delete their
38:13data like people will have retention
38:14policies on emails or slack but by doing
38:19eliminating knowledge and that knowledge
38:21could be useful sort of like in the
38:23first instance just like I want to go
38:25back and search that's the thing I think
38:26a lot of big companies deal with as they
38:28start to like have record retention
38:30things but there's also like second
38:31order value to that in terms of could
38:34that inform a model or a um
38:37you know potential different
38:37applications for how you can learn from
38:39what your organization has done
38:42I don't know where we're going to wind
38:43up on that I do think you're right that
38:45a lot of companies are realizing the
38:47value of their data and their IP and
38:48thinking about that in a deeper way
38:49already operating in the data space both
38:52at hex and previously we already have
38:54this like appreciation for people being
38:56really paranoid about their data people
38:57really caring about where their data
38:59goes and so in some ways there's nothing
39:01new for us here I think there's all
39:03sorts of really exciting opportunities
39:05on how you could use data that customers
39:07are trusting you with making sure you're
39:08continuing to earn that trust with all
39:11these considerations it's really easy to
39:13get stuck in the weeds so let's take a
39:15step back and imagine a future where
39:17companies have successfully deployed
39:20not as a gimmick but in a way that
39:22successfully gets their customers closer
39:24to their goals what differentiates
39:26software is like can you build the
39:27workflows that get to the business
39:29outcomes and as we close out let's
39:31return back to where we started
39:33the contact center let's say you know
39:35we're in 2023 let's jump to 2028 five
39:38years from now given the technology that
39:41we have today also the change in the
39:43cost curve that you mentioned which
39:44sounds really fundamental what does the
39:46contact center look like in 2028 what I
39:48think it looks like is this concept of
39:50like oh like I remember like when we
39:52started the company uh Tim and I we got
39:54this video from our first customer that
39:56like we we had both we both had not
39:59dropped out of the PHD yet but like we
40:01were working with our first customers
40:02sort of like getting our first sort of
40:03thing and our and they showed us a video
40:05about their vision for the contact
40:06center of the future looked like like
40:08this is five years ago and when we saw
40:10that we're like holy like okay
40:11everyone has the same video but it's
40:13like we're going to drop out tomorrow
40:14and we're going to go go to this company
40:17um but but the vision was basically this
40:18idea of like a touchless close where um
40:20it was one it was uh one of our first
40:22customers who basically had this idea
40:23that they had a contact center of like
40:26people that would they didn't sit at
40:28computers anymore they just like would
40:29walk around and have like a a dialogue
40:31uh sort of a just a a piece that they
40:33were using to talk to customers and all
40:35they were focusing on was the
40:36interaction with the customer and the
40:38relationship building aspect and of the
40:40back end like the system is like doing
40:41the um it's like doing the sort of
40:44automatically data entry The Filling
40:46filling of the form all these things
40:47it's like taking care of all the stuff
40:49on the back end and like the agent's not
40:51typing anything they're not doing any
40:52data entry they're not doing any
40:53summarization they're not looking up the
40:55customer record like all that stuff is
40:57like handled by the machine and it's
40:58handled in a way that the human can just
41:01like focus on how do I build a good
41:03relationship and resolve this customer's
41:04problem or like help finding the right
41:06product for their solution and like
41:08that's naturally what humans are great
41:10at which is that sort of empathy and
41:11connection and sort of relationship
41:13building and I think that's what's going
41:14to be we'll have a much less regimented
41:16contact center like much less sort of
41:18structured and like these are your times
41:20you do after call work this is your
41:21average handle time goal these are the
41:23things it'll be just much more focused
41:24on hey how do we use this Center as a
41:27proactive way to strategically build
41:28relationships with key customers and how
41:30do we build those relationships with
41:32with our folks and like take away all
41:34the mundane busy work from it and then
41:36for the stuff that people want like
41:37frictionless experiences like there will
41:39be things out there don't want to
41:40interact with the human about at all and
41:42like those are like we can provide those
41:43in a fully digital sort of frictionless
41:45manner that they can deal with if they
41:46want to deal with the bot for those like
41:48strategic things that they want to
41:49interact they want to trust someone and
41:50have an interaction with someone for
41:51like a critical decision and like feel
41:53like that they can trust something
41:54that's when we have humans available and
41:56like we want to have those humans tied
41:57up and busy work and maintain work we'll
41:59have those humans tied up and how they
42:00actually can help the customer and
42:02they're not measuring their average
42:03Channel time and they're not measuring
42:04like like I have to get to the next call
42:06I have to fill up those notes for this
42:07call it's they're able to focus really
42:09on the customer attraction that's a very
42:10nice picture that's painted and in that
42:12world it's it's almost as if the
42:15customer service agent is actually like
42:17a strategic role right it's not just
42:19like the the day-to-day let me plug in
42:21plug out let me solve this customer's
42:22request it's almost like like an account
42:24manager or something like that yeah and
42:26I think that's where it's headed because
42:27once you have systems that handle the
42:31like reactive things and you move to
42:33like sort of becoming strategic and
42:35proactive it's like how do I proactively
42:37build businesses like when I build
42:38relationships I'm proactively reaching
42:40out to build relationships with large
42:41Enterprise customers right and that's
42:43like a very strategic use of my time to
42:45like go spend time with them and like
42:47sort of build those relationships I
42:48think the same concept applies to like
42:50even if you whether you have like a
42:52hundred Enterprise customers or thousand
42:54Enterprise customers or like 100 million
42:55consumer customers it's like those folks
42:58are people at the end of the day and
42:59like building relationships with them is
43:00just as like strategic as as uh as it is
43:03with uh your your large number of
43:05Enterprise customers and so now
43:07technology can enable you to build those
43:08kinds of relationships in a very human
43:10way at scale yeah and on the customer
43:12side I could see how it could actually
43:13be you know you could even say fun or
43:16entertaining with the integration of
43:17Technology imagine that you know you're
43:19you're talking to a bot and not only
43:21does it serve your problem but it's also
43:23in the voice of like your favorite
43:24cartoon character and it's telling you
43:27answers in a way that you're actually
43:28like following a story or something like
43:30that and actually actually oh because
43:32they've done all this back-end research
43:34based on these customer conversations
43:35that actually you leave the conversation
43:37with a better deal than you had before
43:39and you didn't have to spend three hours
43:41on the phone to get there like I
43:42actually as we're talking about this I'm
43:44like wow you know you go from as you
43:46said a negative NPS industry you can
43:49actually imagine that these things could
43:51be fun and frictionless yep negative to
43:54positive negative to positive what a
43:57great place to end off ai's rapid
43:59evolution is constantly challenging
44:01companies to stay ahead and navigating
44:03the implementation of AI is bringing up
44:06issues of personalization design cost
44:08and privacy to Center Stage a big thanks
44:11to hex sourcegraph and Cresta for
44:13sharing their insights and experiences
44:14in this ever-changing landscape and if
44:17you'd like to learn more about these
44:18companies or what they're doing with AI
44:20links can be found in the show notes
44:22where you can also find links to a bunch
44:24of our recent AI coverage
44:27thank you so much for listening and
44:28we'll see you next time
44:32thanks for listening to the a16c podcast
44:34if you like this episode don't forget to
44:36subscribe here on YouTube to get our
44:37exclusive video content we'll see you