00:00hi everyone welcome to the a 6 & Z
00:02podcast I am sonal and today we have a
00:04podcast on the heels of Apple's
00:06Worldwide Developers Conference
00:08announcements yesterday trying to talk
00:10about some of the announcements and and
00:12more broadly some of the tech trends
00:13that they signify and hosting that
00:15conversation and to kick things off we
00:17have Kyle Russel moderating so yesterday
00:20Apple showed off a bunch of new
00:22developments with regards to their
00:24several different operating systems iOS
00:26on your phone and iPad Mac OS renamed
00:29from OS 10 on the desktop and laptop
00:31computers to make and watch OS and kind
00:34of looking across the spectrum of
00:36software they talked about there were a
00:38couple of big themes that we saw there's
00:39kind of this platform occasion of all of
00:41the apps that Apple makes available on
00:44iOS there was also a lot of focus on AI
00:47whether it was identifying faces of
00:49people you know in photos you take from
00:51your photo library or suggesting which
00:53emoji to use in conversation when you're
00:55just chatting with your friends and
00:56there's another thing I was kind of
00:58related to how they bring software and
00:59interface elements between their
01:00platforms which was this kind of widget
01:03if ocation of different interface
01:05elements where parts of the lockscreen
01:07look like parts of widgets that you use
01:09within apps look like suggestions that
01:10Siri brings up and it's kind of making
01:13the interface work in multiple contexts
01:17while also making it all still feel
01:19familiar no matter kind of what angle
01:20you're approaching a task from so to
01:22jump into kind of all three of those
01:23things I'm joined by Frank and Benedict
01:26experts on AI and mobile respectively
01:29so Benedict what did you think of this
01:31kind of platform if occation of apple's
01:33apps frankly sounded a lot like what
01:36facebook and google have been showing
01:37off taking apps that people use every
01:39day and introducing ways for developers
01:41to kind of become part of that
01:42experience yes I think there's a bunch
01:44of different ways to think about this I
01:47mean one of them and I think first of
01:48all one should remind people that you're
01:51actually most iOS users use Apple Maps
01:53and almost all iOS users use iMessage a
01:56lot and so you know just because
01:58everyone in the valley is using Google
01:59Maps Apple maps is kind of a big deal
02:01and it's interesting that they've turned
02:03particularly there they've turned maps
02:05into a platform almost before Google has
02:08really I think which was a sort of a
02:11thing that was missing actually from
02:12Google i/o was that you
02:13there wasn't much sense of developers
02:15being able to put their own layers into
02:16Google Maps whereas here you know open
02:19table or hotels.com or uber can
02:22integrate directly into maps and start
02:25making suggestions or providing services
02:27kind of embedded within those and then
02:29the same thing in messages the same
02:31thing in Siri I think there's kind of
02:33two ways to look at this one is it's
02:34like this is Apple creating new entry
02:36points and new ways that you can
02:38interact and engage you can get people
02:40to use her app another way to look at it
02:43is it's like it's another thing that
02:45Apple's making developers do and it's
02:47not clear whether all developers will go
02:49for it so there was clearly huge pent-up
02:51demand for keyboards and app extensions
02:53whether there is huge pent-up demand for
02:55people to move out messaging
02:56applications particularly for people who
02:58aren't just doing gifts but you know
03:00whether there's a pent up demand for
03:01opentable to do a messaging plugin it's
03:04kind of unclear I think there's a sort
03:06of a third way to think about this which
03:09is to contrast it with Google so you
03:12know Google's you know you take the use
03:14case I am trying to arrange where to
03:16have dinner with somebody using a
03:18messaging app Google's use case would be
03:20Google's watching you and it says hey
03:21you should go to this restaurant and
03:23Apple's use case is that you have the
03:26open table key but plug-in to messenger
03:28you've got the open table app on your
03:29phone so you just have the open table
03:31icon and messenger and it's all
03:33suggesting restaurants and so there's a
03:35sort of an interesting split here in
03:37that it's like Google feels like it can
03:39subsume all the world's knowledge into
03:41Google and then it can tell you what you
03:42whereas apple is pursuing more of a kind
03:45of a third-party developer centric focus
03:47and I think that manifests in a lot of
03:49the ways that a eye is being which is
03:51we'll talk about later but a lot of the
03:53way that Apple is deploying AI as
03:54opposed to the way Google is applying
03:56applying AI feels like Google is sort of
03:59the all-knowing wants to be the
04:00all-knowing Oracle and you can ask it
04:02something and it will just know the
04:04answer it'll even suggest it before you
04:07whereas apple it's more like kind of the
04:11I don't know it's like the servant that
04:13follows you around and hands you things
04:15as you need them without knowing that
04:16you needed them it's just a sort of a
04:18slightly different approach to how you
04:21what kinds of problems you're trying to
04:23use AI to solve and I think you need
04:25sort of platform eyes
04:26is sort of interesting again because
04:28it's you're sort of you're kind of it's
04:29kind of a bottom-up approach as opposed
04:31to the Google approach which is a sort
04:33of a top-down approach to how you
04:35address these kind of problems yeah one
04:37of the ways to think about it is that if
04:40you think about how do you find
04:41applications and content we've gone from
04:44well the start button on your desktop OS
04:47controls that write hit start the start
04:49button and application show up then we
04:52went to mobile which was your home
04:54screen controls that right which is
04:55whatever set of icons appears when I
04:58push the start screen on my iPhone
05:00that's how I find content and then what
05:03Apple is trying to enable is a third way
05:05which is Siri is the front end for it
05:07which is I can just do a natural
05:09language query and it'll get me to the
05:12right app and then shoot it there
05:13parameters and so apples always been
05:15very consistent because it comes from
05:17this sort of desktop OS world which is
05:19there's a way somebody discovers the set
05:23of things you can do right and then
05:24there's a set of providers who provide
05:26that we used to call them application
05:28providers and now we call the mobile
05:29providers and now we'll call them chat
05:31providers right which is I'm inside
05:33iMessage and I'm on the other side of
05:35this conversation so comparing and
05:37contrasting approaches from these big
05:38incumbents looking at apples focus on
05:42privacy you know they talked about how
05:44they want to enable you to see things
05:47like Oh faces and locations that pull up
05:49across your photo library like be able
05:51to ask Siri hey show me photos from last
05:53summer at the beach that I took that are
05:55in photos or even you know with opening
05:57Siri up to other applications show me
05:59those same photos in the Instagram Apple
06:02you know for the last couple of years
06:03has been heavily pushing this emphasis
06:05on privacy in addition to you know
06:07getting smarter all the time and when
06:09you look at the amount of data that you
06:11have to look at in order to get those
06:12interesting insights it seems like those
06:15two kind of two goals are pulling
06:17against each other you want to be able
06:18to look at everyone's data and you know
06:20kind of pick it apart for all the
06:21learning that you might find you know
06:23common features but Apple seems to you
06:26know they they present what they
06:27president today hints that they think
06:29they found a way to do that without kind
06:30of invading your privacy without saying
06:32we know that Kyle was at this Beach on
06:34this day and do that's kind of creepy
06:36right Apple and Google couldn't be more
06:38different on this in Apple
06:40to emphasize the difference which is
06:42Apple says we make money when you buy
06:44iPhones so we don't have to monetize you
06:48in any other ways we don't have to look
06:50in your email we don't have to look at
06:52your search history we don't have to
06:53look at your pictures we don't need to
06:56look at any of those things but it turns
06:58out the way artificial intelligence is
07:00working which is in particular there's a
07:02technique of AI called deep learning the
07:04way deep learning algorithms work is you
07:07actually do need a lot of data you need
07:09to look at people's photos to recognize
07:11the faces in them and to tell whether
07:13you're at a beach or a work off site or
07:16in a forest so you need a ton of data so
07:19what Apple announced is they're using
07:21this very interesting computer science
07:23technique called differential privacy
07:24and the idea here is that you can feed
07:27the data in a way that guarantees
07:30cryptographically the security of the
07:33data in other words they're not just
07:34saying we promise not to mix your user
07:38ID in with this data they're trying to
07:40do something stronger which is there's
07:42cryptographic techniques you can use to
07:44guarantee that there's a separation
07:46between the personal identification and
07:49then the data itself right so that they
07:51can see my photos without knowing that I
07:53took them so there's a trade-off between
07:55how much security you can guarantee and
07:57then the accuracy of their predictions
07:59and so it'll be interesting to see if
08:02they can get to as accurate predictions
08:05as something like Google photos which
08:06doesn't use this technique so another
08:08element of Apple's AI strategy is that
08:11you know when they were talking about
08:12looking at your photos they say that
08:13they do the learning on device and if
08:16you look at some of the tools they
08:17released for developers they included
08:19this set of API is for basic neural
08:21network subroutines so construction
08:24constructing the different layers of a
08:26neural network so essentially the tools
08:28that they use in order to build the
08:30photos learning on top of the new
08:32version iOS what is the difference
08:34between learning in the cloud you know
08:36Google's going for given their massive
08:39investment in data and server farms and
08:41learning on device are there certain
08:42pros to doing it on device or you know
08:44why do you think Apple would be moving
08:46yeah it's not exactly clear what
08:48developers can do on it yet so we'll
08:50need more details from them but very
08:51generally speaking the way
08:53learning works is that you set up a
08:56neural network it's got a bunch of nodes
08:58with connection strengths between them
09:00and then you feed it a bunch of data and
09:01what the neural network is doing is
09:04adjusting the strengths of connections
09:06between these nodes so that collectively
09:09the nodes can make predictions you feed
09:12it a picture that's got a dog in it or
09:14that's got a cat in it or you're in a
09:16forest or your friend Harry is in it
09:18right so essentially what these neural
09:20networks are doing is making predictions
09:22based on input now the way these
09:25networks get to high accuracy is you
09:28need a ton of data and so distantly what
09:30you do is you get a ton of data and then
09:33you train the network and the result is
09:36now you have a train to network and then
09:38you can take that train network and it
09:40will make predictions so what is clear
09:43that you can do the Apple neural network
09:44stuff is you can get trained networks
09:47and then you can run the classifiers
09:48make inferences is what they call it if
09:51you read the developer documentation on
09:53device so that seems to make a bunch of
09:56sense which is once you have a train
09:58network in other words you've taken a
09:59million pictures and you trained it the
10:01resulting trained network will run on
10:03device very very quickly so that's great
10:05the benefit of doing it on device is
10:09that you don't need a round-trip to the
10:10clouds what you'd make the point that
10:11that's how Google Translate works on the
10:13phone so Google Translate is run on the
10:15device when you have it on your smart
10:16phone and say that's effectively what
10:18they've done yeah so there's the data
10:20center side of it which is analyzing and
10:22training the networks on all these
10:23language pairs the result of that
10:25training goes on to the device so that
10:28you don't need to be connected to the
10:29Internet for it to do the translation so
10:31the exact same approach will work on
10:33Apple what will be interesting is to see
10:36if they extend those API so so so some
10:39rudimentary training can actually take
10:41place on the phone itself so you can
10:43imagine maybe I just want to train a
10:45neural network on my photos and maybe
10:48that's enough of a data set to find
10:49pictures of my friends now it might not
10:52be good at finding pictures of beaches
10:54and forests and you know that type of
10:56thing but it'll probably do a pretty
10:58good job of the faces of the people I
11:00take pictures of all the time so it'll
11:02be interesting to see if they take that
11:03next step which is do rudimentary
11:05training on the phone as well
11:07the reason that you wouldn't do that
11:08today is maybe you don't have enough
11:11data and it consumes battery so let's
11:14leave the training to servers that are
11:16plugged in and then to maybe move to
11:19something a little bit less a satiric
11:20and back into the world of things that
11:22we all will be touching on our phones
11:23every day there was something kind of
11:25throughout the presentation with this
11:27would you deface whether you're looking
11:30at the lock screen where there's now
11:32interactable notifications where you're
11:33not just responding with touch but with
11:353d touch the feature that Apple showed
11:37off last year where it can also detect
11:39pressure to you know the widgets that
11:41are activated using the same mechanism
11:42things that show up in your notification
11:45screen a lot of interface elements that
11:47seemed to derive they're kind of
11:49fundamental thinking from a lot of what
11:51we see on the Apple watch
11:52but I think did you think that that was
11:54particularly interesting or you know
11:56what what did you find notable about it
11:58well I think there's always been this
12:00sort of question of you know bundling
12:02and unbundling on the device and you
12:04know before there with BOTS there were
12:06people were talking about a lot like
12:08this time last year or maybe earlier
12:09about how notifications were going to be
12:12like the new runtime and everything
12:13would happen inside the notification
12:14outside of the messaging app sort of in
12:17your notification feed what Apple has
12:19done is they've sort of created this
12:20format whereby an app can contain a
12:23little bit of content and a little bit
12:24of a couple of buttons and that can be
12:27in Siri it can be in messages it can be
12:30on a pop-up if you press on an app it
12:32can be in your widget screen you'll kind
12:35of build your swipe left screen and it's
12:37kind of interesting I'm not sure what
12:38the broader implications of that are it
12:40feels more like that if just sort of
12:41said let's just kind of tidy it up
12:43because we've got all these different
12:44bubbles all over the place and let's
12:45make them all work that's sort of the
12:47same way and I think what it does point
12:49to is I sort of mentioned earlier is
12:50this question of you know kind of
12:52endlessly shifting interaction models
12:53you know we have so many different I
12:55mean since this point I would make if I
12:57was to say I installed an app on my
12:59smart phone in five years time I don't
13:01quite know what that would mean and I
13:03think we're seeing some of that from
13:05Apple now I come at that question from
13:08an oblique angle one of these we haven't
13:10mentioned so far is apple enabling Apple
13:12pay on the web which is also a sort of
13:14an interesting kind of question of well
13:16where does Apple choose to put the
13:18aggregation layer in this because you
13:20know the obvious in our
13:21this would be you know Apple wants
13:22everyone to make apps so that you can't
13:24lose the ice and leaves the iPhone
13:25because you've got all your great apps
13:26and then Apple pay as a way of making
13:28Apple apps better or apps on Apple
13:30devices better but really Apple pay is
13:32about making the device better and so
13:35here you now say we know here you are
13:37ecommerce vendor there's a whole class
13:39there's a whole category of people for
13:40whom it really doesn't make much sense
13:41to make an app because people aren't
13:43going to buy from the more user service
13:45often enough but now they have access to
13:46Apple pay so even though that retailer
13:49that you buy for maybe six months it's
13:50still better to buy from them on an
13:52iPhone or an iPad because you owe it all
13:54on deed on a Mac because now you can
13:56just have Apple pay instead of typing
13:57your credit card number in I mean I
13:59think maybe sort of there's a couple of
14:00ways to pull back from this there's like
14:02every one of these events in the same
14:04for the Google i/o and Apple this is
14:05stuff that's just been on the roadmap
14:07for the last two or three years and it
14:08makes sense and it's nice to have if
14:09you've got the device and some of that
14:11requires developer support some of it
14:13some of it will get developer support
14:14some of it won't some of it was on the
14:16other platform last year some of it will
14:18be on the other platform next year and
14:19that's just kind of a general kind of
14:21continuous kind of process of just sort
14:22of what's a very sustaining innovation
14:24on these platforms and then the stuff
14:26that sort of points to fundamental
14:29shifts in how you use the more
14:30fundamental shifts in how they want to
14:32do this stuff and I think the kind of
14:34the platform ization that we saw earlier
14:37is interesting as an attempt to solve
14:41some of the problems that Google is
14:42solving with its kind of pervasive
14:44all-knowing AI but getting third-party
14:47developers to solve those problems
14:48instead so instead of Google sucking in
14:51every cinema and being able to tell you
14:53what cinema is showing that film that
14:55you want Apple would get a cinematic
14:58ticket app in to plug that information
14:59into goo into app into Apple maps for
15:02example or restaurants or whatever it is
15:04and so they're sort of trying to use
15:06developers to compete with AI you can
15:08almost argue and then as I said the
15:10other thing is that they're not there
15:13has a themselves are using AI to make
15:15the existing apps better rather than to
15:18make sort of fundamentally new
15:20experiences whereas the argument would
15:23be what Google is trying to kind of
15:24shift the whole level up that's three
15:26let whole experience out the SAC and
15:28make the apps a lot of those apps
15:30slightly irrelevant in the way that
15:32Google web search makes some webpage
15:34instead of subsumes content into Google
15:36search and there's very different
15:37approaches to sort of thinking about how
15:40you should be doing this stuff and where
15:43the kind of the aggregation points are
15:45it's not quite clear where this is going
15:47we're kind of a point where Apple and
15:49Google are kind of converged with each
15:51other and now they're heading off in
15:52different directions but not quite clear
15:53which directions they're going in
15:54another kind of example of that being
15:57the sirikit integration where rather
15:59than just saying get me tickets to
16:01Captain America at 7 p.m. tonight
16:03and Google's like okay I'll figure out
16:05how to write that request it's more like
16:08Alexa where they're reliant on
16:09particular developers to introduce new
16:11skills so what you say is get me tickets
16:13to Captain America at 7:00 from Fandango
16:15yeah exactly so it routes that request
16:18through someone who's you know been
16:20thinking about those problems a little
16:21bit more developer you know who already
16:23has an app that solves that problem yeah
16:25I mean I've been thinking about the
16:26deployment phase of AI so to speak and
16:28you know it strikes me this instead of
16:31you know this is all saying in computer
16:32science that a computer should never ask
16:34a question that it should be able to
16:36work out the answer to clearly that's
16:38what one way of thinking about
16:39everything that Google showed at Google
16:41i/o it's also another way of thinking
16:43about a lot of these sort of sort of
16:45small and interesting and useful little
16:47bits of AI scattered all the way through
16:48iOS but they're trying to answer
16:50completely different kinds of questions
16:53and the Apple is like you know the best
16:55example is the thing you mentioned
16:56earlier you sign into your map because
16:58you walk up to it wearing an Apple watch
16:59so the computer doesn't have to ask you
17:01a password anymore it doesn't have to
17:02ask is that Kyle it just could work it
17:04out because the watch is that whereas
17:06Google would sort of say we should tell
17:10Kyle that he should leave for home - at
17:12home in half an hour early today because
17:13he's meeting a friend in Berkeley and
17:15there's bad traffic so we'll pop a
17:17message up on his device that says hey
17:18you should leave early and there's a
17:20sort of very different kinds of things
17:22that the computer should be able to work
17:23out but they're both things that the
17:25computer should be able to work out but
17:26they're very different sort of
17:27philosophically in terms of thinking
17:29about how you look at the user one last
17:31thing I'd like to bring up something you
17:33tweeted as the news was unfolding was
17:35how nothing kind of came up specifically
17:39for the iPad when it comes to iOS 10 do
17:41you think that that's a matter of the
17:43last couple of point updates to iOS 9
17:45have focused on the iPad because they
17:47Pro or is does that say something about
17:50kind of where Apple's paying its or
17:52focusing its attention I think there's
17:54two answers to this one is Apple has
17:56kind of worked its way through a reboot
17:58of the iPad so the iPad pro really
18:01changes the experience of using it the
18:03keyboard changes the experience of using
18:05it if you have a use case for the pencil
18:07that's a big deal if you I don't so I
18:09don't but pencil works really well the
18:11split-screen multitasking makes a big
18:13difference and so they've kind of
18:15rebooted the experience of having the
18:17thing and then the pricing which we have
18:20the introduction of new subscription
18:23pricing options last week potentially
18:25changes the economics to a developer
18:27because now if you're going to make a
18:29really interesting productivity app
18:30instead of having to charge $50 for it
18:32or give it away for free and those are
18:34basically your only options or charge $5
18:37for it and hope that you get a million
18:39people using it now you can make it a
18:41dollar a month or five dollars a month
18:42and you can make it $5 a month and $15 a
18:45month for different care functionality
18:46and so you have much more scape for
18:49people to invest in creating kind of
18:50more sophisticated more powerful
18:52applications because you have this kind
18:53of this kind of scope for recurring
18:55revenue within there um and so that's
18:57kind of I think all of that was sort of
18:59iPad focused actually but then they
19:02didn't do anything else and so I think
19:04like for example the kind of one of the
19:06remaining pointing pain points is
19:07opening files so if you want to open
19:10something from box or iCloud or Dropbox
19:13in PowerPoint or Excel or something you
19:15have to jump through about how five
19:16hoops and the sort of stuff that they
19:18could have cleaned up there that they
19:19haven't so you know that is slightly
19:21perplexing because there clearly is a
19:24very strong story in Apple's mind about
19:26the iPad as a route to replacing PCs and
19:29they've done a bunch of it but they
19:30didn't announce anything else other than
19:32the pricing and yeah that was a surprise
19:34to me prom with the platform vacation
19:36discussion is that it's all these little
19:37things like now phone can be taken over
19:39by whatsapp or Citrix or to have you and
19:42it looks like the native phone app but
19:43it also has additional features from
19:45those applications well on that point
19:47the thing that I'll call out is if you
19:50think about iMessage they haven't really
19:51touched it since the beginning of the
19:53launch right which is it's just a
19:55straight-up replacement for SMS and it
19:56was sort of free to the Apple community
19:58so I think yesterday you saw
20:01investment in much much richer data
20:03types in iMessage so you can send videos
20:06and they'll play you can do handwriting
20:08samples you can take over the entire
20:10screen you can send messages with
20:12invisible ink so that it can be a
20:13surprise by opening a photo and then you
20:15have to swipe it to actually see it yeah
20:17exactly so if you think about where this
20:19takes us then it takes us to WeChat
20:21right which is WeChat is the messaging
20:23interface to lots and lots of useful
20:25applications and so to write useful
20:27applications you need rich data types so
20:29some of them will be in text and you can
20:31just have a conversation and with Siri
20:33in the mixed you can either type or you
20:35can speak and then the answers will come
20:36back but some of the interactions you
20:38want to have rich data types right there
20:41so instead of just playing a video how
20:44about a buy button or a book it now
20:45button and so I think we're sort of on
20:48the first steps towards that we're
20:50richer and richer data types are
20:52supported in message and in iMessage and
20:54it's only a matter of time before you
20:56get a full-on application in there and
20:57then I have a piece of sort of
20:59interesting historical content a kind of
21:02one of my jobs here sort of court
21:04historian so you might think that
21:06artificial intelligence in Siri just
21:08sort of burst onto the scene recently
21:10and that like people haven't been
21:12thinking about this for a long time so
21:14on the contrary Apple in particular has
21:16been thinking about this verb a long
21:17long time and so to see how Apple was
21:20thinking about this in the late 80s go
21:22youtube a video called knowledge
21:24navigator which came out in 1987 and
21:27you'll see apples take on what is the
21:30future of knowledge work and voice
21:33assisted intelligence and you'll see a
21:36very very compelling demo of things that
21:38we still can't even do today and they've
21:40been thinking about this since 1987 so
21:42just go look that up and I think you'll
21:44enjoy seeing the historical flavor of
21:46this you got a sense of Apple sort of
21:50saying who are these guys who say we
21:51can't do AI but you also got a sense
21:54that they didn't really even want to say
21:56AI I mean it's a little bit like when
21:58they do the new iPhone and they say
22:01rattle off some buzzwords about you know
22:03how the chip in the camera works and
22:05they do it because it kind of sounds
22:07cool but they're not really that's not
22:08really what they're selling they're just
22:09saying it's a better camera now and in
22:12the same sense you know every now and
22:14kind of mentioned some AI buzzwords but
22:17they were sort of saying them as
22:18marketing they weren't saying what
22:21Google was saying which is you know we
22:24have built a fundamentally new and
22:25amazingly powerful AI system that uses
22:28easier techniques it can do these new
22:29things that were never possible before
22:30no they weren't doing that at all
22:32they didn't sit and go up and talk about
22:34how quickly they can recognize
22:36photographs or how many photographs
22:37they've got they didn't even really do a
22:39very compelling image recognition demo I
22:41mean they sort of said they can do it
22:43but they didn't do like the you know the
22:46the convoluted thing where you say you
22:47know you can find your son wearing the
22:49green boots in the car standing next to
22:51another child he's got a teddy bear you
22:53know the kind of stuff that Google
22:54lights do and they do it does feel like
22:56they they or it's a little bit like the
22:58common line that we Apple thinks about
23:00the device Apple thinks about the cloud
23:03has done storage and Google thinks about
23:04the phone as done of done glass and you
23:07sort of get a little still get this sort
23:08of sense from Apple that the the AI is
23:11just something that enables some new
23:14user experience and they want to build
23:15rather than you think oh my god now it's
23:18possible to add to recognize images
23:20right okay what can we do with that they
23:22sort of start from the feature that they
23:24want and build the AR to support it and
23:26I don't think you sort of got a sense
23:28that that yeah it's like of course they
23:29were gonna build some of this stuff you
23:31know they weren't going to not have any
23:32machine learning but it's still not
23:34entirely clear how much that ripples
23:37through the entire organization and
23:39ripples through the entire product
23:40whereas Google clearly are sort of all
23:42in on like this changes exact everything
23:45about how we do everything inside Google
23:46yeah apples very much more toe in the
23:49water let's see if we can deliver better
23:51user experience or better continuum
23:53features or better photo features
23:56through AI as opposed to AI is
23:58absolutely the next most important
24:00platform and everybody should get on
24:01board they are not there yet and very
24:03much so it felt like apples approach to
24:05MVP of products where Apple watch it's
24:08essentially notifications you can
24:10interact with for this it was you're
24:13going to get some insights like photo
24:14will just automatically sort out photos
24:17of you and your friends and beyond that
24:19not over-promising not being too
24:21aggressive just saying see we've bacon
24:24some stuff that can actually provide
24:25meaningful improvements to user
24:26experience and then over the next
24:28two three four years we'll see kind of
24:30the real version Apple have announced
24:32all this AI stuff and they said see we
24:34can do something but they've also said
24:36that they want to do it with their hands
24:37tied behind their back and they've given
24:39some suggestions as to my why that might
24:42be okay but you know one kind of has to
24:45presume skepticism relative to you know
24:48the biggest machine learning company on
24:50earth Google that titty isn't doing this
24:52is one hand tied behind their back and
24:54so it's not quite clear how well that's
24:56going to work in future thanks again to
24:59Frank and Benedict for joining to talk
25:00about these kind of big ideas coming out
25:02of not only Apple but also Microsoft
25:04Facebook and Google clearly all certain
25:07valley is focused on these issues ai
25:09reaching people where they are when it
25:11comes to platforms like messaging and
25:13maps so now that you know kind of the
25:16spectrum is complete with Facebook
25:17across all platforms Google with Android
25:19now Apple on iOS and it's other software
25:22platforms it's clear that these trends
25:24are only going to continue going forward
25:25and with that thanks for listening today