Kind: captions Language: en [Music] you've criticized the art project that is Sophia the robot and what that project essentially does is uses our natural inclination to anthropomorphize things that look like human and given more do you think that could be used by AI systems like in the movie her so do you think that body is needed to create a feeling of intelligence well if Sophia was just an art piece I would have no problem with it but it's presented as something else let me add that comics real quick if creators of Sofia could change something about their marketing or behavior in general what would it be what what's just about everything I mean don't you think here's a tough question I mean so I agree with you so Sofia is not in the general public feels that Sofia can do way more than she actually can that's right and the people will create a Sofia are not honestly publicly communicating trying to teach the public right but here's a tough question don't you think this the same thing is scientists in industry and research are taking advantage of the same as misunderstanding in the public when they create AI companies or published stuff some companies yes I mean there is no sense of there's no desire to delude there's no desire to kind of over claim what something is done but you have your paper on AI that you know has this result on image net you know it's pretty clear I mean it's not not even interesting anymore but you know I don't think there is that I mean the reviewers are generally not very forgiving of of you know unsupported claims of this type and but there are certainly quite a few startups that have had a huge amount of hype around this that I find extremely damaging and I've been calling it out when I've seen it so yeah but to go back to your original question like the necessity of embodiment I think I don't think embodiment is necessary I think grounding is necessary so I don't think we're gonna get machines that I really understand language without some level of grounding in the world world and it's not clear to me that language is a high enough bandwidth medium to communicate how the real world works I think they start to ground what grounding means so running me is that so there is this classic problem of common sense reasoning you know the the Winograd Winograd schema right and so I tell you the the trophy doesn't fit in the suitcase because this tool is too big but the trophy doesn't fit in the suitcase because it's too small and the it in the first case refers to the trophy in the second case of the suitcase and the reason you can figure this out is because you know what the trophy in a suitcase are you know one is supposed to fit in the other one and you know the notion of size and the big object doesn't fit in a small object and this is a TARDIS you know it things like that right so you have this got this knowledge of how the world works of geometry and things like that I don't believe you can learn everything about the world by just being told in language how the world works I think you need some low-level perception of the world you know be a visual touch you know whatever but some higher bandwidth perception of the world so by reading all the world's text you still may not have enough information that's right there's a lot of things that just will never appear in text and that you can't really infer so I think common sense will emerge from you know certainly a lot of language interaction but also with watching videos or perhaps even interacting in building virtual environments and possibly you know robot interacting in the real world but I don't actually believe necessarily that this last one is absolutely necessary but I think there's a need for some grounding but the final product doesn't necessarily need to be embodied you know who say no it just needs to have an awareness a grounding right but it needs to know how the world works to have you know to not be frustrated first waiting to talk to and you talked about emotions being important that's that's a whole nother topic well so you know I talked about this the the basal ganglia ganglia as the you know this thing that could you know calculates your level of miss constant contentment and then there is this other module that sort of tries to do a prediction of whether you're going to be content or not that's the source of some emotion so here for example is an anticipation of bad things that can happen to you right you have this inkling that there is some chance that something really bad is gonna happen to you and that creates here when you know for sure that something bad is gonna happen to you you cannot give up right that's not right anymore it's uncertainty it creates fear so so the punchline is yes we're not going to have a ton of intelligence without emotions whatever the heck emotions are so you mentioned very practical things of fear but there's a lot of other mess around but there are kind of the results of you know drives yeah there's deeper biological stuff going on and I've talked a few folks on this there's a fascinating stuff that ultimately connects to our joy to our brain you