topic_shift
bool
2 classes
utterance
stringlengths
1
7.9k
session_id
stringlengths
7
14
false
Yep.
QMSum_134
false
But I mean , I guess I guess the thing is , uh , this is another , smaller , case of reasoning in the case of an uncertainty , which makes me think Bayes - net should be the way to solve these things. So if you had If for every construction ,
QMSum_134
false
Oh !
QMSum_134
false
right ? you could say , " Well , there Here 's the Where - Is construction. " And for the Where - Is construction , we know we need to l look at this node , that merges these three things together
QMSum_134
false
Mm - hmm.
QMSum_134
false
as for th to decide the response. And since we have a finite number of constructions that we can deal with , we could have a finite number of nodes.
QMSum_134
false
OK. Mm - hmm.
QMSum_134
false
Say , if we had to y deal with arbitrary language , it wouldn't make any sense to do that , because there 'd be no way to generate the nodes for every possible sentence.
QMSum_134
false
Mm - hmm.
QMSum_134
false
But since we can only deal with a finite amount of stuff
QMSum_134
false
So , basically , the idea is to f to feed the output of that belief - net into another belief - net.
QMSum_134
false
Yeah , so basically take these three things and then put them into another belief - net.
QMSum_134
false
But , why why why only those three ? Why not the whol
QMSum_134
false
Well , I mean , d For the Where - Is question. So we 'd have a node for the Where - Is question.
QMSum_134
false
Yeah. But we believe that all the decision nodes are can be relevant for the Where - Is , and the Where How - do - I - get - to or the Tell - me - something - about.
QMSum_134
false
You can come in if you want.
QMSum_134
false
Yes , it is allowed.
QMSum_134
false
As long as y you 're not wearing your h your h headphones. Well , I do I See , I don't know if this is a good idea or not. I 'm just throwing it out. But uh , it seems like we could have I mea or uh we could put all of the all of the r information that could also be relevant into the Where - Is node answer
QMSum_134
false
Mm - hmm. Yep.
QMSum_134
false
node thing stuff. And uh
QMSum_134
false
OK.
QMSum_134
false
I mean Let 's not forget we 're gonna get some very strong input from these sub dis from these discourse things , right ? So. " Tell me the location of X. " Nuh ? Or " Where is X located at ? "
QMSum_134
false
We u
QMSum_134
false
Nuh ?
QMSum_134
false
Yeah , I know , but the Bayes - net would be able to The weights on the on the nodes in the Bayes - net would be able to do all that ,
QMSum_134
true
Mm - hmm.
QMSum_134
false
wouldn't it ? Here 's a k Oh ! Oh , I 'll wait until you 're plugged in. Oh , don't sit there. Sit here. You know how you don't like that one. It 's OK. That 's the weird one. That 's the one that 's painful. That hurts. It hurts so bad. I 'm h I 'm happy that they 're recording that. That headphone. The headphone that you have to put on backwards , with the little little thing and the little little foam block on it ? It 's a painful , painful microphone.
QMSum_134
false
I think it 's th called " the Crown ".
QMSum_134
false
The crown ?
QMSum_134
false
What ?
QMSum_134
false
Yeah , versus " the Sony ".
QMSum_134
false
The Crown ? Is that the actual name ? OK.
QMSum_134
false
Mm - hmm. The manufacturer.
QMSum_134
false
I don't see a manufacturer on it.
QMSum_134
false
You w
QMSum_134
false
Oh , wait , here it is. h This thingy. Yeah , it 's " The Crown ". The crown of pain !
QMSum_134
false
Yes.
QMSum_134
false
You 're on - line ?
QMSum_134
false
Are you are your mike o Is your mike on ?
QMSum_134
false
Indeed.
QMSum_134
false
OK. So you 've been working with these guys ? You know what 's going on ?
QMSum_134
false
Yes , I have. And , I do. Yeah , alright. s So where are we ?
QMSum_134
false
Excellent !
QMSum_134
false
We 're discussing this.
QMSum_134
false
I don't think it can handle French , but anyway.
QMSum_134
false
So. Assume we have something coming in. A person says , " Where is X ? " , and we get a certain We have a Situation vector and a User vector and everything is fine ? An - an and and our and our
QMSum_134
false
Did you just sti Did you just stick the m the the the microphone actually in the tea ?
QMSum_134
false
No.
QMSum_134
false
And , um ,
QMSum_134
false
I 'm not drinking tea. What are you talking about ?
QMSum_134
false
Oh , yeah. Sorry.
QMSum_134
false
let 's just assume our Bayes - net just has three decision nodes for the time being. These three , he wants to know something about it , he wants to know where it is , he wants to go there.
QMSum_134
false
In terms of , these would be wha how we would answer the question Where - Is , right ? We u This is i That 's what you s it seemed like , explained it to me earlier
QMSum_134
false
Yeah , but , mmm.
QMSum_134
false
w We we 're we wanna know how to answer the question " Where is X ? "
QMSum_134
false
Yeah. No , I can I can do the Timing node in here , too , and say " OK. "
QMSum_134
false
Well , yeah , but in the s uh , let 's just deal with the s the simple case of we 're not worrying about timing or anything. We just want to know how we should answer " Where is X ? "
QMSum_134
false
OK. And , um , OK , and , Go - there has two values , right ? , Go - there and not - Go - there. Let 's assume those are the posterior probabilities of that.
QMSum_134
false
Mm - hmm.
QMSum_134
false
Info - on has True or False and Location. So , he wants to know something about it , and he wants to know something he wants to know Where - it - is ,
QMSum_134
false
Excuse me.
QMSum_134
false
has these values. And , um ,
QMSum_134
false
Oh , I see why we can't do that.
QMSum_134
false
And , um , in this case we would probably all agree that he wants to go there. Our belief - net thinks he wants to go there ,
QMSum_134
false
Yeah.
QMSum_134
false
right ?
QMSum_134
false
Mm - hmm.
QMSum_134
false
In the , uh , whatever , if we have something like this here , and this like that and maybe here also some
QMSum_134
false
You should probably make them out of Yeah.
QMSum_134
false
something like that ,
QMSum_134
false
Well , it
QMSum_134
false
then we would guess , " Aha ! He , our belief - net , has s stronger beliefs that he wants to know where it is , than actually wants to go there. " Right ?
QMSum_134
false
That it Doesn't this assume , though , that they 're evenly weighted ?
QMSum_134
false
True.
QMSum_134
false
Like I guess they are evenly weighted.
QMSum_134
false
The different decision nodes , you mean ?
QMSum_134
false
Yeah , the Go - there , the Info - on , and the Location ?
QMSum_134
false
Well , d yeah , this is making the assumption. Yes.
QMSum_134
false
Like
QMSum_134
false
What do you mean by " differently weighted " ? They don't feed into anything really anymore.
QMSum_134
false
But I mean , why do we
QMSum_134
false
Or I jus
QMSum_134
false
If we trusted the Go - there node more th much more than we trusted the other ones , then we would conclude , even in this situation , that he wanted to go there.
QMSum_134
false
Le
QMSum_134
false
So , in that sense , we weight them equally right now.
QMSum_134
false
OK. Makes sense. Yeah. But
QMSum_134
false
So the But I guess the k the question that I was as er wondering or maybe Robert was proposing to me is How do we d make the decision on as to which one to listen to ?
QMSum_134
false
Yeah , so , the final d decision is the combination of these three. So again , it 's it 's some kind of , uh
QMSum_134
false
Bayes - net.
QMSum_134
false
Yeah , sure.
QMSum_134
false
OK so , then , the question i So then my question is t to you then , would be So is the only r reason we can make all these smaller Bayes - nets , because we know we can only deal with a finite set of constructions ? Cuz oth If we 're just taking arbitrary language in , we couldn't have a node for every possible question , you know ?
QMSum_134
false
A decision node for every possible question , you mean ?
QMSum_134
false
Well , I like , in the case of Yeah. In the ca Any piece of language , we wouldn't be able to answer it with this system , b if we just h Cuz we wouldn't have the correct node. Basically , w what you 're s proposing is a n Where - Is node , right ?
QMSum_134
false
Yeah.
QMSum_134
false
And and if we And if someone says , you know , uh , something in Mandarin to the system , we 'd - wouldn't know which node to look at to answer that question ,
QMSum_134
false
So is Yeah. Yeah.
QMSum_134
false
right ?
QMSum_134
false
Mmm ?
QMSum_134
false
So , but but if we have a finite What ?
QMSum_134
false
I don't see your point. What what what I am thinking , or what we 're about to propose here is we 're always gonna get the whole list of values and their posterior probabilities. And now we need an expert system or belief - net or something that interprets that , that looks at all the values and says , " The winner is Timing. Now , go there. " " Uh , go there , Timing , now. " Or , " The winner is Info - on , Function - Off. " So , he wants to know something about it , and what it does. Nuh ? Uh , regardless of of of the input. Wh - Regardle
QMSum_134