topic_shift
bool
2 classes
utterance
stringlengths
1
7.9k
session_id
stringlengths
7
14
false
Hmm. OK. Y you 're you were saying before ?
QMSum_86
false
Uh. Yeah. So , this noise , um Yeah. The MSG Um. Mmm. There is something perhaps , I could spend some days to look at this thing , cuz it seems that when we train networks on let 's say , on TIMIT with MSG features , they they look as good as networks trained on PLP. But , um , when they are used on on the SpeechDat - Car data , it 's not the case oh , well. The MSG features are much worse , and so maybe they 're , um , less more sensitive to different recording conditions , or Shou
QMSum_86
false
Shouldn't be. They should be less so.
QMSum_86
false
Yeah. But
QMSum_86
false
R right ?
QMSum_86
false
Mmm.
QMSum_86
false
Wh - ? But let me ask you this. What what 's the , um ? Do you kno recall if the insertions were were higher with MSG ?
QMSum_86
false
I don't know. I cannot tell. But It 's it the the error rate is higher. So , I don
QMSum_86
false
Yeah. But you should always look at insertions , deletions , and substitutions.
QMSum_86
false
Yeah. Mm - hmm.
QMSum_86
false
So
QMSum_86
false
Mm - hmm.
QMSum_86
false
so , uh MSG is very , very dif Eh , PLP is very much like mel cepstrum. MSG is very different from both of them.
QMSum_86
false
Mm - hmm.
QMSum_86
false
So , if it 's very different , then this is the sort of thing I mean I 'm really glad Andreas brought this point up. I sort of had forgotten to discuss it. Um. You always have to look at how this uh , these adjustments , uh , affect things. And even though we 're not allowed to do that , again we maybe could reflect that back to our use of the features.
QMSum_86
false
Mm - hmm.
QMSum_86
false
So if it if in fact , uh The problem might be that the range of the MSG features is quite different than the range of the PLP or mel cepstrum.
QMSum_86
false
Mm - hmm. Mm - hmm.
QMSum_86
false
And you might wanna change that.
QMSum_86
false
But Yeah. But , it 's d it 's after Well , it 's tandem features , so Mmm.
QMSum_86
false
Yeah.
QMSum_86
false
Yeah. We we have estimation of post posteriors with PLP and with MSG as input ,
QMSum_86
false
Yeah.
QMSum_86
false
so I don Well. I don't know.
QMSum_86
false
That means they 're between zero and one.
QMSum_86
false
Mm - hmm.
QMSum_86
false
But i it it it it doesn't necessarily You know , they could be , um Do - doesn't tell you what the variance of the things is.
QMSum_86
false
Mmm. Mm - hmm.
QMSum_86
false
Right ? Cuz if you 're taking the log of these things , it could be , uh Knowing what the sum of the probabilities are , doesn't tell you what the sum of the logs are.
QMSum_86
false
Mm - hmm. Yeah.
QMSum_86
false
So.
QMSum_86
false
Yeah. So we should look at the likelihood , or or what ? Or well , at the log , perhaps , and
QMSum_86
false
Yeah. Yeah.
QMSum_86
false
Mm - hmm.
QMSum_86
false
Or what you know , what you 're uh the thing you 're actually looking at.
QMSum_86
false
Mm - hmm.
QMSum_86
false
So your your the values that are are actually being fed into HTK.
QMSum_86
false
Mm - hmm. But
QMSum_86
false
What do they look like ?
QMSum_86
false
No And so th the , uh for the tandem system , the values that come out of the net don't go through the sigmoid. Right ? They 're sort of the pre - nonlinearity values ?
QMSum_86
false
Yes.
QMSum_86
false
Right. So they 're kinda like log probabilities is what I was saying.
QMSum_86
false
And those OK. And tho that 's what goes into HTK ?
QMSum_86
false
Uh , almost. But then you actually do a KLT on them.
QMSum_86
false
OK.
QMSum_86
false
Um. They aren't normalized after that , are they ?
QMSum_86
false
Mmm. No , they are not no.
QMSum_86
false
No. OK. So , um. Right. So the question is Yeah. Whatever they are at that point , um , are they something for which taking a square root or cube root or fourth root or something like that is is gonna be a good or a bad thing ? So.
QMSum_86
false
Mm - hmm.
QMSum_86
false
Uh , and that 's something that nothing nothing else after that is gonna Uh , things are gonna scale it Uh , you know , subtract things from it , scale it from it , but nothing will have that same effect. Um. So. Um. Anyway , eh
QMSum_86
false
Yeah. Cuz if if the log probs that are coming out of the MSG are really big , the standard insertion penalty is gonna have very little effect
QMSum_86
false
Well , the Right.
QMSum_86
false
compared to , you know , a smaller set of log probs.
QMSum_86
false
Yeah. No. Again you don't really look at that. It 's something that , and then it 's going through this transformation that 's probably pretty close to It 's , eh , whatever the KLT is doing. But it 's probably pretty close to what a a a discrete cosine transformation is doing.
QMSum_86
false
Yeah.
QMSum_86
false
But still it 's it 's not gonna probably radically change the scale of things. I would think. And , uh Yeah. It may be entirely off and and it may be at the very least it may be quite different for MSG than it is for mel cepstrum or PLP. So that would be So the first thing I 'd look at without adjusting anything would just be to go back to the experiment and look at the , uh , substitutions , insertions , and deletions. And if the if the , uh i if there 's a fairly large effect of the difference , say , uh , uh , the r ratio between insertions and deletions for the two cases then that would be , uh , an indicator that it might might be in that direction.
QMSum_86
false
Mm - hmm. Mm - hmm. Yeah. But ,
QMSum_86
false
Anything else ?
QMSum_86
false
my my point was more that it it works sometimes and but sometimes it doesn't work.
QMSum_86
false
Yeah.
QMSum_86
false
So.
QMSum_86
false
Well.
QMSum_86
false
And it works on TI - digits and on SpeechDat - Car it doesn't work , and
QMSum_86
false
Yeah.
QMSum_86
false
Mm - hmm. Yeah. Well.
QMSum_86
false
But , you know , some problems are harder than others ,
QMSum_86
false
Mm - hmm. Yeah.
QMSum_86
false
and And , uh , sometimes , you know , there 's enough evidence for something to work and then it 's harder , it breaks. You know ,
QMSum_86
false
Mm - hmm.
QMSum_86
false
so it 's But it but , um , i it it could be that when you say it works maybe we could be doing much better , even in TI - digits. Right ?
QMSum_86
false
Yeah. Yeah , sure.
QMSum_86
false
So.
QMSum_86
false
Uh.
QMSum_86
false
Hmm ? Yeah.
QMSum_86
false
Yeah. Well , there is also the spectral subtraction , which , um I think maybe we should , uh , try to integrate it in in our system.
QMSum_86
false
Yeah.
QMSum_86
false
Mmm. Mm - hmm.
QMSum_86
false
Right.
QMSum_86
false
But ,
QMSum_86
false
I think that would involve to to mmm use a big a al already a big bunch of the system of Ericsson. Because he has spectral subtraction , then it 's followed by , um , other kind of processing that 's are dependent on the uh , if it 's speech or noi or silence.
QMSum_86
false
Mm - hmm.
QMSum_86
false
And there is this kind of spectral flattening after if it 's silence , and and s I I think it 's important , um , to reduce this musical noise and this this increase of variance during silence portions. So. Well. This was in this would involve to take almost everything from from the this proposal and and then just add some kind of on - line normalization in in the neural network. Mmm.
QMSum_86
false
OK. Well , this 'll be , I think , something for discussion with Hynek next week.
QMSum_86
false
Yeah. Mm - hmm.
QMSum_86
true
Yeah. OK. Right. So. How are , uh , uh how are things going with what you 're doing ?
QMSum_86
false
Oh. Well , um , I took a lot of time just getting my taxes out of the way multi - national taxes. So , I 'm I 'm starting to write code now for my work but I don't have any results yet. Um , i it would be good for me to talk to Hynek , I think , when he 's here.
QMSum_86
false
Yeah.
QMSum_86
false
Do you know what his schedule will be like ?
QMSum_86
false
Uh , he 'll be around for three days.
QMSum_86
false
OK. So , y
QMSum_86
false
Uh , we 'll have a lot of time.
QMSum_86
false
OK.
QMSum_86
false
So , uh Um. I 'll , uh You know , he 's he 'll he 'll be talking with everybody in this room So.
QMSum_86
false
But you said you won't you won't be here next Thursday ?
QMSum_86
false
Not Thursday and Friday. Yeah. Cuz I will be at faculty retreat.
QMSum_86
false
Hmm.
QMSum_86
false
So. I 'll try to connect with him and people as as I can on on Wednesday. But Um. Oh , how 'd taxes go ? Taxes go OK ?
QMSum_86
false
Mmm. Yeah.
QMSum_86
false
Yeah. Oh , good. Yeah. Yeah. That 's just that 's that 's one of the big advantages of not making much money is the taxes are easier. Yeah.
QMSum_86
false
Unless you 're getting money in two countries.
QMSum_86