Make an optimization tensor.switch(scalar,...) -> ifelse(scalar,...)
We need to be a little bit inteligent. The ifelse is slower then the switch. Also the switch and the computation in the if and else part could be merged.
So we make an optional optimization? Make it after the elemwise fusion? Wait for James proposal to make theano Apply node support multiple parent and compare the execution time of each path?
So we make an optional optimization? Make it after the elemwise fusion? Wait for James proposal to make theano Apply node support multiple parent and compare the execution time of each path?
Leave a comment
Why change the component to DeepLearningTutorial?
If we implement ifelse.c_code() it will lower the speed difference between ifelse and switch, so maybe it would make it easier to always enable this opt. ifelse return a view of one of the inputs, but I don't think that switch do better in this case. In fact, it probably don't return a view, but I'm not sure.
If we implement ifelse.c_code() it will lower the speed difference between ifelse and switch, so maybe it would make it easier to always enable this opt. ifelse return a view of one of the inputs, but I don't think that switch do better in this case. In fact, it probably don't return a view, but I'm not sure.
Why change the component to DeepLearningTutorial?
If we implement ifelse.c_code() it will lower the speed difference between ifelse and switch, so maybe it would make it easier to always enable this opt. ifelse return a view of one of the inputs, but I don't think that switch do better in this case. In fact, it probably don't return a view, but I'm not sure.
If we implement ifelse.c_code() it will lower the speed difference between ifelse and switch, so maybe it would make it easier to always enable this opt. ifelse return a view of one of the inputs, but I don't think that switch do better in this case. In fact, it probably don't return a view, but I'm not sure.