Hey Moonbat!
-
@Moonbat said in Hey Moonbat!:
Still working on machine learning problems
Give me all the details!
My impression is that pure statistical machine learning is running into an intellectual and practical dead end.
Just today I read a fascinating article of why deep learning doesn't suffer more from overfitting problems, although those deep neural network architectures have gazillions of parameters (see here).
-
Deaths and marriages tend to bring people back together.
Hope you're well, MB.
-
@Moonbat said in Hey Moonbat!:
I'm still in Manchester, and still complaining about the rain to anyone who will listen.
God's own country! I didn't realise you'd moved out of the big smoke and into whatever that is in the Mancunian air (probably best not think about that too much).
Glad to hear you're doing well (Phibes is D'Oh by another name...)
-
@Doctor-Phibes said in Hey Moonbat!:
Phibes is D'Oh by another name...
I think Phibes is considerably worse than D'Oh, if that is even possible.
-
@Klaus said in Hey Moonbat!:
@Doctor-Phibes said in Hey Moonbat!:
Phibes is D'Oh by another name...
I think Phibes is considerably worse than D'Oh, if that is even possible.
Having a doctorate is bound to make one more aggravating.
-
@Klaus said in Hey Moonbat!:
@Moonbat said in Hey Moonbat!:
Still working on machine learning problems
Give me all the details!
My impression is that pure statistical machine learning is running into an intellectual and practical dead end.
Just today I read a fascinating article of why deep learning doesn't suffer more from overfitting problems, although those deep neural network architectures have gazillions of parameters (see here).
I actually work more with the kernel machines that article talks about, or at least their Bayesian variants - Gaussian Processes. Though inevitably we also do some deep learning. Most of my time has been spent on Bayesian optimisation in various different settings hence my familiarity with GPs as they tend to be the model of choice if you want high quality uncertainties and you relatively little data.
Deep learning seems to be getting more expensive, which perhaps is the practical dead end you speak of but i'm not sure I would mark it dead yet. I think the surprising thing for me is that the most powerful models have pretty simple architectures - e.g. the transformers and quantised auto encoders that drive things like GPT3 or Dale-E.
-
@Ivorythumper said in Hey Moonbat!:
Deaths and marriages tend to bring people back together.
Hope you're well, MB.
I am well, thank you Ivory, I hope you and MS are too. I think sometimes of our past philosophical battles and of all those armies of text we sent out into the ether. I wonder now how we had the time or energy. Though in truth if my hands had not decided typing was no longer an acceptable activity I would probably still be at it. In any case, it was always interesting talking with you.
-
@89th said in Hey Moonbat!:
Hey MB! Hope all is well man. I’m older, less flexible, have kids, and even less wise than I was 15 years ago. If that’s possible. I live in Minnesota now and, shocker, am on a week long vacation at lake cabin.
Hey 89th, nice to hear from you. I'm with you on the older and less flexible part, however, I am rather more dubious of your claims to be less wise. Hope you and the kids are doing well.
-
@Doctor-Phibes said in Hey Moonbat!:
@Moonbat said in Hey Moonbat!:
I'm still in Manchester, and still complaining about the rain to anyone who will listen.
God's own country! I didn't realise you'd moved out of the big smoke and into whatever that is in the Mancunian air (probably best not think about that too much).
Glad to hear you're doing well (Phibes is D'Oh by another name...)
Hey Doh, thanks for saying hi, hope life is treating you well.
-
@George-K said in Hey Moonbat!:
Piling on to what everyone else has said. Don't be a stranger.
We're still (mostly) fun here!