carlthome
Submitted by carlthome t3_10mhbqv in MachineLearning
carlthome t1_j0tdgxz wrote
As someone who's actually enjoyed Twitter for its presence of paper authors in music ML/MIR with minimal social media drama, I'm happy to see that healthy part of the ML community steadily migrating to Mastodon.
Even though the UX is less polished, I think it's worth saving those cross-uni/corp discussions somehow, so I hope enough people will give the move a honest and patient try.
carlthome t1_iv0gzvw wrote
Interesting to mention layer normalisation over batch normalisation. I thought the latter was "the thing" and that layernorm, groupnorm, instancenorm etc. were follow-ups.
carlthome t1_iuovg39 wrote
Reply to comment by caedin8 in [D] Machine learning prototyping on Apple silicon? by laprika0
Desktop 3070 or laptop?
carlthome t1_irhwo6z wrote
Reply to comment by nomadiclizard in [D] AlphaTensor Explained (Video Walkthrough) by ykilcher
FFT is already O(n*log(n)) though so what could be improved? Linear time?
carlthome t1_iqx9kbo wrote
This is a bit like why computers happen to use binary and not ternary. Everything has been tried before.
There's a long and rich history for artificial neural networks but everybody seems to gravitate toward fewer moving parts in their already uncertain and difficult work.
I guess eventually the paved road of today's MLPs with GPUs became so convenient to use that very few have the time or means to try something radically different without good reason.
This is a fun read by the way: https://stats.stackexchange.com/questions/468606/why-arent-neural-networks-used-with-rbf-activation-functions-or-other-non-mono
carlthome t1_iquzf0i wrote
Do you work at a MLOps startup?
carlthome t1_j7z22wa wrote
Reply to comment by JurgenSchmidthuber in [D] Critique of statistics research from machine learning perspectives (and vice versa)? by fromnighttilldawn
Because they didn't say conference paper, you mean?