AsIAm
AsIAm t1_j88u9rb wrote
Reply to comment by __lawless in [D] Have their been any attempts to create a programming language specifically for machine learning? by throwaway957280
Me too, but it was Google project, and what Google does to its projects..?
I think relying on TF would have been a mistake. This deep integration approach will be more fruitful in the long run. Also, if anybody wants to do ML in Swift on Apple platforms, there is awesome MPS Graph.
AsIAm t1_j88rlv3 wrote
Reply to comment by __lawless in [D] Have their been any attempts to create a programming language specifically for machine learning? by throwaway957280
While S4TF died, the idea (autodiff in the lang) still lives and is slowly and quietly being worked on: https://github.com/apple/swift/issues?q=is%3Aissue+is%3Aopen+autodiff
AsIAm t1_j88rd2i wrote
Reply to comment by Calm_Motor4162 in [D] Have their been any attempts to create a programming language specifically for machine learning? by throwaway957280
While TF.js is performant and godsend, it's ugly because JS lacks operator overloading and native tensor type, so you have to do tf.add(tf.tensor1d([1,2,3]), tf.tensor1d([10,20,30]))
.
AsIAm t1_j7k4r63 wrote
Reply to comment by ok531441 in [D] Python vs Swift vs Julia, what should I learn? (Any benchmarks?) by lukinhasb
Autodiff in Swift is still in active development: https://github.com/apple/swift/pulls?q=is%3Apr+%5BAutoDiff%5D
What got killed is Tensorflow for Swift. (As it was Google project, it wasn't a big surprise.)
AsIAm t1_izx39lx wrote
Reply to comment by tysam_and_co in [D] G. Hinton proposes FF – an alternative to Backprop by mrx-ai
His take on hardware for neural nets is pretty forward(-forward) thinking. Neural nets started by being analog (Rosenblatt's' Perceptron) and only later we started simulating them in software on digital computers. Some recent research (1,2) suggest that physical implementation of learnable neural nets is possible and way more efficient in analog circuits. This means that we could run extremely large nets on a tiny chip. Which could live in your toaster, or your skull.
AsIAm t1_iz4g8q4 wrote
Reply to comment by lfotofilter in [R] The Forward-Forward Algorithm: Some Preliminary Investigations [Geoffrey Hinton] by shitboots
He knows the true probability distribution of the MNIST.
AsIAm t1_jc168cw wrote
Reply to comment by Taenk in [P] Discord Chatbot for LLaMA 4-bit quantized that runs 13b in <9 GiB VRAM by Amazing_Painter_7692
It is. But that doesn't mean 1-bit neural nets are impossible. Even Turing himself toyed with such networks – https://www.npl.co.uk/getattachment/about-us/History/Famous-faces/Alan-Turing/80916595-Intelligent-Machinery.pdf?lang=en-GB