Tiny_Arugula_5648
Tiny_Arugula_5648 t1_j587jfq wrote
Reply to ruby-crowned kinglet, me, digital, 2023 by sylvrn
Wow this is fantastic work..
Tiny_Arugula_5648 t1_j2d910s wrote
Reply to comment by EternalNY1 in There's now an open source alternative to ChatGPT, but good luck running it by ravik_reddit_007
It costs much less and trains in a fraction of the time when you can use a TPU instead of a GPU on Google Cloud.. that’s how Google trained the BERT & T5 models..
Tiny_Arugula_5648 t1_izix4qp wrote
Reply to [OC] How to spot misleading charts? I would like to hear your opinion on the subject, also any tips design-wise? by dark_o3
This sub is overloaded with bad data viz and there are many other problems that aren’t as obvious as these are.. it’s really easy for untrained people to make bad graphs that look good.
The other big issue is a lack of data skepticism.. even if you know best practices, if you use bad data it’s still a bad data viz.
Unsurprisingly the posters always get pissed when you explain where they are making their mistakes.. more interested in getting an upvoted than learning the art.
Tiny_Arugula_5648 t1_isx67jr wrote
Reply to comment by Varterove_muke in [D] GPU comparison for ML by denisn03
Doubtful you got “bricked” or that Google caught you switching accounts… more likely TPUs are in a lot of demand and are expensive and the Colab service is a best effort to give you unused resources and there just wasn’t any TPUs available…
Tiny_Arugula_5648 t1_isj2xgz wrote
Reply to [D] GPU comparison for ML by denisn03
Well it doses depend on what typed of models you want to build and how much data you’ll be using… but the general rule of thumb is always go with the most powerful GPU and largest amount of ram you can afford.. having to little processing power means you’ll wait around much longer for everything (training, predicting) with to little ram many of the larger models out like BERT might not run at all..
Or just get a couple of colab accounts.. I get plenty of v100 & even a100 time, by switching between different accounts
Tiny_Arugula_5648 t1_irzd9n8 wrote
Reply to comment by IglooAustralia88 in [D] Are there any open-source text summarization model? by CeFurkan
Perhaps you meant Bert… it’s definitely one of the most powerful NLU models out.. pretty amazing at summarization
Tiny_Arugula_5648 t1_irsb7bd wrote
Reply to [N] Using machine learning to find an optimal mixture of metals to create a desired alloy by cyphersanthosh
I’ve read in the past the big challenge isn’t in identifying potential new alloys, we’ve had simulations that can do that for a long time.. it’s being able to actually create the material as the process isn’t a simple as just mixing to metals together and melt them.. am I missing something?
Tiny_Arugula_5648 t1_je9oyfo wrote
Reply to [D] The best way to train an LLM on company data by jaxolingo
Not sure why no one is calling this out but there is no indication a LLM is going to be useful here.. you have tabular data, unless it's unstructured text held in there it's not goung to be useful, pick the right model for the job..