Blasket_Basket

Blasket_Basket t1_je86jku wrote

By definition, they aren't going to be something you'll be able to predict. They aren't going to magically pop into existence as soon as the technology is created, because a ton of smaller additional work will need to happen first in order to create the secondary inventions, tools, and/or services that need to exist in order for these jobs to come into existence.

Think about a job like 'SEO Analyst'. This is undoubted a job that only exists because of the invention of other technologies that caused job loss and displacement--electricity, the computer, search engines, social media.

The new jobs won't be created all at once, and there will be a bunch of smaller inventions that will come into play as a result of these big new inventions.

1

Blasket_Basket t1_jdkxg9p wrote

Meh. Sorry for the individuals that are losing their jobs, but this has happened again and again throughout history as new technologies become mainstream. Today's roles aren't special or uniquely safe compared to roles in the past that disappeared due to the invention of new technologies.

I feel for the individuals involved, but this has always been what technological progress looks like.

−3

Blasket_Basket t1_j4210ek wrote

They're slowly phasing it out. They've killed both iOS and Android Cortana apps, and I'm guessing it'll be gone from the next iteration of windows. Suffice to say, it's clearly not a part of their future road map, and not the driving reason why they're investing in ChatGPT. They've made it clear that their purpose here is to enhance Bing and challenge Google's dominance of the search market. Cortana has nothing to do with it.

1

Blasket_Basket t1_j3h24nj wrote

Move experience above education since you have significant work experience. Similarly, move the team lead CV role to the top of that section, about the research assistant roles. Recruiters want to know you have work experience first and foremost. You come across as significantly less competent/senior to recruiters if the first thing they hear about is the stuff you're doing as a grad assistant.

2

Blasket_Basket t1_izv8icg wrote

I see a lot of people mentioning needing a GPU for DL, but it appears no one has yet clarified you only need that for training.

If you're looking for the standard use case of training a model, saving it, and then productionizing that model by exposing an API for model inference only, then you only need a GPU for the training phase. For inference, you do not need a GPU. AWS rents specialized EC2 instances with fast CPUs optimized specifically for model inference.

Another major difference may be that business requirements may preclude the use of Deep Learning in the solution. For instance, business areas like credit risk are regulated and require a level of model explainability that we can't provide with neural networks.

Others have already made great comments regarding tabular vs unstructured data, no other comments to add there.

One final thing area is the sheer volume of data needed for a DL solution vs a "Shallow" ML solution. You need orders of magnitude more data to successfully train most DL models than you do to get good performance with most other ML algorithms.

3

Blasket_Basket t1_ixrzd51 wrote

Lol, this isn't reality. They can't "clone your personality" based on a questionnaire. This is just a bullshit marketing term from a company that has access to GPT-3 and a couple APIs and has no idea what they're doing.

6

Blasket_Basket t1_isp0d5p wrote

Reply to comment by Moppmopp in rx6900xt for ML? [D] by Moppmopp

Yep, pretty much. AMD cards are pretty close to useless when it comes to Deep Learning. Shallow algorithms (anything this is ML but not DL) typically run on the CPU, not the GPU.

For DL, you need Nvidia cards.

1