Ferocious_Armadillo

Ferocious_Armadillo t1_jaa3wt2 wrote

I heard a great explanation of this recently on the news. It’s a number that (taking a bunch of other things into account like payment history for bills including mortgage, loans, etc, credit history (mainly think “how long have I had and used a credit card) and other factors into 1 number from 300-800. This number tells people “how likely am I to pay back credit?/how good of a borrower am I? (From a bank’s perspective$” the higher the number, the “better” of a borrower you are.

This is important and you should care because those with higher credit scores:

  • can get lower interest rates when you go to get a mortgage
  • would actually qualify for higher loan amounts/can borrow more money, and a bunch of other stuff.
1

Ferocious_Armadillo t1_j9rlhzx wrote

I might be off base here but my first thought was there might be something there with integrating the full area of peaks and sorting out peaks from specific elements in the spectral analysis of a heterogeneous mixture (possibly through a Fourier transform or convolution? This is ringing bells for me as feeling similar to signal processing…)

1

Ferocious_Armadillo t1_j9dgsw0 wrote

The “line down the middle” is where the “predicted” shark is the true shark (the type of shark was guessed correctly. The color of the square represents the certainty of the guess. *How confident is the software that it made the “right” guess? In these cases, it’s 100% confident since it is the right one!)

Pretty cool if you ask me!

You can match them up to see for yourself.

2

Ferocious_Armadillo t1_j98sbqm wrote

I think I’m gonna have to respectfully disagree on a lot of this. You’re right that it largely comes down to training data used. The thing that largely jumps out to me, though, in the examples you give and in your point (1) is that while you want to train using a large amount of data, especially for such large networks as those you suggest, is that while you need that large amount of data, you want to avoid overfitting your model to your data in the pursuit of accuracy or reliability or whatever metric you choose to determine how “good” or accurate your model is against some ground truth.

And while on the surface, NNs can definitely seem like or appear as though they’re “black boxes” or “we can’t accurately describe their structure or how they work”. That’s largely untrue. In fact, I would claim that it’s precisely because we can design and model NN structure and use a structure (both in terms of # of layers, connectedness between them, inputs, weights, biases, activation functions, etc.) that would lend itself best to a given purpose, that has allowed the field to come as far as it has, to generate the NNS in the examples you provide in the first place.

Sorry about the rant… I didn’t realize I get so passionate about NNs.

1

Ferocious_Armadillo t1_j6lf39v wrote

Hemoglobin carries oxygen to every cell in your body. Low hemoglobin means less oxygen is getting to your cells. Different cells can go do long without the optimal level of oxygen they need, and so, those parts of your body get damaged. The amount of damage done to different parts of the body determines the effect on your overall health. Since certain parts of your body have really important functions and a high demand of oxygen (your brain, your heart) this could have especially bad long term effects.

1

Ferocious_Armadillo t1_j2734lr wrote

Yes. And back to your original questions…

Yes water molecules get protonated to form H3O+. These molecules get filtered out into blood from capillaries in the small intestine as part of digestion.

Then, in the blood, the amount of H3O+ gets really tightly regulated (turning between H3O+, H2O and OH-) by a bunch of specific molecules that are called buffers (a sort of category of molecules) that regulate the pH of your blood.

If the pH of your blood falls too far out of this range, by having too many/few of these H3O+ molecules, or a breakdown of the above process/other molecules involved, having too few/too many buffers, etc. that can be very deadly, very quickly.

6