Why should a wave function be normalized?

Why do wave functions need to be normalized? Why are they not normalized? [Duplicate]

This question already has an answer here:

Before I started studying quantum mechanics, I thought I knew what normalization is. Here is a definition that matches what I meant by normalization:

normalization - Multiplication (a series, function or data element) by a factor that makes the norm or an associated quantity such as an integral equal to a desired value (usually 1).

Most often, I've seen a normalization normalize to 1 or 100% or something similar. Isn't it a kind of normalization to put things in percentages? If I take a quiz and get 24/25, I "normalize" this by saying that I have 96%. That's what I understood by normalization.

Why am I confused now

Since studying quantum mechanics, I've been confused by the term normalization. Let me quote this part from Griffiths to illustrate an example of how he uses the term:

We now return to the statistical interpretation of the wave function, which says taht | Ψ (x, t) | 2 is the probability density for finding the particle at point x, at time t. It follows that the integral of | Ψ | 2 must be 1 (the particle must be somewhere.

∫ + ∞-∞ | Ψ (x, t) | 2dx = 1
Without this, statistical interpretation would be nonsense.

However, this requirement should bother you: After all, the wave function should be determined by the Schrödinger equation - we cannot impose an external condition Ψ without checking whether the two are consistent. Well, a look at [the time-dependent Schrödinger equation] shows that if Ψ (x, t) is a solution, so is AΨ (x, t), where A is an arbitrary (complex) constant. So we have to choose this indefinite multiplicative factor to ensure this ∫ + ∞-∞ | Ψ (x, t) |