It all boils down to linear algebra
When I was in college, my view of applied math was something like the following.
Applied math is mostly mathematical physics. Mathematical physics is mostly differential equations. Numerical solution of differential equations boils down to linear algebra. Therefore the heart of applied math is linear algebra.
I still think there's a lot of truth in the summary above. Linear algebra is very important, and a great deal of applied math does ultimately depend on efficient solutions of large linear systems. The weakest link in the argument may be the first one: there's a lot more to applied math than mathematical physics. Mathematical physics hasn't declined, but other areas have grown. Still, areas of applied math outside of mathematical physics and outside of differential equations often depend critically on linear algebra.
I'd certainly recommend that someone interested in applied math become familiar with numerical linear algebra. If you're going to be an expert in differential equations, or optimization, or many other fields, you need to be at leas familiar with numerical linear algebra if you're going to compute anything. As Stephen Boyd points out in his convex optimization class, many of the breakthroughs in optimization over the last 20 years have at their core breakthroughs in numerical linear algebra. Improved algorithms have sped up the solution of very large systems more than Moore's law has.
It may seem questionable that linear algebra is at the heart of applied math because it's linear. What about nonlinear applications, such as nonlinear PDEs? Nonlinear differential equations lead to nonlinear algebraic equations when discretized. But these nonlinear systems are solved via iterations of linear systems, so we're back to linear algebra.