To show that the given short exact sequence of finite-dimensional vector spaces splits, we have to show that \( Im(f) \) is a direct summand of \( V_{2} \), where \( f \ \colon V_{1} \longrightarrow V_{2} \) is the given \( \mathbb{K} \)-linear monomorphism.
Since \( f \) is a \( \mathbb{K} \)-linear map, \( Im(f) \) is a vector subspace of \( V_{2} \). Since \( V_{2} \) is a finite-dimensional vector space, say of dimension \( n \), \( Im(f) \) is also finite-dimensional, say of dimension \( k \). Therefore, consider a \( \mathbb{K} \)-basis \( \{ e_{i} \}_{i=1}^{k} \) of \( Im(f) \) and extend it into a \( \mathbb{K} \)-basis of \( V_{2} \). It follows that \( Im(f) \) is a direct summand of \( V_{2} \), as every element \( \displaystyle v \) of \( V_{2} \) is expressed uniquely as \[ \displaystyle v = \sum_{i=1}^{k} \lambda_{i}e_{i} + \sum_{i=k+1}^{n} \lambda_{i}e_{i} \]and the first summand belongs to \( Im(f) \).
|