$\square$ Summary: Linear transformation with equal-sized domain and codomain, so it has the potential to be invertible, but in this case is not. Neither injective nor surjective. Diagonalizable, though.
$\square$ A linear transformation (Definition LT).\begin{equation*}\ltdefn{T}{\complex{5}}{\complex{5}},\quad \lt{T}{\colvector{x_1\\x_2\\x_3\\x_4\\x_5}}= \colvector{-2 x_1 + 3 x_2 + 3 x_3 - 6 x_4 + 3 x_5\\ -16 x_1 + 9 x_2 + 12 x_3 - 28 x_4 + 28 x_5\\ -19 x_1 + 7 x_2 + 14 x_3 - 32 x_4 + 37 x_5\\ -21 x_1 + 9 x_2 + 15 x_3 - 35 x_4 + 39 x_5\\ -9 x_1 + 5 x_2 + 7 x_3 - 16 x_4 + 16 x_5} \end{equation*}
$\square$ A basis for the kernel of the linear transformation (Definition KLT).\begin{align*}\set{\colvector{3\\4\\1\\3\\3}} \end{align*}
$\square$ Is the linear transformation injective (Definition ILT)?No.
Since the kernel is nontrivial Theorem KILT tells us that the linear transformation is not injective. Also, since the rank can not exceed 3, we are guaranteed to have a nullity of at least 2, just from checking dimensions of the domain and the codomain. In particular, verify that
\begin{align*}
\lt{T}{\colvector{1\\3\\-1\\2\\4}}&=\colvector{4\\55\\72\\77\\31}&
\lt{T}{\colvector{4\\7\\0\\5\\7}}&=\colvector{4\\55\\72\\77\\31}
\end{align*}
This demonstration that $T$ is not injective is constructed with the observation that
\begin{align*}
\colvector{4\\7\\0\\5\\7}&=\colvector{1\\3\\-1\\2\\4}+\colvector{3\\4\\1\\3\\3}
\intertext{and}
\vect{z}&=\colvector{3\\4\\1\\3\\3}\in\krn{T}
\end{align*}
so the vector $\vect{z}$ effectively “does nothing” in the evaluation of $T$.
$\square$ A spanning set for the range of a linear transformation (Definition RLT)can be constructed easily by evaluating the linear transformation on a standard basis (Theorem SSRLT).\begin{align*}\set{\colvector{-2\\-16\\-19\\-21\\-9},\,\colvector{3\\9\\7\\9\\5},\, \colvector{3\\12\\14\\15\\7},\,\colvector{-6\\-28\\-32\\-35\\-16},\, \colvector{3\\28\\37\\39\\16}} \end{align*}
$\square$ A basis for the range of the linear transformation (Definition RLT). If the linear transformation is injective, then the spanning set just constructed is guaranteed to be linearly independent (Theorem ILTLI) and is therefore a basis of the range with no changes. Injective or not, this spanning set can be converted to a “nice” linearly independent spanning set by making the vectors the rows of a matrix (perhaps after using a vector representation), row-reducing, and retaining the nonzero rows (Theorem BRS), and perhaps un-coordinatizing.\begin{align*}\set{ \colvector{1\\0\\0\\0\\1},\,\colvector{0\\1\\0\\0\\-1},\, \colvector{0\\0\\1\\0\\-1},\,\colvector{0\\0\\0\\1\\2} } \end{align*}
$\square$ Is the linear transformation surjective (Definition SLT)?No.
The dimension of the range is 4, and the codomain ($\complex{5}$) has dimension 5. So $\rng{T}\neq\complex{5}$ and by Theorem RSLT the transformation is not surjective.
To be more precise, verify that $\colvector{-1\\2\\3\\-1\\4}\not\in\rng{T}$, by setting the output equal to this vector and seeing that the resulting system of linear equations has no solution, i.e. is inconsistent. So the preimage, $\preimage{T}{\colvector{-1\\2\\3\\-1\\4}}$, is empty. This alone is sufficient to see that the linear transformation is not onto.
$\square$ Subspace dimensions associated with the linear transformation (Definition ROLT, Definition NOLT). Verify Theorem RPNDD, and examine parallels with earlier results for matrices.\begin{align*}\text{Rank: }4&&\text{Nullity: }1&&\text{Domain dimension: }5&\end{align*}
$\square$ Is the linear transformation invertible (Definition IVLT, and examine parallels with the existence of matrix inverses.)?No.
Neither injective nor surjective. Notice that since the domain and codomain have the same dimension, either the transformation is both onto and one-to-one (making it invertible) or else it is both not onto and not one-to-one (as in this case) by Theorem RPNDD.
$\square$ Matrix representation of the linear transformation, as described in Theorem MLTCV. (See also Example MOLT.) If $A$ is the matrix below, then $\lt{T}{\vect{x}} = A\vect{x}$. This computation may also be viewed as an application of Definition MR and Theorem FTMR from Section MR, where the bases are chosen to be the standard bases of $\complex{m}$ (Definition SUV).\begin{bmatrix} -2&3&3&-6&3\\ -16&9&12&-28&28\\ -19&7&14&-32&37\\ -21&9&15&-35&39\\ -9&5&7&-16&16 \end{bmatrix}
$\square$ Eigenvalues, and bases for eigenspaces (Definition EELT, Theorem EER). Evaluate the linear transformation with each eigenvector as an interesting check.\begin{align*}\eigensystem{T}{-1}{\colvector{0\\2\\3\\3\\1}}\\ \eigensystem{T}{0}{\colvector{3\\4\\1\\3\\3}}\\ \eigensystem{T}{1}{\colvector{5\\3\\0\\0\\2},\,\colvector{-3\\1\\0\\2\\0},\,\colvector{1\\-1\\2\\0\\0}} \end{align*}
$\square$ A diagonal matrix representation relative to a basis of eigenvectors. \begin{align*} B&=\set{\colvector{0\\2\\3\\3\\1},\,\colvector{3\\4\\1\\3\\3},\, \colvector{5\\3\\0\\0\\2},\,\colvector{-3\\1\\0\\2\\0},\, \colvector{1\\-1\\2\\0\\0}} &&\text{(Domain, codomain basis)}\\ \matrixrep{T}{B}{B}&=\begin{bmatrix} -1&0&0&0&0\\ 0&0&0&0&0\\ 0&0&1&0&0\\ 0&0&0&1&0\\ 0&0&0&0&1 \end{bmatrix} \end{align*}