## Practice Midterm SOLUTION: #6 (part one) |
abbyto1607 |
---|---|

Conditions for being a basis: *must be a linearly independent set of vectors *must span all of the vector space. Apply this to given set of vectors ${x+1, x-2, x^{2}+x}$ I didn't know how to write this as a matrix, and row reduce to prove linear independence, so I used a definition from the book instead. That is, $c_{1}v_{1}+c_{2}v_{2}+...+c_{n}v_{n}=0$ only if $c_{1},c_{2},...,c_{n}=0$ in a linearly independent set of vectors. $c_{1}(x+1)+c_{2}(x-2)+c_{3}(x^{2}+x)=0$ solving, we get $(c_{3})x^{2}+(c_{1}+c_{2}+c_{3})x+(c_{1}-2c_{2})=0$ Now we need to find what $c_{1},c_{2},c_{3}$ are. Doing some algebra, we see that $c_{1}(x+1)+c_{2}(x-2)+c_{3}(x^{2}+x)=0$ only when $c_{1},c_{2},c_{3}=0$ This proves that ${x+1, x-2, x^{2}+x}$ is a linearly independent set of vectors. Now we need to prove the second condition, that the set spans all of the vector space. The vector space in question is all polynomials of degree less than or equal to 2. Any such polynomial can be written as some $\alpha(x^{2})+\beta(x)+\gamma$. To span all of the vector space, any such$\alpha(x^{2})+\beta(x)+\gamma$ must be able to be written as a linear combination of the vectors in the basis. In mathematical terms, $\alpha(x^{2})+\beta(x)+\gamma=a(x+1)+b(x-2)+c(x^{2}+x)$ for some real $a,b,c,\alpha,\beta,\gamma$ Solving for this, $(c)x^{2}+(a+b+c)x+ (a-2b)=(\alpha)x^{2}+(\beta)x+(\gamma)$ where $c=\alpha, b=(1/3)(\beta-\gamma-\alpha), a=(5/3)\gamma-(2/3)\beta+(2/3)\alpha$. We can see that a, b, c are defined, and that the given set of vectors spans all of the vector space. Thus, ${x+1, x-2, x^{2}+x}$ is a basis for polynomials of degree less than or equal to 2. |

CarlHarmon: May 5, 2015, 8:59 p.m. |
---|

This is all correct, but the much easier way to do it is by simply showing that it reduces to the identity matrix. |

aleksanderjanczewski: May 5, 2015, 9:51 p.m. |
---|

This is not correct, your reasoning is incorrect. It is not enough to define a,b,c in terms of alpha, beta and gamma. You have to show that any linear combination for "zero vector" in your basis can be obtained only if all coefficients are equal to zero. Follow up if u want me to solve it for you. |

emmyralds: May 5, 2015, 10:11 p.m. |
---|

^^ I think she did that in the first half of her solution. The part with alpha, beta, and gamma was used to prove that the basis spans P3. Thus, I believe her solution is valid. |

abbyto1607: May 5, 2015, 10:27 p.m. |
---|

True, CarlHarmon, but I didn't want to play around with choosing a basis and setting up a matrix, so I used this definition. Also, emmyralds is right. If you look at the first half of my solution, I showed that there exists only the trivial solution, where $c_{1},c_{2},...,c_{n}=0$ |

aleksanderjanczewski: May 5, 2015, 11:04 p.m. |
---|

Then it is my bad, sorry about it I didnt notice that. |

LizvetteV: May 6, 2015, 12:56 a.m. |
---|

How would you set up (x+1,x-2,x^2+x) as a matrix in order to row reduce, if you are trying to use that method instead of the definition from the book? |

kmeneses95ND: May 6, 2015, 9:41 a.m. |
---|

In matrix form this basis is: 1 1 1 1 -2 1 0 0 0 Then, using this, you can row reduce from there and see if this matrix reduces to the identity matrix. If so, then these vectors are linearly independent and if not then they are not. |

kmeneses95ND: May 6, 2015, 9:44 a.m. |
---|

Sorry about the formatting of my last comment. I am not sure how to do matrix notation on here. Just say that (1 1 0) is the first column of the matrix, (-2 1 0) the second column, and (0 1 1) the third |

LizvetteV: May 6, 2015, 2:49 p.m. |
---|

Got it, thank you! |