How to find the basis of a system of vectors examples. How to find the basis of a given system of vectors

Find the basis of the system of vectors and vectors that are not included in the basis, expand on the basis:

A 1 = {5, 2, -3, 1}, A 2 = {4, 1, -2, 3}, A 3 = {1, 1, -1, -2}, A 4 = {3, 4, -1, 2}, A 5 = {13, 8, -7, 4}.

Solution. Consider homogeneous system linear equations

A 1 X 1 + A 2 X 2 + A 3 X 3 + A 4 X 4 + A 5 X 5 = 0

or expanded .

We will solve this system using the Gaussian method, without swapping rows and columns, and, in addition, choosing the main element not in the upper left corner, but throughout the entire row. The task is to select the diagonal part of the transformed system of vectors.

~ ~

~ ~ ~ .

The allowed system of vectors, which is equivalent to the original one, has the form

A 1 1 X 1 + A 2 1 X 2 + A 3 1 X 3 + A 4 1 X 4 + A 5 1 X 5 = 0 ,

Where A 1 1 = , A 2 1 = , A 3 1 = , A 4 1 = , A 5 1 = . (1)

Vectors A 1 1 , A 3 1 , A 4 1 form a diagonal system. Hence the vectors A 1 , A 3 , A 4 form the basis of the system of vectors A 1 , A 2 , A 3 , A 4 , A 5 .

We now expand the vectors A 2 And A 5 in basis A 1 , A 3 , A 4 . To do this, we first expand the corresponding vectors A 2 1 And A 5 1 diagonal system A 1 1 , A 3 1 , A 4 1 , bearing in mind that the coefficients of the vector expansion in the diagonal system are its coordinates x i.

From (1) we have:

A 2 1 = A 3 1 (-1) + A 4 1 0 + A 1 1 1 A 2 1 = A 1 1 – A 3 1 .

A 5 1 = A 3 1 0 + A 4 1 1+ A 1 1 2 A 5 1 = 2A 1 1 + A 4 1 .

Vectors A 2 And A 5 expand in basis A 1 , A 3 , A 4 with the same coefficients as the vectors A 2 1 And A 5 1 diagonal system A 1 1 , A 3 1 , A 4 1 (those coefficients x i). Hence,

A 2 = A 1 – A 3 , A 5 = 2A 1 + A 4 .

Tasks. 1.Find the basis of the system of vectors and the vectors that are not included in the basis, expand according to the basis:

1. a 1 = { 1, 2, 1 }, a 2 = { 2, 1, 3 }, a 3 = { 1, 5, 0 }, a 4 = { 2, -2, 4 }.

2. a 1 = { 1, 1, 2 }, a 2 = { 0, 1, 2 }, a 3 = { 2, 1, -4 }, a 4 = { 1, 1, 0 }.

3. a 1 = { 1, -2, 3 }, a 2 = { 0, 1, -1 }, a 3 = { 1, 3, 0 }, a 4 = { 0, -7, 3 }, a 5 = { 1, 1, 1 }.

4. a 1 = { 1, 2, -2 }, a 2 = { 0, -1, 4 }, a 3 = { 2, -3, 3 }.

2. Find all bases of a system of vectors:

1. a 1 = { 1, 1, 2 }, a 2 = { 3, 1, 2 }, a 3 = { 1, 2, 1 }, a 4 = { 2, 1, 2 }.

2. a 1 = { 1, 1, 1 }, a 2 = { -3, -5, 5 }, a 3 = { 3, 4, -1 }, a 4 = { 1, -1, 4 }.

In geometry, a vector is understood as a directed segment, and vectors obtained from one another by parallel translation are considered equal. All equal vectors are treated as the same vector. The beginning of the vector can be placed at any point in space or plane.

If the coordinates of the ends of the vector are given in space: A(x 1 , y 1 , z 1), B(x 2 , y 2 , z 2), then

= (x 2 – x 1 , y 2 – y 1 , z 2 – z 1). (1)

A similar formula holds in the plane. This means that a vector can be written as a coordinate string. Operations on vectors, - addition and multiplication by a number, on strings are performed component by component. This makes it possible to expand the concept of a vector, understanding a vector as any string of numbers. For example, the solution of a system of linear equations, as well as any set of values ​​of the system variables, can be considered as a vector.

On strings of the same length, the addition operation is performed according to the rule

(a 1 , a 2 , … , a n) + (b 1 , b 2 , … , b n) = (a 1 + b 1 , a 2 + b 2 , … , a n+b n). (2)

Multiplication of a string by a number is performed according to the rule

l(a 1 , a 2 , … , a n) = (la 1 , la 2 , … , la n). (3)

Set of row vectors of given length n with the indicated operations of vector addition and multiplication by a number forms an algebraic structure called n-dimensional linear space.

A linear combination of vectors is a vector , where λ 1 , ... , λ m are arbitrary coefficients.

A system of vectors is called linearly dependent if there exists its linear combination equal to , which has at least one non-zero coefficient.

A system of vectors is called linearly independent if in any of its linear combinations equal to , all coefficients are zero.

Thus, the solution of the question of the linear dependence of the system of vectors is reduced to the solution of the equation

x 1 + x 2 + … + x m = . (4)

If this equation has nonzero solutions, then the system of vectors is linearly dependent. If the zero solution is unique, then the system of vectors is linearly independent.

To solve system (4), for clarity, the vectors can be written not in the form of rows, but in the form of columns.

Then, after performing transformations on the left side, we arrive at a system of linear equations equivalent to equation (4). The main matrix of this system is formed by the coordinates of the original vectors arranged in columns. The column of free members is not needed here, since the system is homogeneous.

Basis of a system of vectors (finite or infinite, in particular, of the entire linear space) is its non-empty linearly independent subsystem, through which any vector of the system can be expressed.

Example 1.5.2. Find the basis of the system of vectors = (1, 2, 2, 4), = (2, 3, 5, 1), = (3, 4, 8, –2), = (2, 5, 0, 3) and express other vectors through the basis.

Solution. We build a matrix in which the coordinates of these vectors are arranged in columns. This is the matrix of the system x 1 + x 2 + x 3 + x 4 =. . We bring the matrix to a stepped form:

~ ~ ~

The basis of this system of vectors is formed by the vectors , , , which correspond to the leading elements of the rows marked with circles. To express the vector, we solve the equation x 1 + x 2 + x 4 = . It is reduced to a system of linear equations, the matrix of which is obtained from the original by rearranging the column corresponding to , to the place of the column of free terms. Therefore, when reducing to a stepped form, the same transformations will be made on the matrix as above. This means that we can use the resulting matrix in a stepped form by making the necessary permutations of the columns in it: the columns with circles are placed to the left of the vertical bar, and the column corresponding to the vector is placed to the right of the bar.

We successively find:

x 4 = 0;

x 2 = 2;

x 1 + 4 = 3, x 1 = –1;

Comment. If it is required to express several vectors through the basis, then for each of them the corresponding system of linear equations is constructed. These systems will differ only in the columns of free members. In this case, each system is solved independently of the others.

EXERCISE 1.4. Find the basis of the system of vectors and express the rest of the vectors in terms of the basis:

a) = (1, 3, 2, 0), = (3, 4, 2, 1), = (1, –2, –2, 1), = (3, 5, 1, 2);

b) = (2, 1, 2, 3), = (1, 2, 2, 3), = (3, –1, 2, 2), = (4, –2, 2, 2);

c) = (1, 2, 3), = (2, 4, 3), = (3, 6, 6), = (4, –2, 1); = (2, -6, -2).

In a given system of vectors, a basis can usually be distinguished in different ways, but all bases will have the same number of vectors. The number of vectors in the basis of a linear space is called the dimension of the space. For n-dimensional linear space n is the dimension of the space, since this space has a standard basis = (1, 0, … , 0), = (0, 1, … , 0), … , = (0, 0, … , 1). Through this basis, any vector = (a 1 , a 2 , … , a n) is expressed as follows:

= (a 1 , 0, … , 0) + (0, a 2 , … , 0) + … + (0, 0, … , a n) =

A 1 (1, 0, … , 0) + a 2 (0, 1, … , 0) + … + a n(0, 0, ... ,1) = a 1 + a 2 + ... + a n .

Thus, the components in the row of the vector = (a 1 , a 2 , … , a n) are its coefficients in the expansion in terms of the standard basis.

Straight lines on a plane

The problem of analytical geometry is the application of the coordinate method to geometric problems. Thus, the problem is translated into an algebraic form and solved by means of algebra.


When we analyzed the concepts of an n-dimensional vector and introduced operations on vectors, we found out that the set of all n-dimensional vectors generates a linear space. In this article, we will talk about the most important related concepts - about the dimension and the basis of a vector space. We also consider the theorem on the expansion of an arbitrary vector in terms of a basis and the connection between different bases of an n-dimensional space. Let us analyze in detail the solutions of typical examples.

Page navigation.

Concept of vector space dimension and basis.

The concepts of dimension and basis of a vector space are directly related to the concept of a linearly independent system of vectors, so we recommend, if necessary, refer to the article linear dependence of a system of vectors, properties of linear dependence and independence.

Definition.

Dimension of the vector space is called the number equal to the maximum number of linearly independent vectors in this space.

Definition.

Vector space basis is an ordered set of linearly independent vectors of this space, the number of which is equal to the dimension of the space.

We present some arguments based on these definitions.

Consider the space of n -dimensional vectors.

Let us show that the dimension of this space is equal to n .

Let us take a system of n unit vectors of the form

Let's take these vectors as rows of the matrix A. In this case, matrix A will be an n by n identity matrix. The rank of this matrix is ​​n (if necessary, see the article). Therefore, the system of vectors is linearly independent, and no vector can be added to this system without violating its linear independence. Since the number of vectors in the system equals n, then the dimension of the space of n-dimensional vectors is n, and the unit vectors are the basis of this space.

From the last statement and the definition of the basis, we can conclude that any system of n-dimensional vectors whose number of vectors is less than n is not a basis.

Now let's swap the first and second vectors of the system . It is easy to show that the resulting system of vectors is also a basis of an n-dimensional vector space. Let us compose a matrix, taking it as rows of the vectors of this system. This matrix can be obtained from the identity matrix by swapping the first and second rows, hence its rank will be n . Thus, a system of n vectors is linearly independent and is a basis of an n-dimensional vector space.

If we swap other vectors of the system , we get another basis.

If we take a linearly independent system of non-unit vectors, then it is also the basis of an n-dimensional vector space.

Thus, a vector space of dimension n has as many bases as there are linearly independent systems of n n -dimensional vectors.

If we talk about a two-dimensional vector space (that is, about a plane), then its basis is any two non-collinear vectors. The basis of a three-dimensional space is any three non-coplanar vectors.

Let's look at a few examples.

Example.

Are vectors the basis of a 3D vector space?

Solution.

Let us examine this system of vectors for a linear dependence. To do this, we will compose a matrix, the rows of which will be the coordinates of the vectors, and find its rank:


Thus, the vectors a , b and c are linearly independent and their number is equal to the dimension of the vector space, therefore, they are the basis of this space.

Answer:

Yes, they are.

Example.

Can a system of vectors be the basis of a vector space?

Solution.

This system of vectors is linearly dependent, since the maximum number of linearly independent three-dimensional vectors is three. Therefore, this system of vectors cannot be a basis of a three-dimensional vector space (although a subsystem of the original system of vectors is a basis).

Answer:

No, he can not.

Example.

Make sure the vectors

can be a basis of a four-dimensional vector space.

Solution.

Let's make a matrix, taking it as rows of the original vectors:

Let's find:

Thus, the system of vectors a, b, c, d is linearly independent and their number is equal to the dimension of the vector space, therefore, a, b, c, d are its basis.

Answer:

The original vectors are indeed the basis of a four-dimensional space.

Example.

Do vectors form the basis of a 4-dimensional vector space?

Solution.

Even if the original system of vectors is linearly independent, the number of vectors in it is not enough to be the basis of a four-dimensional space (the basis of such a space consists of 4 vectors).

Answer:

No, it doesn't.

Decomposition of a vector in terms of a vector space basis.

Let arbitrary vectors are the basis of an n -dimensional vector space. If we add some n-dimensional vector x to them, then the resulting system of vectors will be linearly dependent. From the properties of linear dependence, we know that at least one vector of a linearly dependent system is linearly expressed in terms of the others. In other words, at least one of the vectors of a linearly dependent system is decomposed into the rest of the vectors.

Thus we come to a very important theorem.

Theorem.

Any vector of an n-dimensional vector space is uniquely decomposed in terms of a basis.

Proof.

Let - basis of n -dimensional vector space. Let's add an n-dimensional vector x to these vectors. Then the resulting system of vectors will be linearly dependent and the vector x can be linearly expressed in terms of the vectors : , where are some numbers. So we got the expansion of the vector x in terms of the basis. It remains to prove that this decomposition is unique.

Assume that there is another decomposition , where - some numbers. Subtract from the left and right parts of the last equality, respectively, the left and right parts of the equality:

Since the system of basis vectors is linearly independent, then, by the definition of linear independence of a system of vectors, the resulting equality is possible only when all coefficients are equal to zero. Therefore, , which proves the uniqueness of the expansion of the vector in terms of the basis.

Definition.

The coefficients are called coordinates of the vector x in the basis .

After getting acquainted with the theorem on the expansion of a vector in terms of a basis, we begin to understand the essence of the expression “we are given an n-dimensional vector ". This expression means that we are considering a vector x of an n -dimensional vector space whose coordinates are given in some basis. At the same time, we understand that the same vector x in another basis of the n-dimensional vector space will have coordinates different from .

Consider the following problem.

Let, in some basis of an n-dimensional vector space, we are given a system of n linearly independent vectors

and vector . Then the vectors are also a basis of this vector space.

Let us need to find the coordinates of the vector x in the basis . Let's denote these coordinates as .

Vector x in basis has an idea. We write this equality in coordinate form:

This equality is equivalent to a system of n linear algebraic equations with n unknown variables :

The main matrix of this system has the form

Let's denote it as A. The columns of matrix A are vectors of a linearly independent system of vectors , so the rank of this matrix is ​​n , hence its determinant is non-zero. This fact indicates that the system of equations has a unique solution that can be found by any method, for example, or .

So the desired coordinates will be found vector x in the basis .

Let's analyze the theory with examples.

Example.

In some basis of the three-dimensional vector space, the vectors

Make sure the vector system is also a basis of this space and find the coordinates of the vector x in this basis.

Solution.

For a system of vectors to be the basis of a three-dimensional vector space, it must be linearly independent. Let's find out by determining the rank of the matrix A , whose rows are vectors . We find the rank by the Gauss method


therefore, Rank(A) = 3 , which shows the linear independence of the system of vectors .

So vectors are the basis. Let the vector x have coordinates in this basis. Then, as we showed above, the relationship of the coordinates of this vector is given by the system of equations

Substituting into it the values ​​known from the condition, we obtain

Let's solve it by Cramer's method:

Thus, the vector x in the basis has coordinates .

Answer:

Example.

In some basis four-dimensional vector space is given a linearly independent system of vectors

It is known that . Find coordinates of vector x in basis .

Solution.

Since the system of vectors is linearly independent by assumption, then it is a basis of a four-dimensional space. Then the equality means that the vector x in the basis has coordinates. Denote the coordinates of the vector x in the basis How .

The system of equations that defines the relationship of the coordinates of the vector x in bases And has the form

Substitute in it known values and find the desired coordinates:

Answer:

.

Communication between bases.

Let two linearly independent systems of vectors be given in some basis of an n-dimensional vector space

And

that is, they are also bases of this space.

If - vector coordinates in basis , then the relationship of coordinates And is given by a system of linear equations (we talked about this in the previous paragraph):

, which in matrix form can be written as

Similarly, for a vector, we can write

The previous matrix equalities can be combined into one, which essentially defines the relationship of the vectors of two different bases

Similarly, we can express all basis vectors through the basis :

Definition.

Matrix called transition matrix from the basis to basis , then the equality

Multiplying both sides of this equation on the right by

we get

Let's find the transition matrix, while we will not dwell on finding the inverse matrix and multiplying matrices (see, if necessary, articles and):

It remains to find out the relationship of the coordinates of the vector x in the given bases.

Let the vector x have coordinates in the basis, then

and in the basis the vector x has coordinates , then

Since the left parts of the last two equalities are the same, we can equate the right parts:

If we multiply both sides on the right by

then we get


On the other side

(find inverse matrix on one's own).
The last two equalities give us the desired relationship of the coordinates of the vector x in the bases and .

Answer:

The transition matrix from basis to basis has the form
;
the coordinates of the vector x in bases and are related by the relations

or
.

We considered the concepts of dimension and basis of a vector space, learned how to decompose a vector according to a basis, and discovered a connection between different bases of an n-dimensional space of vectors through a transition matrix.

Lectures on Algebra and Geometry. Semester 1.

Lecture 9. The basis of a vector space.

Summary: system of vectors, linear combination of a system of vectors, coefficients of a linear combination of a system of vectors, basis on a line, plane and in space, dimensions of vector spaces on a line, plane and in space, decomposition of a vector in a basis, coordinates of a vector with respect to a basis, equality theorem two vectors, linear operations with vectors in coordinate notation, orthonormal triple of vectors, right and left triples of vectors, orthonormal basis, fundamental theorem of vector algebra.

Chapter 9

item 1. Basis on the line, on the plane and in space.

Definition. Any finite set of vectors is called a system of vectors.

Definition. Expression where
is called a linear combination of a system of vectors
, and the numbers
are called the coefficients of this linear combination.

Let L, Р and S be a line, a plane and a space of points, respectively, and
. Then
are vector spaces of vectors as directed segments on the line L, on the plane P, and in the space S, respectively.


any non-zero vector is called
, i.e. any non-zero vector collinear to the straight line L:
And
.

Basis notation
:
- basis
.

Definition. The vector space basis
is any ordered pair of noncollinear vectors in the space
.

, Where
,
- basis
.

Definition. The vector space basis
is any ordered triple of non-coplanar vectors (that is, not lying in the same plane) of the space
.

- basis
.

Comment. The basis of a vector space cannot contain a zero vector: in the space
by definition, in space
two vectors will be collinear if at least one of them is zero, in space
three vectors will be coplanar, i.e. they will lie in the same plane if at least one of the three vectors is zero.

item 2. Decomposition of a vector in terms of a basis.

Definition. Let is an arbitrary vector,
is an arbitrary system of vectors. If the equality

then they say that the vector represented as a linear combination of a given system of vectors. If the given system of vectors
is a basis of the vector space, then equality (1) is called the decomposition of the vector basis
. Linear combination coefficients
are called in this case the coordinates of the vector relative to the basis
.

Theorem. (On the expansion of a vector in terms of a basis.)

Any vector of a vector space can be decomposed in its basis and, moreover, in a unique way.

Proof. 1) Let L be an arbitrary line (or axis) and
- basis
. Take an arbitrary vector
. Since both vectors And collinear to the same line L, then
. Let us use the theorem on the collinearity of two vectors. Because
, then there is (exists) such a number
, What
and thus we have obtained a decomposition of the vector basis
vector space
.

We now prove the uniqueness of such a decomposition. Let's assume the opposite. Let there be two decompositions of the vector basis
vector space
:

And
, Where
. Then
and using the distribution law, we get:

Because
, then it follows from the last equality that
, etc.

2) Now let P be an arbitrary plane and
- basis
. Let
arbitrary vector of this plane. Let's postpone all three vectors from any one point of this plane. Let's build 4 straight lines. Let's draw a straight line , on which the vector lies , direct
, on which the vector lies . Through the end of the vector draw a line parallel to the vector and a straight line parallel to the vector . These 4 lines cut a parallelogram. See below fig. 3. According to the parallelogram rule
, And
,
,
- basis ,
- basis
.

Now, by what was already proved in the first part of this proof, there are numbers
, What

And
. From here we get:

and the possibility of expansion in terms of the basis is proved.

Let us now prove the uniqueness of the expansion in terms of the basis. Let's assume the opposite. Let there be two decompositions of the vector basis
vector space
:
And
. We get equality

Where should
. If
, That
, and since
, That
and the expansion coefficients are:
,
. Let now
. Then
, Where
. By the theorem on the collinearity of two vectors, this implies that
. We have obtained a contradiction to the condition of the theorem. Hence,
And
, etc.

3) Let
- basis
let it go
arbitrary vector. Let us carry out the following constructions.

Set aside all three basis vectors
and vector from one point and build 6 planes: the plane in which the basis vectors lie
, plane
and plane
; further through the end of the vector draw three planes parallel to the three planes just constructed. These 6 planes cut out the box:

According to the vector addition rule, we get the equality:

. (1)

By construction
. Hence, by the theorem on the collinearity of two vectors, it follows that there is a number
, such that
. Likewise,
And
, Where
. Now, substituting these equalities into (1), we get:

and the possibility of expansion in terms of the basis is proved.

Let us prove the uniqueness of such a decomposition. Let's assume the opposite. Let there be two decompositions of the vector basis
:

AND . Then

Note that, by assumption, the vectors
non-coplanar, hence they are pairwise non-collinear.

Two cases are possible:
or
.

a) Let
, then from equality (3) it follows:

. (4)

It follows from equality (4) that the vector expanded in terms of the basis
, i.e. vector lies in the vector plane
and hence the vectors
coplanar, which contradicts the condition.

b) There remains a case
, i.e.
. Then from equality (3) we obtain or

Because
is the basis of the space of vectors lying in the plane, and we have already proved the uniqueness of the expansion in the basis of the vectors of the plane, it follows from equality (5) that
And
, etc.

The theorem has been proven.

Consequence.

1) There is a one-to-one correspondence between the set of vectors of the vector space
and the set of real numbers R.

2) There is a one-to-one correspondence between the set of vectors of the vector space
and cartesian square

3) There is a one-to-one correspondence between the set of vectors of the vector space
and cartesian cube
sets of real numbers R.

Proof. Let us prove the third assertion. The first two are proved similarly.

Let's choose and fix in space
some basis
and set up a display
according to the following rule:

those. each vector is associated with an ordered set of its coordinates.

Since, with a fixed basis, each vector has a unique set of coordinates, the correspondence given by rule (6) is indeed a mapping.

It follows from the proof of the theorem that different vectors have different coordinates with respect to the same basis, i.e. mapping (6) is an injection.

Let
an arbitrary ordered set of real numbers.

Consider the vector
. By construction, this vector has coordinates
. Therefore, mapping (6) is a surjection.

A mapping that is both injective and surjective is bijective, i.e. one-to-one, etc.

The consequence is proven.

Theorem. (On the equality of two vectors.)

Two vectors are equal if and only if their coordinates with respect to the same basis are equal.

The proof immediately follows from the previous corollary.

item 3. Dimension of a vector space.

Definition. The number of vectors in the basis of a vector space is called its dimension.

Designation:
is the dimension of the vector space V.

Thus, in accordance with this and previous definitions, we have:

1)
is the vector space of vectors of the line L.

- basis
,
,
,
– vector decomposition
basis
,
- vector coordinate relative to the basis
.

2)
is the vector space of vectors of the plane Р.

- basis
,
,
,
– vector decomposition
basis
,
are vector coordinates relative to the basis
.

3)
is the vector space of vectors in the space of points S.

- basis
,
,
– vector decomposition
basis
,
are vector coordinates relative to the basis
.

Comment. If
, That
and you can choose the basis
space
So
- basis
And
- basis
. Then
, And
, .

Thus, any vector of the line L, the plane P and the space S can be expanded in terms of the basis
:

Designation. By virtue of the vector equality theorem, we can identify any vector with an ordered triple of real numbers and write:

This is only possible if the basis
fixed and there is no danger of tangling.

Definition. The record of a vector in the form of an ordered triple of real numbers is called the coordinate form of the vector record:
.

item 4. Linear operations with vectors in coordinate notation.

Let
- space basis
And
are its two arbitrary vectors. Let
And
is the notation of these vectors in coordinate form. Let, further,
is an arbitrary real number. In these notations, the following theorem holds.

Theorem. (On linear operations with vectors in coordinate form.)

2)
.

In other words, in order to add two vectors, you need to add their corresponding coordinates, and to multiply a vector by a number, you need to multiply each coordinate of this vector by a given number.

Proof. Since, according to the condition of the theorem, then using the axioms of the vector space, which are subject to the operations of adding vectors and multiplying a vector by a number, we obtain:

This implies .

The second equality is proved similarly.

The theorem has been proven.

item 5. Orthogonal vectors. Orthonormal basis.

Definition. Two vectors are called orthogonal if the angle between them is equal to the right angle, i.e.
.

Designation:
– vectors And orthogonal.

Definition. Vector trio
is called orthogonal if these vectors are pairwise orthogonal to each other, i.e.
,
.

Definition. Vector trio
is called orthonormal if it is orthogonal and the lengths of all vectors are equal to one:
.

Comment. It follows from the definition that an orthogonal and, therefore, orthonormal triple of vectors is non-coplanar.

Definition. Ordered non-coplanar triple of vectors
, laid off from one point, is called right (right-oriented) if, when observed from the end of the third vector to the plane containing the first two vectors And , the shortest rotation of the first vector to the second takes place counterclockwise. Otherwise, the triple of vectors is called left (left-oriented).

Here, Fig. 6 shows the right triple of vectors
. The following figure 7 shows the left triplet of vectors
:

Definition. Basis
vector space
is called orthonormal if
orthonormal triple of vectors.

Designation. In what follows, we will use the right orthonormal basis
, see the following figure.

Expression of the form called linear combination of vectors A 1 , A 2 ,...,A n with coefficients λ 1, λ 2 ,...,λ n.

Determining the linear dependence of a system of vectors

Vector system A 1 , A 2 ,...,A n called linearly dependent, if there is a non-zero set of numbers λ 1, λ 2 ,...,λ n, under which the linear combination of vectors λ 1 *A 1 +λ 2 *A 2 +...+λ n *A n equal to zero vector, that is, the system of equations: has a non-zero solution.
Set of numbers λ 1, λ 2 ,...,λ n is nonzero if at least one of the numbers λ 1, λ 2 ,...,λ n different from zero.

Determining the linear independence of a system of vectors

Vector system A 1 , A 2 ,...,A n called linearly independent, if the linear combination of these vectors λ 1 *A 1 +λ 2 *A 2 +...+λ n *A n is equal to the zero vector only for a zero set of numbers λ 1, λ 2 ,...,λ n , that is, the system of equations: A 1 x 1 +A 2 x 2 +...+A n x n =Θ has a unique zero solution.

Example 29.1

Check if a system of vectors is linearly dependent

Solution:

1. We compose a system of equations:

2. We solve it using the Gauss method. The Jordanian transformations of the system are given in Table 29.1. When calculating, the right parts of the system are not written down, since they are equal to zero and do not change under Jordan transformations.

3. From the last three rows of the table we write the allowed system equivalent to the original system:

4. We get common decision systems:

5. Having set at your own discretion the value of the free variable x 3 =1, we obtain a particular non-zero solution X=(-3,2,1).

Answer: Thus, with a non-zero set of numbers (-3,2,1), the linear combination of vectors equals the zero vector -3A 1 +2A 2 +1A 3 =Θ. Hence, system of vectors linearly dependent.

Properties of vector systems

Property (1)
If the system of vectors is linearly dependent, then at least one of the vectors is decomposed in terms of the rest, and vice versa, if at least one of the vectors of the system is decomposed in terms of the rest, then the system of vectors is linearly dependent.

Property (2)
If any subsystem of vectors is linearly dependent, then the whole system is linearly dependent.

Property (3)
If a system of vectors is linearly independent, then any of its subsystems is linearly independent.

Property (4)
Any system of vectors containing a zero vector is linearly dependent.

Property (5)
A system of m-dimensional vectors is always linearly dependent if the number of vectors n is greater than their dimension (n>m)

Basis of the vector system

The basis of the system of vectors A 1 , A 2 ,..., A n such a subsystem B 1 , B 2 ,...,B r(each of the vectors B 1 ,B 2 ,...,B r is one of the vectors A 1 , A 2 ,..., A n) that satisfies the following conditions:
1. B 1 ,B 2 ,...,B r linearly independent system of vectors;
2. any vector Aj of the system A 1 , A 2 ,..., A n is linearly expressed in terms of the vectors B 1 ,B 2 ,...,B r

r is the number of vectors included in the basis.

Theorem 29.1 On the unit basis of a system of vectors.

If a system of m-dimensional vectors contains m different unit vectors E 1 E 2 ,..., E m , then they form the basis of the system.

Algorithm for finding the basis of a system of vectors

In order to find the basis of the system of vectors A 1 ,A 2 ,...,A n it is necessary:

  • Compose a homogeneous system of equations corresponding to the system of vectors A 1 x 1 +A 2 x 2 +...+A n x n =Θ
  • bring this system
Loading...
Top