Skip to content

Solving Vector Equations

Introduction

A vector equation is an equation that includes both known and unknown vectors. Just like in algebra we try to isolate the unknown variable by using operations like addition, subtraction, or division, in vector equations we also want to isolate the unknown vector. But things get more complicated because the operations involved—like the dot product and cross product—do not have any kind of inverse operation. You can’t “divide” by a vector in the way you divide numbers.

For example, in equations like

\[ \vec{r} \times \vec{a} = \vec{a} \times \vec{b} \quad \text{or} \quad \vec{r} \cdot \vec{a} = k, \]

the unknown vector \( \vec{r} \) is stuck inside the vector operation, and we can’t just move things around to solve for it like we would in a simple equation like \( 3x = 9 \). There’s no way to “undo” a dot or cross product directly.

So, to solve vector equations, we have to use different tricks and techniques. We may rewrite the equation using known vector identities, or try to reduce the equation to something simpler, like a condition involving scalar multiples. Sometimes we can guess the form of the solution using what we know about how vectors behave.

In this chapter, we will learn several methods to isolate the unknown vector and solve such equations step by step.

Determining a Vector from Multiple Dot Product Conditions

We now investigate the question: can a vector \( \vec{r} \) be uniquely determined from a single dot product condition \( \vec{r} \cdot \vec{a} = \lambda \), where \( \vec{a} \) is a known vector and \( \lambda \) is a known scalar?

The answer is no. The equation

\[ \vec{r} \cdot \vec{a} = \lambda \]

represents a plane perpendicular to \( \vec{a} \). Every vector \( \vec{r} \) whose tip lies on this plane will satisfy the equation. Since a plane contains infinitely many vectors, there are infinitely many possible \( \vec{r} \) satisfying this condition. Therefore, one equation involving the dot product is insufficient to uniquely determine \( \vec{r} \).

To determine \( \vec{r} \) uniquely, we require more constraints. Suppose we are also given:

\[ \vec{r} \cdot \vec{b} = \mu, \quad \vec{r} \cdot \vec{c} = \nu, \]

with \( \vec{a}, \vec{b}, \vec{c} \) all known and non-coplanar, i.e., linearly independent. Then we are given three scalar equations:

\[ \vec{r} \cdot \vec{a} = \lambda, \quad \vec{r} \cdot \vec{b} = \mu, \quad \vec{r} \cdot \vec{c} = \nu. \]

Geometrically, these are three planes in \(\mathbb{R}^3\), and when their normals \( \vec{a}, \vec{b}, \vec{c} \) are linearly independent, the intersection of the three planes is a single point. Hence, \( \vec{r} \) is uniquely determined.

How shall we find \( \vec{r} \) algebraically?

The most straightforward approach is to write

\[ \vec{r} = x \hat{i} + y \hat{j} + z \hat{k} \]

and substitute this into the three dot product equations using the components of \( \vec{a}, \vec{b}, \vec{c} \). This yields a system of three linear equations in three variables \( x, y, z \), which we can solve using methods like Cramer's Rule, Gaussian elimination, or matrix inversion.

However, there is a more elegant method using vector identities.

Since \( \vec{a}, \vec{b}, \vec{c} \) are linearly independent, their scalar triple product \( [\vec{a}\ \vec{b}\ \vec{c}] \neq 0 \), and the vectors

\[ \vec{b} \times \vec{c},\quad \vec{c} \times \vec{a},\quad \vec{a} \times \vec{b} \]

are also linearly independent. This we know from vector triple product properties. Hence, any vector \( \vec{r} \in \mathbb{R}^3 \) can be written as

\[ \vec{r} = x(\vec{b} \times \vec{c}) + y(\vec{c} \times \vec{a}) + z(\vec{a} \times \vec{b}) \tag{1} \]

for some scalars \( x, y, z \). This is known from the fundamental theorem of threee dimensions.

We now determine \( x, y, z \) using the given dot product equations.

Take dot product of both sides of (1) with \( \vec{a} \):

\[ \vec{r} \cdot \vec{a} = x (\vec{b} \times \vec{c}) \cdot \vec{a} + y (\vec{c} \times \vec{a}) \cdot \vec{a} + z (\vec{a} \times \vec{b}) \cdot \vec{a} \]

Recall the identity \( (\vec{u} \times \vec{v}) \cdot \vec{w} = [\vec{w}\ \vec{u}\ \vec{v}] \), which is the scalar triple product. Also note that \( (\vec{u} \times \vec{v}) \cdot \vec{v} = 0 \), since cross product is perpendicular to both vectors involved.

Thus,

\[ \vec{r} \cdot \vec{a} = x [\vec{a}\ \vec{b}\ \vec{c}] + 0 + 0 = x[\vec{a}\ \vec{b}\ \vec{c}] \implies x = \frac{\lambda}{[\vec{a}\ \vec{b}\ \vec{c}]} \]

Similarly, dotting (1) with \( \vec{b} \) gives:

\[ \vec{r} \cdot \vec{b} = x (\vec{b} \times \vec{c}) \cdot \vec{b} + y (\vec{c} \times \vec{a}) \cdot \vec{b} + z (\vec{a} \times \vec{b}) \cdot \vec{b} = 0 + y[\vec{b}\ \vec{c}\ \vec{a}] + 0 \implies y = \frac{\mu}{[\vec{b}\ \vec{c}\ \vec{a}]} \]

And dotting with \( \vec{c} \):

\[ \vec{r} \cdot \vec{c} = x (\vec{b} \times \vec{c}) \cdot \vec{c} + y (\vec{c} \times \vec{a}) \cdot \vec{c} + z (\vec{a} \times \vec{b}) \cdot \vec{c} = 0 + 0 + z[\vec{c}\ \vec{a}\ \vec{b}] \implies z = \frac{\nu}{[\vec{c}\ \vec{a}\ \vec{b}]} \]

Therefore, the unique vector \( \vec{r} \) satisfying

\[ \vec{r} \cdot \vec{a} = \lambda,\quad \vec{r} \cdot \vec{b} = \mu,\quad \vec{r} \cdot \vec{c} = \nu \]

is given by

\[ \vec{r} = \frac{\lambda}{[\vec{a}\ \vec{b}\ \vec{c}]} (\vec{b} \times \vec{c}) + \frac{\mu}{[\vec{b}\ \vec{c}\ \vec{a}]} (\vec{c} \times \vec{a}) + \frac{\nu}{[\vec{c}\ \vec{a}\ \vec{b}]} (\vec{a} \times \vec{b}) \]

Resolving Along an Appropriate Basis

To isolate the unknown vector \( \vec{r} \) from vector equations, the key idea is to resolve \( \vec{r} \) as a linear combination of suitable basis vectors. The choice of basis depends on the vectors present in the given equations and their mutual geometric relationships. Here are the standard options:

  1. Standard Cartesian Basis

    If the given vectors are provided in component form, or if no specific structure is imposed, resolve

    \[ \vec{r} = x\hat{i} + y\hat{j} + z\hat{k} \]

    and substitute into the equations to form a system of scalar equations in \( x, y, z \).

  2. Given Non-Coplanar Vectors as Basis

    If three vectors \( \vec{a}, \vec{b}, \vec{c} \) are known and linearly independent (i.e., non-coplanar), then any vector \( \vec{r} \) in \(\mathbb{R}^3\) can be expressed as

    \[ \vec{r} = x\vec{a} + y\vec{b} + z\vec{c} \]
  3. Cross Product Basis

    If \( \vec{a}, \vec{b}, \vec{c} \) are non-coplanar, then the vectors \( \vec{b} \times \vec{c},\ \vec{c} \times \vec{a},\ \vec{a} \times \vec{b} \) are also linearly independent. Then express

    \[ \vec{r} = x(\vec{b} \times \vec{c}) + y(\vec{c} \times \vec{a}) + z(\vec{a} \times \vec{b}) \]
  4. Mixed Basis with Two Vectors and Their Cross Product

    If only two non-colinear vectors \( \vec{a} \) and \( \vec{b} \) are given, then the set \( \vec{a}, \vec{b}, \vec{a} \times \vec{b} \) is linearly independent. Then write

    \[ \vec{r} = x\vec{a} + y\vec{b} + \lambda(\vec{a} \times \vec{b}) \]

In every case, the goal is to reduce the vector equation to scalar equations by substituting the resolved form of \( \vec{r} \) and equating coefficients or applying scalar product identities. The method chosen depends on the form and structure of the given vector conditions.

Can We Determine \( \vec{r} \) from an Equation Like \( \vec{r} \times \vec{a} = \vec{b} \)?

Let us consider the vector equation

\[ \vec{r} \times \vec{a} = \vec{b}, \]

where \( \vec{a} \) and \( \vec{b} \) are known vectors, and \( \vec{r} \) is the unknown vector to be determined.

To begin with, we must check whether such an equation even has a solution. Recall that the cross product \( \vec{r} \times \vec{a} \) is always perpendicular to both \( \vec{r} \) and \( \vec{a} \). Hence, for the equation to have a solution, the right-hand side \( \vec{b} \) must be perpendicular to \( \vec{a} \). In other words, the necessary condition is

\[ \vec{a} \cdot \vec{b} = 0. \]

If \( \vec{a} \cdot \vec{b} \ne 0 \), then the equation has no solution, since \( \vec{b} \) would not lie in the plane orthogonal to \( \vec{a} \), and hence cannot be the cross product \( \vec{r} \times \vec{a} \) for any vector \( \vec{r} \).

Assume now that \( \vec{a} \cdot \vec{b} = 0 \). Then the equation

\[ \vec{r} \times \vec{a} = \vec{b} \]

has solutions. Let \( \vec{r}_0 \) be any one particular solution, i.e.,

\[ \vec{r}_0 \times \vec{a} = \vec{b}. \]

Now consider any vector of the form

\[ \vec{r} = \vec{r}_0 + \lambda \vec{a}, \quad \lambda \in \mathbb{R}. \]

Then

\[ \vec{r} \times \vec{a} = (\vec{r}_0 + \lambda \vec{a}) \times \vec{a} = \vec{r}_0 \times \vec{a} + \lambda (\vec{a} \times \vec{a}) = \vec{b} + \lambda \vec{0} = \vec{b}. \]

Thus, any vector of the form \( \vec{r} = \vec{r}_0 + \lambda \vec{a} \) also satisfies the equation. This means the set of all solutions is

\[ \vec{r} = \vec{r}_0 + \lambda \vec{a}, \quad \lambda \in \mathbb{R}, \]

which represents a line in space parallel to \( \vec{a} \) passing through the point \( \vec{r}_0 \).

Therefore, the equation

\[ \vec{r} \times \vec{a} = \vec{b} \]

has no solution if \( \vec{a} \cdot \vec{b} \ne 0 \), and infinitely many solutions—all differing by a multiple of \( \vec{a} \)—if \( \vec{a} \cdot \vec{b} = 0 \).

To determine \( \vec{r} \) uniquely, additional information is necessary, such as an extra condition of the form

\[ \vec{r} \cdot \vec{c} = \gamma, \]

where \( \vec{c} \) and \( \gamma \) are known.

Vector Triple Product to Isolate the unknown vector

Now suppose we are given the system

\[ \vec{r} \times \vec{a} = \vec{b}, \quad \vec{r} \cdot \vec{a} = \lambda, \]

where \( \vec{a} \), \( \vec{b} \), and \( \lambda \) are known, and \( \vec{r} \) is the unknown vector.

Unlike the single equation \( \vec{r} \times \vec{a} = \vec{b} \), which admits infinitely many solutions (all differing by a multiple of \( \vec{a} \)), the addition of the scalar constraint \( \vec{r} \cdot \vec{a} = \lambda \) restricts the solution space to a single vector. These two conditions together are sufficient to determine \( \vec{r} \) uniquely.

To solve, we employ the identity of the vector triple product:

\[ \vec{a} \times (\vec{b} \times \vec{c}) = (\vec{a} \cdot \vec{c})\vec{b} - (\vec{a} \cdot \vec{b})\vec{c}. \]

Take the cross product of \( \vec{a} \) on both sides of the first equation:

\[ \vec{a} \times (\vec{r} \times \vec{a}) = \vec{a} \times \vec{b}. \]

Applying the vector triple product identity:

\[ (\vec{a} \cdot \vec{a})\vec{r} - (\vec{a} \cdot \vec{r})\vec{a} = \vec{a} \times \vec{b}. \]

Now substitute the known scalar \( \lambda = \vec{r} \cdot \vec{a} \):

\[ |\vec{a}|^2 \vec{r} - \lambda \vec{a} = \vec{a} \times \vec{b} \implies |\vec{a}|^2 \vec{r} = \vec{a} \times \vec{b} + \lambda \vec{a} \implies \vec{r} = \frac{\vec{a} \times \vec{b} + \lambda \vec{a}}{|\vec{a}|^2}. \]

Hence, the unique solution to the system

\[ \vec{r} \times \vec{a} = \vec{b}, \quad \vec{r} \cdot \vec{a} = \lambda \]

is given by

\[ \vec{r} = \frac{\vec{a} \times \vec{b} + \lambda \vec{a}}{|\vec{a}|^2}. \]

This solution expresses \( \vec{r} \) entirely in terms of the known quantities and uses a single application of the vector triple product identity to isolate \( \vec{r} \) from the cross product.

Example

Let

\[ \vec{a} = \mathbf{i} + \mathbf{j} + \mathbf{k}, \quad \vec{b} = 2\mathbf{i} - \mathbf{k}. \]

Let \( \vec{r} \) be a vector such that

\[ \vec{r} \times \vec{a} = \vec{a} \times \vec{b}, \quad \text{and} \quad \vec{r} \cdot \vec{a} = 3. \]

Find \( \vec{r} \).

Solution:

We are given

\[ \vec{r} \times \vec{a} = \vec{a} \times \vec{b}. \]

Take the cross product of both sides with \( \vec{a} \),

\[ \vec{a} \times (\vec{r} \times \vec{a}) = \vec{a} \times (\vec{a} \times \vec{b}). \]

Use the vector triple product identity:

\[ \vec{a} \times (\vec{r} \times \vec{a}) = (\vec{a} \cdot \vec{a}) \vec{r} - (\vec{a} \cdot \vec{r}) \vec{a}, \]
\[ \vec{a} \times (\vec{a} \times \vec{b}) = (\vec{a} \cdot \vec{b}) \vec{a} - (\vec{a} \cdot \vec{a}) \vec{b}. \]

So,

\[ |\vec{a}|^2 \vec{r} - (\vec{a} \cdot \vec{r}) \vec{a} = (\vec{a} \cdot \vec{b}) \vec{a} - |\vec{a}|^2 \vec{b}. \]

Substitute values:

\[ \vec{a} = \mathbf{i} + \mathbf{j} + \mathbf{k}, \quad \Rightarrow \quad |\vec{a}|^2 = 1^2 + 1^2 + 1^2 = 3, \]
\[ \vec{a} \cdot \vec{r} = 3 \quad \text{(given)}, \quad \vec{a} \cdot \vec{b} = (1)(2) + (1)(0) + (1)(-1) = 2 + 0 - 1 = 1. \]

Thus, the equation becomes

\[ 3\vec{r} - 3\vec{a} = (1)\vec{a} - 3\vec{b}. \]

Bring all terms to one side:

\[ 3\vec{r} = 3\vec{a} + \vec{a} - 3\vec{b} \implies 3\vec{r} = 4\vec{a} - 3\vec{b}. \]

Divide both sides by 3:

\[ \vec{r} = \frac{4}{3} \vec{a} - \vec{b}. \]

Substitute back \( \vec{a} = \mathbf{i} + \mathbf{j} + \mathbf{k} \), \( \vec{b} = 2\mathbf{i} - \mathbf{k} \):

\[ \vec{r} = \frac{4}{3} (\mathbf{i} + \mathbf{j} + \mathbf{k}) - (2\mathbf{i} - \mathbf{k}) = \left( \frac{4}{3} - 2 \right)\mathbf{i} + \frac{4}{3} \mathbf{j} + \left( \frac{4}{3} + 1 \right)\mathbf{k} = \left( -\frac{2}{3} \right)\mathbf{i} + \frac{4}{3} \mathbf{j} + \frac{7}{3} \mathbf{k}. \]

Hence,

\[ \boxed{ \vec{r} = -\frac{2}{3} \mathbf{i} + \frac{4}{3} \mathbf{j} + \frac{7}{3} \mathbf{k} }. \]

Alternate Solution:

We are given the vector equation

\[ \vec{r} \times \vec{a} = \vec{a} \times \vec{b}. \]

Using the identity \( \vec{a} \times \vec{b} = -\vec{b} \times \vec{a} \), we rewrite the equation as

\[ \vec{r} \times \vec{a} + \vec{b} \times \vec{a} = \vec{0} \implies (\vec{r} + \vec{b}) \times \vec{a} = \vec{0}. \]

This implies that \( \vec{r} + \vec{b} \) is parallel to \( \vec{a} \), i.e.,

\[ \vec{r} + \vec{b} = \lambda \vec{a} \quad \text{for some } \lambda \in \mathbb{R} \implies \vec{r} = -\vec{b} + \lambda \vec{a}. \]

This represents a straight line passing through the point with position vector \( -\vec{b} \) and parallel to the direction \( \vec{a} \).

Now use the additional condition \( \vec{r} \cdot \vec{a} = 3 \). Substituting \( \vec{r} = -\vec{b} + \lambda \vec{a} \),

\[ (-\vec{b} + \lambda \vec{a}) \cdot \vec{a} = 3 \implies -\vec{b} \cdot \vec{a} + \lambda |\vec{a}|^2 = 3 \implies \lambda = \frac{3 + \vec{b} \cdot \vec{a}}{|\vec{a}|^2}. \]

We already computed:

\[ \vec{a} = \mathbf{i} + \mathbf{j} + \mathbf{k}, \quad \vec{b} = 2\mathbf{i} - \mathbf{k}, \]
\[ \vec{a} \cdot \vec{b} = 2 + 0 - 1 = 1, \quad |\vec{a}|^2 = 3. \]

So,

\[ \lambda = \frac{3 + 1}{3} = \frac{4}{3}. \]

Now substitute back:

\[ \vec{r} = -\vec{b} + \frac{4}{3} \vec{a} = -(2\mathbf{i} - \mathbf{k}) + \frac{4}{3} (\mathbf{i} + \mathbf{j} + \mathbf{k}) = -2\mathbf{i} + \mathbf{k} + \frac{4}{3} \mathbf{i} + \frac{4}{3} \mathbf{j} + \frac{4}{3} \mathbf{k}. \]

Group terms:

\[ \vec{r} = \left( -2 + \frac{4}{3} \right) \mathbf{i} + \frac{4}{3} \mathbf{j} + \left( 1 + \frac{4}{3} \right) \mathbf{k} = -\frac{2}{3} \mathbf{i} + \frac{4}{3} \mathbf{j} + \frac{7}{3} \mathbf{k}. \]

Hence,

\[ \boxed{ \vec{r} = -\frac{2}{3} \mathbf{i} + \frac{4}{3} \mathbf{j} + \frac{7}{3} \mathbf{k} }. \]

This confirms the result obtained by the first method.

Another Alternative:

Another way to solve the equation is by assuming the general form

\[ \vec{r} = x\mathbf{i} + y\mathbf{j} + z\mathbf{k}, \]

and substituting this into the given equations. The first condition is

\[ \vec{r} \times \vec{a} = \vec{a} \times \vec{b}, \]

with

\[ \vec{a} = \mathbf{i} + \mathbf{j} + \mathbf{k}, \quad \vec{b} = 2\mathbf{i} - \mathbf{k}. \]

Compute both sides explicitly using component-wise cross product, and then equate the resulting vectors to form three scalar equations in \( x, y, z \). You also have the dot product condition

\[ \vec{r} \cdot \vec{a} = 3, \]

which gives a fourth equation. Solving the resulting system will yield the values of \( x, y, z \).

You are encouraged to carry out this method yourself.