If you were to take a square and draw two diagonals lines, they would cross in the center, and form four right triangles. The two diagonals cross at 90 degrees. You might intuitively guess that two diagonals of a cube, each running from one corner of the cube to its opposite corner and crossing in the center, would also cross at right angles. You would be mistaken. Determining the angle at which two diagonals in a cube cross one another is slightly more complicated than it might appear at first glance, but it does make great practice for understanding the principles of geometry and trigonometry.

Define the length of an edge as one unit. By definition, every edge on the cube has an identical length of one unit.

Use the Pythagorean theorem to determine the length of a diagonal running from one corner, to the opposite corner on the same face. Call this a “short diagonal’ for the sake of clarity. Each side of the right triangle formed is one unit, so the diagonal must be equal to √2.

Use the Pythagorean theorem to determine the length of a diagonal running from one corner to the opposite corner of the opposite face. Call this a “long diagonal.” You have a right triangle with one side equal to 1 unit and one side equal to a “short diagonal,” √2 units. The square of the hypotenuse is equal to the sum of the squares of the sides, so the hypotenuse must be √3. Each diagonal running from one corner of the cube to the opposite corner is √3 units long.

Draw a rectangle to represent two long diagonals crossing in the center of the cube. You want to find the angle of their intersection. This rectangle will be 1 unit tall and √2 units wide. The long diagonals bisect one another in the center of this rectangle and form two different types of triangle. One of these triangles has one side equal to one unit and the other two sides equal to √3/2 (one half the length of a long diagonal). The other also has two sides equal to √3/2 but its other side is equal to √2. You only need to analyze one of the triangles, so take the first one and solve for the unknown angle.

Use the trigonometric formula c^2 = a^2 + b^2 – 2ab cos C to solve for the unknown angle of this triangle. C=1, and both a and b are equal to √3/2. Plugging these values into the equation, you will determine that the cosine of your unknown angle is 1/3. Taking the inverse cosine of 1/3 gives an angle of 70.5 degrees.