Chebyshev distance is a metric defined on a vector space where the distance between two vector is the greatest of their differences along any coordinate dimension. for example, in two dimension, point(x,y) to its 8 adjacent points are 1(see images below, left is represent chebyshev distance, right is Manhattan distance)
chebyshev distance |
Manhattan distance |
The relationship between Chebyshev distance (also know as Linf) and Manhattan distance(also know as L1) is subtle, in one dimension , they are equivalent; in two dimension, all points whose Chebyshev distance equal to r with point(x,y) become a circles in the form of square, with side of length 2r, very interesting, all points whose Manhattan distance equal to r with point(x,y) also become a circles in the form of square, with side of length sqrt(2)*r , we can rotate an scale the axis to transform Chebyshev distance to manhattan distance.
But what's more, the equivalence is not generalize to high dimension
so what is the advantage of Manhattan distance? independent, more precisely, Manhattan distance calculation is independent between dimensions.
now you can finish the problem meeting point , of course , if you want to pass system test, you must implement tricky.
Today is April 22nd, 2018, exactly 6 years later from the day this post was posted!
ReplyDeleteSomehow I run into this post while solving a very similar problem from exactly same site: https://www.hackerrank.com/contests/hourrank-27/challenges/moving-the-kings/problem
Such a coincidence haha!