## Maximal $rdepth$

In this chapter we higlight some bounds concerning the maximal regression depth of any fit, given a dataset $Z_n = \{(x_i, y_i), i = 1,...,n \} \subset \rm I\!R^2$.

#### Lower bound

Lower bound
Given any dataset $Z_n = \{(x_i, y_i), i = 1,...,n \} \subset \rm I\!R^2$, $\max_{l} rdepth(l, Z_n) \ge \left\lceil\frac{n}{3}\right\rceil$

We prove the previous lower bound by showing that for any dataset we can always define a fit $\kappa$ having a regression depth equal to $\left\lceil\frac{n}{3}\right\rceil$.

Construction of $\kappa$
1. Sort $Z_n$ observations acording to their x-coordinate, such that $x_1 < x_2 < ... < x_n$.
2. Split $Z_n$ into three partitions $L, M, R$ such that (with $m \in \rm I\!N$):
• If $n = 3m$ : $\mid L \mid = \mid M \mid = \mid R \mid = \frac{n}{3}$
• If $n = 3m+1$ : $\mid L \mid = \mid R \mid = \frac{n}{3}$    and   $\mid M \mid = \frac{n}{3} + 1$
• If $n = 3m+2$ : $\mid L \mid = \mid R \mid = \frac{n}{3}+1$    and   $\mid M \mid = \frac{n}{3}$
and $L$ is composed of the first $\mid L \mid$ left most observations, $M$ the next $\mid M \mid$ observations and $R$ the remaining ones.
3. Find $\kappa$ such that $\kappa$ bisects $P_1$ and $P_2$, with $P_1 = L \cup M$ and $P_2 = M \cup R$. $\kappa$ must exist by the ham-sandwich theorem.
Regression depth of $\kappa$
Given a fit $\kappa$ obtained by the previously described construction, we define $P^+_2 (P^0_2, P^-_2)$ the observations of $P_2$ for which the residuals with respect to $\kappa$ are positive (null, negative).

By definition of $\kappa$ we know that $P^+_2 \le \left\lfloor n/3 \right\rfloor, P^-_2 \le \left\lfloor n/3 \right\rfloor$ and $P^+_2 + P^0_2 + P^-_2 = \begin{cases} 2n/3 & \text{if } n = 3m \text{ } (m \in \rm I\!N)\\ 2 \left\lfloor n/3 \right\rfloor + 1 & \text{otherwise} \end{cases}$ If we choose a tilting point $v$ on the left of $P_2$ for the fit $\kappa$, by tilting the line until it is vertical, $\kappa$ has to pass all points with positive and null or negative and null residuals.$P^+_2 \cup P^0_2+$ or all points from $P^-_2 \cup P^0_2$.

This leads to a regression depth equal to $\min(P^+_2 \cup P^0_2 , P^-_2 \cup P^0_2) \ge \left\lceil n/3 \right\rceil$.

With the following picture we illustrate the previous proof and construction of the $\kappa$ fit.

• The space is split into three ensembles $L$, $M$, $R$.
• The points in green are points contained in $P^-_2$
• The points in yellow are points contained in $P^+_2$
• The point in orange is contained in $P^0_2$
• The point in red is the tilting point $v$ on the left of $P_2$.

In the example, $n = 10$ and we see that $\min(P^+_2 \cup P^0_2 , P^-_2 \cup P^0_2) \ge \left\lceil 10/3 \right\rceil = 4$

#### Upper bound

In this section we explain two upper bounds for the regression depth of any fit, given a dataset $Z_n = \{(x_i, y_i), i = 1,...,n \} \subset \rm I\!R^2$. The first one is a general and naïve upper bound as the second one is a more specific and more interesting one.

Naïve upper bound for regression depth
By definition of the regression depth given in chapter two, we know that the maximum number of observations that can be removed from $Z_n$ in order to a candidate fit $l$ to become a non-fit is $n$. This situation occurs if $l$ is intersecting all points of $Z_n$.
Upper bound for regression depth
Given a dataset $Z_n = \{(x_i, y_i), i = 1,...,n \} \subset \rm I\!R^2$, if all observations of $Z_n$ are in general position $\max_{l} rdepth(l, Z_n) \le \left\lfloor \frac{n+2}{2} \right\rfloor$ Proof: We define $Z_{n-k}$ the dataset $Z_n$ from which we remove the $k$ observations that lay on the fit $l$ with maximal regression depth. Given the assumption that all observations of $Z_n$ are in general position, we know that $k = 2$.

Given $L_{n-2}$ the set of lines obtained by the dual transformation $D$ on the $n-2$ observations of $Z_{n-2}$ and a point $l$ which is the fit with maximal regression depth in the dual space, we know that for any direction $u$ (not parrallel to any $L_i \in L_n$) such that $\mid\mid u \mid\mid = 1$: $\#\{\text{intersections between the halflines } [l , l \pm u > \text{and lines} \in L_{n-2} \} = n-2$ which implies that $0 \le rdepth(l, Z_{n-k}) \le \left\lfloor \frac{n-2}{2} \right\rfloor$ we now add back the two observations that laid on $l$ $2 \le rdepth(l, Z_n) \le \left\lfloor \frac{n+2}{2} \right\rfloor$

#### When the lower bound is met

If all observations of $Z_n$ have distinct $x_i$ and lay on a convex (or concave) curve, then $\max_{l} rdepth(l, Z_n) = \left\lceil \frac{n+2}{3} \right\rceil$ Proof: Given a datset $Z_n$ and a fit $l$ with maximal regression depth insersecting two observations of $Z_n$. The fit $l$ split the the set of the remaining Z_{n-2} observations in three sets with alternating residuals signs. We know that the maximal regression depth is equal to $2 +$ the number of observations in the smallest set. By the lower bound defined earlier in this chapter, we know that the smallest subset of $Z_{n-2}$ must contain at least $\left\lfloor \frac{n-2}{3} \right\rfloor$. $\left\lfloor \frac{n-2}{3} \right\rfloor + 2 = \left\lceil \frac{n+2}{3} \right\rceil$

The following picture is an explanatory illustration for the previous proof, where $l$ is the fit with maximal regression depth and the three regions alternating the residuals signs. The smallest set being the middle one, with only one element, we know that the maximal regression depth is equal to $2 + 1 = 3 = \left\lceil \frac{7+2}{3} \right\rceil$.