Characterizations of the generalized inverse Gaussian, asymmetric Laplace, and shifted (truncated) exponential laws via independence properties
Abstract.
We prove three new characterizations of the generalized inverse Gaussian (GIG), asymmetric Laplace (AL), shifted exponential (sExp) and shifted truncated exponential (stExp) distributions in terms of non-trivial independence preserving transformations, which were conjectured by Croydon and Sasada in [4]. We do this under the assumptions of absolute continuity and mild regularity conditions on the densities.
Croydon and Sasada [5] use these independence preserving transformations to analyze statistical mechanical models which display KPZ behavior. Our characterizations show the integrability of these models only holds for these four specific distributions in the absolutely continuous setting.
Key words and phrases:
Independence preserving, Burke property, stationarity, generalized inverse Gaussian, asymmetric Laplace, shifted exponential, characterizing distributions2020 Mathematics Subject Classification:
1. Introduction
1.1. Background
Several important distributions preserve independence under certain transformations. Perhaps the most classic example is the normal distribution. Kac [6] and Bernstein [1] both showed that given two non-degenerate real independent random variables and , the two random variables are independent if and only if and are normally distributed with the same variance. This in turn implies that and are both normal with the same variance as well. Note that this transformation is an involution, a property which all transformations in this paper will possess.
Similar independence properties have also been proven for the gamma distribution in Lukacs [8], the geometric and exponential distributions in Crawford [2], the beta distribution in Seshadri and Wesołowski [11], and the product of generalized inverse Gaussian and gamma distributions in Matsumoto and Yor [9] and Seshadri and Wesołowski [10].
The characterizations of Lukacs [8] and Seshadri and Wesołowski [11] were used in Chaumont and Noack [3] to characterize stationarity in dimensional lattice directed polymer models, which in turn characterize the Burke property in the same setting. While many statistical mechanical models are conjectured to belong to the KPZ universality class, computations quickly become intractable without the presence of some sort of integrability structure such as stationarity. In fact, at the moment, only exactly solvable models have allowed for KPZ-type computations, highlighting how useful independence preserving transformations can be. The characterization of independence preserving distributions can therefore aid in the search for new exactly solvable models displaying KPZ behavior.
In this paper, we prove characterizations via independence preserving transformations for the generalized inverse Gaussian (GIG), the asymmetric Laplace (AL), the shifted exponential (sExp), and the shifted truncated exponential (stExp) distributions, all of which were conjectured in Croydon and Sasada [4].
1.2. Main Results
We now recall the definitions of the distributions followed by the theorems that characterize them.
Generalized inverse Gaussian (GIG) distribution: For , , the generalized inverse Gaussian distribution with parameters , which we denote GIG, has density
where is the normalizing constant and is the modified Bessel function of the second kind with parameter .
Theorem 1.1.
Let with and be the involution given by
(1.1) |
Let X and Y be -valued independent random variables with twice-differentiable densities that are strictly positive throughout . Then, the random variables are independent if and only if there exist and such that
in which case, U GIG and V GIG. Hence, if moreover (U,V) has the same distribution as (X,Y), then X GIG and Y GIG for some and .
Remark: A special case of this theorem with , has already been proven by Letac and Wesołowski [7, Theorem 4.1] where they used no density assumptions.
Now recall the definition of the asymmetric Laplace (AL) distribution.
Asymmetric Laplace (AL) distribution: For , the asymmetric Laplace distribution with parameters , which we denote AL, has density
where .
Theorem 1.2.
Let be the involution given by
(1.2) |
Let X and Y be -valued independent random variables with densities that never vanish and are twice-differentiable on . It is then the case that (U,V) are independent if and only if there exist such that
in which case, U AL(r,q) and V AL(q+r,p). Hence, if moreover (U,V) has the same distribution as (X,Y), then X AL(p,q) and Y AL(p+q,p).
Notice that Theorem 1.2 has striking similarities with Wesołowski [12], who showed that given two non-degenerate -valued independent random variables , the random variables under the transformation
are independent if and only if where denotes the beta distribution with shape parameters . Theorem 1.2 has a similar condition where denotes the asymmetric Laplace (AL) distribution with parameters .
Finally, we recall the definitions of the shifted exponential (sExp) and shifted truncated exponential (stExp) distributions. The definitions are then followed by Theorem 1.3 which characterizes them.
Shifted exponential distribution: For , the shifted exponential distribution with parameters , which we denote sExp, has density
where .
Shifted truncated exponential distribution: For with , the shifted truncated exponential distribution with parameters , which we denote stExp, has density
where .
Theorem 1.3.
Let and define
(1.3) |
Then is an involution, while is a bijection. Let X and Y be -valued independent random variables with densities satisfying when with otherwise, and when with otherwise. Moreover, assume the densities are twice-differentiable inside their respective supports. It is then the case that (U,V) are independent if and only if there exists such that
in which case, U stExp and V sExp. Hence, if moreover (U,V) has the same distribution as (X,Y), then X stExp and Y sExp for some . Note that is an involution.
1.3. Structure of the paper
Each of our main results correspond to a conjecture in Croydon and Sasada [4]. Theorem 1.1 solves Conjecture 8.6, Theorem 1.2 solves Conjecture 8.15, and Theorem 1.3 solves Conjecture 8.10.
We first prove Theorem 1.2 and 1.3 and then prove Theorem 1.1. We save the proof of Theorem 1.1 for the end because it is the most involved of the three. In Section 2, we give the proof for Theorem 1.2. We then give the proof for Theorem 1.3 in Section 3. Finally, we prove Theorem 1.1 in Section 4. All three theorems are proven using methods from Chaumont and Noack in [3].
1.4. Acknowledgements
The authors would like to thank Timo Seppäläinen and Philippe Sosoe for their valuable insights.
2. Proof of Theorem 1.2
We begin by partitioning into sections so that the minimum functions in (1.2) may be simplified. Define connected open subsets of by
See Figure 1 below.
Recall that the transformation in this theorem is
which maps ➀ ➃, ➁ ➂, and ➄ ➄. Therefore, defining the set ➀ ➁ ➂ ➃ ➄, we see that is a smooth involution on the open set . Notice that has Lebesgue measure and therefore has no effect on the distributions of the pairs of random variables and . We will therefore only focus on the joint densities with inputs , which in turn implies .
Our approach begins with the joint densities of and , which are related in the following manner
(2.1) |
where denotes the probability distribution function of , and is the Jacobian determinant of the involution .
One can check (as we do later) that throughout ➀, ➁, ➂, ➃, ➄ and therefore throughout O.
Proof.
The if part follows from an easy computation using (2.1). We now prove the only if part. Since are mutually independent and are mutually independent, (2.1) simplifies to
(2.2) |
Since neither nor vanish on , (2.2) implies that neither nor vanish on . Since is a smooth involution on , and and are both twice-differentiable on , the same follows for and .
In section ➀, where and , the transformation simplifies to
which has Jacobian determinant . Plugging the transformation into (2.2), we get
(2.3) |
Taking the logarithm of both sides, and setting gives
(2.4) |
Differentiating (2.4) with respect to yields
(2.5) |
Now differentiate (2.5) with respect to to obtain
(2.6) |
The second equality holds for all because in section ➀, . Thus must be linear on and there must exist some real constants such that
(2.7) |
This implies
(2.8) |
Substituting the derivative of (2.7) into (2.5) gives for all , which in turn implies the existence of some real constant such that
(2.9) |
We have thus shown that the p.d.f.’s of and have exponential forms in the domains and respectively.
In section ➁, where and , the transformation simplifies to
whose Jacobian determinant is . Plugging this transformation into (2.2), we get
(2.10) |
Taking the logarithm of both sides, setting , and differentiating with respect to gives
(2.11) |
Now differentiate (2.11) with respect to to obtain
(2.12) |
The second equality holds for all , because in section ➁, . Thus must be linear throughout and there must exist real constants such that
(2.13) |
This implies that
(2.14) |
Differentiating (2.13) and substituting into (2.11) gives for all which implies the existence of a real constant such that
(2.15) |
We have thus shown that the p.d.f.’s of and both have exponential forms in the domain .
In section ➃, where , the transformation simplifies to
which has Jacobian determinant . Plugging this transformation into (2.2), we get
(2.16) |
Taking the logarithm of both sides, setting , and differentiating with respect to gives
(2.17) |
Now differentiate (2.17) with respect to to obtain
(2.18) |
The second equality holds for all because in ➃, . Thus must be linear throughout , meaning there exist real constants such that
(2.19) |
This implies that
(2.20) |
Substituting the derivative of (2.19) into (2.17) gives for all , which in turn implies the existence of a real constant such that
(2.21) |
We have thus shown that the p.d.f.’s of and have exponential forms in the domains and respectively.
So far, we have proven that the p.d.f.’s of and both have exponential forms in the domain and the p.d.f’s of and both have exponential forms in the domain . Now, if we substitute (2.14), (2.8), (2.9) into (2.3) and (2.20), (2.15), (2.21) into (2.16), we will see that and also have exponential forms in the domain .
Hence, we have
(2.22) |
(2.23) |
Now that we have shown all the random variables have exponential forms everywhere, we proceed to show that for each of the densities the respective constants coefficients in the domains and are equal. Substituting (2.14), (2.8), (2.15), and (2.22) into (2.10) gives
which implies that . Therefore, comparing (2.9) with (2.15) allows us to conclude that the constant coefficients in the domains and are equal for .
In section ➂, where , the transformation simplifies to
which has Jacobian determinant . Plugging this transformation into (2.2), we get
(2.24) |
Substituting (2.14), (2.23), (2.15), and (2.21) into (2.24) gives
which implies that . Therefore, comparing (2.20) with (2.14) allows us to conclude that the constant coefficients in the domains and are equal for .
In section ➄, where and , the transformation simplifies to
which has Jacobian determinant . Plugging this transformation into (2.2), we get
(2.25) |
Substituting (2.20), (2.8), (2.9), and (2.21) into (2.25) gives
(2.26) |
Simplifying (2.26) yields
Combining this with the fact that and implies that and . Therefore, comparing (2.22) with (2.21) and (2.23) with (2.8) allows us to conclude that the constant coefficients in the domains and are equal for and .
Finally, all that remains is to prove that the constant coefficients in their p.d.f.’s have the conjectured form. Using the fact that the p.d.f. of a random variable must integrate to 1, we obtain the densities of . Specifically, we combine and for , and for , and for , and and for to obtain
where are positive constants (due to the fact that the densities are integrable). Replacing by gives us the p.d.f.’s that the conjecture predicts. We have thus proven the only if part of the conjecture. ∎
3. Proof of Theorem 1.3
Similar to the proof of Theorem 1.2, we begin by partitioning into sections so that the minimum functions in (1.3) may be simplified. Define connected open subsets of by
See Figure 2 below.
Recall that the transformation in this theorem is
which maps ➀ ➀ and ➁ ➁. Therefore, defining the set ➀ ➁, we see that is a smooth involution on the open set . Notice that has Lebesgue measure and therefore has no effect on the distributions of the pairs of random variables and . We will therefore only focus on the joint densities with inputs , which in turn implies .
Our approach again begins with the joint densities of and , which are related in the following manner
(3.1) |
where denotes the probability distribution function of , and is the Jacobian determinant of the involution . One can check (as we do later) that throughout ➀, ➁, and therefore O.
Proof.
The if part is easy to check by routine calculation using (3.1). We now prove the only if part. Since are mutually independent and are mutually independent, (3.1) simplifies to
(3.2) |
Since does not vanish on and does not vanish on and is a bijection and an involution on , (3.2) implies that does not vanish on and does not vanish on . Since is a smooth involution on , and and are both twice-differentiable inside their respective supports, we know that and are twice differentiable on and .
In section ➀, where , the transformation simplifies to
whose Jacobian determinant is . Plugging this transformation into (3.2) we get
(3.3) |
Restricting the domains of and , taking logarithms of both sides of (3.3), and setting gives
(3.4) |
for all such that . Differentiating (3.4) with respect to yields
(3.5) |
Now differentiate (3.5) with respect to to obtain
(3.6) |
The second equality holds for all because in section ➀, . Thus must be linear on and there must exist some real constants such that:
(3.7) |
This implies that
(3.8) |
Substituting the derivative of (3.7) into (3.5) gives for all , which in turn implies the existence of some real constant such that
which implies
(3.9) |
We have thus shown that the p.d.f.’s of and have exponential forms in and .
In section ➁, where , the transformation simplifies to
which has Jacobian determinant . Plugging this into (3.2) we get
(3.10) |
Restricting the domains of and , substituting (3.8) and (3.9) into (3.10), taking the logarithm of both sides, and setting gives
(3.11) |
for all such that . Differentiating (3.11) with respect to and yields
The equality holds for all because in section ➁, . This implies that for some real constants
(3.12) |
(3.13) |
We have thus shown that the p.d.f.’s of and have exponential forms in and .
Recall that we showed on . Since a proper density function must integrate to 1, we must have . Setting , we see that and hence all four probability density functions have the desired rate parameter.
We finish the proof by showing that and have supports corresponding to stExp and sExp distributions. Recall that we have already shown for all and for all . All that remains is to show for all and for all .
By assumption, when , otherwise, when and otherwise.
By equation (3.3), we have throughout section ➀, where . For all , , so
(3.14) |
whenever and . Taking any gives since . Thus, (3.14) implies for all since .
By equation (3.10), we have throughout section ➁, where . For all , , so
(3.15) |
whenever and . Taking any gives . Thus, (3.15) implies for all since .
Combining all the above gives us the p.d.f.’s that the conjecture predicts. We have thus proven the only if part of the conjecture. ∎
4. Proof of Theorem 1.1
Recall the transformation
Before we begin the proof, we make the useful observation that . We also compute the following partial derivatives:
(4.1) | ||||||
Our approach again begins by analyzing the relationship between the joint densities of and , which are related in the following manner
(4.2) |
where is the Jacobian determinant of the transformation . Using (LABEL:partials) one can compute .
Proof of Theorem 1.3.
The if part is easy to check by routine calculation using (4.2). We now prove the only if part. Since are mutually independent and are mutually independent, (4.2) simplifies to
(4.3) |
Since neither nor vanish on , (4.3) implies that and do not vanish on as well. Since is a smooth involution, and and are both twice differentiable, so are and . Now taking logarithms and setting gives us
(4.4) |
Now, taking mixed partials of both sides we get
Substituting in the values of the partials from (LABEL:partials), using the fact that , multiplying both sides by the common denominator , and rearranging gives us
(4.5) | ||||
Since is an involution, equation (4.5) holds for all . By first fixing any , we see the right-hand side (and therefore the left-hand side) of (4.5) must be a second degree polynomial in for all . Since second degree polynomials are smooth, it follows that is smooth on . Similarly, for each fixed , the left-hand side (and therefore the right-hand side) must be a second degree polynomial in for all . It now follows that is smooth on . The fact that and are both smooth now implies that when writing the right-hand side of (4.5) out as a second degree polynomial in , each of the coefficients has a smooth dependence on . This implies that each of these coefficient must in turn themselves be at most second degree polynomials in with no -dependence. We may therefore set both sides of (4.5) equal to
for where are fixed constants.
Taking the limits of equation (4.5) and the polynomial as and , we get respectively
These two differential equations hold for all and can be explicitly solved for resulting in
(4.6) | ||||
where are fixed constants. Notice that the forms of and are very close to that of the logarithm of the p.d.f. of the GIG distribution. To finish the proof, we just have to show that and that the remaining parameters are related exactly as conjectured.
Substituting the derivatives of and into equation (4.5) gives us
Combining like terms and simplifying, we have
(4.7) | ||||
Both sides of (4.7) are polynomials in and , so we equate like terms. Since the left-hand side has no term and the right-hand side has no term, it follows that . Continuing to equate the and terms we get
Now set , , , , and plug these into (4.6) to get
The fact that and are densities, and are therefore integrable, implies the constants have the right signs and that
Finally, since and is an involution, we have . Now the if part of the theorem implies that and have GIG distributions with the desired parameters.
∎
References
- [1] S. Bernstein, On a property characteristic for the normal law, Trudy Leningrad Polytechnic Institute, 3, (1941), 21-22.
- [2] G. B. Crawford, Characterization of geometric and exponential distributions, Ann. Math. Statist., 37 (1966), 1790-1795.
- [3] H. Chaumont and C. Noack, Characterizing stationary dimensional lattice polymer models, Electronic Journal of Probability, 23 (2018), 1-19.
- [4] D. A. Croydon and M. Sasada, Detailed balance and invariant measures for systems of locally-defined dynamics, arXiv:2007.06203, (2020).
- [5] D. A. Croydon and M. Sasada, On the stationary solutions of random polymer models and their zero-temperature limits, arXiv:2104.0345, (2021).
- [6] M. Kac, On a characterization of the normal distribution, American Journal of Mathematics, 61 (1939), 726-728.
- [7] G. Letac and J. Wesowłowski, An independence property for the product of GIG and gamma laws, The Annals of Probability, 28 (2000), 1371-1383.
- [8] E. Lukacs, A characterization of the gamma distribution, The Annals of Mathematical Statistics, 26 (1955), 310-324.
- [9] H. Matsumoto and M. Yor, An analogue of Pitman’s theorem for exponential Wiener functionals. II. The role of the generalized inverse Gaussian laws, Nagoya Math. J., 162 (2001), 65-86.
- [10] V. Seshadri and J. Wesołowski, Mutual characterizations of the gamma and the generalized inverse gaussian laws by constancy of regression, Sankhyā. The Indian Journal of Statistics. Series A, 63 (2001), 107-112.
- [11] V. Seshadri and J. Wesołowski, Constancy of regressions for Beta distributions, Sankhyā. The Indian Journal of Statistics, 65 (2003), 284-291.
- [12] J. Wesołowski, On a functional equation related to an independence property for beta distributions, Aequationes Mathematicae, 66 (2003), 156-163.