Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

negative lambda and others #3

Open
funfwo opened this issue Mar 1, 2022 · 1 comment
Open

negative lambda and others #3

funfwo opened this issue Mar 1, 2022 · 1 comment

Comments

@funfwo
Copy link

funfwo commented Mar 1, 2022

This is a great package. And I have some questions.
Recently I come across the KM perspective to estimate levy triplet by virtue of the book of Mr Tabar published in 2019. Statisticians, especially French scholars, have done a lot on it. However, real time series are traces of dynamic processes ruled by some unknown laws. Methods that lack the support of some natural law governance yet only focus on data may not be the right road. And the KM perspective shows me a direction.
I tried your package last night with some real time series, but many of them are negative lambda. Besides, I want to use the estimated lambda and xi to generate surrogate process to compare with the original time series (I also want to get drift and diffusion from the moments but moments are not scalar,thus I calculated from the original), but the generated X has many nan. Could you give some guidance?
And the KM perspective is a nonparametric method, which may have some similarities with the statistical methods worked by, e.g. Fabienne Comte or Jean Jacod. Could you show some details on them?
Finally neural network is nonparametric method to approximate the unknown distribution by stacked neural layers, do you have some plan to try them?

Many thanks

@LRydin
Copy link
Owner

LRydin commented Mar 3, 2022

Hey there @funfwo, thanks for the questions. I'll start by trying to reply the first question first, i.e., the more practical one regarding negative lambdas and nans. I'll get to the other questions in time, but their are harder to answer.

So, the first thing you can try to get possibly more reasonable values for lambda is to change correction to False, that is,

import jumpdiff as jd

# with X some time series
edges, moments = jd.moments(X, correction=False)

The correction term is an improvement on the estimation of the Kramers–Moyal coefficients, described in the package, which can sometimes help improve estimation. It is set as default, but unfortunately it does not always lead to better results. I think that should help get a more realistic lambda.

Now, dealing with nan's is not something jumpdiff can do, unfortunately. I would suggest you either brute-force remove these, with for example X = X[~np.isnan(X)], of use forward or backward fill, as for example using pandas and X = pd.DataFrame(X).bfill().values.flatten().

I'll get around to your other questions soon enough – from my point of view, Fabienne Comte's and co's work is always very interesting but difficult to implement numerically.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants