Skip to content

Conversation

@wind57
Copy link
Contributor

@wind57 wind57 commented Oct 21, 2024

No description provided.

wind57 and others added 30 commits December 4, 2021 07:59
@wind57
Copy link
Contributor Author

wind57 commented Oct 21, 2024

First thing was for me to reproduce this one locally, which I did.

The 502 Bad Gateway was really coming from ingress, it could not connect to the underlying Spring app that we were running for the test. The test we have in place goes like this:

  • read a property from a config map
  • do polling for that config-map
  • change the config-map. because we do polling, we catch that event and issue a SHUTDOWN of the app, which is really a ConfigurableApplicationContext::close

I simplified the test for myself a bit, so that I would not run in k3s, but connect to my local running cluster. This way I could debug the application far easier. This did not help much :) but I noticed something interesting: when I issue a ConfigurableApplicationContext::close, I see that the netty engine goes down, but the Spring application/process does not. This was a hint that there are hanging threads... So I did a jstack of the process after issuing a ConfigurableApplicationContext::close and noticed that there are some okhttp3 related threads that are not shutdown (non-daemon threads), which was a clear indication that fabric8 is not really shutdown.

From here, it was easy, implement KubernetesClient::close. Most probably things have changed inside fabric8 library in way that we now get in trouble, but the blame is on us anyway, as we were never closing the client...

@wind57 wind57 marked this pull request as ready for review October 21, 2024 15:43
@wind57
Copy link
Contributor Author

wind57 commented Oct 21, 2024

@ryanjbaxter fixes the main build failure

return new Fabric8PodUtils(client);
}

@PreDestroy
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we just add @Bean(destroyMethod = "close") to the KubernetesClient bean above?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm guessing that didn't work?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hey, sorry, it worked partially. It did not work when we use config data api and register this bean via ConfigurableBootstrapContext. Your idea is much nicer and more correct then mine, its just that I don't know how to register the same destroyMethod in this case, but I'm reading the code a bit. Let me see if I find a way to do it

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

eh no, it did not work :( I don't know why though... may be we can push this fix, and if I figure out your suggestion, amend it later

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If I read the documentation of destroyMethod on a @Bean, we should not even have to add it explicitly and it must work, but my tests, and pipeline failures, show that its not really the case.

@wind57 wind57 marked this pull request as draft October 21, 2024 19:05
@wind57 wind57 marked this pull request as ready for review October 21, 2024 20:01
@wind57 wind57 requested a review from ryanjbaxter October 22, 2024 08:32
@wind57
Copy link
Contributor Author

wind57 commented Oct 22, 2024

I took another pass for a few hours today in the morning to see if I understand why your suggesting does not work, and I did not succeeded. I am still curious what is going on, but as long as we have a solution in place, I think we are good. wdyt?

@wind57
Copy link
Contributor Author

wind57 commented Oct 22, 2024

actually this should go into 3.1.x, I see no reason to stay in main only. I'll close this one and open a new PR against 3.1.x

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants