Timeout during large import operation #10002
Unanswered
craigmoscardini
asked this question in
Q&A
Replies: 3 comments 1 reply
-
Both the nginx proxy request timeout and the uWSGI or gunicorn app server request timeout can be configured to allow these requests to complete, or at least allow you to submit in larger chunks. Or if this is a one-time import, figure out if reconfiguring the app server is going to be more time effective than just completing the import.
—
Mark Tinberg ***@***.***>
Division of Information Technology-Network Services
University of Wisconsin-Madison
…________________________________
From: craigmoscardni ***@***.***>
Sent: Sunday, August 14, 2022 1:08 PM
To: netbox-community/netbox ***@***.***>
Cc: Subscribed ***@***.***>
Subject: [netbox-community/netbox] Timeout during large import operation (Discussion #10002)
I am trying to import data into a new installation. Originally running version 3.2.7 and since upgraded to 3.2.8. The problem large imports using CSV files or copying formatted data take too long and time out. Same if I delete a large amount of items (IPs/devices etc). Depending on the data type the maximum I can import at any time is between 1000 and 2000 lines. The largest file I have is interfaces, with nearly 80,000 lines. The error appears to be caused by nginx. Is there anything that can be changed to allow the job to continue until completion?
—
Reply to this email directly, view it on GitHub<#10002>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/AAS7UMZPYZVGJF3Q5GIVRFDVZEY2TANCNFSM56QHPGXA>.
You are receiving this because you are subscribed to this thread.Message ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
0 replies
-
Thanks Mark. Don't suppose you know where the appropriate config is on a default install? I had modified what I thought would be the value in the nginx config file but that didn't improve |
Beta Was this translation helpful? Give feedback.
0 replies
-
I don't know nginx, I'm using the lae.netbox Ansible playbook with uWSGI and an Apache frontend
So in Apache I have
Alias /static /opt/netbox/current/netbox/static
# ProxyPass proxies requests to local gunicorn wsgi service
ProxyPass "/" "http://{{ netbox_socket }}/"
# ProxyPassReverse rewrites Location, Content-Location and URI headers in responses
ProxyPassReverse "/" "http://{{ netbox_socket }}/"
# Pass along Apache authenticated user to wsgi service
# Hint to Django app that redirects should use same scheme (https) as original request
RequestHeader set X-Forwarded-Proto "%{REQUEST_SCHEME}s"
# Incident XXXXXXXX GET /api/docs/?format=openapi taking too long for Ansible
ProxyTimeout 900
in the netbox config yaml source I have
RQ_DEFAULT_TIMEOUT: 900
I don't have a timeout in the uwsgi.ini.
—
Mark Tinberg ***@***.***>
Division of Information Technology-Network Services
University of Wisconsin-Madison
…________________________________
From: craigmoscardni ***@***.***>
Sent: Tuesday, August 16, 2022 1:39 AM
To: netbox-community/netbox ***@***.***>
Cc: Mark Tinberg ***@***.***>; Comment ***@***.***>
Subject: Re: [netbox-community/netbox] Timeout during large import operation (Discussion #10002)
Thanks Mark. Don't suppose you know where the appropriate config is on a default install? I had modified what I thought would be the value in the nginx config file but that didn't improve
—
Reply to this email directly, view it on GitHub<#10002 (comment)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/AAS7UMYFLNKPAMI266BNF53VZMZTNANCNFSM56QHPGXA>.
You are receiving this because you commented.Message ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I am trying to import data into a new installation. Originally running version 3.2.7 and since upgraded to 3.2.8. The problem large imports using CSV files or copying formatted data take too long and time out. Same if I delete a large amount of items (IPs/devices etc). Depending on the data type the maximum I can import at any time is between 1000 and 2000 lines. The largest file I have is interfaces, with nearly 80,000 lines. The error appears to be caused by nginx. Is there anything that can be changed to allow the job to continue until completion?
Beta Was this translation helpful? Give feedback.
All reactions