@anverhousseini did you find anything about this? I'm looking at this as well.
@prbsparx I'm running several nginx proxies successfully. Do you need any help?
I'm also interesting in setting up nginx proxies. Can you share your experience and config ?
I also would be interested in your configuration documentation if you were willing to share.
This is our config, we are managing the SSL cert with Let's Encrypt:
upstream jamf_backend {
zone tomcat 64k;
ip_hash;
server 192.168.1.10:8080 max_fails=3 fail_timeout=30s;
server 192.168.1.11:8080 max_fails=3 fail_timeout=30s;
}
server {
if ($host = jamf.example.com) {
return 301 https://$host$request_uri;
}
listen 80;
server_name jamf.example.com;
location / {
return 301 https://$server_name$request_uri;
}
}
server {
listen 443 ssl http2 default_server;
server_name jamf.example.com;
ssl_certificate /path/to/cert.pem
ssl_certificate_key /path/to/key.pem
include /etc/letsencrypt/options-ssl-nginx.conf;
ssl_dhparam /etc/nginx/ssl/dhparam.pem;
client_max_body_size 1024M;
send_timeout 300;
location / {
include /etc/nginx/proxy_params;
proxy_connect_timeout 300;
proxy_send_timeout 300;
proxy_read_timeout 90m;
send_timeout 300;
proxy_pass http://jamf_backend/;
proxy_redirect off;
error_page 500 502 503 504 /custom_50x.html;
location = /custom_50x.html {
root /usr/share/nginx/html;
internal;
}
}
}
@anverhousseini I'm running into an issue with flushing logs specifically where I get 504 Timeouts from the NGINX load balancer.
You mentioned these settings:
proxy_connect_timeout 300;
proxy_send_timeout 300;
proxy_read_timeout 90m;
send_timeout 300;
How did you determine what value to set for each? Was there anything that helped you gauge it better?