I have a REST endpoint that is deployed in a docker container using
Dockerfile snippet
ENTRYPOINT service nginx start | tensorflow_model_server --rest_api_port=8501 \
--model_name= <modelname> \
--model_base_path=/<Saved model path>
The REST endpoint is for a Tensorflow model.
I am using NGINX to accept the requests and route them to the REST endpoint
nginx.conf snippet
listen 8080 deferred;
client_max_body_size 500M;
location /invocations {
proxy_pass http://localhost:8501/v1/models/<modelname>:predict;
}
location /ping {
return 200 "OK";
}
Docker desktop version: 2.1.0.5 OS: macOS High Sierra 10.13.4
Invoking this service worked fine all this while but now throws the below Broken Pipe error starting yesterday. Can someone please help?
Using TensorFlow backend.
Traceback (most recent call last):
File "/Users/ls_spiderman/.pyenv/versions/3.6.8/lib/python3.6/site-packages/urllib3/connectionpool.py", line 672, in urlopen
chunked=chunked,
File "/Users/ls_spiderman/.pyenv/versions/3.6.8/lib/python3.6/site-packages/urllib3/connectionpool.py", line 387, in _make_request
conn.request(method, url, **httplib_request_kw)
File "/Users/ls_spiderman/.pyenv/versions/3.6.8/lib/python3.6/http/client.py", line 1239, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/Users/ls_spiderman/.pyenv/versions/3.6.8/lib/python3.6/http/client.py", line 1285, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/Users/ls_spiderman/.pyenv/versions/3.6.8/lib/python3.6/http/client.py", line 1234, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/Users/ls_spiderman/.pyenv/versions/3.6.8/lib/python3.6/http/client.py", line 1065, in _send_output
self.send(chunk)
File "/Users/ls_spiderman/.pyenv/versions/3.6.8/lib/python3.6/http/client.py", line 986, in send
self.sock.sendall(data)
BrokenPipeError: [Errno 32] Broken pipe