TensorFlow-Serving-Lite written in Go
Prepare the model by running training/create.py
.
Now this model can be served either by tensorflow-serving
using
server@host $ cd server/tensorflow-serving
server@host $ ./run.sh
or
server@host $ cd server/tensorflow-serving-lite
server@host $ go build tensorflow-serving-lite.go && ./tensorflow-serving-lite
TensorFlow-Serving-Lite has currently hard-coded end-points.