Page MenuHomePhabricator (Chris)

No OneTemporary

Size
19 KB
Referenced Files
None
Subscribers
None
diff --git a/README.md b/README.md
index e04fcbc..f38710c 100644
--- a/README.md
+++ b/README.md
@@ -1,72 +1,77 @@
# Tensorflow Implementation of Yahoo's Open NSFW Model
This repository contains an implementation of [Yahoo's Open NSFW Classifier](https://github.com/yahoo/open_nsfw) rewritten in tensorflow.
The original caffe weights have been extracted using [Caffe to TensorFlow](https://github.com/ethereon/caffe-tensorflow). You can find them at `data/open_nsfw-weights.npy`.
## Prerequisites
All code should be compatible with `Python 3.6` and `Tensorflow 1.x` (tested with 1.12). The model implementation can be found in `model.py`.
### Usage
```
> python classify_nsfw.py -m data/open_nsfw-weights.npy test.jpg
Results for 'test.jpg'
SFW score: 0.9355766177177429
NSFW score: 0.06442338228225708
```
__Note:__ Currently only jpeg images are supported.
`classify_nsfw.py` accepts some optional parameters you may want to play around with:
```
usage: classify_nsfw.py [-h] -m MODEL_WEIGHTS [-l {yahoo,tensorflow}]
[-t {tensor,base64_jpeg}]
input_jpeg_file
positional arguments:
input_file Path to the input image. Only jpeg images are
supported.
optional arguments:
-h, --help show this help message and exit
-m MODEL_WEIGHTS, --model_weights MODEL_WEIGHTS
Path to trained model weights file
-l {yahoo,tensorflow}, --image_loader {yahoo,tensorflow}
image loading mechanism
- -t {tensor,base64_jpeg}, --input_type {tensor,base64_jpeg}
+ -i {tensor,base64_jpeg}, --input_type {tensor,base64_jpeg}
input type
```
__-l/--image-loader__
The classification tool supports two different image loading mechanisms.
* `yahoo` (default) replicates yahoo's original image loading and preprocessing. Use this option if you want the same results as with the original implementation
* `tensorflow` is an image loader which uses tensorflow exclusively (no dependencies on `PIL`, `skimage`, etc.). Tries to replicate the image loading mechanism used by the original caffe implementation, differs a bit though due to different jpeg and resizing implementations. See [this issue](https://github.com/mdietrichstein/tensorflow-open_nsfw/issues/2#issuecomment-346125345) for details.
__Note:__ Classification results may vary depending on the selected image loader!
-__-t/--input_type__
+__-i/--input_type__
Determines if the model internally uses a float tensor (`tensor` - `[None, 224, 224, 3]` - default) or a base64 encoded string tensor (`base64_jpeg` - `[None, ]`) as input. If `base64_jpeg` is used, then the `tensorflow` image loader will be used, regardless of the _-l/--image-loader_ argument.
### Tools
The `tools` folder contains some utility scripts to test the model.
-__export_graph.py__
+__create_predict_request.py__
+
+Takes an input image and generates a json file suitable for prediction requests to a Open NSFW Model deployed with [Google Cloud ML Engine](https://cloud.google.com/ml-engine/docs/concepts/prediction-overview) (`gcloud ml-engine predict`) or [tensorflow-serving](https://www.tensorflow.org/serving/).
-Exports the tensorflow graph and checkpoint. Freezes and optimizes the graph per default for improved inference and deployment usage (e.g. Android, iOS, etc.). Import the graph with `tf.import_graph_def`.
__export_savedmodel.py__
Exports the model using the tensorflow serving export api (`SavedModel`). The export can be used to deploy the model on [Google Cloud ML Engine](https://cloud.google.com/ml-engine/docs/concepts/prediction-overview), [Tensorflow Serving]() or on mobile (haven't tried that one yet).
-__create_predict_request.py__
+__export_tflite.py__
+
+Exports the model in [TFLite format](https://www.tensorflow.org/lite/). Use this one if you want to run inference on mobile or IoT devices. Please not that the `base64_jpeg` input type does not work with TFLite since the standard runtime is missing a number of required tensorflow operations.
+
+__export_graph.py__
-Takes an input image and spits out an json file suitable for prediction requests to a Open NSFW Model deployed with [Google Cloud ML Engine](https://cloud.google.com/ml-engine/docs/concepts/prediction-overview) (`gcloud ml-engine predict`) or [tensorflow-serving](https://www.tensorflow.org/serving/).
+Exports the tensorflow graph and checkpoint. Freezes and optimizes the graph per default for improved inference and deployment usage (e.g. Android, iOS, etc.). Import the graph with `tf.import_graph_def`.
\ No newline at end of file
diff --git a/classify_nsfw.py b/classify_nsfw.py
index d609775..e5c4c7b 100644
--- a/classify_nsfw.py
+++ b/classify_nsfw.py
@@ -1,69 +1,69 @@
#!/usr/bin/env python
import sys
import argparse
import tensorflow as tf
from model import OpenNsfwModel, InputType
from image_utils import create_tensorflow_image_loader
from image_utils import create_yahoo_image_loader
import numpy as np
IMAGE_LOADER_TENSORFLOW = "tensorflow"
IMAGE_LOADER_YAHOO = "yahoo"
def main(argv):
parser = argparse.ArgumentParser()
parser.add_argument("input_file", help="Path to the input image.\
Only jpeg images are supported.")
parser.add_argument("-m", "--model_weights", required=True,
help="Path to trained model weights file")
parser.add_argument("-l", "--image_loader",
default=IMAGE_LOADER_YAHOO,
help="image loading mechanism",
choices=[IMAGE_LOADER_YAHOO, IMAGE_LOADER_TENSORFLOW])
- parser.add_argument("-t", "--input_type",
+ parser.add_argument("-i", "--input_type",
default=InputType.TENSOR.name.lower(),
help="input type",
choices=[InputType.TENSOR.name.lower(),
InputType.BASE64_JPEG.name.lower()])
args = parser.parse_args()
model = OpenNsfwModel()
with tf.Session() as sess:
input_type = InputType[args.input_type.upper()]
model.build(weights_path=args.model_weights, input_type=input_type)
fn_load_image = None
if input_type == InputType.TENSOR:
if args.image_loader == IMAGE_LOADER_TENSORFLOW:
fn_load_image = create_tensorflow_image_loader(tf.Session(graph=tf.Graph()))
else:
fn_load_image = create_yahoo_image_loader()
elif input_type == InputType.BASE64_JPEG:
import base64
fn_load_image = lambda filename: np.array([base64.urlsafe_b64encode(open(filename, "rb").read())])
sess.run(tf.global_variables_initializer())
image = fn_load_image(args.input_file)
predictions = \
sess.run(model.predictions,
feed_dict={model.input: image})
print("Results for '{}'".format(args.input_file))
print("\tSFW score:\t{}\n\tNSFW score:\t{}".format(*predictions[0]))
if __name__ == "__main__":
main(sys.argv)
diff --git a/tools/create_predict_request.py b/tools/create_predict_request.py
index 5a5a352..f1b4f0b 100644
--- a/tools/create_predict_request.py
+++ b/tools/create_predict_request.py
@@ -1,25 +1,76 @@
import base64
import json
import argparse
+import numpy as np
+import tensorflow as tf
from tensorflow.python.saved_model.signature_constants import PREDICT_INPUTS
-"""base64 encodes the given input jpeg and outputs json data suitable for
-'gcloud ml-engine predict' requests to a model generated with 'export-model.py'
+import os
+import sys
+
+sys.path.append((os.path.normpath(
+ os.path.join(os.path.dirname(os.path.realpath(__file__)),
+ '..'))))
+
+from image_utils import create_tensorflow_image_loader
+from image_utils import create_yahoo_image_loader
+from model import InputType
+
+IMAGE_LOADER_TENSORFLOW = "tensorflow"
+IMAGE_LOADER_YAHOO = "yahoo"
+
+# Thanks to https://stackoverflow.com/a/47626762
+class NumpyEncoder(json.JSONEncoder):
+ def default(self, obj):
+ if isinstance(obj, np.ndarray):
+ return obj.tolist()
+ return json.JSONEncoder.default(self, obj)
+
+"""Generates a json prediction request suitable for consumption by a model
+generated with 'export-model.py' and deployed on either ml-engine or tensorflow-serving
"""
if __name__ == "__main__":
parser = argparse.ArgumentParser()
parser.add_argument("input_file", help="Path to the input image file")
+
+ parser.add_argument("-i", "--input_type", required=True,
+ default=InputType.TENSOR.name.lower(),
+ help="Input type",
+ choices=[InputType.TENSOR.name.lower(),
+ InputType.BASE64_JPEG.name.lower()])
+
+ parser.add_argument("-l", "--image_loader", required=False,
+ default=IMAGE_LOADER_YAHOO,
+ help="Image loading mechanism. Only relevant when using input_type 'tensor'",
+ choices=[IMAGE_LOADER_YAHOO, IMAGE_LOADER_TENSORFLOW])
+
parser.add_argument("-t", "--target", required=True,
choices=['ml-engine', 'tf-serving'],
- help="Create json for ml-engine or tensorflow-serving")
+ help="Create json request for ml-engine or tensorflow-serving")
args = parser.parse_args()
target = args.target
- image_b64 = base64.urlsafe_b64encode(open(args.input_file, "rb").read())
+ input_type = InputType[args.input_type.upper()]
+
+ image_data = None
+
+ if input_type == InputType.TENSOR:
+ fn_load_image = None
+
+ if args.image_loader == IMAGE_LOADER_TENSORFLOW:
+ with tf.Session() as sess:
+ fn_load_image = create_tensorflow_image_loader(sess)
+ sess.run(tf.global_variables_initializer())
+ image_data = fn_load_image(args.input_file)[0]
+ else:
+ image_data = create_yahoo_image_loader(tf.Session(graph=tf.Graph()))(args.input_file)[0]
+ elif input_type == InputType.BASE64_JPEG:
+ import base64
+ image_data = base64.urlsafe_b64encode(open(args.input_file, "rb").read()).decode("ascii")
if target == "ml-engine":
- print(json.dumps({PREDICT_INPUTS: image_b64.decode("ascii")}))
+ print(json.dumps({PREDICT_INPUTS: image_data}, cls=NumpyEncoder))
elif target == "tf-serving":
- print(json.dumps({"instances": [image_b64.decode("ascii")]}))
+ print(json.dumps({"instances": [image_data]}, cls=NumpyEncoder))
diff --git a/tools/export_graph.py b/tools/export_graph.py
index 8381016..5b5f7cc 100644
--- a/tools/export_graph.py
+++ b/tools/export_graph.py
@@ -1,116 +1,123 @@
import os
import sys
import argparse
import tensorflow as tf
from tensorflow.python.tools import freeze_graph
from tensorflow.python.tools import optimize_for_inference_lib
sys.path.append((os.path.normpath(
os.path.join(os.path.dirname(os.path.realpath(__file__)),
'..'))))
from model import OpenNsfwModel, InputType
"""Exports the graph so it can be imported via import_graph_def
The exported model takes an base64 encoded string tensor as input
"""
if __name__ == "__main__":
parser = argparse.ArgumentParser()
parser.add_argument("target", help="output directory")
parser.add_argument("-m", "--model_weights", required=True,
help="Path to trained model weights file")
+ parser.add_argument("-i", "--input_type", required=True,
+ default=InputType.TENSOR.name.lower(),
+ help="Input type",
+ choices=[InputType.TENSOR.name.lower(),
+ InputType.BASE64_JPEG.name.lower()])
+
parser.add_argument("-o", "--optimize", action='store_true',
default=False,
help="Optimize graph for inference")
parser.add_argument("-f", "--freeze", action='store_true',
required=False, default=False,
help="Freeze graph: convert variables to ops")
parser.add_argument("-t", "--text", action='store_true',
required=False, default=False,
help="Write graph as binary (.pb) or text (pbtext)")
args = parser.parse_args()
model = OpenNsfwModel()
export_base_path = args.target
do_freeze = args.freeze
do_optimize = args.optimize
- as_binary = args.binary
+ as_binary = not args.text
+ input_type = InputType[args.input_type.upper()]
input_node_name = 'input'
output_node_name = 'predictions'
base_name = 'open_nsfw'
checkpoint_path = os.path.join(export_base_path, base_name + '.ckpt')
if as_binary:
graph_name = base_name + '.pb'
else:
graph_name = base_name + '.pbtxt'
graph_path = os.path.join(export_base_path, graph_name)
frozen_graph_path = os.path.join(export_base_path,
'frozen_' + graph_name)
optimized_graph_path = os.path.join(export_base_path,
'optimized_' + graph_name)
with tf.Session() as sess:
model.build(weights_path=args.model_weights,
- input_type=InputType.BASE64_JPEG)
+ input_type=input_type)
sess.run(tf.global_variables_initializer())
saver = tf.train.Saver()
saver.save(sess, save_path=checkpoint_path)
print('Checkpoint exported to {}'.format(checkpoint_path))
tf.train.write_graph(sess.graph_def, export_base_path, graph_name,
as_text=not as_binary)
print('Graph exported to {}'.format(graph_path))
if do_freeze:
print('Freezing graph...')
freeze_graph.freeze_graph(
input_graph=graph_path, input_saver='',
input_binary=as_binary, input_checkpoint=checkpoint_path,
output_node_names=output_node_name,
restore_op_name='save/restore_all',
filename_tensor_name='save/Const:0',
output_graph=frozen_graph_path, clear_devices=True,
initializer_nodes='')
print('Frozen graph exported to {}'.format(frozen_graph_path))
graph_path = frozen_graph_path
if do_optimize:
print('Optimizing graph...')
input_graph_def = tf.GraphDef()
with tf.gfile.Open(graph_path, 'rb') as f:
data = f.read()
input_graph_def.ParseFromString(data)
output_graph_def =\
optimize_for_inference_lib.optimize_for_inference(
input_graph_def,
[input_node_name],
[output_node_name],
tf.float32.as_datatype_enum)
f = tf.gfile.FastGFile(optimized_graph_path, 'wb')
f.write(output_graph_def.SerializeToString())
print('Optimized graph exported to {}'
.format(optimized_graph_path))
diff --git a/tools/export_savedmodel.py b/tools/export_savedmodel.py
index 28021f1..2579bb1 100644
--- a/tools/export_savedmodel.py
+++ b/tools/export_savedmodel.py
@@ -1,67 +1,72 @@
import os
import sys
import argparse
import tensorflow as tf
from tensorflow.python.saved_model import builder as saved_model_builder
from tensorflow.python.saved_model.signature_def_utils\
import predict_signature_def
from tensorflow.python.saved_model.tag_constants import SERVING
from tensorflow.python.saved_model.signature_constants\
import DEFAULT_SERVING_SIGNATURE_DEF_KEY
from tensorflow.python.saved_model.signature_constants import PREDICT_INPUTS
from tensorflow.python.saved_model.signature_constants import PREDICT_OUTPUTS
sys.path.append((os.path.normpath(
os.path.join(os.path.dirname(os.path.realpath(__file__)),
'..'))))
from model import OpenNsfwModel, InputType
"""Builds a SavedModel which can be used for deployment with
gcloud ml-engine, tensorflow-serving, ...
-
-The exported model takes an base64 encoded string tensor as input
"""
if __name__ == "__main__":
parser = argparse.ArgumentParser()
parser.add_argument("target", help="output directory")
+ parser.add_argument("-i", "--input_type", required=True,
+ default=InputType.TENSOR.name.lower(),
+ help="Input type",
+ choices=[InputType.TENSOR.name.lower(),
+ InputType.BASE64_JPEG.name.lower()])
+
parser.add_argument("-v", "--export_version",
help="export model version",
default="1")
parser.add_argument("-m", "--model_weights", required=True,
help="Path to trained model weights file")
args = parser.parse_args()
model = OpenNsfwModel()
export_base_path = args.target
export_version = args.export_version
+ input_type = InputType[args.input_type.upper()]
export_path = os.path.join(export_base_path, export_version)
with tf.Session() as sess:
model.build(weights_path=args.model_weights,
- input_type=InputType.BASE64_JPEG)
+ input_type=input_type)
sess.run(tf.global_variables_initializer())
builder = saved_model_builder.SavedModelBuilder(export_path)
builder.add_meta_graph_and_variables(
sess, [SERVING],
signature_def_map={
DEFAULT_SERVING_SIGNATURE_DEF_KEY: predict_signature_def(
inputs={PREDICT_INPUTS: model.input},
outputs={PREDICT_OUTPUTS: model.predictions}
)
}
)
builder.save()
diff --git a/tools/export_tflite.py b/tools/export_tflite.py
index 7a1d4a1..d6a90f8 100644
--- a/tools/export_tflite.py
+++ b/tools/export_tflite.py
@@ -1,44 +1,49 @@
import os
import sys
import argparse
import tensorflow as tf
sys.path.append((os.path.normpath(
os.path.join(os.path.dirname(os.path.realpath(__file__)),
'..'))))
from model import OpenNsfwModel, InputType
"""Exports a tflite version of tensorflow-open_nsfw
-The exported model takes an base64 encoded string tensor as input.
-
Note: The standard TFLite runtime does not support all required ops.
You will have to implement the missing ones by yourself.
"""
if __name__ == "__main__":
parser = argparse.ArgumentParser()
parser.add_argument("target", help="output filename, e.g. 'open_nsfw.tflite'")
+ parser.add_argument("-i", "--input_type", required=True,
+ default=InputType.TENSOR.name.lower(),
+ help="Input type. Warning: base64_jpeg does not work with the standard TFLite runtime since a lot of operations are not supported",
+ choices=[InputType.TENSOR.name.lower(),
+ InputType.BASE64_JPEG.name.lower()])
+
parser.add_argument("-m", "--model_weights", required=True,
help="Path to trained model weights file")
args = parser.parse_args()
model = OpenNsfwModel()
export_path = args.target
+ input_type = InputType[args.input_type.upper()]
with tf.Session() as sess:
model.build(weights_path=args.model_weights,
- input_type=InputType.BASE64_JPEG)
+ input_type=input_type)
sess.run(tf.global_variables_initializer())
converter = tf.contrib.lite.TFLiteConverter.from_session(sess, [model.input], [model.predictions])
tflite_model = converter.convert()
with open(export_path, "wb") as f:
f.write(tflite_model)

File Metadata

Mime Type
text/x-diff
Expires
Fri, Sep 12, 2:20 PM (1 d, 8 h)
Storage Engine
blob
Storage Format
Raw Data
Storage Handle
42892
Default Alt Text
(19 KB)

Event Timeline