gabrieltool.statemachine.callable_zoo.processor_zoo package

Submodules

gabrieltool.statemachine.callable_zoo.processor_zoo.base module

Basic callable classes for Processor.

class DummyCallable(dummy_input='dummy_input_value')[source]

Bases: gabrieltool.statemachine.callable_zoo.base.CallableBase

A Dummy Callable class for testing and examples.

classmethod from_json(json_obj)

Create a CallableBase class instance from a json object.

Subclasses should overide this class depending on the input type of their constructor.

class FasterRCNNOpenCVCallable(proto_path, model_path, labels=None, conf_threshold=0.8)[source]

Bases: gabrieltool.statemachine.callable_zoo.base.CallableBase

A callable class that executes a FasterRCNN object detection model using OpenCV.

classmethod from_json(json_obj)[source]

Create an object from a JSON object.

Parameters:json_obj (json) – JSON object that has all the serialized constructor arguments.
Raises:ValueError – when constructor arguments’ type don’t match.
Returns:The deserialized FasterRCNNOpenCVCallable object.
Return type:FasterRCNNOpenCVCallable
visualize_detections(img, results)[source]

Visualize object detection outputs.

This is a helper function for debugging processor callables. The results should follow Gabrieltool’s convention, which is

Parameters:
  • {OpenCV Image} (img) –
  • {Dictionary} -- a dictionary of class_idx -> [[x1, y1, x2, y2, confidence, cls_idx],..] (results) –
Returns:

OpenCV Image – Image with detected objects annotated

gabrieltool.statemachine.callable_zoo.processor_zoo.containerized module

Callable classes for Containerized Processors.

Currently we don’t offer functionalities to clean up the containers after the program finishes. Use the following commands to clean up the containers started by this module.

$ docker stop -t 0 $(docker ps -a -q –filter=”name=GABRIELTOOL”)
class FasterRCNNContainerCallable(container_image_url, conf_threshold=0.5)[source]

Bases: gabrieltool.statemachine.callable_zoo.base.CallableBase

A callable class to execute containerized FasterRCNN model in Caffe.

Use this class if your object detector is generated by TPOD v1 and the container image is hosted by cmusatyalab’s gitlab container registry.

CONTAINER_NAME = 'GABRIELTOOL-FasterRCNNContainerCallable-115'
clean()[source]
container_server_url
classmethod from_json(json_obj)[source]

Deserialize.

class SingletonContainerManager(container_name)[source]

Bases: object

Helper class to start, get, and remove a container identified by a name.

clean()[source]

Remove the container if it exists.

container
container_name
start_container(image_url, command, **kwargs)[source]

Start a container

Parameters:
  • image_url (string) – Container Image URL.
  • command (string) – Container command.
  • kwargs (dictionary) – Extra arguments to pass to Docker client.
Returns:

A container

Return type:

Container

class TFServingContainerCallable(model_name, serving_dir, conf_threshold=0.5)[source]

Bases: gabrieltool.statemachine.callable_zoo.base.CallableBase

A callable class to execute frozen tensorflow models using TF serving container images.

Use this class if your object detector is generated by OpenTPOD and you have downloaded the model. The TF serving container is started lazily when an FSM runner starts.

CONTAINER_NAME = 'GABRIELTOOL-TFServingContainerCallable-115'
SERVED_DIRS = {}
TFSERVING_GRPC_PORT = 8500
clean()[source]
container_external_port

Port of the TF Serving container.

classmethod from_json(json_obj)[source]

Deserialize.

prepare()[source]

Launch the TF serving container. Do not call this method directly unless debugging.

This function is called when an FSM runner starts. This enables gabrieltool to start only one TF serving container to serve many models.

gabrieltool.statemachine.callable_zoo.processor_zoo.tfutils module

Utilities for using Tensorflow models.

class TFServingPredictor(host, port)[source]

Bases: object

An agent that makes request to a TF serving server to get object detection results.

This agent communicates with the TF serving server (often a container at localhost) through gRPC.

__init__(host, port)[source]

Constructor.

Parameters:
  • host (string) – TF serving server hostname or IP address.
  • port (int) – TF serving server port number.
infer_one(model_name, rgb_image, conf_threshold=0.5)[source]

Infer one image by sending a request to TF serving server.

Parameters:
  • model_name (string) – Name of the Model
  • rgb_image (numpy array) – Image in RGB format
  • conf_threshold (float, optional) – Cut-off threshold for detection. Defaults to 0.5.
Returns:

keys are class ids, values are list of [x1, y1, x2, y2, confidence, label_idx]. e.g {‘cat’: [[0, 0, 100, 100, 0.6, ‘cat’]], 1: [[0, 0, 100, 100, 0.7, 1]]}

Return type:

Dictionary

Module contents

A collection of Callable classes to be used by Processors (in FSM states).