Skip to main content
Skip table of contents

Developers reference guide

Overview

This overview shows the parts of a complete system setup integrating Breeze Runtime with a sorting machine. The customer software can run on PC 1 or a separate PC 2.

Fig. 1. The system setup example

Fig. 2. Communication overview

Commands

The commands and the server’s response are sent over TCP/IP on port 2000. The message format is unindented JSON ended with CR + LF (ASCII 13 + ASCII 10).

You can use curl with telnet for communicating with Breeze Runtime and send commands. For example: curl -v telnet://<ip>:<port>

Camera commands

Initialize camera

Using Breeze default settings

Message:

JSON
{
  "Command" : "InitializeCamera", 
  "Id"      : "IdString"
} 

Optional. Camera specific settings

Message:

JSON
{
  "Command"       : "InitializeCamera", 
  "Id"            : "IdString", 
  "DeviceName"    : "SimulatorCamera",
  "RequestedPort" : 3000, 
  "TimeOut"       : 5 // (seconds) 
  ... 
  ...  
  ... 
} 
  • The optional parameter RequestedPort can be used to set the data stream port, which defaults to 3000.

  • The optional parameter TimeOut is in seconds. Used in for example when waiting for the camera to become stable.

  • The command can have any number of camera-specific arguments (see table below).

Reply:

JSON
{
  "Id" : "IdString", 
  "Success" : true
} 
Available camera manufacturers
HySpex

HySpex SDK

  • Swir, Vnir, Etc.

JSON
{
  "DeviceName"      : "HySpexCamera",
  "CameraType"      : "VNIR_1800",
  "FramesToAverage" : 1, // optional; default = 1
  "SetFilePath"     : "<path>\<SetFileDirectory>"
}

For more information see HySpex hardware installation


Specim

LUMO SDK

  • Swir, FX-10, FX-17, Mwir, Etc.

JSON
{
  "DeviceName"  :   "SpecimCamera",
  "CameraType"  :   "FX17 with NI", // Camera name from Lumo or "FileReader",
  "ScpFilePath" :   "<path>\<file>.scp"
}

For more information see Specim hardware installation guide


INNO-SPEC

INNO-SPEC SDK DEPRECATED

JSON
{
  "DeviceName"            : "InnoSpecCamera",
  "CameraType"            : "RedEye",
  "MissingPixelsFilePath" : "<path>\<file>.xxx" // (optional)
}

INNO-SPEC Photon Focus-based camera

JSON
{
    "DeviceName": "InnoSpecPhotonFocusCamera",
    "CameraType": "GigE Camera - XXX.YYY.ZZZ.LLL",
    "BinningSpatial": 1, // Only applicable when no config file used - Default = 1
    "BinningSpectral": 1, // Only applicable when no config file used - Default = 1
    "ConfigFilePath": "C:\\Users\\user\\Documents\\InnoSpec\\HSI-17-121421.toml", // Optional
    "CorrectionMode": "OffsetGainHotpixel", // optional, default = "OffsetGainHotpixel"
    "ModeId": "1x1", // Only applicable when using config file
    "SpatialRoi": "", // Optional
    "SpectralRoi": "1", // Optional
    "TriggerActivation": "RisingEdge",
    "TriggerDelay": 0, // Default = 0
    "TriggerMode": false,// Default = false
    "TriggerSource": "Line0",
}

Where XXX.YYY.ZZZ.LLL is the IP address of the camera

See INNO-SPEC for more information on the different properties.


Basler

Pylon SDK

JSON
{
  "DeviceName"        : "BaslerCamera",
  "CameraType"        : "Basler ac1920-155um 12345", // Where `ac1920-155um` model name and `12345` is the camera serial number
  "BinningHorizontal" : 1, //optional; default = 1,
  "BinningVertical"   : 1 // optional; default = 1
}

Detection Technology

Detection Technology - X-ray sensor with Ethernet connection

JSON
{
  "DeviceName"          : "DeeTeeCamera",
  "CameraType"          : "X-Scan",
  "HostIp"              : "127.0.0.1",
  "SpatialBinning"      : 0, //optional; default : 0
  "BufferSize"          : 100, //optional; default : 100
  "TimeoutExceptionMs"  : 5000, //optional; default : 5000
  "TimeOut"             : 20 //optional; default : 20
}
Calibration data
JSON
{
  "Ob"                  : 52000,
  "DynCalEnable"        : false,
  "DynCalLines"         : 100,
  "DynCalThreshold"     : 0.01,
  "DynCalNbBlockSaved"  : 10
}
Geometric calibration
JSON
{
  "GeoCorrEnable"   : true,
  "SD"              : 1160,
  "DLE"             : 36.411,
  "DHE"             : 40.600,
  "CenterPixel"     : 458.5
}

For additional start up settings and in-depth explanations see Detection Technology camera


Prediktera

FileReader test camera.

JSON
{
  "DeviceName" : "SimulatorCamera"
}

If only DeviceName is specified, a simulated camera with 9 frames with an image width at 10 pixels will be used. The frames contain one object. The frames 'wrap around'; the 10th frame will be the same as the first one, etc. The CameraType will be "SimulatorCamera" (see below). By specifying the data files used, any raw data stream can be used, e.g.:

JSON
{
  "RawDataFilePath"         : "Data\Nuts\measurement.raw",
  "DarkReferenceFilePath"   : "Data\Nuts\darkref_measurement.raw",
  "WhiteReferenceFilePath"  : "Data\Nuts\whiteref_measurement.raw",
  "CameraType"              : "NutsSimulatorCamera"
}
  • The file paths are relative to the directory where BreezeRuntime.exe is located.

  • The parameter CameraType can be any text, and the CameraType field in the reply from the GetStatus command will be set to this text.

  • See Running Breeze Runtime with Breeze Client for instructions on how to download and install the nuts test data, which should be used in conjunction with the workflow with ID “NutsWorkflow”.


Generic camera

A camera implementing a JSON-RPC 2.0 standard, such as Specim FX-50.

JSON
{
  "DeviceName"            : "GenericeCamera",
  "CameraType"            : "Cortex",
  "HostIp"                : "127.0.0.1",
  "TimeOut"               : 5, //optional; default = 5
  "TimeoutExceptionMs"    : 5000, // optional; default = 5000
  "InitializeTimeOut"     : 5, // optional; default = TimeOut
  "StableTimeOut"         : 5, // optional; default = TimeOut
  "PreprocessingTimeOut"  : 5, // optional; default = TimeOut
  "BinningSpatial"        : 1, // optional; default = 1
  "BinningSpectral"       : 1, // optional; default = 1
  "BufferSize"            : 100 // optional; default = 100
}

Default values on initialization and more see Generic camera


Unispectral

Unispectral Monarch Camera - USB-C

JSON
{
  "CameraType"          : "Monarch II",
  "DarkReferenceValue"  : 64,
  "DeviceName"          : "UnispectralCamera",
  "Gain"                : 1, // optional; default = 1
  "BinningSpatial"      : 1, // optional; default = 1
  "BinningSpectral"     : 1, // optional; default = 1
  "SpectralRoi"         : "0;0;0;0;1;1;1;1;1;0" // optional; default = "1;1;1;1;1;1;1;1;1;1"
}

SpectralRoi = "1" is equivalent to "1;1;1;1;1;1;1;1;1;1" - meaning all bands are included

For more information see: Unispectral


Resonon

Resonon Basler-based cameras (Pika L, XC2, and UV)

USB or Ethernet connected cameras.

May required drivers to be installed on the computer. See Basler

JSON
{
  "DeviceName" : "ResononBaslerCamera",
  "CameraType" : "BaslerCamera serial-number",
  "BinningHorizontal" : 1, // optional; default = loaded from Resonon camera specific settings
  "BinningVertical" : 1 // optional; default = loaded from Resonon camera specific settings
}

serial-number is the serial number device id specified on the camera

Resonon Allied Vision-based cameras (Pika IR, IR+, IR-L, and IR-L+)

JSON
{
  DeviceName : "ResononAlliedVisionCamera",
  CameraType : "GigE device-id",
  BinningHorizontal : 1, // optional; default = loaded from Resonon camera specific settings
  BinningVertical : 1 // optional; default = loaded from Resonon camera specific settings
}

device-id is the device id specified on the camera


Disconnect camera

Message:

JSON
{
  "Command"   :   "DisconnectCamera",
  "Id"        :   "IdString"
}

Reply:

JSON
{
  "Id"          :    "IdString",
  "Success"     :    true,
  "Message"     :    "Success"
}

Get camera property

Message:

JSON
{
  "Command"     :    "GetCameraProperty",
  "Id"          :    "IdString"
  "Property"    :    "FrameRate"
}

Reply:

JSON
{
  "Id"          :    "IdString",
  "Success"     :    true,
  "Message"     :    "100"
}

Below is a list of properties, which all logical cameras have. Some physical cameras cannot be queried for some properties though, and a default value will be returned by the logical camera. Some logical cameras have properties not listed here, but that can still be retrieved. E.g. The camera “SimulatorCamera” has the property UniqueProperty (Float, 32 bits) to demonstrate this feature. It also has the property “State”, which can be used for simulating dark and white references (see commands TakeDarkReference and TakeWhiteReference).

Property

Original Data Type

Comment

IntegrationTime

Float, 32 bits

µs

FrameRate

Float, 32 bits

Hz

IsCapturing

Boolean

true or false

ImageWidth

Integer, 32 bits

No of pixels

ImageHeight

Integer, 32 bits

No of pixels

Wavelengths

string

E.g. "300;300.3;300.6;300.9;301.2..."

MaxSignal

Integer, 32 bits

Maximum signal value

Temperature

Float, 32 bits

Sensor temperature ().

DataSize

Integer

1 = Byte,

2 = Short (Integer, 16 bits),

4 = Float (Float, 32 bits),

8 = Double (Float, 64 bits)

Interleave

Integer

1 = BIL (Band Interleaved by Line)

2 = BIP (Band Interleaved by Pixel)

Set camera property

Message:

JSON
{
  "Command" : "SetCameraProperty", 
  "Id" : "IdString",
  "FrameRate": "200"
} 

Reply:

JSON
{
  "Id" : "IdString", 
  "Success" : true, 
  "Message" : "200"
}

Below is a list of properties, which all logical cameras have. Some physical cameras cannot have some properties set though, and the set operation will not have any effect. Some logical cameras have properties not listed here, but that can still be set. E.g. The camera “SimulatorCamera” has the property UniqueProperty (Float, 32 bits) to demonstrate this feature. It has also the property "State", which can be used when simulating dark and white references (see commands TakeDarkReference and TakeWhiteReference).

Property

Original Data Type

Comment

IntegrationTime

Float, 32 bits

µs

FrameRate

Float, 32 bits

Hz

Raw-data capture commands

Start capture

Message:

JSON
{
    "Command"         : "StartCapture",
    "Id"              : "IdString",
    "NumberOfFrames"  : 10, 
    "Folder"          : "Path to folder" //Optional 
} 

The parameter NumberOfFrames is optional; if omitted, the camera will capture until StopCapture is sent.

The parameter Folder is optional; if omitted, the camera will store the captured frames in a measurement.raw file in the chosen folder.

Reply:

JSON
{
    "Id"            :    "IdString",
    "Success"       :    true,
    "Message"       :    ""
}

Stop capture

Message:

JSON
{
  "Command" : "StopCapture", 
  "Id" : "IdString"
} 

Reply:

JSON
{
  "Id" : "IdString", 
  "Success" : true, 
  "Message" : ""
} 

Prediction commands

Get workflows

Message:

JSON
{
    "Command"               :    "GetWorkflows",
    "Id"                    :    "IdString"
    "IncludeTestWorkflows"  :    true
}

The parameter IncludeTestWorkflows specifies whether the test workflows bundled with Breeze Runtime (see below) should be included; If the parameter is false or omitted, only the workflows created by the user with Breeze are included.

Reply:

JSON
{
    "Id"        :    "IdString",
    "Success"    :    true,
    "Message"    :    "
    [
      {
        "Name": "Test Workflow",
        "Id": "TestWorkflow",
  ....
Click to expand entire reply body
JSON
{
    "Id"        :    "IdString",
    "Success"    :    true,
    "Message"    :    "
    [
      {
        "Name": "Test Workflow",
        "Id": "TestWorkflow",
        "Description": "Powder Quantification 10 pixels x 9 lines Test Sample",
        "CreatedTime": "20180325160219",
        "CreatedBy": "Administrator",
        "PredictionMode": "Normal",
        "ObjectFormat": {
          "Id": "12345",
          "Name": "spectral_sample",
          "Descriptors": [
            {
              "Type": "Category",
              "Name": "Type",
              "Index": 0,
              "Id": "ebf126aa",
              "Classes": [
                {
                  "Name": "-",
                  "Color": "#ff0000",
                  "Value": 0
                },
                {
                  "Name": "V",
                  "Color": "#3ad23a",
                  "Value": 1
                },
                {
                  "Name": "P",
                  "Color": "#4664be",
                  "Value": 2
                },
                {
                  "Name": "B",
                  "Color": "#f6f76d",
                  "Value": 3
                }
              ]
            },
            {
              "Type": "Property",
              "Name": "B",
              "Index": 1,
              "Id": "28987009"
            },
            {
              "Type": "Property",
              "Name": "V",
              "Index": 2,
              "Id": "78f02a2b"
            },
            {
              "Type": "Property",
              "Name": "P",
              "Index": 3,
              "Id": "1c6d9580"
            }
          ]
        },
        "StreamFormat": {
          "TimeFormat": "Utc100NanoSeconds",
          "Lines": [
            {
              "Type": "Category",
              "Name": "SampleCategory",
              "Index": 0,
              "Id": "aa533a79",
              "Classes": [
                {
                  "Name": "-",
                  "Color": "#ff0000",
                  "Value": 0
                },
                {
                  "Name": "Sample",
                  "Color": "#3ad23a",
                  "Value": 1
                }
              ]
            },
            {
              "Type": "Category",
              "Name": "Type",
              "Index": 1,
              "Id": "ebf126aa",
              "Classes": [
                {
                  "Name": "-",
                  "Color": "#ff0000",
                  "Value": 0
                },
                {
                  "Name": "V",
                  "Color": "#3ad23a",
                  "Value": 1
                },
                {
                  "Name": "P",
                  "Color": "#4664be",
                  "Value": 2
                },
                {
                  "Name": "B",
                  "Color": "#f6f76d",
                  "Value": 3
                }
              ]
            },
            {
              "Type": "Property",
              "Name": "B",
              "Index": 2,
              "Id": "28987009"
            },
            {
              "Type": "Property",
              "Name": "V",
              "Index": 3,
              "Id": "78f02a2b"
            },
            {
              "Type": "Property",
              "Name": "P",
              "Index": 4,
              "Id": "1c6d9580"
            }
          ]
        }
      },
      {
        "Name": "Nuts Workflow",
        "Id": "NutsWorkflow",
        "Description": "This is a workflow with nuts and shells",
        "CreatedTime": "20180404145346",
        "CreatedBy": "Administrator",
        "PredictionMode": "Normal",
        "ObjectFormat": {
          "Id": "12345",
          "Name": "Sample - Nuts_Classification",
          "Descriptors": [
            {
              "Type": "Category",
              "Name": "Nut or shell",
              "Index": 0,
              "Id": "80c6940d",
              "Classes": [
                {
                  "Name": "-",
                  "Color": "#ff0000",
                  "Value": 0
                },
                {
                  "Name": "Nut",
                  "Color": "#3ad23a",
                  "Value": 1
                },
                {
                  "Name": "Shell",
                  "Color": "#4664be",
                  "Value": 2
                }
              ]
            }
          ]
        },
        "StreamFormat": {
          "TimeFormat": "Utc100NanoSeconds",
          "Lines": [
            {
              "Type": "Category",
              "Name": "SampleCategory",
              "Index": 0,
              "Id": "5dae874a",
              "Classes": [
                {
                  "Name": "-",
                  "Color": "#ff0000",
                  "Value": 0
                },
                {
                  "Name": "Sample",
                  "Color": "#3ad23a",
                  "Value": 1
                }
              ]
            },
            {
              "Type": "Category",
              "Name": "Nut or shell",
              "Index": 1,
              "Id": "80c6940d",
              "Classes": [
                {
                  "Name": "-",
                  "Color": "#ff0000",
                  "Value": 0
                },
                {
                  "Name": "Nut",
                  "Color": "#3ad23a",
                  "Value": 1
                },
                {
                  "Name": "Shell",
                  "Color": "#4664be",
                  "Value": 2
                }
              ]
            }
          ]
        }
      },
      {
        "Name": "Object To Pixels Pipeline Workflow",
        "Id": "ObjectToPixelsPipelineWorkflow",
        "Description": "Object To Pixels Pipeline Mode Workflow",
        "CreatedTime": "20180831131648",
        "CreatedBy": "Administrator",
        "PredictionMode": "ObjectToPixelsPipeline",
        "StreamFormat": {
          "TimeFormat": "Utc100NanoSeconds",
          "Lines": [
            {
              "Type": "Category",
              "Name": "Type",
              "Index": 0,
              "Id": "8a614f86",
              "Classes": [
                {
                  "Name": "Background",
                  "Color": "#4664be",
                  "Value": 0
                },
                {
                  "Name": "Sample",
                  "Color": "#3ad23a",
                  "Value": 1
                },
                {
                  "Name": "Unknown",
                  "Color": "#ff0000",
                  "Value": 2
                },
                {
                  "Name": "A",
                  "Color": "#3ad23a",
                  "Value": 3
                },
                {
                  "Name": "B",
                  "Color": "#4664be",
                  "Value": 4
                },
                {
                  "Name": "C",
                  "Color": "#f6f76d",
                  "Value": 5
                },
                {
                  "Name": "D",
                  "Color": "#73e1fa",
                  "Value": 6
                },
                {
                  "Name": "Unknown Sample",
                  "Color": "#ff0000",
                  "Value": 7
                }
              ]
            }
          ]
        }
      }
    ]
  "}

Load workflow

Loads a workflow to use for predictions

Message:

JSON
{
 "Command"        :   "LoadWorkflow",
 "Id"            :    "IdString",
 "WorkflowId"    :    "TestWorkflow",
 "Xml"            :    null,  // can be omitted
 "FieldOfView"    :    10.5
}
  • To load a server-side workflow, the parameter WorkflowId is set to one of the IDs in the reply from GetWorkflows.

  • To load a client-side workflow, the parameter Xml is set to the contents of a client-side workflow XML file.

  • Either WorkflowId or Xml must be specified, but not both.

  • FieldOfView is optional. if omitted, the value for the current camera will be used from settings in Breeze on the local computer.

Reply:

JSON
{
    "Id"        :    "IdString",
    "Success"    :    true,
    "Message"    :    
    "{
      "Name": "Test Workflow",
  ....
Click to expand entire reply body
JSON
{
    "Id"        :    "IdString",
    "Success"    :    true,
    "Message"    :    
    "{
      "Name": "Test Workflow",
      "Id": "TestWorkflow",
      "Description": "This is a test workflow",
      "CreatedTime": "20180325160219",
      "CreatedBy": "Administrator",
      "PredictionMode": "Normal",
      "ObjectFormat": {
      "Id": "12345",
      "Name": "spectral_sample",
      "Descriptors": [
      {
        "Type": "Category",
        "Name": "Type",
        "Index": 0,
        "Id": "ebf126aa",
        "Classes": [
          {
            "Name": "-",
            "Color": "#ff0000",
            "Value": 0
          },
          {
            "Name": "V",
            "Color": "#3ad23a",
            "Value": 1
          },
          {
            "Name": "P",
            "Color": "#4664be",
            "Value": 2
          },
          {
            "Name": "B",
            "Color": "#f6f76d",
            "Value": 3
          }
        ]
      },
      {
        "Type": "Property",
        "Name": "B",
        "Index": 1,
        "Id": "28987009"
      },
      {
        "Type": "Property",
        "Name": "V",
        "Index": 2,
        "Id": "78f02a2b"
      },
      {
        "Type": "Property",
        "Name": "P",
        "Index": 3,
        "Id": "1c6d9580"
      }
      ]
      },
      "StreamFormat": {
      "TimeFormat": "Utc100NanoSeconds",
      "LineWidth": 10,
      "Lines": [
      {
        "Type": "Category",
        "Name": "SampleCategory",
        "Index": 0,
        "Id": "aa533a79",
        "Classes": [
          {
            "Name": "-",
            "Color": "#ff0000",
            "Value": 0
          },
          {
            "Name": "Sample",
            "Color": "#3ad23a",
            "Value": 1
          }
        ]
      },
      {
        "Type": "Category",
        "Name": "Type",
        "Index": 1,
        "Id": "ebf126aa",
        "Classes": [
          {
            "Name": "-",
            "Color": "#ff0000",
            "Value": 0
          },
          {
            "Name": "V",
            "Color": "#3ad23a",
            "Value": 1
          },
          {
            "Name": "P",
            "Color": "#4664be",
            "Value": 2
          },
          {
            "Name": "B",
            "Color": "#f6f76d",
            "Value": 3
          }
        ]
      },
      {
        "Type": "Property",
        "Name": "B",
        "Index": 2,
        "Id": "28987009"
      },
      {
        "Type": "Property",
        "Name": "V",
        "Index": 3,
        "Id": "78f02a2b"
      },
      {
        "Type": "Property",
        "Name": "P",
        "Index": 4,
        "Id": "1c6d9580"
      }
      ],
      }
    }"
}

CreatedTime has the format yyyyMMddhhmmss.

Delete workflow

Message:

JSON
{
  "Command" : "DeleteWorkflow", 
  "WorkflowId" : "IdString"
} 

Open shutter

Message:

JSON
{
    "Command" : "OpenShutter", 
    "Id" : "IdString"
} 

Open the camera shutter

Reply:

JSON
{
  "Id" : "IdString", 
  "Success" : true, 
  "Message" : ""
} 

Close shutter

Message:

JSON
{
  "Command" : "CloseShutter", 
  "Id" : "IdString"
} 

Closes the camera shutter.

Reply:

JSON
{
  "Id" : "IdString", 
  "Success" : true, 
  "Message" : ""
} 

Take dark reference

Message:

JSON
{
  "Command" : "TakeDarkReference", 
  "Id" : "IdString"
} 

If the camera “SimulatorCamera” is used (see command Initialize camera), the camera state must be set to DarkReference first; either use the command CloseShutter, or use the command SetProperty to set State to DarkReference. When the shutter is opened the state is set to Normal.

Reply:

JSON
{
    "Id" : "IdString", 
    "Success" : true, 
    "Message" : ""
} 
Dark reference quality control

Automatic dark reference quality control checks

  • The standard deviation between lines is lower than 5%

  • The average signal relative to max signal is lower than 50%

Invalid dark reference reply example:

Reply:

JSON
{
    "Id" : "IdString", 
    "Success" : false, 
    "Code" : 1006 
    "Error" : "InvalidDarkReference" 
    "Message" : "Variation over lines is higher than 5%"
} 

Take white reference

Message:

JSON
{
    "Command" : "TakeWhiteReference", 
    "Id" : "IdString"
} 

If the camera “SimulatorCamera” is used (see command Initialize Camera), the camera property State must be set to WhiteReference first, using the command SetCameraProperty. After the reference is taken the state can be set to Normal.

Reply:

JSON
{
    "Id" : "IdString", 
    "Success" : true, 
    "Message" : ""
} 
White reference quality control

Automatic white reference quality control checks

  • The standard deviation between lines is lower than 5%

  • The average signal relative to max signal is lower than 99%

  • The average signal relative to max signal is higher than 50%

  • The standard deviation between average pixels is lower than 5%

10% of border pixels on each side is not included in the calculation

Invalid white reference reply example:

Reply:

JSON
{
    "Id" : "IdString", 
    "Success" : false, 
    "Code" : 1007 
    "Error" : "InvalidWhiteReference" 
    "Message": "White reference less than 50% of max signal"
} 

Get workflow setup

Should be called to retrieve prediction setup from the active workflow, which another client loaded.

Message:

JSON
{
    "Command" : "GetWorkflowSetup", 
    "Id" : "IdString"
} 

Reply:

See command LoadWorkflow.

Start predict

Message:

JSON
{
    "Command" : "StartPredict", 
    "Id" : "IdString", 
    "IncludeObjectShape" : true, 
    "FrameCount" : -1
} 

Reply:

JSON
{
    "Id" : "IdString", 
    "Success" : true, 
    "Message" : ""
} 

Parameters:

  • IncludeObjectShape (optional, defaults to false): Whether start y offset, object center, and border coordinates should be included with object identified event.

The identified objects are sent over the event stream (please see section Event Stream). This is an example object:

JSON
{ 
    "Event" : "PredictionObject", 
    "Code" : 4000, 
    "Message" : "%7B%22StartTime%22%3A636803791584234611%2C%22EndTime%22%3A63680379158431 4525%2C%22Descriptors%22%3A%5B1.0%2C45.4867249%2C47.89075%2C6.40655136%5 D%2C%22Shape%22%3A%7B%22Start%22%3A1%2C%22Center%22%3A%5B5%2C3%5D%2C%22B order%22%3A%5B%5B3%2C0%5D%2C%5B8%2C0%5D%2C%5B8%2C1%5D%2C%5B9%2C1%5D%2C%5 B9%2C7%5D%2C%5B3%2C7%5D%2C%5B3%2C3%5D%2C%5B2%2C3%5D%2C%5B2%2C1%5D%2C%5B3 %2C1%5D%2C%5B3%2C0%5D%5D%7D%7D" 
} 

Message field un-escaped and indented:

JSON
{ 
    "StartTime": 636803787957020338, 
    "EndTime": 636803787957110238, 
    "Descriptors":
    [ 
        1.0, 
        45.4867249, 
        47.89075, 
        6.40655136 
    ], 
    "Shape":
    { 
        "Start": 1, 
        "Center": [5, 3], 
        "Border":
        [ 
            [3, 0], 
            [8, 0], 
            [8, 1], 
            [9, 1],  
            [9, 7], 
            [3, 7], 
            [3, 3], 
            [2, 3], 
            [2, 1], 
            [3, 1], 
            [3, 0] 
        ] 
    } 
} 

Stop predict

Message:

JSON
{
    "Command" : "StopPredict", 
    "Id" : "IdString"
} 

Reply:

JSON
{
    "Id" : "IdString", 
    "Success" : true, 
    "Message" : ""
} 

Get status

Message:

JSON
{
    "Command" : "GetStatus", 
    "Id" : "IdString"
} 

Reply:

JSON
{
    "Id"    :    "IdString",
    "Success"    :    true,
    "Message"    :
   "{\"State\":    \"Predicting\",
     \"WorkflowId\":    \"abcde\",
     \"CameraType\":    \"MWIR\",
     \"FrameRate\":    200.0,                     // Hz
     \"IntegrationTime\":    1500.0,              // µs
     \"Temperature\":    293.15,                  // K
     \"DarkReferenceValidTime\":    4482.39258,   // s
     \"WhiteReferenceValidTime\":    9483.392,    // s
     \"LicenseExpiryDate\":    \"2018-12-31\",    // yyyy-MM-dd
     \"SystemTime\":    636761502932108549,
     \"SystemTimeFormat\":    \"Utc100NanoSeconds\"}"
}
  • The original data type for SystemTime is an unsigned 64-bit integer.

States:

  • Idle

  • LoadingWorkflow

  • Predicting

  • StoppingPrediction

  • CapturingRawPixelLines

  • CapturingMultipleFrames

The field Temperature represents the camera sensor temperature in Kelvin degrees. Celsius degrees = Temperature - 273.15. Fahrenheit degrees = Temperature x 1.8 - 459.67.

Get property

Message:

JSON
{
    "Command" : "GetProperty", 
    "Id" : "IdString", 
    "Property" : "PropertyString", 
//Optional: 
    "NodeId" : "NodeIdString", 
    "Name": "FieldName"
} 

Reply:

JSON
{
    "Id" : "IdString", 
    "Success" : true, 
    "Message" : "result string"
} 

The field NodeId can be used to get property information from a node in the analysis tree. If it is not specified then workflow global properties will be fetched.

Available Properties are:

  • Version

  • State

  • WorkspacePath

  • WorkflowId

  • DarkReferenceValidTime

  • WhiteReferenceValidTime

  • DarkReferenceFile

  • WhiteReferenceFile

  • LicenseExpiryDate

  • SystemTime

  • SystemTimeFormat

  • PredictorThreads

    • Return number of threads used in prediction.
      Default -1 = number of available virtual cpu cores

  • AvailableCameraProviders

    • Returns list of semicolon separated available providers

  • Fields

    • Returns list of semicolon separated workflow fields

  • FieldValue

    • Requires “Name"

    • Workflow field value

Example 1

Message:

JSON
{
    "Command" : "GetProperty", 
    "Id" : "1", 
    "Property" : "Version"
} 

Reply:

JSON
{
    "Id" : "1", 
    "Success" : true, 
    "Message" : "2021.1.0.999 Release"
} 

Example 2

Message:

JSON
{
    "Command" : "GetProperty", 
    "Id" : "2", 
    "Property" : "FieldValue", 
    "Name" : "Voltage"
} 

Reply:

JSON
{
    "Id" : "2", 
    "Success" : true, 
    "Message" : "100"
} 

Set property

Message:

JSON
{
    "Command" : "SetProperty", 
    "Id" : "IdString", 
    "Property" : "PropertyString", 
    "Value" : "ValueString"
} 

Reply:

JSON
{
    "Id" : "IdString", 
    "Success" : true, 
    "Message" : "result string"
} 

Available Properties to set are:

  • PredictorThreads

    • The number of threads used in prediction.
      Default = -1 (number of available virtual cpu cores)

Re-Initialize prediction

If there is a camera problem during prediction then Initialize command can be called to reconnect the camera and if successful start the prediction again. The tries parameter decides the number of tries that will be done before returning the error code CameraNotStable: 1009 error code.

Re-Initialize command sequence order

  • If predicting before Re-Initialize command was called

    • StopPredict

  • For a number of tires until the camera is initialized

    • DisconnectCamera

    • InitializeCamera

  • If predicting before Re-Initialize command was called

    • LoadWorkflow (Same workflow as loaded before)

    • StartPredict

Message:

JSON
{
    "Command" : "Initialize", 
    "Id" : "IdString", 
    "Tries" : 10, 
    "TimeBetweenTrialSec" : 10
} 

Reply:

JSON
{
    "Id" : "IdString", 
    "Success" : true, 
    "Message" : ""
} 

Start capture on predict

Start capture measurements while predicting. The measurements are stored in the folder:

{Breeze workspace folder}/Data/Runtime/Measurements/{date}

Capture folders are named by date when starting the capture. The recorded measurements stored in that folder are divided by the max number count argument. The measurements are called “Measurement_1”, “Measurement_2”,...“Measurement_N”.

Message:

JSON
{
    "Command" : "StartCaptureOnPredict", 
    "Id" : "IdString", 
    "Name" : "Name of measurements", 
    "MaxFrameCount" : 1000 // Max number of frames per measurement 
    "Object" : true | false  // Save only object or not 
}

Reply:

JSON
{
    "Id" : "IdString", 
    "Success" : true, 
    "Message" : ""
} 

Stop capture on predict

Message:

JSON
{
    "Command" : "StopCaptureOnPredict", 
    "Id" : "IdString"
} 

Reply:

JSON
{
    "Id" : "IdString", 
    "Success" : true, 
    "Message" : ""
} 

Apply changes on measurements

Apply changes using a Breeze Runtime workflow to measurement folders on disk to create pixel predictions (measurement_prediction.raw), measurement thumbnail (measurement.jpg) and result from descriptors (measurement.xml) saved in the same folder.

Note. Measurement folder must contain dark reference (darkref_measurement.raw) and white reference (whiteref_measurement.raw) if the raw data should be converted to reflectance or absorbance.

Message:

JSON
{
  "Command"       : "ApplyChanges", 
  "Id"            : "IdString", 
  "XmlFile"       : "{path to Runtime workflow}/workflow.xml",
  "Folders"       : "{path to measurement folder 1};
                     {path to measurement folder 2};
                     {path to measurement folder N};"
}

Reply:

JSON
{
    "Id" : "IdString", 
    "Success" : true, 
    "Message" : ""
} 

Error Handling and Events

Errors can happen in two different ways: Either as a result of a command or as an event.

Command Errors

When a command fails, the Success field will be false, the Error field will hold a string error code and the Code field will hold an integer error code. The Message field will hold a further error description:

Reply:

JSON
{
    "Id" : "IdString", 
    "Success" : false, 
    "Message" : "Camera is not initialized", 
    "Error" : "GeneralError", 
    "Code": 3000
} 

When an error occurs on the server-side during prediction or capture, the error information is sent via the event stream. Log files can be found in the directory “.Evince” in the home directory, e.g. “C:\Users\<user name>\.Evince.

Command Error Codes

These will be sent in response to a failed command.

Error Code

Constant

Description

1000

GeneralCommandError

General command error. Message field gives more details.

3001

UnknownError

Unexpected error in runtime command call. Stack trace gives more details.

JSON
{
    "Id" : "s94kl34",
    "Success": false,
    "Event": "Error",
    "Error": "UnknownError",
    "Code" : 3001,
    "Message": "ArrayIndexOutOfBoundsException",
    "StackTrace": "At RtPredict.cs:281 ...\nAt ..."
}

1002

InvalidLicense

Sent on StartCapture if no valid license is found for Breeze. Sent on StartPredict if no valid license is found for BreezeAPI.

1003

MissingReferences

Sent on StartPredict if references are needed and both references are missing.

1004

MissingDarkReference

Sent on StartPredict if references are needed and dark reference is missing.

1005

MissingWhiteReference

Sent on StartPredict if references are needed and white reference is missing.

1006

InvalidDarkReference

Sent on TakeDarkReference if the captured dark reference is not good enough. Message will include detailed explanation.

1007

InvalidWhiteReference

Sent on TakeWhiteReference if the captured white reference is not good enough. Message will include a detailed explanation.

1008

MissingDarkReferenceFile

Sent on TakeWhiteReference if white reference intensity is used and there is no dark reference file to read.

1009

CameraNotStable

Sent on Initialize when cannot reinitialize prediction after given number tries

Event Stream

Events and errors that do not belong to a command will be sent over TCP/IP on port 2500. The message format is unindented JSON ended with CR + LF (ASCII 13 + ASCII 10). All errors, except those in command responses, will also be sent over this channel.

Regular Event Codes

These events can be sent during startup, prediction, and capture.

Example

Message:

JSON
{
    "Event" : "DarkReferenceOutdated", 
    "Code" : 2001
} 

Event Code

Constant

Description

2000

reserved for event server

2001

DarkReferenceOutdated

Dark reference is outdated

2002

WhiteReferenceOutdated

White reference is outdated

2003

WorkflowListChanged

Workflow in Runtime folder has been changed, removed or added

Error Event Codes

These events can be sent during startup, prediction, and capture. An error event will have the field Event = “Error” and the field “Error” specifies the name of the error.

Example

Message:

JSON
{
    "Event" : "Error", 
    "Error" : "UnknownError",
    "Message" : "<message>",
    "Code" : 3001, 
    "StackTrace": "At RtPredict.cs:281..."
} 

Error Code

Constant

Description

3000

GeneralError

General error. Message field gives more details.

3001

UnknownError

Unexpected error during prediction or capture. Message will include error details

JSON
{
     "Event"     : "UnknownError",
     "Code"      : 3001,
     "Message": "ArrayIndexOutOfBoundsException",
     "StackTrace": "At RtPredict.cs:281 ...\nAt ..."
}

3002

CameraErrorCode

Forwards error code generated by camera supplier. A new field CameraErrorCode will be present in event. Example:

JSON
{
     "Event"    : "CameraErrorCode",
     "Code"    : 3002, 
     "Message"    : "Camera not streaming"
}

3003

FrameQueueOverflow

Prediction frame queue overflow. Please use a lower camera frame rate.

Prediction Object

Event Code

Constant

Description

4000

PredictionObject

The Runtime has identified an object. Its properties can be found in the message field as escaped JSON. Please see Start Predict for details.

JSON
{
  "Event"       : "PredictionObject",
  "Code"        : 4000, 
  "Message"     : "<Escaped JSON>"
}

Data Stream

The data stream from the server is sent over TCP/IP, and the port defaults to 3000, but can be set via the InitializeCamera command.

All integers and floats are little-endian encoded in the stream.

Header

[Stream Type][Frame Number][Timestamp][Metadata size][Data Body Size]

1. Stream Type:

1 = Raw Pixel Line

2 = Prediction Lines

3 = Rgb Pixel Line

4 = StreamStarted/EndOfStream

Data Type: Byte

2. Frame Number:

Frame number from the camera

Data Type: Integer, 64 bits

3. Timestamp:

Time in 100-nanoseconds when the line is read from the camera (Number of 100-nanosecond intervals that have elapsed since 12:00:00 midnight on January 1, 0001, UTC)

Data Type: Integer, 64 bits

4. Metadata Size:

Number of bytes in metadata body

Data Type: Integer, 32 bits

5. Data Body Size:

Number of bytes in the data body

Data Type: Integer, 32 bits

Metadata Body: See below.

Data Body: See below.

Metadata body

1. CameraProcessingTime:

Camera Processing Time, in 100-nanoseconds

Data Type: Integer, 32 bits

2. CameraDeltaTimeToPreviousFrame:

Time difference between frames on camera, in 100-nanoseconds

Data Type: Integer, 32 bits

3. BreezeProcessingTime:

Breeze Processing Time, in 100-nanoseconds

Data Type: Integer, 32 bits

4. BreezeDeltaTimeToPreviousFrame:

Time difference between frames in Breeze, in 100-nanoseconds

Data Type: Integer, 32 bits

Data body

Raw Pixel Line

Stream Type = 1

Time in the header represents the time when the raw data was received from the camera. Data Body: Raw bytes

Prediction Lines

Stream Type = 2

Time in the header represents the time when the frame raw data was received from the camera.

Data Body:

Line1:

Array:

The number of elements in the array equals the LineWidth field in the reply from the LoadWorkflow command.

Data Type:

See below.

Line 2

Line N

Classification Vector

Range:

0 (No class) - Number of classes (255)

Example 1:

[0,0,0,0,1,1,1,1,0,0,0,2,2,2,2,0,0,0,0,0,3,3,3,0,0,...]

Example 2:

[0,0,1,1,1,0,1,1,0,0,0,1,1,1,1,0,0,0,0,0,1,1,1,0,0,...] (sample vector) Data Type: Byte

Quantification Vector

Range:

-Max float - Max float

Example:

[0,0,0,0,4.5,5.2,3.8,6.5,0,0,0,...]

Data Type:

Float, 32 bits

Confidence Values

Range:

5-1 (High - Low)

Example:

[0,0,0,0,3,3,3,3,0,0,0,5,5,5,5,0,0,0,0,0,1,1,1,0,0,...]

Data Type:

Byte

Rgb Pixel Line

Range:

R = 0 - 255, G = 0 - 255, B = 0 - 255

Example:

[50,100,150],[50,45,55],[50,20,35],[85,65,255],[0,0,0],[...]

Data Type:

3 x Byte

Change visualization variable:

SetProperty("Property" = "VisualizationVariable", "Value" = "Reflectance")

Value can be either “Raw", "Reflectance", "Absorbance" or "Descriptor names"

Change visualization blend:

SetProperty("Property" = "VisualizationBlend", "Value" = "True or False")

Stream Start / End

Stream Type = 4

Time in the header represents the time when the data was sent.

Data Body are these ASCII encoded strings:

  • StreamStarted

  • EndOfStream

Runtime Command Switches

Switch

Description

/p:<n>

e.g. "/p:16"

Set no of calculation threads

/w:<Current Workspace Path>

e.g. /w:"C:\temp\My Folder"

Override current workspace path

/e

Execute BreezeRuntime as EventServer (see below)

/logLevel:<Level>

e.g., /logLevel:Trace

Override minimum loglevel for console logging

  • TRACE

  • DEBUG

  • INFO (default)

  • WARN

  • ERROR

  • FATAL

/s:<sessionId>

e.g., /s:Nuts_Workflow

Log session id

Running Breeze Runtime with Breeze Client

1. Start “BreezeRuntime.exe”. Note: Located under “C:\Program

Files\Prediktera\Breeze\Runtime”. If the optional event server is used (see Event Stream), start “BreezeRuntime.exe /e” instead.

2. Start “BreezeClient.exe” from the Start menu

3. Press the “Connect” button.

4. Select workflow exported from Breeze and press “Load”

5. Start predicting on the loaded workflow by pressing “Start”

6. View real-time predictions under the “Realtime” tab

7. View Breeze Runtime status under the “Status” tab

JavaScript errors detected

Please note, these errors can depend on your browser setup.

If this problem persists, please contact our support.