Skip to main content
Skip table of contents

Developers reference guide

Overview

This overview shows the parts of a complete system setup integrating Breeze Runtime with a custom software. The customer software can run on PC 1 or a separate PC 2.

image-20240917-075539.png

Fig. 1. The system setup example

image-20240917-074627.png

Fig. 2. Communication overview

Commands

The commands and the server’s response are sent over TCP/IP on port 2000. The message format is unindented JSON ended with CR + LF (ASCII 13 + ASCII 10).

You can use curl with telnet for communicating with Breeze Runtime and send commands.

BASH
curl -v telnet://<ip>:<port> 

EXAMPLE

BASH
echo -en '{"Command":"GetStatus","Id":"IdString"}\r\n' | nc <ip> <port>

If the commands are in a file separated with newline make sure the line-ending is CRLF. Translate line-endings in place using for example unix2dos or translate when posting to Breeze Runtime with for example tr:

BASH
 cat multiple-commands-file.txt | tr '\n' '\r\n'| nc <ip> <port>

Camera commands

Initialize camera

Using Breeze default settings

Message:

JSON
{
    "Command" : "InitializeCamera", 
    "Id"      : "IdString"
} 

Optional. Camera specific settings

Message:

JSON
{
    "Command"       : "InitializeCamera", 
    "Id"            : "IdString", 
    "DeviceName"    : "SimulatorCamera",
    "RequestedPort" : 3000, 
    "TimeOut"       : 5 // (seconds) 
    ... 
} 
  • The optional parameter RequestedPort can be used to set the data stream port, which defaults to 3000.

  • The optional parameter TimeOut is in seconds. Used in for example when waiting for the camera to become stable.

  • The command can have any number of camera-specific arguments (see table below).

Reply:

JSON
{
    "Id"      : "IdString", 
    "Success" : true
} 
Available camera manufacturers
HySpex

HySpex SDK

  • Swir, Vnir, Etc.

JSON
{
    "DeviceName"                : "HySpexCamera",
    "SetFilePath"               : "<path>\<SetFileDirectory>",
    "CameraType"                : "VNIR_1800_SN0940",
    "RequestedPort"             : 3000,             // optional; default = 3000
    "ControlMode"               : "disabled",       // optional; default = "disabled" (disabled, master)
    "LensId"                    : "0",              // optional; default = "0"
    "FramesToAverage"           : 1,                // optional; default = 1
    "MotorType"                 : "translation",    // optional; default = "translation" (translation, rotation)
    "BufferSize"                : 1024,             // optional; default = 1024
    "Trigger"                   : "default",        // optional; default = "default" (default, external, internal)
    "BufferSizePreProcessing"   : 128,              // optional; default = 128
    "ImageOption"               : "RAW_BP",         // optional; default = "RAW_BP" (RAW, BGSUB, RE, RAW_BP, HSNR_RAW, HSNR_RE, HSNR_RAW_BP)
}

For more information see HySpex hardware installation


Specim

LUMO SDK

  • Swir, FX-10, FX-17, Mwir, Etc.

JSON
{
    "DeviceName"  :   "SpecimCamera",
    "CameraType"  :   "FX17 with NI", // Camera name from Lumo or "FileReader",
    "ScpFilePath" :   "<path>\<file>.scp"
}

For more information see Specim hardware installation guide

Specim FX-50

JSON-RPC 2.0 standard.

JSON
{
  "DeviceName"            : "GenericeCamera",
  "CameraType"            : "Cortex",
  "HostIp"                : "127.0.0.1",
  "TimeOut"               : 5, //optional; default = 5
  "TimeoutExceptionMs"    : 5000, // optional; default = 5000
  "InitializeTimeOut"     : 5, // optional; default = TimeOut
  "StableTimeOut"         : 5, // optional; default = TimeOut
  "PreprocessingTimeOut"  : 5, // optional; default = TimeOut
  "BinningSpatial"        : 1, // optional; default = 1
  "BinningSpectral"       : 1, // optional; default = 1
  "BufferSize"            : 100 // optional; default = 100
}

Default values on initialization and more see Generic camera


INNO-SPEC

INNO-SPEC SDK DEPRECATED

JSON
{
    "DeviceName"            : "InnoSpecCamera",
    "CameraType"            : "RedEye",
    "MissingPixelsFilePath" : "<path>\<file>.xxx" // (optional)
}

INNO-SPEC Photon Focus-based camera

JSON
{
    "DeviceName"        : "InnoSpecPhotonFocusCamera",
    "CameraType"        : "GigE Camera - XXX.YYY.ZZZ.LLL",
    "BinningSpatial"    : 1,                // Only applicable when no config file used - Default = 1
    "BinningSpectral"   : 1,                // Only applicable when no config file used - Default = 1
    "ConfigFilePath"    : "C:\\Users\\user\\Documents\\InnoSpec\\HSI-17-121421.toml",              // Optional
    "CorrectionMode"    : "OffsetGainHotpixel", // optional, default = "OffsetGainHotpixel"
    "ModeId"            : "1x1",                // Only applicable when using config file
    "SpatialRoi"        : "",               // Optional
    "SpectralRoi"       : "1",              // Optional
    "TriggerActivation" : "RisingEdge",
    "TriggerDelay"      : 0,                // Default = 0
    "TriggerMode"       : false,            // Default = false
    "TriggerSource"     : "Line0"
}

Where XXX.YYY.ZZZ.LLL is the IP address of the camera

See INNO-SPEC for more information on the different properties.


Basler

Pylon SDK

JSON
{
    "DeviceName"        : "BaslerCamera",
    "CameraType"        : "Basler ac1920-155um 12345", // Where `ac1920-155um` model name and `12345` is the camera serial number
    "BinningHorizontal" : 1, //optional; default = 1,
    "BinningVertical"   : 1 // optional; default = 1
}

Detection Technology

Detection Technology - X-ray sensor with Ethernet connection

JSON
{
  "DeviceName"          : "DeeTeeCamera",
  "CameraType"          : "X-Scan",
  "HostIp"              : "127.0.0.1",
  "SpatialBinning"      : 0, //optional; default : 0
  "BufferSize"          : 100, //optional; default : 100
  "TimeoutExceptionMs"  : 5000, //optional; default : 5000
  "TimeOut"             : 20 //optional; default : 20
}
Calibration data
JSON
{
  "Ob"                  : 52000,
  "DynCalEnable"        : false,
  "DynCalLines"         : 100,
  "DynCalThreshold"     : 0.01,
  "DynCalNbBlockSaved"  : 10
}
Geometric calibration
JSON
{
  "GeoCorrEnable"   : true,
  "SD"              : 1160,
  "DLE"             : 36.411,
  "DHE"             : 40.600,
  "CenterPixel"     : 458.5
}

For additional start up settings and in-depth explanations see Detection Technology camera


Prediktera

FileReader test camera.

JSON
{
  "DeviceName" : "SimulatorCamera"
}

If only DeviceName is specified, a simulated camera with 9 frames with an image width at 10 pixels will be used. The frames contain one object. The frames 'wrap around'; the 10th frame will be the same as the first one, etc. The CameraType will be "SimulatorCamera" (see below). By specifying the data files used, any raw data stream can be used, e.g.:

JSON
{
  "RawDataFilePath"         : "Data\Nuts\measurement.raw",
  "DarkReferenceFilePath"   : "Data\Nuts\darkref_measurement.raw",
  "WhiteReferenceFilePath"  : "Data\Nuts\whiteref_measurement.raw",
  "CameraType"              : "NutsSimulatorCamera"
}
  • The file paths are relative to the directory where BreezeRuntime.exe is located.

  • The parameter CameraType can be any text, and the CameraType field in the reply from the GetStatus command will be set to this text.

  • See Running Breeze Runtime with Breeze Client for instructions on how to download and install the nuts test data, which should be used in conjunction with the workflow with ID “NutsWorkflow”.


Unispectral

Unispectral Monarch Camera - USB-C

JSON
{
  "CameraType"          : "Monarch II",
  "DarkReferenceValue"  : 64,
  "DeviceName"          : "UnispectralCamera",
  "Gain"                : 1, // optional; default = 1
  "BinningSpatial"      : 1, // optional; default = 1
  "BinningSpectral"     : 1, // optional; default = 1
  "SpectralRoi"         : "0;0;0;0;1;1;1;1;1;0" // optional; default = "1;1;1;1;1;1;1;1;1;1"
}

SpectralRoi = "1" is equivalent to "1;1;1;1;1;1;1;1;1;1" - meaning all bands are included

For more information see: Unispectral


Resonon

Resonon Basler-based cameras (Pika L, XC2, and UV), see Resonon

USB or Ethernet connected cameras.

May required drivers to be installed on the computer. See Basler

JSON
{
  "DeviceName" : "ResononBaslerCamera",
  "CameraType" : "BaslerCamera serial-number",
  "BinningHorizontal" : 1, // optional; default = loaded from Resonon camera specific settings
  "BinningVertical" : 1 // optional; default = loaded from Resonon camera specific settings
}

serial-number is the serial number device id specified on the camera

Resonon Allied Vision-based cameras (Pika IR, IR+, IR-L, and IR-L+)

JSON
{
  "DeviceName" : "ResononAlliedVisionCamera",
  "CameraType" : "GigE device-id",
  "BinningHorizontal" : 1, // optional; default = loaded from Resonon camera specific settings
  "BinningVertical" : 1 // optional; default = loaded from Resonon camera specific settings
}

device-id is the device id specified on the camera


Qtechnology

qtec C-Series

JSON
{
  "DeviceName" : "ServerCamera",
  "CameraType" : "Server",
  "Host"  : "10.100.10.10",
  "PixelFormat" : "short", // optional; default = "short"
  "SpectralRoi" : "1" // optional; default = "1" - all wavelenghts included
  "BinningSpatial" : 1, // optional; default = 1
  "BinningSpectral" : 1 // optional; default = 1
}

For more information see: Qtechnology


Prediktera Data Server
JSON
{
  "DeviceName" : "DataServerCamera",
  "CameraType" : "Server",
  "Port"  : 2200,
  "Width" : 384,
  "Height"  : 31,
  "Wavelength"  : "1164.52;1209.84;1255.04;1300.16;1345.2;1390.17;1435.09;1479.97;1524.81;1569.64;1614.45;1659.25;1704.05;1748.85;1793.66;1838.47;1883.3;1928.12;1972.95;2017.79;2062.62;2107.45;2152.26;2197.05;2241.81;2286.53;2331.2;2375.81;2420.34;2464.78;2509.12"
  "MaxSignal" : 1, // optional; default = 1
  "DataSize"  : "Float" // optional; valid values "Byte", "Short", "Float", "Double"
}

Changes in MaxSignal needs to be reflected in DataSize and vice-versa

For more information see: Prediktera Data Server


Disconnect camera

Message:

JSON
{
  "Command"   :   "DisconnectCamera",
  "Id"        :   "IdString",
  "CameraId"  :   0 // Optional if one (1) camera, mandatory if more than 1 camera
}

Reply:

JSON
{
  "Id"          :    "IdString",
  "Success"     :    true,
  "Message"     :    "Success"
}

Get camera property

Message:

JSON
{
  "Command"     :    "GetCameraProperty",
  "Id"          :    "IdString",
  "CameraId"    :    0, // Optional if one (1) camera, mandatory if more than 1 camera
  "Property"    :    "FrameRate"
}

Reply:

JSON
{
  "Id"          :    "IdString",
  "Success"     :    true,
  "Message"     :    "100"
}

Below is a list of properties, which all logical cameras have. Some physical cameras cannot be queried for some properties though, and a default value will be returned by the logical camera. Some logical cameras have properties not listed here, but that can still be retrieved. E.g. The camera “SimulatorCamera” has the property UniqueProperty (Float, 32 bits) to demonstrate this feature. It also has the property “State”, which can be used for simulating dark and white references (see commands TakeDarkReference and TakeWhiteReference).

Property

Original Data Type

Comment

IntegrationTime

Float, 32 bits

µs

FrameRate

Float, 32 bits

Hz

IsCapturing

Boolean

true or false

ImageWidth

Integer, 32 bits

No of pixels

ImageHeight

Integer, 32 bits

No of pixels

Wavelengths

string

E.g. "300;300.3;300.6;300.9;301.2..."

MaxSignal

Integer, 32 bits

Maximum signal value

Temperature

Float, 32 bits

Sensor temperature ().

DataSize

Integer

1 = Byte,

2 = Short (Integer, 16 bits),

4 = Float (Float, 32 bits),

8 = Double (Float, 64 bits)

Interleave

Integer

1 = BIL (Band Interleaved by Line)

2 = BIP (Band Interleaved by Pixel)

Set camera property

Message:

JSON
{
  "Command"   : "SetCameraProperty", 
  "Id"        : "IdString",,
  "CameraId"  : 0, // Optional if one (1) camera, mandatory if more than 1 camera
  "Name"      : "FrameRate",
  "Value"     : "200"
} 

Reply:

JSON
{
  "Id" : "IdString", 
  "Success" : true, 
  "Message" : "200"
}

Below is a list of properties, which all logical cameras have. Some physical cameras cannot have some properties set though, and the set operation will not have any effect. Some logical cameras have properties not listed here, but that can still be set. E.g. The camera “SimulatorCamera” has the property UniqueProperty (Float, 32 bits) to demonstrate this feature. It has also the property "State", which can be used when simulating dark and white references (see commands TakeDarkReference and TakeWhiteReference).

Property

Original Data Type

Comment

IntegrationTime

Float, 32 bits

µs

FrameRate

Float, 32 bits

Hz

Raw-data capture commands

Start capture

Message:

JSON
{
    "Command"         : "StartCapture",
    "Id"              : "IdString",
    "CameraId"        : 0, // Optional if one (1) camera, mandatory if more than 1 camera
    "NumberOfFrames"  : 10, 
    "Folder"          : "Path to folder" //Optional 
} 

The parameter NumberOfFrames is optional; if omitted, the camera will capture until StopCapture is sent.

The parameter Folder is optional; if omitted, the camera will store the captured frames in a measurement.raw file in the chosen folder.

Reply:

JSON
{
    "Id"            :    "IdString",
    "Success"       :    true,
    "Message"       :    ""
}

Stop capture

Message:

JSON
{
    "Command"     : "StopCapture", 
    "CameraId"    : 0, // Optional if one (1) camera, mandatory if more than 1 camera
    "Id"          : "IdString"
} 

Reply:

JSON
{
  "Id"      : "IdString", 
  "Success" : true, 
  "Message" : ""
} 

Prediction commands

Get workflows

Message:

JSON
{
    "Command"               :    "GetWorkflows",
    "Id"                    :    "IdString"
    "IncludeTestWorkflows"  :    true
}

The parameter IncludeTestWorkflows specifies whether the test workflows bundled with Breeze Runtime (see below) should be included; If the parameter is false or omitted, only the workflows created by the user with Breeze are included.

Reply:

JSON
{
    "Id"         :    "IdString",
    "Success"    :    true,
    "Message"    :    "
    [
      {
        "Name": "Test Workflow",
        "Id": "TestWorkflow",
  ....
Click to expand entire reply body
JSON
{
    "Id"        :    "IdString",
    "Success"    :    true,
    "Message"    :    "
    [
      {
        "Name": "Test Workflow",
        "Id": "TestWorkflow",
        "Description": "Powder Quantification 10 pixels x 9 lines Test Sample",
        "CreatedTime": "20180325160219",
        "CreatedBy": "Administrator",
    "FieldOfView": 320.0,
    "IncludeChildObjects": true,
    "Settings": {
        "PredictionMode": "Normal",
        "Chunks": 1,
        "BufferSize": 1,
        "LineBinning": 1
    },
        "PredictionMode": "Normal",
        "ObjectFormat": {
          "Id": "12345",
          "Name": "spectral_sample",
          "Descriptors": [
            {
              "Type": "Category",
              "Name": "Type",
              "Index": 0,
              "Id": "ebf126aa",
              "Classes": [
                {
                  "Name": "-",
                  "Color": "#ff0000",
                  "Value": 0
                },
                {
                  "Name": "V",
                  "Color": "#3ad23a",
                  "Value": 1
                },
                {
                  "Name": "P",
                  "Color": "#4664be",
                  "Value": 2
                },
                {
                  "Name": "B",
                  "Color": "#f6f76d",
                  "Value": 3
                }
              ]
            },
            {
              "Type": "Property",
              "Name": "B",
              "Index": 1,
              "Id": "28987009"
            },
            {
              "Type": "Property",
              "Name": "V",
              "Index": 2,
              "Id": "78f02a2b"
            },
            {
              "Type": "Property",
              "Name": "P",
              "Index": 3,
              "Id": "1c6d9580"
            }
          ]
        },
        "StreamFormat": {
          "TimeFormat": "Utc100NanoSeconds",
          "Lines": [
            {
              "Type": "Category",
              "Name": "SampleCategory",
              "Index": 0,
              "Id": "aa533a79",
              "Classes": [
                {
                  "Name": "-",
                  "Color": "#ff0000",
                  "Value": 0
                },
                {
                  "Name": "Sample",
                  "Color": "#3ad23a",
                  "Value": 1
                }
              ]
            },
            {
              "Type": "Category",
              "Name": "Type",
              "Index": 1,
              "Id": "ebf126aa",
              "Classes": [
                {
                  "Name": "-",
                  "Color": "#ff0000",
                  "Value": 0
                },
                {
                  "Name": "V",
                  "Color": "#3ad23a",
                  "Value": 1
                },
                {
                  "Name": "P",
                  "Color": "#4664be",
                  "Value": 2
                },
                {
                  "Name": "B",
                  "Color": "#f6f76d",
                  "Value": 3
                }
              ]
            },
            {
              "Type": "Property",
              "Name": "B",
              "Index": 2,
              "Id": "28987009"
            },
            {
              "Type": "Property",
              "Name": "V",
              "Index": 3,
              "Id": "78f02a2b"
            },
            {
              "Type": "Property",
              "Name": "P",
              "Index": 4,
              "Id": "1c6d9580"
            }
          ]
        }
      },
      {
        "Name": "Nuts Workflow",
        "Id": "NutsWorkflow",
        "Description": "This is a workflow with nuts and shells",
        "CreatedTime": "20180404145346",
        "CreatedBy": "Administrator",
        "PredictionMode": "Normal",
        "ObjectFormat": {
          "Id": "12345",
          "Name": "Sample - Nuts_Classification",
          "Descriptors": [
            {
              "Type": "Category",
              "Name": "Nut or shell",
              "Index": 0,
              "Id": "80c6940d",
              "Classes": [
                {
                  "Name": "-",
                  "Color": "#ff0000",
                  "Value": 0
                },
                {
                  "Name": "Nut",
                  "Color": "#3ad23a",
                  "Value": 1
                },
                {
                  "Name": "Shell",
                  "Color": "#4664be",
                  "Value": 2
                }
              ]
            }
          ]
        },
        "StreamFormat": {
          "TimeFormat": "Utc100NanoSeconds",
          "Lines": [
            {
              "Type": "Category",
              "Name": "SampleCategory",
              "Index": 0,
              "Id": "5dae874a",
              "Classes": [
                {
                  "Name": "-",
                  "Color": "#ff0000",
                  "Value": 0
                },
                {
                  "Name": "Sample",
                  "Color": "#3ad23a",
                  "Value": 1
                }
              ]
            },
            {
              "Type": "Category",
              "Name": "Nut or shell",
              "Index": 1,
              "Id": "80c6940d",
              "Classes": [
                {
                  "Name": "-",
                  "Color": "#ff0000",
                  "Value": 0
                },
                {
                  "Name": "Nut",
                  "Color": "#3ad23a",
                  "Value": 1
                },
                {
                  "Name": "Shell",
                  "Color": "#4664be",
                  "Value": 2
                }
              ]
            }
          ]
        }
      },
      {
        "Name": "Object To Pixels Pipeline Workflow",
        "Id": "ObjectToPixelsPipelineWorkflow",
        "Description": "Object To Pixels Pipeline Mode Workflow",
        "CreatedTime": "20180831131648",
        "CreatedBy": "Administrator",
        "PredictionMode": "ObjectToPixelsPipeline",
        "StreamFormat": {
          "TimeFormat": "Utc100NanoSeconds",
          "Lines": [
            {
              "Type": "Category",
              "Name": "Type",
              "Index": 0,
              "Id": "8a614f86",
              "Classes": [
                {
                  "Name": "Background",
                  "Color": "#4664be",
                  "Value": 0
                },
                {
                  "Name": "Sample",
                  "Color": "#3ad23a",
                  "Value": 1
                },
                {
                  "Name": "Unknown",
                  "Color": "#ff0000",
                  "Value": 2
                },
                {
                  "Name": "A",
                  "Color": "#3ad23a",
                  "Value": 3
                },
                {
                  "Name": "B",
                  "Color": "#4664be",
                  "Value": 4
                },
                {
                  "Name": "C",
                  "Color": "#f6f76d",
                  "Value": 5
                },
                {
                  "Name": "D",
                  "Color": "#73e1fa",
                  "Value": 6
                },
                {
                  "Name": "Unknown Sample",
                  "Color": "#ff0000",
                  "Value": 7
                }
              ]
            }
          ]
        }
      }
    ]
  "}

Load workflow

Loads a workflow to use for predictions, and returns detailed information about the workflow. A single workflow can be active at the time in the Runtime.

Message:

JSON
{
 "Command"        :   "LoadWorkflow",
 "Id"             :    "IdString",
 "WorkflowId"     :    "TestWorkflow",
 "UseReferences"  :    true,
 "Xml"            :    null,  // can be omitted
 "FieldOfView"    :    10.5
}
  • To load a server-side workflow, the parameter WorkflowId is set to one of the IDs in the reply from GetWorkflows.

  • To load a client-side workflow, the parameter Xml is set to the contents of a client-side workflow XML file.

  • Either WorkflowId or Xml must be specified, but not both.

  • FieldOfView is optional. if omitted, the value for the current camera will be used from settings in Breeze on the local computer.

Reply:

The reply to this command contains detailed information on workflow in the returned Message JSON object. Specifically these parts of the Message are useful to understand the data returned in the Runtime’s Event stream and Data stream:

  • The Settings/PredictionMode value Normal corresponds to the Pixel Prediction Lines document settings for the data stream.

  • The value of ObjectFormat and its Descriptors value contain definitions of the JSON data for each object in the Event Stream.

  • The the StreamFormat value is an object defining the data in the binary Data Stream. Since it contains binary data, you need this information to parsing the data. The value of Type for each descriptor in the StreamFormat/Lines object specifies the size of the data: Type:Property means it is a quantitative value sin a 32-bit float data type. Type:Category - a byte containing a class index.

Tip: click below to expand the JSON Message body to view an example.

JSON
{
    "Id"         :    "IdString",
    "Success"    :    true,
    "Message"    :    
    "{
      "Name": "Test Workflow",
  ....
Click to expand entire workflow JSON body
JSON
{
  "Name": "Test Workflow",
  "Id": "TestWorkflow",
  "Description": "This is a test workflow",
  "CreatedTime": "20180325160219",
  "CreatedBy": "Administrator",
  "FieldOfView": 320.0,
  "IncludeChildObjects": true,
  "Settings": {
      "PredictionMode": "Normal",
      "Chunks": 1,
      "BufferSize": 1,
      "LineBinning": 1
  },
  "ObjectFormat": {
    "Id": "12345",
    "Name": "spectral_sample",
    "Descriptors": [
      {
        "Type": "Category",
        "Name": "Type",
        "Index": 0,
        "Id": "ebf126aa",
        "Classes": [
          {
            "Name": "-",
            "Color": "#ff0000",
            "Value": 0
          },
          {
            "Name": "V",
            "Color": "#3ad23a",
            "Value": 1
          },
          {
            "Name": "P",
            "Color": "#4664be",
            "Value": 2
          },
          {
            "Name": "B",
            "Color": "#f6f76d",
            "Value": 3
          }
        ]
      },
      {
        "Type": "Property",
        "Name": "B",
        "Index": 1,
        "Id": "28987009"
      },
      {
        "Type": "Property",
        "Name": "V",
        "Index": 2,
        "Id": "78f02a2b"
      },
      {
        "Type": "Property",
        "Name": "P",
        "Index": 3,
        "Id": "1c6d9580"
      }
    ]
  },
  "StreamFormat": {
    "TimeFormat": "Utc100NanoSeconds",
    "LineWidth": 10,
    "ColorScale": "jet",
    "ReverseColors": false,
    "Lines": [
      {
        "Type": "Category",
        "Name": "SampleCategory",
        "Index": 0,
        "Id": "aa533a79",
        "Classes": [
          {
            "Name": "-",
            "Color": "#ff0000",
            "Value": 0
          },
          {
            "Name": "Sample",
            "Color": "#3ad23a",
            "Value": 1
          }
        ]
      },
      {
        "Type": "Category",
        "Name": "Type",
        "Index": 1,
        "Method": "ClassificationExpression",
        "Id": "ebf126aa",
        "Min": 0.0,
        "Max": 0.0,
        "Classes": [
          {
            "Name": "-",
            "Color": "#ff0000",
            "Value": 0
          },
          {
            "Name": "V",
            "Color": "#3ad23a",
            "Value": 1
          },
          {
            "Name": "P",
            "Color": "#4664be",
            "Value": 2
          },
          {
            "Name": "B",
            "Color": "#f6f76d",
            "Value": 3
          }
        ],
        "ObjectOnly": false
      },
      {
        "Type": "Property",
        "Name": "B",
        "Index": 2,
        "Method": "QuantificationPls",
        "Id": "28987009",
        "Min": 0.0,
        "Max": 1.0,
        "LineType": "QuantificationVector",
        "ObjectOnly": false
      },
      {
        "Type": "Property",
        "Name": "V",
        "Index": 3,
        "Method": "QuantificationPls",
        "Id": "78f02a2b",
        "Min": 0.0,
        "Max": 1.0,
        "LineType": "QuantificationVector",
        "ObjectOnly": false
      },
      {
        "Type": "Property",
        "Name": "P",
        "Index": 4,
        "Method": "QuantificationPls",
        "Id": "1c6d9580",
        "Min": 0.0,
        "Max": 1.0,
        "LineType": "QuantificationVector",
        "ObjectOnly": false
      }
    ]
  }
}

CreatedTime has the format yyyyMMddhhmmss.

Delete workflow

Message:

JSON
{
  "Command"    : "DeleteWorkflow", 
  "WorkflowId" : "IdString"
} 

Open shutter

Message:

JSON
{
    "Command"   : "OpenShutter", 
    "CameraId"  : 0, // Optional if one (1) camera, mandatory if more than 1 camera
    "Id"        : "IdString"
} 

Open the camera shutter

Reply:

JSON
{
  "Id"      : "IdString", 
  "Success" : true, 
  "Message" : ""
} 

Close shutter

Message:

JSON
{
  "Command"   : "CloseShutter", 
  "CameraId"  : 0, // Optional if one (1) camera, mandatory if more than 1 camera
  "Id"        : "IdString"
} 

Closes the camera shutter.

Reply:

JSON
{
  "Id"      : "IdString", 
  "Success" : true, 
  "Message" : ""
} 

Take dark reference

Message:

JSON
{
  "Command"   : "TakeDarkReference", 
  "CameraId"  : 0, // Optional if one (1) camera, mandatory if more than 1 camera
  "Id"        : "IdString"
} 

If the camera “SimulatorCamera” is used (see command Initialize camera), the camera state must be set to DarkReference first; either use the command CloseShutter, or use the command

SetProperty to set State to DarkReference. When the shutter is opened the state is set to Normal.

Reply:

JSON
{
    "Id"      : "IdString", 
    "Success" : true, 
    "Message" : ""
} 
Dark reference quality control

Automatic dark reference quality control checks

  • The standard deviation between lines is lower than 5%

  • The average signal relative to max signal is lower than 50%

Invalid dark reference reply example:

Reply:

JSON
{
    "Id"      : "IdString", 
    "Success" : false, 
    "Code"    : 1006 
    "Error"   : "InvalidDarkReference" 
    "Message" : "Variation over lines is higher than 5%"
} 

Take white reference

Message:

JSON
{
    "Command" : "TakeWhiteReference", 
    "Id"      : "IdString"
} 

If the camera “SimulatorCamera” is used (see command Initialize Camera), the camera property State must be set to WhiteReference first, using the command SetCameraProperty. After the reference is taken the state can be set to Normal.

Reply:

JSON
{
    "Id"      : "IdString", 
    "Success" : true, 
    "Message" : "Type=WhiteReferenceQuality;State=Good;Message=;StderrLines=0.00024;StderrPixels=0.01095;Min=38189;Mean=48029.7;Median=49055;Max=52960;Std=3717.01;StdError=0.07739;SaturatedPixels=25;TotalSaturated=2"
} 

The Message field contains a semicolon-delimited string of white reference quality metrics:

  • Type – identifies the type of quality report (WhiteReferenceQuality).

  • State – overall evaluation of the reference quality (Good, Warning, Error)

  • Message – optional message with additional context (empty in this case).

  • StderrLines – standard error calculated across line intensities.

  • StderrPixels – standard error calculated across pixel intensities.

  • Min – minimum intensity value observed.

  • Mean – average intensity value across all pixels.

  • Median – median intensity value across all pixels.

  • Max – maximum intensity across all pixels.

  • Std – standard deviation of intensity values (spread/variance).

  • StdError – overall standard error of the mean.

  • SaturatedPixels – number of pixels at maximum measurable intensity in a single line/frame.

  • TotalSaturated – total number of frames/lines containing saturation events.

White reference quality control

Automatic white reference quality control checks

  • The standard deviation between lines is lower than 5%

  • The average signal relative to max signal is lower than 99%

  • The average signal relative to max signal is higher than 50%

  • The standard deviation between average pixels is lower than 5%

10% of border pixels on each side is not included in the calculation

Invalid white reference reply example:

Reply:

JSON
{
    "Id"      : "IdString", 
    "Success" : false, 
    "Code"    : 1007 
    "Error"   : "InvalidWhiteReference" 
    "Message" : "White reference less than 50% of max signal"
} 

Get workflow setup

Should be called to retrieve prediction setup from the active workflow, which another client loaded.

Message:

JSON
{
    "Command" : "GetWorkflowSetup", 
    "Id"      : "IdString"
} 

Reply:

See command LoadWorkflow.

Start predict

Message:

JSON
{
    "Command"            : "StartPredict", 
    "Id"                 : "IdString", 
    "IncludeObjectShape" : true, 
    "FrameCount"         : -1
} 

Reply:

JSON
{
    "Id"      : "IdString", 
    "Success" : true, 
    "Message" : ""
} 

Parameters:

  • IncludeObjectShape (optional, defaults to false): Whether start y offset, object center, and border coordinates should be included with object identified event.

The identified objects are sent over the event stream (please see section Event Stream). This is an example object:

JSON
{ 
    "Event"   : "PredictionObject", 
    "Code"    : 4000, 
    "Message" : "%7B......" 
} 
PredictionObject

Message field un-escaped and indented:

JSON
{ 
    "Id": "9a327679-12cd-4442-996b-8268d24a5e50", 
    "CameraId": 0,
    "SegmentationId": "dc5bae5e",
    "StartTime": 638919780939627360,
    "EndTime": 638919780951329271,
    "StartLine": 2001,
    "EndLine": 2500,
    "Children": [],
    "Descriptors":
    [ 
        1.0, 
        45.4867249, 
        47.89075, 
        6.40655136 
    ], 
    "Shape":
    { 
        "Center": [968, 249], 
        "Border":
        [ 
            [0, 0],
            [1936, 0],
            [1936, 499],
            [0, 499] 
        ] 
    } 
} 

Stop predict

Message:

JSON
{
    "Command" : "StopPredict", 
    "Id" : "IdString"
} 

Reply:

JSON
{
    "Id"      : "IdString", 
    "Success" : true, 
    "Message" : ""
} 

Get status

Message:

JSON
{
    "Command" : "GetStatus", 
    "Id"      : "IdString"
} 

Reply:

JSON
{
    "Id"    :    "IdString",
    "Success"    :    true,
    "Message"    :
   "{\"State\":    \"Predicting\",
     \"WorkflowId\":    \"abcde\",
     \"CameraType\":    \"MWIR\",
     \"FrameRate\":    200.0,                     // Hz
     \"IntegrationTime\":    1500.0,              // µs
     \"Temperature\":    293.15,                  // K
     \"DarkReferenceValidTime\":    4482.39258,   // s
     \"WhiteReferenceValidTime\":    9483.392,    // s
     \"LicenseExpiryDate\":    \"2018-12-31\",    // yyyy-MM-dd
     \"SystemTime\":    636761502932108549,
     \"SystemTimeFormat\":    \"Utc100NanoSeconds\"}"
}
  • The original data type for SystemTime is an unsigned 64-bit integer.

States:

  • Idle

  • LoadingWorkflow

  • Predicting

  • StoppingPrediction

  • CapturingRawPixelLines

  • CapturingMultipleFrames

The field Temperature represents the camera sensor temperature in Kelvin degrees. Celsius degrees = Temperature - 273.15. Fahrenheit degrees = Temperature x 1.8 - 459.67.

Get property

Message:

JSON
{
    "Command"  : "GetProperty", 
    "Id"       : "IdString", 
    "Property" : "PropertyString", 
//Optional: 
    "NodeId"   : "NodeIdString", 
    "Name"     : "FieldName"
} 

Reply:

JSON
{
    "Id"      : "IdString", 
    "Success" : true, 
    "Message" : "result string"
} 

The field NodeId can be used to get property information from a node in the analysis tree. If it is not specified then workflow global properties will be fetched.

Available Properties are:

  • Version

  • State

  • WorkspacePath

  • WorkflowId

  • DarkReferenceValidTime

  • WhiteReferenceValidTime

  • DarkReferenceFile

  • WhiteReferenceFile

  • LicenseExpiryDate

  • SystemTime

  • SystemTimeFormat

  • PredictorThreads

    • Return number of threads used in prediction.
      Default -1 = number of available virtual cpu cores

  • AvailableCameraProviders

    • Returns list of semicolon separated available providers

  • Fields

    • Returns list of semicolon separated workflow fields

  • FieldValue

    • Requires “Name"

    • Workflow field value

Example 1

Message:

JSON
{
    "Command" : "GetProperty", 
    "Id" : "1", 
    "Property" : "Version"
} 

Reply:

JSON
{
    "Id" : "1", 
    "Success" : true, 
    "Message" : "2021.1.0.999 Release"
} 

Example 2

Message:

JSON
{
    "Command" : "GetProperty", 
    "Id" : "2", 
    "Property" : "FieldValue", 
    "Name" : "Voltage"
} 

Reply:

JSON
{
    "Id" : "2", 
    "Success" : true, 
    "Message" : "100"
} 

Set property

Message:

JSON
{
    "Command" : "SetProperty", 
    "Id" : "IdString", 
    "Property" : "PropertyString", 
    "Value" : "ValueString"
} 

Reply:

JSON
{
    "Id" : "IdString", 
    "Success" : true, 
    "Message" : "result string"
} 

Available Properties to set are:

  • PredictorThreads

    • The number of threads used in prediction.
      Default = -1 (number of available virtual CPU cores)

Re-Initialize prediction

If there is a camera problem during prediction then Initialize command can be called to reconnect the camera and if successful start the prediction again. The tries parameter decides the number of tries that will be done before returning the error code CameraNotStable: 1009 error code.

Re-Initialize command sequence order

  • If predicting before Re-Initialize command was called

    • StopPredict

  • For a number of tires until the camera is initialized

    • DisconnectCamera

    • InitializeCamera

  • If predicting before Re-Initialize command was called

    • LoadWorkflow (Same workflow as loaded before)

    • StartPredict

Message:

JSON
{
    "Command" : "Initialize", 
    "Id" : "IdString", 
    "Tries" : 10, 
    "TimeBetweenTrialSec" : 10
} 

Reply:

JSON
{
    "Id" : "IdString", 
    "Success" : true, 
    "Message" : ""
} 

Start capture on predict

Start capture measurements while predicting. The measurements are stored in the folder:

{Breeze workspace folder}/Data/Runtime/Measurements/{date}

Capture folders are named by date when starting the capture. The recorded measurements stored in that folder are divided by the max number count argument. The measurements are called “Measurement_1”, “Measurement_2”,...“Measurement_N”.

Message:

JSON
{
    "Command" : "StartCaptureOnPredict", 
    "Id" : "IdString", 
    "Name" : "Name of measurements", 
    "MaxFrameCount" : 1000 // Max number of frames per measurement 
    "Object" : true | false  // If data should be saved only when objects are identified
}

When segmentation is used to identify objects in a project, “Object”: true results in measurements being captured only when an object is detected. If no object is detected those image lines will not be saved to disk in the measurement, to save space.

Reply:

JSON
{
    "Id" : "IdString", 
    "Success" : true, 
    "Message" : ""
} 

Stop capture on predict

Message:

JSON
{
    "Command" : "StopCaptureOnPredict", 
    "Id" : "IdString"
} 

Reply:

JSON
{
    "Id" : "IdString", 
    "Success" : true, 
    "Message" : ""
} 

Apply changes on ENVI image files

Apply changes using specific Breeze Runtime Workflow on ENVI image files with extension (raw, bil, bip, bsq or img) to create pixel predictions ([FILE_NAME]_prediction.raw), measurement thumbnail ([FILE_NAME].jpg) and result from descriptors ([FILE_NAME].xml) saved in the same folder.

Measurement folder must contain dark reference (darkref_[FILE_NAME].raw) and white reference (whiteref_[FILE_NAME].raw) if the raw data should be converted to reflectance or absorbance.

Message:

JSON
{
  "Command"   : "ApplyChanges",
  "Id"        : "IdString",
  "XmlFile"   : "{path to Runtime workflow}/workflow.xml",
  "Files"     : [
    "{path to ENVI image file 1}",
    "{path to ENVI image file 2}",
    "{path to ENVI image file N}"
  ]
}

Reply:

JSON
{
    "Id"      : "IdString", 
    "Success" : true, 
    "Message" : ""
} 

Error Handling and Events

Here is some additional information regarding error formats and responses.

Error Types

Errors can occur in two ways:

  1. Command Errors
    Sent as part of a command reply when that command fails.

  2. Event Errors
    Sent via the event stream when something goes wrong server-side (e.g. during prediction or capture).

1. Command Errors

When a command fails:

  • Success

    • Type: boolean

    • Value: false

  • Error

    • Type: string

    • Description: A short error code

  • Code

    • Type: integer

    • Description: A numeric error code

  • Message

    • Type: string

    • Description: A human-readable description of what went wrong

Example reply:

JSON
{
  "Id"      : "IdString",
  "Success" : false,
  "Message" : "Camera is not initialized",
  "Error"   : "GeneralError",
  "Code"    : 3000
}

When an error occurs on the server-side during prediction or capture, the error information is sent via the event stream. Log files can be found in the directory “%USERPROFILE%\.Prediktera\Breeze” in the home directory, e.g. “C:\Users\<user name>\.Prediktera\Breeze. “

2. Event Stream

When an error occurs server-side (for example, during prediction or capture), it’s sent as an event rather than a command response. To investigate these errors in detail, check the log files at:

BASH
%USERPROFILE%\.Prediktera\Breeze

– for example

BASH
C:\Users\<user name>\.Prediktera\Breeze

These logs contain full stack traces and diagnostic details for any event-stream errors.

Learn more in Breeze log files and troubleshooting.

Command Error Codes

These will be sent in response to a failed command.

Error Code

Constant

Description

1000

GeneralCommandError

General command error. Message field gives more details.

3001

UnknownError

Unexpected error in runtime command call. Stack trace gives more details.

JSON
{
    "Id" : "s94kl34",
    "Success": false,
    "Event": "Error",
    "Error": "UnknownError",
    "Code" : 3001,
    "Message": "ArrayIndexOutOfBoundsException",
    "StackTrace": "At RtPredict.cs:281 ...\nAt ..."
}

1002

InvalidLicense

Sent on StartCapture if no valid license is found for Breeze. Sent on StartPredict if no valid license is found for BreezeAPI.

1003

MissingReferences

Sent on StartPredict if references are needed and both references are missing.

1004

MissingDarkReference

Sent on StartPredict if references are needed and dark reference is missing.

1005

MissingWhiteReference

Sent on StartPredict if references are needed and white reference is missing.

1006

InvalidDarkReference

Sent on TakeDarkReference if the captured dark reference is not good enough. Message will include detailed explanation.

1007

InvalidWhiteReference

Sent on TakeWhiteReference if the captured white reference is not good enough. Message will include a detailed explanation.

1008

MissingDarkReferenceFile

Sent on TakeWhiteReference if white reference intensity is used and there is no dark reference file to read.

1009

CameraNotStable

Sent on Initialize when cannot reinitialize prediction after given number tries

Event Stream

Events and errors that do not belong to a command will be sent over TCP/IP on port 2500. The message format is unindented JSON ended with CR + LF (ASCII 13 + ASCII 10). All errors, except those in command responses, will also be sent over this channel.

Regular Event Codes

These events can be sent during startup, prediction, and capture.

Example

Message:

JSON
{
    "Event" : "DarkReferenceOutdated", 
    "Code" : 2001
} 

Event Code

Constant

Description

2000

reserved for event server

2001

DarkReferenceOutdated

Dark reference is outdated

2002

WhiteReferenceOutdated

White reference is outdated

2003

WorkflowListChanged

Workflow in Runtime folder has been changed, removed or added

Error Event Codes

These events can be sent during startup, prediction, and capture. An error event will have the field Event = “Error” and the field “Error” specifies the name of the error.

Example

Message:

JSON
{
    "Event" : "Error", 
    "Error" : "UnknownError",
    "Message" : "<message>",
    "Code" : 3001, 
    "StackTrace": "At RtPredict.cs:281..."
} 

Error Code

Constant

Description

3000

GeneralError

General error. Message field gives more details.

3001

UnknownError

Unexpected error during prediction or capture. Message will include error details

JSON
{
     "Event"     : "UnknownError",
     "Code"      : 3001,
     "Message": "ArrayIndexOutOfBoundsException",
     "StackTrace": "At RtPredict.cs:281 ...\nAt ..."
}

3002

CameraErrorCode

Forwards error code generated by camera supplier. A new field CameraErrorCode will be present in event. Example:

JSON
{
     "Event"    : "CameraErrorCode",
     "Code"    : 3002, 
     "Message"    : "Camera not streaming"
}

3003

FrameQueueOverflow

Prediction frame queue overflow. Please use a lower camera frame rate.

Prediction Object

Event Code

Constant

Description

4000

PredictionObject

The Runtime has identified an object. Its properties can be found in the message field as escaped JSON. See Start Predict for details.

JSON
{
  "Event"       : "PredictionObject",
  "Code"        : 4000, 
  "Message"     : "<Escaped JSON>"
}

Data Stream

The data stream from the server is sent over TCP/IP. By default, the port is 3000 but you can change it with the InitializeCamera command. A client connects to this port and reads output data from the Runtime for all pixels, line by line.

When a client connects to the Data stream to obtain data, it must keep up with the data flow from the Runtime. Otherwise, the Runtime’s send queue can overflow resulting in lost data, and a SendQueueOverflow error returned to the client in the Event stream, and also logged as warnings in the Runtime log. This means that a client that isn’t interested in data, still must read it, or otherwise close the connection to the Runtime.

All integers and floats in the stream use little-endian encoding.

Stream Header

Each packet begins with a fixed-length header:

CODE
[StreamType][FrameNumber][Timestamp][MetadataSize][DataBodySize]
  • StreamType (1 byte)

    • 1 = Raw Pixel Line

    • 2 = Prediction Lines

    • 3 = RGB Pixel Line

    • 4 = StreamStarted / EndOfStream

  • FrameNumber (64-bit integer)
    Frame number assigned by the camera.

  • Timestamp (64-bit integer)
    Number of 100-nanosecond intervals since 00:00:00 UTC on January 1, 0001.

  • MetadataSize (32-bit integer)
    Byte count of the metadata body.

  • DataBodySize (32-bit integer)
    Byte count of the data body.

Metadata body

Immediately following the header, the metadata body contains four 32-bit integer fields, each measured in 100-nanosecond units:

  • CameraProcessingTime

  • CameraDeltaTimeToPreviousFrame

  • BreezeProcessingTime

  • BreezeDeltaTimeToPreviousFrame

Data body

The contents of the data body depend on StreamType:

1. Raw Pixel Line

  • Description: Raw bytes received directly from the camera.

  • Header Timestamp: Time when the raw data arrived.

2. Prediction Lines

  • Header Timestamp: Time when the raw frame data arrived from the camera.

  • Layout: One line per array, matching the LineWidth from the LoadWorkflow reply.

  • Components:

    • Classification Vector (byte array)

      • Range: 0 (No class) to 255 (class index)

      • Example 1:

        CODE
        [0,0,0,0,1,1,1,1,0,0,0,2,2,2,2,0,0,0,0,0,3,3,3,0,0,…]
      • Example 2:

        CODE
        [0,0,1,1,1,0,1,1,0,0,0,1,1,1,1,0,0,0,0,0,1,1,1,0,0,…]

        (sample vector) Data Type: Byte

    • Quantification Vector (32-bit float array)

      • Range: –MinFloat to +MaxFloat

      • Example:

        CODE
        [0,0,0,0,4.5,5.2,3.8,6.5,0,0,0,…]
    • Confidence Values (byte array)

      • Range: 5 (High) to 1 (Low), or 0 if not part of the model

        TEXT
        [0,0,0,0,3,3,3,3,0,0,0,5,5,5,5,0,0,0,0,0,1,1,1,0,0,…]

3. Rgb Pixel Line

  • Description: Color pixel data.

  • Data Body: Triplets of bytes for red, green and blue.

  • Range: 0–255 per channel

  • Example:

    CODE
    [50,100,150], [50,45,55], [85,65,255], …

RGB pixel lines correspond to the real-time visualization in Breeze Recorder and Breeze Client where you use the toolbar to change the variable to see. To change the visualization variable or background blend programmatically, use SetProperty as described below in https://prediktera.atlassian.net/wiki/spaces/BR/pages/564200816/Developers+reference+guide#RuntimeConfigurationVariable

4. Stream Start / End

  • Header Timestamp: Time when the control message was sent.

  • Data Body: ASCII strings, either

    • StreamStarted

    • EndOfStream

Runtime Configuration

You can adjust visualization settings at runtime with the Developers reference guide | setProperty command:

TEXT
SetProperty("Property" = "VisualizationVariable", "Value" = "<Raw | Reflectance | Absorbance | Descriptor names>")
SetProperty("Property" = "VisualizationBlend",    "Value" = "True or False")
  • VisualizationVariable: Selects which data channel to display.

  • VisualizationBlend: Enables or disables blending mode.

CameraId can be added to both Visualization settings

Runtime Command Switches

Switch

Description

/p:<n>

e.g. "/p:16"

Set no of calculation threads

/w:<Current Workspace Path>

e.g. /w:"C:\temp\My Folder"

Override current workspace path

/e

Execute BreezeRuntime as EventServer (see below)

/logLevel:<Level>

e.g., /logLevel:Trace

Override minimum loglevel for console logging

  • TRACE

  • DEBUG

  • INFO (default)

  • WARN

  • ERROR

  • FATAL

/s:<sessionId>

e.g., /s:Nuts_Workflow

Log session id

Running Breeze Runtime with Breeze Client

1. Start “BreezeRuntime.exe”. Note: Located under “C:\Program

Files\Prediktera\Breeze\Runtime”. If the optional event server is used (see Event Stream), start “BreezeRuntime.exe /e” instead.

2. Start “BreezeClient.exe” from the Start menu

3. Press the “Connect” button.

image-20240815-083749.png

4. Select workflow exported from Breeze and press “Load”

image-20240815-084130.png

5. Start predicting on the loaded workflow by pressing “Start”

image-20240815-085012.png

6. View real-time predictions under the “Realtime” tab

image-20240815-084911.png

7. View Breeze Runtime status under the “Status” tab

image-20240815-085052.png

JavaScript errors detected

Please note, these errors can depend on your browser setup.

If this problem persists, please contact our support.