Stream On-Demand Platform REST API
Operations
StreamAware

StreamAware On-Demand is designed for video service providers to streamline quality assurance and control. The software ensures high-quality video by providing complete visibility across file-based workflows, while also enabling automated quality checks to verify videos meet delivery requirements. It uses the Emmy® award-winning IMAX VisionScience™ technology to provide a single, objective metric to monitor quality across the entire media supply chain and provide clear insights into its performance. The software also includes industry standard and customizable quality checks to automate video, audio, and metadata compliance. This results is complete visibility of quality across file-based workflows, and enables automation of tasks that typically require the human eye, improving efficiency and reducing the margin for human error to ensure video consistently meets the highest standards.

Analyses
POST /analyses
PATCH /analyses/{id}
DELETE /analyses/{id}
Submit a new analysis
POST /analyses

To submit one or more analyses for processing, send a POST request to /analyses.

An analysis can be either full-reference or no-reference.

Full-Reference

  • Used when you want to validate the performance of your encoder or compare one encoder or settings against another

  • Can compare any number of outputs, from a single asset to a full encoding ladder and/or HLS playlist

  • Compares each subject asset against a pristine reference to provide and score a pixel-by-pixel comparison

    A full-reference analysis requires specifying both reference and subject assets and, during its operation, the On-Demand Analyzer will first make a no-reference pass on the reference asset and then will compare each subject asset against the reference. As such, this endpoint will return an Analysis object for the no-reference analysis of the reference in addition to one for each comparison with a subject asset. Let’s consider the following example as an illustration:


    Reference Asset(s) Subject Asset(s)
    GOT_S2_EP1.mov GOT_S2_EP1_libx264_1920x1080_50-0.mov
    GOT_S2_EP1_libx264_1280x720_50-0.mov
    GOT_S2_EP1_libx264_960x540_50-0.mov

    Results in 1 no-reference analysis on the reference (testId: 1):

    • GOT_S2_EP1.mov

    Results in 2 full-reference analyses:

    • GOT_S2_EP1.mov —> GOT_S2_EP1_libx264_1920x1080_50-0.mov (testId: 1-1)
    • GOT_S2_EP1.mov —> GOT_S2_EP1_libx264_1280x720_50-0.mov (testId: 1-2)
    • GOT_S2_EP1.mov —> GOT_S2_EP1_libx264_960x540_50-0.mov (testId: 1-3)

No-Reference

  • Used when you want to validate the quality of a source or reference asset (i.e. source valiation)

  • Analyzes a single asset in isolation to provide a pixel-by-pixel evaluation capable of detecting and scoring the impact of numerous video anomolies

    A no-reference analysis requires specifying one or more subject assets, each of which is analyzed in isolation. The following example illustrates a common no-reference analysis:

    Subject Asset(s)
    GOT_S2_EP1_libx264_1920x1080_50-0.mov
    GOT_S2_EP1_libx264_1280x720_50-0.mov
    GOT_S2_EP1_libx264_960x540_50-0.mov

    Results in 3 no-reference analyses:

    • GOT_S2_EP1_libx264_1920x1080_50-0.mov (testId: 1)
    • GOT_S2_EP1_libx264_1280x720_50-0.mov (testId: 2)
    • GOT_S2_EP1_libx264_960x540_50-0.mov (testId: 3)

For more details on how to configure common requests, please consult the endpoint examples below and the NewAnalysis object that forms the request.

The StreamAware On-Demand Analyzer supports full-reference analyses where the assets do not share the same frame rate, albeit with some restrictions. Please refer to the Cross-frame rate support in the technical documentation for details.

The StreamAware On-Demand Analyzer supports a variety of video file formats. Please refer to the Supported video formats in the technical documentation for details.

Request body

application/json

The NewAnalysis request body is used to submit any combination of reference and subject assets you wish, enabling everything from ad-hoc no-reference analyses to full-reference encoding ladder comparisons. Please consult the description above, the endpoint example and/or the NewAnalysis object for more details.

Responses

201 201

The newly created analysis, populated with key attribute values.

Note

The response returned from this endpoint indicates only that an analysis has been successfully submitted for processing. It makes no guarantees that the analysis will execute without error, nor does it indicate anything about the content or nature of the results, if available. To discover these details you are directed to consult the Insights REST API

Body
application/json
400 400

The request could not be understood by the server due to malformed syntax. The client SHOULD NOT repeat the request without modifications.

403 403

The server understood the request, but is refusing to fulfill it. Authorization will not help and the request SHOULD NOT be repeated.

405 405

The request HTTP method is known by the server but has been disabled and cannot be used for that resource.

415 415

Used when the request is asking for a content-type that is not supported (i.e. XML when you only support JSON). The IMAX Stream On-Demand Platform API currently only supports the JSON content-type (i.e. application/json).

500 500

The server encountered an unexpected condition which prevented it from fulfilling the request.

503 503

The server is currently unable to handle the request due to a temporary overloading or maintenance of the server. The implication is that this is a temporary condition which will be alleviated after some delay.

Examples

Create (submit) a no-reference analysis for an asset all licensed quality checks enabled

curl --location --request POST 'https://localhost/api/v1/analyses' \
--header 'Content-Type: application/json' \
--header 'Accept: application/json' \
--data-raw '{
  "content": {
    "title": "Simple NR Test with Quality Checks - Big Buck Bunny"
  },
  "subjectAssets": [
    {
      "name": "Big_Buck_Bunny.mp4",
      "path": "royalty_free/big_buck_bunny/source",
      "storageLocation": {
        "type": "PVC",
        "name": "video-files-pvc"
      }
    }
  ],
  "analyzerConfig": {
    "enableBandingDetection": true,
    "qualityCheckConfig": {
      "enabled": true,
      "duration": 5,
      "skipStart": 10.25,
      "skipEnd": 10.25,
      "freezeFrame": {
        "enabled": true,
        "duration": 10
      }
    }
  }
}'

HTTP/1.1 201 Created  
 
Content-Type: application/json 

{
  "submittedAnalyses": [
    {
      "analyzerConfig": {
        "additionalConfigurationOptions": {},
        "enableBandingDetection": true,
        "qualityCheckConfig": {
          "blackFrame": {
            "duration": 5,
            "enabled": true,
            "skipEnd": 10.25,
            "skipStart": 10.25
          },
          "colorBarFrame": {
            "duration": 5,
            "enabled": true,
            "skipEnd": 10.25,
            "skipStart": 10.25
          },
          "duration": 5,
          "enabled": true,
          "freezeFrame": {
            "duration": 10,
            "enabled": true,
            "skipEnd": 10.25,
            "skipStart": 10.25
          },
          "missingCaptions": {
            "duration": 5,
            "enabled": true,
            "skipEnd": 10.25,
            "skipStart": 10.25
          },
          "silence": {
            "commonParameters": {
              "duration": 5,
              "enabled": true,
              "skipEnd": 10.25,
              "skipStart": 10.25
            }
          },
          "skipEnd": 10.25,
          "skipStart": 10.25,
          "solidColorFrame": {
            "duration": 5,
            "enabled": true,
            "skipEnd": 10.25,
            "skipStart": 10.25
          }
        },
        "viewingEnvironments": []
      },
      "id": "286703bc-ad9a-4f05-87e4-ffe0cce188dc",
      "subjectAsset": {
        "content": {
          "title": "Simple NR Test with Quality Checks - Big Buck Bunny"
        },
        "hdr": false,
        "name": "Big_Buck_Bunny.mp4",
        "path": "royalty_free/big_buck_bunny/source",
        "storageLocation": {
          "name": "video-files-pvc",
          "type": "PVC"
        }
      },
      "submissionTimestamp": "2022-03-25T22:04:34.601Z",
      "testId": "1"
    }
  ]
}

Create (submit) a full-reference analysis with manual temporal alignment using Asset startFrame property

curl --location --request POST 'https://localhost/api/v1/analyses' \
--header 'Content-Type: application/json' \
--header 'Accept: application/json' \
--data-raw '{
  "content": {
    "title": "Simple FR analysis, no TA - Big Buck Bunny"
  },
  "referenceAssets": [
    {
      "name": "Big_Buck_Bunny.mp4",
      "path": "royalty_free/big_buck_bunny/source",
      "storageLocation": {
        "type": "S3",
        "name": "video-files",
        "credentials": {
          "useAssumedIAMRole": true
        }
      },
      "startFrame": 1
    }
  ],
  "subjectAssets": [
    {
      "name": "Big_Buck_Bunny_h264_qp_21.ts",
      "path": "royalty_free/big_buck_bunny/outputs",
      "storageLocation": {
        "type": "S3",
        "name": "video-files",
        "credentials": {
          "useAssumedIAMRole": true
        }
      },
      "startFrame": 1
    },
    {
      "name": "Big_Buck_Bunny_h264_qp_31.ts",
      "path": "royalty_free/big_buck_bunny/outputs",
      "storageLocation": {
        "type": "S3",
        "name": "video-files",
        "credentials": {
          "useAssumedIAMRole": true
        }
      },
      "startFrame": 1
    }
  ],
  "analyzerConfig": {
    "enableBandingDetection": true,
    "enableTemporalAlignment": false
  }
}'

HTTP/1.1 201 Created
{
  "submittedAnalyses": [
    {
      "analyzerConfig": {
        "additionalConfigurationOptions": {},
        "enableBandingDetection": true,
        "enableTemporalAlignment": false,
        "viewingEnvironments": []
      },
      "id": "9f7c088b-97c2-4576-9cec-da46a3f6a704",
      "subjectAsset": {
        "content": {
          "title": "Simple FR analysis, no TA - Big Buck Bunny"
        },
        "hdr": false,
        "name": "Big_Buck_Bunny.mp4",
        "path": "royalty_free/big_buck_bunny/source",
        "startFrame": 1,
        "storageLocation": {
          "name": "video-files",
          "type": "S3",
          "credentials": {
            "useAssumedIAMRole": true
          }
        }
      },
      "submissionTimestamp": "2022-03-25T22:13:42.096Z",
      "testId": "1"
    },
    {
      "analyzerConfig": {
        "additionalConfigurationOptions": {},
        "enableBandingDetection": true,
        "enableComplexityAnalysis": false,
        "enableTemporalAlignment": false,
        "viewingEnvironments": []
      },
      "id": "9f7c088b-97c2-4576-9cec-da46a3f6a704",
      "referenceAsset": {
        "content": {
          "title": "Simple FR analysis, no TA - Big Buck Bunny"
        },
        "hdr": false,
        "name": "Big_Buck_Bunny.mp4",
        "path": "royalty_free/big_buck_bunny/source",
        "startFrame": 1,
        "storageLocation": {
          "name": "video-files",
          "type": "S3",
          "credentials": {
            "useAssumedIAMRole": true
          }
        }
      },
      "subjectAsset": {
        "content": {
          "title": "Simple FR analysis, no TA - Big Buck Bunny"
        },
        "hdr": false,
        "name": "Big_Buck_Bunny_h264_qp_21.ts",
        "path": "royalty_free/big_buck_bunny/outputs",
        "startFrame": 1,
        "storageLocation": {
          "name": "video-files",
          "type": "S3",
          "credentials": {
            "useAssumedIAMRole": true
          }
        }
      },
      "submissionTimestamp": "2022-03-25T22:13:42.096Z",
      "testId": "1-1"
    },
    {
      "analyzerConfig": {
        "additionalConfigurationOptions": {},
        "enableBandingDetection": true,
        "enableComplexityAnalysis": false,
        "enableTemporalAlignment": false,
        "viewingEnvironments": []
      },
      "id": "9f7c088b-97c2-4576-9cec-da46a3f6a704",
      "referenceAsset": {
        "content": {
          "title": "Simple FR analysis, no TA - Big Buck Bunny"
        },
        "hdr": false,
        "name": "Big_Buck_Bunny.mp4",
        "path": "royalty_free/big_buck_bunny/source",
        "startFrame": 1,
        "storageLocation": {
          "name": "video-files",
          "type": "S3",
          "credentials": {
            "useAssumedIAMRole": true
          }
        }
      },
      "subjectAsset": {
        "content": {
          "title": "Simple FR analysis, no TA - Big Buck Bunny"
        },
        "hdr": false,
        "name": "Big_Buck_Bunny_h264_qp_31.ts",
        "path": "royalty_free/big_buck_bunny/outputs",
        "startFrame": 1,
        "storageLocation": {
          "name": "video-files",
          "type": "S3",
          "credentials": {
            "useAssumedIAMRole": true
          }
        }
      },
      "submissionTimestamp": "2022-03-25T22:13:42.096Z",
      "testId": "1-2"
    }
  ]
}

Create (submit) a no-reference analysis for a single raw (.yuv) asset

curl --location --request POST 'https://localhost/api/v1/analyses' \
--header 'Content-Type: application/json' \
--header 'Accept: application/json' \
--data-raw '{
  "content": {
    "title": "HoneyBee - Raw/YUV"
  },
  "subjectAssets": [
    {
      "name": "HoneyBee_3840x2160_120fps_420_10bit_YUV.yuv",
      "path": "royalty_free/yuv",
      "storageLocation": {
        "type": "PVC",
        "name": "video-files-pvc-2"
      },
      "rawVideoParameters": {
        "resolution": {
          "width": 3840,
          "height": 2160
        },
        "fps": 24,
        "scanType": "P",
        "pixelFormat": "YUV420P"
      }
    }
  ],
  "analyzerConfig": {
    "enableBandingDetection": true
  }
}'

HTTP/1.1 201 Created 
{
  "submittedAnalyses": [
    {
      "analyzerConfig": {
        "additionalConfigurationOptions": {},
        "enableBandingDetection": true,
        "viewingEnvironments": []
      },
      "id": "7575ea3b-8d6d-4768-9227-b57814fec75f",
      "subjectAsset": {
        "content": {
          "title": "HoneyBee - Raw/YUV"
        },
        "hdr": false,
        "name": "HoneyBee_3840x2160_120fps_420_10bit_YUV.yuv",
        "path": "royalty_free/yuv",
        "rawVideoParameters": {
          "fieldOrder": "TFF",
          "fps": 24,
          "pixelFormat": "YUV420P",
          "resolution": {
            "height": 2160,
            "width": 3840
          },
          "scanType": "P"
        },
        "storageLocation": {
          "name": "video-files-pvc-2",
          "type": "PVC"
        }
      },
      "submissionTimestamp": "2022-03-26T14:41:47.169Z",
      "testId": "1"
    }
  ]
}

Create (submit) a full-reference analysis for an HLS asset.

curl --location --request POST 'https://localhost/api/v1/analyses' \
--header 'Content-Type: application/json' \
--header 'Accept: application/json' \
--data-raw '{
  "content": {
    "title": "Soccer Video - HLS - All bandwidths"
  },
  "subjectAssets": [
    {
      "name": "Soccer.m3u8",
      "storageLocation": {
        "name": "http://172.31.64.201:8084",
        "type": "HTTP"
      },
      "path": "Soccer_MWC"
    }
  ],
  "referenceAssets": [
    {
      "name": "Soccer_1min.mp4",
      "storageLocation": {
        "type": "PVC",
        "name": "video-files-pvc"
      },
      "path": "hlsReferences"
    }
  ],
  "analyzerConfig": {
    "enableBandingDetection": true
  }
}'

HTTP/1.1 201 Created 

Content-Type: application/json

{
  "submittedAnalyses": [
    {
      "analyzerConfig": {
        "additionalConfigurationOptions": {},
        "enableBandingDetection": true,
        "viewingEnvironments": []
      },
      "id": "4d8eea6b-c530-44aa-83e8-717e0b618113",
      "subjectAsset": {
        "content": {
          "title": "Soccer Video - HLS - All bandwidths"
        },
        "hdr": false,
        "name": "Soccer_1min.mp4",
        "path": "hlsReferences",
        "storageLocation": {
          "name": "video-files-pvc",
          "type": "PVC"
        }
      },
      "submissionTimestamp": "2022-03-26T15:24:08.072Z",
      "testId": "1"
    },
    {
      "analyzerConfig": {
        "additionalConfigurationOptions": {},
        "enableBandingDetection": true,
        "enableComplexityAnalysis": false,
        "viewingEnvironments": []
      },
      "id": "4d8eea6b-c530-44aa-83e8-717e0b618113",
      "referenceAsset": {
        "content": {
          "title": "Soccer Video - HLS - All bandwidths"
        },
        "hdr": false,
        "name": "Soccer_1min.mp4",
        "path": "hlsReferences",
        "storageLocation": {
          "name": "video-files-pvc",
          "type": "PVC"
        }
      },
      "subjectAsset": {
        "content": {
          "title": "Soccer Video - HLS - All bandwidths"
        },
        "hdr": false,
        "name": "Soccer.m3u8",
        "path": "Soccer_MWC",
        "storageLocation": {
          "name": "http://172.31.64.201:8084",
          "type": "HTTP"
        },
        "streamIdentifier": {
          "bandwidth": 1713000,
          "type": "HLSVariantIdentifier"
        }
      },
      "submissionTimestamp": "2022-03-26T15:24:08.072Z",
      "testId": "1-1"
    },
    {
      "analyzerConfig": {
        "additionalConfigurationOptions": {},
        "enableBandingDetection": true,
        "enableComplexityAnalysis": false,
        "viewingEnvironments": []
      },
      "id": "4d8eea6b-c530-44aa-83e8-717e0b618113",
      "referenceAsset": {
        "content": {
          "title": "Soccer Video - HLS - All bandwidths"
        },
        "hdr": false,
        "name": "Soccer_1min.mp4",
        "path": "hlsReferences",
        "storageLocation": {
          "name": "video-files-pvc",
          "type": "PVC"
        }
      },
      "subjectAsset": {
        "content": {
          "title": "Soccer Video - HLS - All bandwidths"
        },
        "hdr": false,
        "name": "Soccer.m3u8",
        "path": "Soccer_MWC",
        "storageLocation": {
          "name": "http://172.31.64.201:8084",
          "type": "HTTP"
        },
        "streamIdentifier": {
          "bandwidth": 280000,
          "type": "HLSVariantIdentifier"
        }
      },
      "submissionTimestamp": "2022-03-26T15:24:08.072Z",
      "testId": "1-2"
    },
    {
      "analyzerConfig": {
        "additionalConfigurationOptions": {},
        "enableBandingDetection": true,
        "enableComplexityAnalysis": false,
        "viewingEnvironments": []
      },
      "id": "4d8eea6b-c530-44aa-83e8-717e0b618113",
      "referenceAsset": {
        "content": {
          "title": "Soccer Video - HLS - All bandwidths"
        },
        "hdr": false,
        "name": "Soccer_1min.mp4",
        "path": "hlsReferences",
        "storageLocation": {
          "name": "video-files-pvc",
          "type": "PVC"
        }
      },
      "subjectAsset": {
        "content": {
          "title": "Soccer Video - HLS - All bandwidths"
        },
        "hdr": false,
        "name": "Soccer.m3u8",
        "path": "Soccer_MWC",
        "storageLocation": {
          "name": "http://172.31.64.201:8084",
          "type": "HTTP"
        },
        "streamIdentifier": {
          "bandwidth": 19987000,
          "type": "HLSVariantIdentifier"
        }
      },
      "submissionTimestamp": "2022-03-26T15:24:08.072Z",
      "testId": "1-3"
    },
    {
      "analyzerConfig": {
        "additionalConfigurationOptions": {},
        "enableBandingDetection": true,
        "enableComplexityAnalysis": false,
        "viewingEnvironments": []
      },
      "id": "4d8eea6b-c530-44aa-83e8-717e0b618113",
      "referenceAsset": {
        "content": {
          "title": "Soccer Video - HLS - All bandwidths"
        },
        "hdr": false,
        "name": "Soccer_1min.mp4",
        "path": "hlsReferences",
        "storageLocation": {
          "name": "video-files-pvc",
          "type": "PVC"
        }
      },
      "subjectAsset": {
        "content": {
          "title": "Soccer Video - HLS - All bandwidths"
        },
        "hdr": false,
        "name": "Soccer.m3u8",
        "path": "Soccer_MWC",
        "storageLocation": {
          "name": "http://172.31.64.201:8084",
          "type": "HTTP"
        },
        "streamIdentifier": {
          "bandwidth": 582000,
          "type": "HLSVariantIdentifier"
        }
      },
      "submissionTimestamp": "2022-03-26T15:24:08.072Z",
      "testId": "1-4"
    },
    {
      "analyzerConfig": {
        "additionalConfigurationOptions": {},
        "enableBandingDetection": true,
        "enableComplexityAnalysis": false,
        "viewingEnvironments": []
      },
      "id": "4d8eea6b-c530-44aa-83e8-717e0b618113",
      "referenceAsset": {
        "content": {
          "title": "Soccer Video - HLS - All bandwidths"
        },
        "hdr": false,
        "name": "Soccer_1min.mp4",
        "path": "hlsReferences",
        "storageLocation": {
          "name": "video-files-pvc",
          "type": "PVC"
        }
      },
      "subjectAsset": {
        "content": {
          "title": "Soccer Video - HLS - All bandwidths"
        },
        "hdr": false,
        "name": "Soccer.m3u8",
        "path": "Soccer_MWC",
        "storageLocation": {
          "name": "http://172.31.64.201:8084",
          "type": "HTTP"
        },
        "streamIdentifier": {
          "bandwidth": 9561000,
          "type": "HLSVariantIdentifier"
        }
      },
      "submissionTimestamp": "2022-03-26T15:24:08.072Z",
      "testId": "1-5"
    },
    {
      "analyzerConfig": {
        "additionalConfigurationOptions": {},
        "enableBandingDetection": true,
        "enableComplexityAnalysis": false,
        "viewingEnvironments": []
      },
      "id": "4d8eea6b-c530-44aa-83e8-717e0b618113",
      "referenceAsset": {
        "content": {
          "title": "Soccer Video - HLS - All bandwidths"
        },
        "hdr": false,
        "name": "Soccer_1min.mp4",
        "path": "hlsReferences",
        "storageLocation": {
          "name": "video-files-pvc",
          "type": "PVC"
        }
      },
      "subjectAsset": {
        "content": {
          "title": "Soccer Video - HLS - All bandwidths"
        },
        "hdr": false,
        "name": "Soccer.m3u8",
        "path": "Soccer_MWC",
        "storageLocation": {
          "name": "http://172.31.64.201:8084",
          "type": "HTTP"
        },
        "streamIdentifier": {
          "bandwidth": 3042000,
          "type": "HLSVariantIdentifier"
        }
      },
      "submissionTimestamp": "2022-03-26T15:24:08.072Z",
      "testId": "1-6"
    }
  ]
}

Create (submit) a no-reference analyses for an IMF asset

curl --location --request POST 'https://localhost/api/v1/analyses' \
--header 'Content-Type: application/json' \
--header 'Accept: application/json' \
--data-raw '{
  "content": {
    "title": "Configure analysis with an IMF asset"
  },
  "subjectAssets": [
    {
      "name": "CPL_DTC-Master-SDR-ML5-R1-OV.xml",
      "path": "/videos/imf/DTC-Master-SDR-ML5-R1-OV",
      "storageLocation": {
        "name": "videos",
        "type": "PVC"
      }
    }
  ]
}'

HTTP/1.1 201 Created 
{
  "submittedAnalyses": [
    {
      "id": "756fb026-f4f4-47d8-ae8f-afd239643a55",
      "subjectAsset": {
        "content": {
          "title": "Configure analysis with an IMF asset"
        },
        "hdr": false,
        "name": "CPL_DTC-Master-SDR-ML5-R1-OV.xml",
        "path": "/videos/imf/DTC-Master-SDR-ML5-R1-OV",
        "storageLocation": {
          "name": "videos",
          "type": "PVC"
        }
      },
      "submissionTimestamp": "2022-03-26T15:54:27.731Z",
      "testId": "1"
    }
  ]
}

Create (submit) a no-reference analyes for an Image Sequence asset

curl --location --request POST 'https://localhost/api/v1/analyses' \
--header 'Content-Type: application/json' \
--header 'Accept: application/json' \
--data-raw '{
  "content": {
    "title": "Image Sequence"
  },
  "subjectAssets": [
    {
      "name": "frameIndex%d-test.png",
      "path": "compressed-videos/image-sequence/png",
      "storageLocation": {
        "type": "S3",
        "name": "imax-compressed-videos"
      },
      "imageSequenceParameters": {
        "fps": 25
      }
    }
  ],
  "analyzerConfig": {
    "enableBandingDetection": true
  }
}'

HTTP/1.1 201 Created 
{
  "submittedAnalyses": [
    {
      "analyzerConfig": {
        "additionalConfigurationOptions": {},
        "enableBandingDetection": true,
        "viewingEnvironments": []
      },
      "id": "c096a553-ef1a-441c-ab09-bf56c28e7704",
      "subjectAsset": {
        "content": {
          "title": "Image Sequence"
        },
        "hdr": false,
        "imageSequenceParameters": {
          "fps": 25
        },
        "name": "frameIndex%d-test.png",
        "path": "compressed-videos/image-sequence/png",
        "storageLocation": {
          "name": "imax-compressed-videos",
          "type": "S3"
        }
      },
      "submissionTimestamp": "2022-03-26T15:54:27.731Z",
      "testId": "1"
    }
  ]
}

Create (submit) a full-reference analysis with score-based quality checks

curl --location --request POST 'https://localhost/api/v1/analyses' \
--header 'Content-Type: application/json' \
--header 'Accept: application/json' \
--data-raw '{
  "content": {
    "title": "FR Analysis With Score-Based Quality Checks"
  },
  "subjectAssets": [
    {
      "name": "Big_Buck_Bunny_h264_qp_21.ts",
      "path": "royalty_free/big_buck_bunny/outputs",
      "storageLocation": {
        "type": "S3",
        "name": "video-files",
        "credentials": {
          "useAssumedIAMRole": true
        }
      },
      "qualityCheckConfig": {
        "scoreChecks": [
          {
            "metric": "SVS",
            "threshold": 80,
            "durationSeconds": 5,
            "skipStart": 1.25,
            "skipEnd": 1.25,
            "viewingEnvironmentIndex": 0
          },
          {
            "metric": "SVS",
            "threshold": 60,
            "durationSeconds": 2,
            "skipStart": 1.25,
            "skipEnd": 1.25,
            "viewingEnvironmentIndex": 0
          },
          {
            "metric": "SBS",
            "threshold": 75,
            "durationFrames": 48,
            "skipStart": 1.25,
            "skipEnd": 1.25,
            "viewingEnvironmentIndex": 0
          }
        ]
      }
    }
  ],
  "referenceAssets": [
    {
      "name": "Big_Buck_Bunny.mp4",
      "path": "royalty_free/big_buck_bunny/source",
      "storageLocation": {
        "type": "S3",
        "name": "video-files",
        "credentials": {
          "useAssumedIAMRole": true
        }
      }
    }
  ],
  "analyzerConfig": {
    "viewingEnvironments": [
      {
        "device": {
          "name": "oled65c9pua"
        },
        "viewerType": "EXPERT"
      },
      {
        "device": {
          "name": "xl2420t"
        },
        "viewerType": "TYPICAL"
      }
    ]
  }
}'

HTTP/1.1 201 Created
{
  "submittedAnalyses": [
    {
      "analyzerConfig": {
        "additionalConfigurationOptions": {},
        "enableBandingDetection": true,
        "viewingEnvironments": [
          {
            "device": {
              "name": "oled65c9pua"
            },
            "viewerType": "EXPERT"
          },
          {
            "device": {
              "name": "xl2420t"
            },
            "viewerType": "TYPICAL"
          }
        ]
      },
      "id": "944ade76-645a-4826-b500-3267fb1668f1",
      "subjectAsset": {
        "content": {
          "title": "FR Analysis With Score-Based Quality Checks"
        },
        "hdr": false,
        "name": "Big_Buck_Bunny_h264_qp_21.ts",
        "path": "royalty_free/big_buck_bunny/outputs",
        "storageLocation": {
          "type": "S3",
          "name": "video-files",
          "credentials": {
            "useAssumedIAMRole": true
          }
        }
      },
      "submissionTimestamp": "2022-06-30T15:48:40.648Z",
      "testId": "1"
    },
    {
      "analyzerConfig": {
        "additionalConfigurationOptions": {},
        "enableBandingDetection": true,
        "enableComplexityAnalysis": false,
        "viewingEnvironments": [
          {
            "device": {
              "name": "oled65c9pua"
            },
            "viewerType": "EXPERT"
          },
          {
            "device": {
              "name": "xl2420t"
            },
            "viewerType": "TYPICAL"
          }
        ]
      },
      "id": "944ade76-645a-4826-b500-3267fb1668f1",
      "referenceAsset": {
        "content": {
          "title": "FR Analysis With Score-Based Quality Checks"
        },
        "hdr": false,
        "name": "Big_Buck_Bunny.mp4",
        "path": "royalty_free/big_buck_bunny/source",
        "storageLocation": {
          "type": "S3",
          "name": "video-files",
          "credentials": {
            "useAssumedIAMRole": true
          }
        }
      },
      "subjectAsset": {
        "content": {
          "title": "FR Analysis With Score-Based Quality Checks"
        },
        "hdr": false,
        "name": "Big_Buck_Bunny_h264_qp_21.ts",
        "path": "royalty_free/big_buck_bunny/outputs",
        "qualityCheckConfig": {
          "scoreChecks": [
            {
              "durationSeconds": 5,
              "metric": "SVS",
              "reverseThresholdComparison": false,
              "skipEnd": 1.25,
              "skipStart": 1.25,
              "threshold": 80,
              "viewingEnvironmentIndex": 0
            },
            {
              "durationSeconds": 2,
              "metric": "SVS",
              "reverseThresholdComparison": false,
              "skipEnd": 1.25,
              "skipStart": 1.25,
              "threshold": 60,
              "viewingEnvironmentIndex": 0
            },
            {
              "durationFrames": 48,
              "metric": "SBS",
              "reverseThresholdComparison": false,
              "skipEnd": 1.25,
              "skipStart": 1.25,
              "threshold": 75,
              "viewingEnvironmentIndex": 0
            }
          ]
        },
        "storageLocation": {
          "type": "S3",
          "name": "video-files",
          "credentials": {
            "useAssumedIAMRole": true
          }
        }
      },
      "submissionTimestamp": "2022-06-30T15:48:40.648Z",
      "testId": "1-1"
    }
  ]
}

Create (submit) a no-reference analysis for a Dolby Vision asset with a metadata sidecar

curl -X POST "https://localhost/api/v1/analyses"  \
 -H "Content-Type: application/json"  \
 -d '{
    "subjectAssets": [
        {
            "content": {
                "title": "Sparks - Dolby Vision"
            },
            "name": "20161103_1023_SPARKS_4K_P3_PQ_4000nits_DoVi.mxf",
            "sidecars": [
                {
                    "type": "DOLBY_VISION_METADATA",
                    "name": "20161103_SPARKS_DOVI_METADATA_AR_CORRECT.xml"
                }
            ],
            "path": "/mnt/nas/videos",
            "storageLocation": {
                "type": "PVC",
                "name": "videos"
            },
            "dynamicRange": "HDR"
        }
    ],
    "analyzerConfig": {
        "enableBandingDetection": true
    }
}'

Create (submit) a full-reference analysis with audio-based quality checks

curl --location --request POST 'https://localhost/api/v1/analyses' \
--header 'Content-Type: application/json' \
--header 'Accept: application/json' \
--data-raw '{
    "content": {
        "title": "FR Analysis With Audio-Based Quality Checks"
    },
    "referenceAssets": [
        {
            "content": {
                "title": "Big Buck Bunny"
            },
            "assetUri": "s3://my-bucket-name/test/Big_Buck_Bunny_1080p_ref.mp4",
            "name": "Big_Buck_Bunny_1080p@5000kbps.mp4",
            "path": "/mnt/nas/videos",
            "storageLocation": {
                "type": "S3",
                "name": "videos",
                "credentials": {
                    "useAssumedIAMRole": true
                }
            },
            "audio": {
                "groups": [
                    {
                        "qualityCheckConfig": {
                            "loudnessChecks": [
                                {
                                    "type": "MAX_TRUE_PEAK_LEVEL",
                                    "enabled": true,
                                    "duration": 1,
                                    "skipStart": 1.25,
                                    "skipEnd": 1.25,
                                    "threshold": -2
                                },
                                {
                                    "type": "MIN_LOUDNESS_RANGE",
                                    "enabled": true,
                                    "threshold": 5                   
                                }
                                {
                                    "type": "MAX_LOUDNESS_RANGE",
                                    "enabled": true,
                                    "threshold": 25                   
                                }
                            ]
                        },
                        "loudnessMeasurements": {
                            "algorithm": "ITU_R_BS_1770_3",
                            "enabled": true
                        }
                    }
                ]
            }
        }
    ],
    "subjectAssets": [
        {
            "content": {
                "title": "Big Buck Bunny"
            },
            "assetUri": "s3://my-bucket-name/test/Big_Buck_Bunny_1080p_test.mp4",
            "name": "Big_Buck_Bunny_1080p@4000kbps.mp4",
            "path": "/mnt/nas/videos",
            "storageLocation": {
                "type": "S3",
                "name": "videos",
                "credentials": {
                    "useAssumedIAMRole": true
                }
            },
            "audio": {
                "groups": [
                    {
                        "qualityCheckConfig": {
                            "loudnessChecks": [
                                {
                                    "type": "MAX_TRUE_PEAK_LEVEL",
                                    "enabled": true,
                                    "duration": 1,
                                    "skipStart": 1.25,
                                    "skipEnd": 1.25,
                                    "threshold": -2
                                },
                                {
                                    "type": "MIN_LOUDNESS_RANGE",
                                    "enabled": true,
                                    "threshold": 5                   
                                }
                                {
                                    "type": "MAX_LOUDNESS_RANGE",
                                    "enabled": true,
                                    "threshold": 25                   
                                }
                            ]
                        },
                        "loudnessMeasurements": {
                            "algorithm": "ITU_R_BS_1770_3",
                            "enabled": true
                        }
                    }
                ]
            }
        }
    ],
    "analyzerConfig": {
        "enableBandingDetection": true
    }
}

HTTP/1.1 201 Created 

Content-Type: application/json

{
    "submittedAnalyses": [
        {
            "id": "04a2e841-9c9e-4f50-9f9f-4a8b847f5b3e",
            "content": {
                "title": "FR Analysis With Audio-Based Quality Checks"
            },
            "referenceAssets": [
                {
                    "content": {
                        "title": "Big Buck Bunny"
                    },
                    "assetUri": "s3://my-bucket-name/test/Big_Buck_Bunny_1080p_ref.mp4",
                    "name": "Big_Buck_Bunny_1080p@5000kbps.mp4",
                    "path": "/mnt/nas/videos",
                    "storageLocation": {
                        "type": "S3",
                        "name": "videos",
                        "credentials": {
                            "useAssumedIAMRole": true
                        }
                    },
                    "audio": {
                        "groups": [
                            {
                                "qualityCheckConfig": {
                                    "loudnessChecks": [
                                        {
                                            "type": "MAX_TRUE_PEAK_LEVEL",
                                            "enabled": true,
                                            "duration": 1,
                                            "skipStart": 1.25,
                                            "skipEnd": 1.25,
                                            "threshold": -2
                                        },
                                        {
                                            "type": "MIN_LOUDNESS_RANGE",
                                            "enabled": true,
                                            "threshold": 5                   
                                        }
                                        {
                                            "type": "MAX_LOUDNESS_RANGE",
                                            "enabled": true,
                                            "threshold": 25                   
                                        }
                                    ]
                                },
                                "loudnessMeasurements": {
                                    "algorithm": "ITU_R_BS_1770_3",
                                    "enabled": true
                                }
                            }
                        ]
                    }
                }
            ],
            "subjectAssets": [
                {
                    "content": {
                        "title": "Big Buck Bunny"
                    },
                    "assetUri": "s3://my-bucket-name/test/Big_Buck_Bunny_1080p_test.mp4",
                    "name": "Big_Buck_Bunny_1080p@4000kbps.mp4",
                    "path": "/mnt/nas/videos",
                    "storageLocation": {
                        "type": "S3",
                        "name": "videos",
                        "credentials": {
                            "useAssumedIAMRole": true
                        }
                    },
                    "audio": {
                        "groups": [
                            {
                                "qualityCheckConfig": {
                                    "loudnessChecks": [
                                        {
                                            "type": "MAX_TRUE_PEAK_LEVEL",
                                            "enabled": true,
                                            "duration": 1,
                                            "skipStart": 1.25,
                                            "skipEnd": 1.25,
                                            "threshold": -2
                                        },
                                        {
                                            "type": "MIN_LOUDNESS_RANGE",
                                            "enabled": true,
                                            "threshold": 5                   
                                        }
                                        {
                                            "type": "MAX_LOUDNESS_RANGE",
                                            "enabled": true,
                                            "threshold": 25                   
                                        }
                                    ]
                                },
                                "loudnessMeasurements": {
                                    "algorithm": "ITU_R_BS_1770_3",
                                    "enabled": true
                                }
                            }
                        ]
                    }
                }
            ],
            "analyzerConfig": {
                "enableBandingDetection": true
            }
            "submissionError": "",
            "submissionTimestamp": "2018-01-01T14:20:22Z",
            "testId": "1-1"
        }
    ]
}
Cancel an analysis
PATCH /analyses/{id}

To update an existing analysis, send a PATCH request to /analyses/{id} where id is the UUID of the analysis to update.

Please see the AnalysisPatchRequest schema to understand the options supported by the analysis update operation.

Path variables

id
string uuid required

The UUID of the analysis to be cancelled.

Min length: 36
Max length: 36
Example:
04a2e841-9c9e-4f50-9f9f-4a8b847f5b3e

Request body

application/json

Responses

200 OK

The analysis was successfully patched/updated

404 404

The server cannot find the requested resource.

400 400

The request could not be understood by the server due to malformed syntax. The client SHOULD NOT repeat the request without modifications.

403 403

The server understood the request, but is refusing to fulfill it. Authorization will not help and the request SHOULD NOT be repeated.

405 405

The request HTTP method is known by the server but has been disabled and cannot be used for that resource.

415 415

Used when the request is asking for a content-type that is not supported (i.e. XML when you only support JSON). The IMAX Stream On-Demand Platform API currently only supports the JSON content-type (i.e. application/json).

500 500

The server encountered an unexpected condition which prevented it from fulfilling the request.

503 503

The server is currently unable to handle the request due to a temporary overloading or maintenance of the server. The implication is that this is a temporary condition which will be alleviated after some delay.

Examples

Cancelling an analysis

curl -X PATCH "https://localhost/api/v1/analyses/04a2e841-9c9e-4f50-9f9f-4a8b847f5b3e"  \
 -H "Content-Type: application/json"  \
 -d '{
    "status": "CANCELLED"
}' 

HTTP/1.1 200 OK
Delete an analysis
DELETE /analyses/{id}

To delete an analysis, send a DELETE request to /analyses/{id} where id is the UUID of the analysis to delete.

Only analyses that have been previously cancelled or completed can be deleted.

Path variables

id
string uuid required

The UUID of the analysis to be cancelled.

Min length: 36
Max length: 36
Example:
04a2e841-9c9e-4f50-9f9f-4a8b847f5b3e

Responses

200 OK

The analysis deletion request was successfully processed

404 404

The server cannot find the requested resource.

400 400

The request could not be understood by the server due to malformed syntax. The client SHOULD NOT repeat the request without modifications.

403 403

The server understood the request, but is refusing to fulfill it. Authorization will not help and the request SHOULD NOT be repeated.

405 405

The request HTTP method is known by the server but has been disabled and cannot be used for that resource.

415 415

Used when the request is asking for a content-type that is not supported (i.e. XML when you only support JSON). The IMAX Stream On-Demand Platform API currently only supports the JSON content-type (i.e. application/json).

500 500

The server encountered an unexpected condition which prevented it from fulfilling the request.

503 503

The server is currently unable to handle the request due to a temporary overloading or maintenance of the server. The implication is that this is a temporary condition which will be alleviated after some delay.

Examples

Deleting an anlysis

curl -X DELETE "https://localhost/api/v1/analyses/04a2e841-9c9e-4f50-9f9f-4a8b847f5b3e" 

HTTP/1.1 200 OK
Frame captures and maps
POST /frames
POST /bandingMaps
POST /qualityMaps
POST /colorDifferenceMaps
Create frame capture
POST /frames

Creates a frame capture for a given asset.

Request body

application/json

Responses

201 201

A PNG image that represents the frame capture.

Body
image/png
string binary
400 400

The request could not be understood by the server due to malformed syntax. The client SHOULD NOT repeat the request without modifications.

403 403

The server understood the request, but is refusing to fulfill it. Authorization will not help and the request SHOULD NOT be repeated.

405 405

The request HTTP method is known by the server but has been disabled and cannot be used for that resource.

415 415

Used when the request is asking for a content-type that is not supported (i.e. XML when you only support JSON). The IMAX Stream On-Demand Platform API currently only supports the JSON content-type (i.e. application/json).

500 500

The server encountered an unexpected condition which prevented it from fulfilling the request.

503 503

The server is currently unable to handle the request due to a temporary overloading or maintenance of the server. The implication is that this is a temporary condition which will be alleviated after some delay.

Examples

curl -X POST "https://localhost/api/v1/frames"  \
 -H "Content-Type: application/json"  \
 -d '{
    "type": "FrameRequest",
    "asset": {
        "name": "Big_Buck_Bunny_1080p@5000kbps.mp4",
        "path": "/mnt/nas/videos",
        "storageLocation": {
            "name": "/videos",
            "type": "S3"
        }
    },
    "startFrame": {
        "type": "PTS",
        "value": 400
    },
    "additionalFrames": 24
}'
Create banding map
POST /bandingMaps

Creates the banding map for a frame of a given asset.

Banding Maps measure color banding presence at a pixel level as viewed by an “expert” on an OLED TV using a no-reference approach. The map is generated as part of one of several steps used in computing an IMAX Banding Score (XBS). The banding map is a binary map with white pixels showing banding presence, and does not reflect pixel-level variations in banding impairment visibility.

Request body

application/json

Responses

201 201

A PNG image that represents the frame’s banding map.

Body
image/png
string binary
400 400

The request could not be understood by the server due to malformed syntax. The client SHOULD NOT repeat the request without modifications.

403 403

The server understood the request, but is refusing to fulfill it. Authorization will not help and the request SHOULD NOT be repeated.

405 405

The request HTTP method is known by the server but has been disabled and cannot be used for that resource.

415 415

Used when the request is asking for a content-type that is not supported (i.e. XML when you only support JSON). The IMAX Stream On-Demand Platform API currently only supports the JSON content-type (i.e. application/json).

500 500

The server encountered an unexpected condition which prevented it from fulfilling the request.

503 503

The server is currently unable to handle the request due to a temporary overloading or maintenance of the server. The implication is that this is a temporary condition which will be alleviated after some delay.

Examples

curl -X POST "https://localhost/api/v1/bandingMaps"  \
 -H "Content-Type: application/json"  \
 -d '{
    "type": "FrameRequest",
    "asset": {
        "name": "Big_Buck_Bunny_1080p@5000kbps.m3u8",
        "path": "/mnt/nas/videos",
        "storageLocation": {
            "name": "/videos",
            "type": "S3"
        },
        "streamIdentifier": {
            "type": "HLSVariantIdentifier",
            "bandwidth": 4997885,
            "fallbackStreamIndex": 1
        }
    },
    "startFrame": {
        "type": "FrameIndex",
        "value": 1200
    },
    "additionalFrames": 24
}'
Create quality map
POST /qualityMaps

Creates the quality map for a frame of a given asset.

Quality Maps are gray scale presentations of pixel-level perceptual quality that show the spatial distribution of impairments within a frame. Quality Maps illustrate where impairments occur at a pixel level. The maps provide the reason behind the quality score. Dark pixels show the impairments compared to the reference file. Areas that are not that important, such as the area around text, might have more white pixels. Generally, the darker the image, the lower the score.

Request body

application/json

Responses

201 201

A PNG image that represents the subjectAsset frame’s quality map.

Body
image/png
string binary
400 400

The request could not be understood by the server due to malformed syntax. The client SHOULD NOT repeat the request without modifications.

403 403

The server understood the request, but is refusing to fulfill it. Authorization will not help and the request SHOULD NOT be repeated.

405 405

The request HTTP method is known by the server but has been disabled and cannot be used for that resource.

415 415

Used when the request is asking for a content-type that is not supported (i.e. XML when you only support JSON). The IMAX Stream On-Demand Platform API currently only supports the JSON content-type (i.e. application/json).

500 500

The server encountered an unexpected condition which prevented it from fulfilling the request.

503 503

The server is currently unable to handle the request due to a temporary overloading or maintenance of the server. The implication is that this is a temporary condition which will be alleviated after some delay.

Examples

curl -X POST "https://localhost/api/v1/qualityMaps"  \
 -H "Content-Type: application/json"  \
 -d '{
    "type": "FullReferenceFrameRequest",
    "asset": {
        "name": "Big_Buck_Bunny_1080p@5000kbps.mp4",
        "path": "/mnt/nas/videos",
        "content": {
            "title": "Big Buck Bunny"
        },
        "storageLocation": {
            "type": "S3",
            "name": "/videos"
        }
    },
    "startFrame": {
        "type": "PTS",
        "value": 1400
    },
    "reference": {
        "name": "Big_Buck_Bunny.mp4",
        "path": "/mnt/nas/videos/sources",
        "storageLocation": {
            "type": "S3",
            "name": "/videos"
        }
    },
    "referenceStartFrame": {
        "type": "PTS",
        "value": 1200
    },
    "additionalFrames": 24
}'
Create color difference map
POST /colorDifferenceMaps

Creates a color volume difference map for a frame of a given asset.

Color volume difference maps are gray scale maps that illustrate pixel-level color and skin tone deviation with respect to the reference file. Brighter pixels correspond to a higher deviation.

Request body

application/json
Examples

A JSON body payload for requesting a color difference map between a subject and reference asset.

{
    "type": "FullReferenceFrameRequest",
    "asset": {
        "name": "Big_Buck_Bunny_1080p@5000kbps.mp4",
        "path": "/mnt/nas/videos",
        "content": {
            "title": "Big Buck Bunny"
        },
        "storageLocation": {
            "type": "S3",
            "name": "/videos"
        }
    },
    "startFrame": {
        "type": "PTS",
        "value": 1400
    },
    "reference": {
        "name": "Big_Buck_Bunny.mp4",
        "path": "/mnt/nas/videos/sources",
        "storageLocation": {
            "type": "S3",
            "name": "/videos"
        }
    },
    "referenceStartFrame": {
        "type": "PTS",
        "value": 1200
    },
    "additionalFrames": 24
}

Responses

201 201

A PNG image that represents the subjetAsset frame’s color difference map.

Body
image/png
string binary
400 400

The request could not be understood by the server due to malformed syntax. The client SHOULD NOT repeat the request without modifications.

403 403

The server understood the request, but is refusing to fulfill it. Authorization will not help and the request SHOULD NOT be repeated.

405 405

The request HTTP method is known by the server but has been disabled and cannot be used for that resource.

415 415

Used when the request is asking for a content-type that is not supported (i.e. XML when you only support JSON). The IMAX Stream On-Demand Platform API currently only supports the JSON content-type (i.e. application/json).

500 500

The server encountered an unexpected condition which prevented it from fulfilling the request.

503 503

The server is currently unable to handle the request due to a temporary overloading or maintenance of the server. The implication is that this is a temporary condition which will be alleviated after some delay.

Examples

curl -X POST "https://localhost/api/v1/colorDifferenceMaps"  \
 -H "Content-Type: application/json"  \
 -d '{
    "type": "FullReferenceFrameRequest",
    "asset": {
        "name": "Big_Buck_Bunny_1080p@5000kbps.mp4",
        "path": "/mnt/nas/videos",
        "content": {
            "title": "Big Buck Bunny"
        },
        "storageLocation": {
            "type": "S3",
            "name": "/videos"
        }
    },
    "startFrame": {
        "type": "PTS",
        "value": 1400
    },
    "reference": {
        "name": "Big_Buck_Bunny.mp4",
        "path": "/mnt/nas/videos/sources",
        "storageLocation": {
            "type": "S3",
            "name": "/videos"
        }
    },
    "referenceStartFrame": {
        "type": "PTS",
        "value": 1200
    },
    "additionalFrames": 24
}'
Create frame captures and maps
POST /captures

Creates (and caches) all the captures (frames and maps) possible for the supplied asset(s). Use this endpoint if you expect to retrieve more than one type of capture (i.e. frame, banding map, quality map) for a given frame and want the system to pre-fetch and cache these images to reduce wait times on subsequent requests to any of the frame capture endpoints.

Request body

application/json

Responses

201 Created

A PNG image that represents the frame capture.

Body
image/png
string binary
400 400

The request could not be understood by the server due to malformed syntax. The client SHOULD NOT repeat the request without modifications.

403 403

The server understood the request, but is refusing to fulfill it. Authorization will not help and the request SHOULD NOT be repeated.

405 405

The request HTTP method is known by the server but has been disabled and cannot be used for that resource.

415 415

Used when the request is asking for a content-type that is not supported (i.e. XML when you only support JSON). The IMAX Stream On-Demand Platform API currently only supports the JSON content-type (i.e. application/json).

500 500

The server encountered an unexpected condition which prevented it from fulfilling the request.

503 503

The server is currently unable to handle the request due to a temporary overloading or maintenance of the server. The implication is that this is a temporary condition which will be alleviated after some delay.

Examples

Creates all captures for the supplied asset and reference and returns the frame (i.e. content) in the response.

curl -X POST "https://localhost/api/v1/captures"  \
 -H "Content-Type: application/json"  \
 -d '{
    "frameRequest": {
        "type": "FullReferenceFrameRequest",
        "asset": {
            "name": "Big_Buck_Bunny_1080p@5000kbps.mp4",
            "path": "/mnt/nas/videos",
            "content": {
                "title": "Big Buck Bunny"
            },
            "storageLocation": {
                "type": "S3",
                "name": "/videos"
            }
        },
        "startFrame": {
            "type": "PTS",
            "value": 1400
        },
        "reference": {
            "name": "Big_Buck_Bunny.mp4",
            "path": "/mnt/nas/videos/sources",
            "storageLocation": {
                "type": "S3",
                "name": "/videos"
            }
        },
        "referenceStartFrame": {
            "type": "PTS",
            "value": 1200
        },
        "additionalFrames": 24
    },
    "requestedCaptureType": "FRAME"
}'

Creates all captures for the supplied asset and returns the banding map in the response.

curl -X POST "https://localhost/api/v1/captures"  \
 -H "Content-Type: application/json"  \
 -d '{
    "frameRequest": {
        "type": "FrameRequest",
        "asset": {
            "name": "Big_Buck_Bunny_1080p@5000kbps.mp4",
            "path": "/mnt/nas/videos",
            "content": {
                "title": "Big Buck Bunny"
            },
            "storageLocation": {
                "type": "S3",
                "name": "/videos"
            }
        },
        "startFrame": {
            "type": "PTS",
            "value": 1400
        },
        "additionalFrames": 24
    },
    "requestedCaptureType": "BANDING_MAP"
}'
Get the frame and map cache details
GET /cache

Retrieves the frame and map cache details.

Responses

200 OK
Body
application/json
400 400

The request could not be understood by the server due to malformed syntax. The client SHOULD NOT repeat the request without modifications.

403 403

The server understood the request, but is refusing to fulfill it. Authorization will not help and the request SHOULD NOT be repeated.

405 405

The request HTTP method is known by the server but has been disabled and cannot be used for that resource.

415 415

Used when the request is asking for a content-type that is not supported (i.e. XML when you only support JSON). The IMAX Stream On-Demand Platform API currently only supports the JSON content-type (i.e. application/json).

500 500

The server encountered an unexpected condition which prevented it from fulfilling the request.

503 503

The server is currently unable to handle the request due to a temporary overloading or maintenance of the server. The implication is that this is a temporary condition which will be alleviated after some delay.

Examples

curl -X GET "https://localhost/api/v1/cache" 

HTTP/1.1 200 OK  
 
Content-Type: application/json 

{
    "caches": [
        {
            "cacheId": "533db73-0f9a-4805-9651-c5dcd519dc37",
            "numberOfFiles": 15182,
            "sizeOfFiles": 1073741824,
            "humanReadableSizeOfFiles": "1.0 G"
        }
    ]
}
Clear the frame and map cache
DELETE /cache

Clear the cache used to store the frames and maps (banding, quality, color volume difference) that have been previously requested (and cached).

Request parameters

beforeTimestamp
string date-time optional

The UTC date and time up to and before which a cached frame and/or map will be deleted

Example:
2022-09-15T17:32:28Z

Responses

200 OK
404 404

The server cannot find the requested resource.

400 400

The request could not be understood by the server due to malformed syntax. The client SHOULD NOT repeat the request without modifications.

403 403

The server understood the request, but is refusing to fulfill it. Authorization will not help and the request SHOULD NOT be repeated.

405 405

The request HTTP method is known by the server but has been disabled and cannot be used for that resource.

415 415

Used when the request is asking for a content-type that is not supported (i.e. XML when you only support JSON). The IMAX Stream On-Demand Platform API currently only supports the JSON content-type (i.e. application/json).

500 500

The server encountered an unexpected condition which prevented it from fulfilling the request.

503 503

The server is currently unable to handle the request due to a temporary overloading or maintenance of the server. The implication is that this is a temporary condition which will be alleviated after some delay.

Examples

Delete all frames/maps up to a given date-time

curl -X DELETE "https://localhost/api/v1/cache?beforeTimestamp=2022-09-15T17:32:28Z" 

HTTP/1.1 200 OK

Delete all frames/maps

curl -X DELETE "https://localhost/api/v1/cache" 

HTTP/1.1 200 OK
StreamSmart

StreamSmart On-Demand overlays on your existing encoding workflow and uses the most accurate measure of video quality, IMAX VisionScience™, to retain the same visual quality of experience, while delivering a significant reduction in delivery costs. IMAX’s unique VisionScience perceptual quality measurement technology "sees” video the way humans do and takes advantage of opportunities to decrease bits in a way that humans will not notice, and which other methods cannot match. StreamSmart uses this quality-measurement approach to analyze every frame of a video and optimize it for best picture quality and compression efficiency, guaranteeing an optimal viewer experience while maximizing bitrate savings, typically 15% above what top-of-the-line content-aware encoders are already delivering.

Optimizations
PATCH /optimizations/{id}
DELETE /optimizations/{id}
Submit a video asset for optimization
POST /optimizations

To create an optimized encoding, send a POST request to the IMAX StreamSmart™ /optimizations endopint.

This endpoint currently supports creating optimized encodings for the following encoders:

During optimization, IMAX StreamSmart will produce one or more renditions, each of which represents an encoded video using your chosen encoder, including all your desired encoder settings. Each is optimized whereby IMAX StreamSmart will produce an encoded version of the video that is of indistinguishable perceived quality to the un-optimized version using fewer bits.

FFmpeg

IMAX StreamSmart supports a number of FFmpeg encoding strategies including:

  • Single-pass constant rate factor (CRF)
  • Single-pass variable bitrate (VBR)
  • Multi-pass variable bitrate (VBR)

Please see the examples to the side and FFmpegConfig for details.

AWS Elemental MediaConvert (EMC)

IMAX StreamSmart supports using the verbatim JSON config from an EMC invocation.
Please see the examples to the side and EMCConfig for details.

Request body

application/json

Responses

201 Created

The newly created optimization, populated with key attribute values.

Body
application/json
400 400

The request could not be understood by the server due to malformed syntax. The client SHOULD NOT repeat the request without modifications.

403 403

The server understood the request, but is refusing to fulfill it. Authorization will not help and the request SHOULD NOT be repeated.

405 405

The request HTTP method is known by the server but has been disabled and cannot be used for that resource.

415 415

Used when the request is asking for a content-type that is not supported (i.e. XML when you only support JSON). The IMAX Stream On-Demand Platform API currently only supports the JSON content-type (i.e. application/json).

500 500

The server encountered an unexpected condition which prevented it from fulfilling the request.

503 503

The server is currently unable to handle the request due to a temporary overloading or maintenance of the server. The implication is that this is a temporary condition which will be alleviated after some delay.

Examples

Optimizing an FFmpeg encoding (VBR)

curl -X POST "https://localhost/api/v1/optimizations"  \
 -H "Content-Type: application/json"  \
 -d '{
    "content": {
        "title": "Big Buck Bunny"
    },
    "input": {
        "assetUri": "s3://videos/examples/Big_Buck_Bunny.mp4"
    },
    "encoderConfig": {
        "type": "FFmpegConfig",
        "encodes": [
            {
                "command": [
                    "ffmpeg -i {INPUT_LOCATION} -c:v libx264 -x264-params \"ref=3:bframes=3:b_adapt=2:keyint=48:keyint_min=48:scenecut=0\" -profile:v high -level:v 4.1 -preset slow -b:v 4500k -maxrate 6250k -bufsize 10000k -an {OUTPUT_LOCATION}"
                ],
                "outputLocation": {
                    "assetUri": "s3://videos/examples/output/encoded_video.mp4"
                }
            }
        ]
    }
}'

Optimizing an FFmpeg encoding (multi-pass VBR)

curl -X POST "https://localhost/api/v1/optimizations"  \
 -H "Content-Type: application/json"  \
 -d '{
    "content": {
        "title": "Big Buck Bunny"
    },
    "input": {
        "assetUri": "s3://videos/examples/Big_Buck_Bunny.mp4"
    },
    "encoderConfig": {
        "type": "FFmpegConfig",
        "encodes": [
            {
                "command": [
                    "ffmpeg -i {INPUT_LOCATION} -passlogfile {TEMP_FILE_1} -profile:v high -preset slow -pass 1 -vcodec libx264 -bf 0 -refs 4 -b:v 4500k -maxrate:v 4500k -bufsize:v 6000k -minrate:v 6000k -x264-params \"rc-lookahead=48:keyint=96:stitchable=1:keyint_min:48\" -copyts -start_at_zero -an -f mp4 /dev/null",
                    "ffmpeg -i {INPUT_LOCATION} -passlogfile {TEMP_FILE_1} -profile:v high -preset slow -pass 2 -vcodec libx264 -bf 0 -refs 4 -b:v 4500k -maxrate:v 4500k -bufsize:v 6000k -minrate:v 6000k -x264-params \"rc-lookahead=48:keyint=96:stitchable=1:keyint_min:48\" -copyts -start_at_zero -an -f mp4 {OUTPUT_LOCATION}"
                ],
                "outputLocation": {
                    "assetUri": "s3://videos/examples/output/encoded_video.mp4"
                }
            }
        ]
    }
}'

Optimizing an FFmpeg encoding (VBR, multiple encodes)

curl -X POST "https://localhost/api/v1/optimizations"  \
 -H "Content-Type: application/json"  \
 -d '{
    "content": {
        "title": "Big Buck Bunny"
    },
    "input": {
        "assetUri": "s3://videos/examples/Big_Buck_Bunny.mp4"
    },
    "encoderConfig": {
        "type": "FFmpegConfig",
        "encodes": [
            {
                "command": [
                    "ffmpeg -i {INPUT_LOCATION} -c:v libx264 -x264-params \"ref=3:bframes=3:b_adapt=2:keyint=48:keyint_min=48:scenecut=0\" -profile:v high -level:v 4.1 -preset slow -b:v 4500k -maxrate 6250k -bufsize 10000k -an {OUTPUT_LOCATION}"
                ],
                "outputLocation": {
                    "assetUri": "s3://videos/examples/output/output1.mp4"
                }
            },
            {
                "command": [
                    "ffmpeg -i {INPUT_LOCATION} -c:v libx264 -x264-params \"ref=3:bframes=3:b_adapt=2:keyint=48:keyint_min=48:scenecut=0\" -profile:v high -level:v 4.1 -preset slow -b:v 4000k -maxrate 6000k -bufsize 9000k -an {OUTPUT_LOCATION}"
                ],
                "outputLocation": {
                    "assetUri": "s3://videos/examples/output/output2.mp4"
                }
            },
            {
                "command": [
                    "ffmpeg -i {INPUT_LOCATION} -c:v libx264 -x264-params \"ref=3:bframes=3:b_adapt=2:keyint=48:keyint_min=48:scenecut=0\" -profile:v high -level:v 4.1 -preset slow -b:v 3500k -maxrate 5500k -bufsize 8500k -an {OUTPUT_LOCATION}"
                ],
                "outputLocation": {
                    "assetUri": "s3://videos/examples/output/output3.mp4"
                }
            }
        ]
    }
}'

Optimizing an FFmpeg encoding (CRF)

curl -X POST "https://localhost/api/v1/optimizations"  \
 -H "Content-Type: application/json"  \
 -d '{
    "content": {
        "title": "Big Buck Bunny"
    },
    "input": {
        "assetUri": "s3://videos/examples/Big_Buck_Bunny.mp4"
    },
    "encoderConfig": {
        "type": "FFmpegConfig",
        "encodes": [
            {
                "command": [
                    "ffmpeg -r 24 -i {INPUT_LOCATION} -c:v libx264 -x264-params \"ref=3:bframes=3:b_adapt=2:keyint=48:keyint_min=48:scenecut=0\" -profile:v high -level:v 4.1 -preset slow -crf 23 -maxrate 4500k -bufsize 6000k -an {OUTPUT_LOCATION}"
                ],
                "outputLocation": {
                    "assetUri": "s3://videos/examples/output/encoded_video.mp4"
                }
            }
        ]
    }
}'

Optimizing an FFmpeg encoding (CRF) with additional optimization configuration

curl -X POST "https://localhost/api/v1/optimizations"  \
 -H "Content-Type: application/json"  \
 -d '{
    "content": {
        "title": "Big Buck Bunny"
    },
    "input": {
        "assetUri": "s3://videos/examples/Big_Buck_Bunny.mp4"
    },
    "encoderConfig": {
        "type": "FFmpegConfig",
        "encodes": [
            {
                "command": [
                    "ffmpeg -r 24 -i {INPUT_LOCATION} -c:v libx264 -x264-params \"ref=3:bframes=3:b_adapt=2:keyint=48:keyint_min=48:scenecut=0\" -profile:v high -level:v 4.1 -preset slow -crf 23 -maxrate 4500k -bufsize 6000k -an {OUTPUT_LOCATION}"
                ],
                "outputLocation": {
                    "assetUri": "s3://videos/examples/output/encoded_video.mp4"
                },
                "optimizationConfig": {
                    "key1": "value1",
                    "key2": "value2"
                }
            }
        ]
    }
}'

Optimizing an EMC encoding

curl -X POST "https://localhost/api/v1/optimizations"  \
 -H "Content-Type: application/json"  \
 -d '{
    "content": {
        "title": "Big Buck Bunny"
    },
    "encoderConfig": {
        "type": "EMCConfig",
        "config": {
            "JobTemplate": "",
            "Queue": "arn:aws:mediaconvert:us-east-1:315835334412:queues/Default",
            "UserMetadata": {},
            "Role": "arn:aws:iam::315835334412:role/mediaconvert-optimizer",
            "Settings": {
                "OutputGroups": [
                    {
                        "CustomName": "top-profile-encode",
                        "Name": "CMAF",
                        "Outputs": [
                            {
                                "ContainerSettings": {
                                    "Container": "CMFC"
                                },
                                "VideoDescription": {
                                    "Width": 1920,
                                    "ScalingBehavior": "STRETCH_TO_OUTPUT",
                                    "Height": 1080,
                                    "TimecodeInsertion": "DISABLED",
                                    "AntiAlias": "ENABLED",
                                    "Sharpness": 50,
                                    "CodecSettings": {
                                        "Codec": "H_264",
                                        "H264Settings": {
                                            "InterlaceMode": "PROGRESSIVE",
                                            "NumberReferenceFrames": 3,
                                            "Syntax": "DEFAULT",
                                            "Softness": 0,
                                            "GopClosedCadence": 1,
                                            "GopSize": 2,
                                            "Slices": 1,
                                            "GopBReference": "ENABLED",
                                            "HrdBufferSize": 16000000,
                                            "MaxBitrate": 8000000,
                                            "EntropyEncoding": "CABAC",
                                            "RateControlMode": "QVBR",
                                            "QvbrSettings": {
                                                "QvbrQualityLevel": 9
                                            },
                                            "CodecProfile": "HIGH",
                                            "MinIInterval": 0,
                                            "AdaptiveQuantization": "AUTO",
                                            "CodecLevel": "AUTO",
                                            "SceneChangeDetect": "ENABLED",
                                            "QualityTuningLevel": "SINGLE_PASS",
                                            "UnregisteredSeiTimecode": "DISABLED",
                                            "GopSizeUnits": "SECONDS",
                                            "ParControl": "INITIALIZE_FROM_SOURCE",
                                            "NumberBFramesBetweenReferenceFrames": 3,
                                            "RepeatPps": "DISABLED",
                                            "DynamicSubGop": "ADAPTIVE"
                                        }
                                    }
                                },
                                "NameModifier": "_8Mbps"
                            }
                        ],
                        "OutputGroupSettings": {
                            "Type": "CMAF_GROUP_SETTINGS",
                            "CmafGroupSettings": {
                                "TargetDurationCompatibilityMode": "SPEC_COMPLIANT",
                                "WriteHlsManifest": "ENABLED",
                                "WriteDashManifest": "ENABLED",
                                "SegmentLength": 4,
                                "Destination": "s3://s3-bucket/destination/path/",
                                "FragmentLength": 2,
                                "SegmentControl": "SEGMENTED_FILES",
                                "WriteSegmentTimelineInRepresentation": "ENABLED",
                                "ManifestDurationFormat": "FLOATING_POINT",
                                "StreamInfResolution": "INCLUDE"
                            }
                        }
                    }
                ],
                "Inputs": [
                    {
                        "AudioSelectors": {
                            "Audio Selector 1": {
                                "DefaultSelection": "DEFAULT"
                            }
                        },
                        "VideoSelector": {
                            "ColorSpace": "FOLLOW",
                            "Rotate": "DEGREE_0",
                            "AlphaBehavior": "DISCARD"
                        },
                        "FilterEnable": "AUTO",
                        "PsiControl": "USE_PSI",
                        "FilterStrength": 0,
                        "DeblockFilter": "DISABLED",
                        "DenoiseFilter": "DISABLED",
                        "TimecodeSource": "ZEROBASED",
                        "FileInput": "s3://s3-bucket/sources/source.mov"
                    }
                ]
            },
            "AccelerationSettings": {
                "Mode": "DISABLED"
            },
            "StatusUpdateInterval": "SECONDS_15",
            "Priority": 0,
            "HopDestinations": []
        }
    }
}'
Cancel an optimization
PATCH /optimizations/{id}

To update an existing optimization, send a PATCH request to /optimizations/{id} where id is the UUID of the analysis to update.

Please see the OptimizationPatchRequest schema to understand the options supported by the optimizations update operation.

Path variables

id
string uuid required
Min length: 36
Max length: 36
Example:
04a2e841-9c9e-4f50-9f9f-4a8b847f5b3e

Request body

application/json

Examples

Cancelling an optimization

curl -X PATCH "https://localhost/api/v1/optimizations/04a2e841-9c9e-4f50-9f9f-4a8b847f5b3e"  \
 -H "Content-Type: application/json"  \
 -d '{
    "status": "CANCELLED"
}' 

HTTP/1.1 200 OK
Delete an optimization
DELETE /optimizations/{id}

To delete an optimization, send a DELETE request to /optimization/{id} where id is the UUID of the optimization to delete.

Only optimizations that have been previously cancelled or completed can be deleted.

Path variables

id
string required
Min length: 36
Max length: 36
Example:
04a2e841-9c9e-4f50-9f9f-4a8b847f5b3e

Responses

200 OK

The optimization deletion request was successfully processed

404 Not Found

The server cannot find the requested resource.

400 400

The request could not be understood by the server due to malformed syntax. The client SHOULD NOT repeat the request without modifications.

403 403

The server understood the request, but is refusing to fulfill it. Authorization will not help and the request SHOULD NOT be repeated.

405 405

The request HTTP method is known by the server but has been disabled and cannot be used for that resource.

415 415

Used when the request is asking for a content-type that is not supported (i.e. XML when you only support JSON). The IMAX Stream On-Demand Platform API currently only supports the JSON content-type (i.e. application/json).

500 500

The server encountered an unexpected condition which prevented it from fulfilling the request.

503 503

The server is currently unable to handle the request due to a temporary overloading or maintenance of the server. The implication is that this is a temporary condition which will be alleviated after some delay.

Examples

Deleting an optimization

curl -X DELETE "https://localhost/api/v1/optimizations/04a2e841-9c9e-4f50-9f9f-4a8b847f5b3e" 

HTTP/1.1 200 OK
System

Used to query the status and configure the state of the IMAX Stream On-Demand Platform.

GET /version
GET /featureLicense
PUT /featureLicense
PUT /s3AccessSecret
POST /configurations
GET /configurations/{id}
PATCH /configurations/{type}.{id}
Get the version
GET /version

List system version information for the Stream On-Demand Platform REST API.

Responses

200 OK
Body
application/json
400 400

The request could not be understood by the server due to malformed syntax. The client SHOULD NOT repeat the request without modifications.

403 403

The server understood the request, but is refusing to fulfill it. Authorization will not help and the request SHOULD NOT be repeated.

405 405

The request HTTP method is known by the server but has been disabled and cannot be used for that resource.

415 415

Used when the request is asking for a content-type that is not supported (i.e. XML when you only support JSON). The IMAX Stream On-Demand Platform API currently only supports the JSON content-type (i.e. application/json).

500 500

The server encountered an unexpected condition which prevented it from fulfilling the request.

503 503

The server is currently unable to handle the request due to a temporary overloading or maintenance of the server. The implication is that this is a temporary condition which will be alleviated after some delay.

Examples

curl -X GET "https://localhost/api/v1/version" 

HTTP/1.1 200 OK  
 
Content-Type: application/json 

{
    "version": {
        "commitBranch": "stream-ondemand/release/3.1.0",
        "commitHash": "facc2ef0a3c8ebc10819dc1218748f8d2cbfafd9",
        "commitTime": "2022-05-02T18:58:44Z",
        "stamped": "true",
        "versionString": "3.1.0-12"
    }
}
Get the system status
GET /status

Fetches the system readiness/status by reporting on the individual readiness of all the services that comprise the system.

Responses

200 OK
Body
application/json
400 400

The request could not be understood by the server due to malformed syntax. The client SHOULD NOT repeat the request without modifications.

403 403

The server understood the request, but is refusing to fulfill it. Authorization will not help and the request SHOULD NOT be repeated.

405 405

The request HTTP method is known by the server but has been disabled and cannot be used for that resource.

415 415

Used when the request is asking for a content-type that is not supported (i.e. XML when you only support JSON). The IMAX Stream On-Demand Platform API currently only supports the JSON content-type (i.e. application/json).

500 500

The server encountered an unexpected condition which prevented it from fulfilling the request.

503 503

The server is currently unable to handle the request due to a temporary overloading or maintenance of the server. The implication is that this is a temporary condition which will be alleviated after some delay.

Examples

curl -X GET "https://localhost/api/v1/status" 

HTTP/1.1 200 OK  
 
Content-Type: application/json 

{
  "checks": [
    {
      "deploymentId": "d5b36ce6-d0e2-4dd2-bcf6-8a893b5fa1ef",
      "serviceId": "5307bca3-e556-43c7-9c23-b94dc63c23d9",
      "serviceName": "AnalysesService",
      "status": "READY"
    },
    {
      "deploymentId": "cc34d238-0a51-4f20-b2e7-2e121b94b414",
      "serviceId": "92ec6f09-0c7a-4f5d-b493-32ce30fe2207",
      "serviceName": "AnalysisLifecycleService",
      "status": "READY"
    },
    {
      "deploymentId": "6781c1c5-7862-4930-b176-152d293f087f",
      "serviceId": "dbf2c6bf-5bff-4cbc-b17b-74db30729db5",
      "serviceName": "AnalysisValidatorService",
      "status": "READY"
    },
    {
      "deploymentId": "6b8c7bd0-996f-441a-ad43-5e31a41af6ad",
      "serviceId": "e5224e73-40ce-4251-a325-63bddf0f70ba",
      "serviceName": "AnalyzerOpenApiRestService",
      "status": "READY"
    },
    {
      "deploymentId": "91421026-6c32-4546-99c2-7b89bfad43e2",
      "serviceId": "ef6df8de-1491-4d88-b18e-d0dc5b395758",
      "serviceName": "AnalyzerResourceEstimatorService",
      "status": "READY"
    },
    {
      "deploymentId": "d950eccf-5b5f-429f-800c-8d8e5138b298",
      "serviceId": "b0d82272-ebf7-40b0-adfd-e73f67c8f333",
      "serviceName": "AssetBrowsingOpenApiRestService",
      "status": "READY"
    },
    {
      "deploymentId": "b270a5ec-5d93-4109-a46f-ffef4b2e254e",
      "serviceId": "2ecb90f3-6889-4d68-a5a5-bbe3c05477d8",
      "serviceName": "AssetProbeService",
      "status": "READY"
    },
    {
      "deploymentId": "4da5f68b-93b2-4deb-b13b-663be9b29180",
      "serviceId": "87411208-9278-4170-aa18-21c9fcc22776",
      "serviceName": "BandingMapsService",
      "status": "READY"
    },
    {
      "deploymentId": "4bfc3ea7-d985-4cec-a8e1-9c49dd1851cc",
      "serviceId": "04d0fb8c-f703-4865-9886-3e748915c7ed",
      "serviceName": "CacheService",
      "status": "READY"
    },
    {
      "deploymentId": "8c36eaba-f337-4234-87b1-8dad67f75f46",
      "serviceId": "b73e476a-24e7-4326-b7ce-9d0e655d8c9a",
      "serviceName": "CapturesService",
      "status": "READY"
    },
    {
      "deploymentId": "1e3d1fcc-01c1-4feb-aba7-d22b966dff8f",
      "serviceId": "5c9b03d5-c634-47da-9628-31509686031e",
      "serviceName": "ColorDifferenceMapsService",
      "status": "READY"
    },
    {
      "deploymentId": "26e4558e-9c3c-4ba0-99e1-2e4a869896f5",
      "serviceId": "2f0f6ded-da18-4599-91bb-07c472b20af5",
      "serviceName": "CreateAnalysisEndpointHandler",
      "status": "READY"
    },
    {
      "deploymentId": "d6b6f1af-1789-4309-8c08-3eac11099b1b",
      "serviceId": "8ad5de58-da84-41a2-ab2c-68753aee908d",
      "serviceName": "CreateBandingMapsEndpointHandler",
      "status": "READY"
    },
    {
      "deploymentId": "cbbe9b94-5cb6-4fb8-a4d6-4b8d726f0b38",
      "serviceId": "81692007-e614-4cdd-8c28-875bc65a50ca",
      "serviceName": "CreateCapturesEndpointHandler",
      "status": "READY"
    },
    {
      "deploymentId": "b774a432-5b6c-48dc-8632-fddc2319825f",
      "serviceId": "cdf7403c-4dea-41c0-a43a-9d54268ad8d4",
      "serviceName": "CreateColorDifferenceMapsEndpointHandler",
      "status": "READY"
    },
    {
      "deploymentId": "04194b43-e5c3-419a-8af4-dbf66f94feaa",
      "serviceId": "58db90cc-ee41-4214-81b7-3cfa3c128705",
      "serviceName": "CreateConfigurationEndpointHandler",
      "status": "READY"
    },
    {
      "deploymentId": "2385c368-2e10-4301-9173-3cad70004243",
      "serviceId": "9d8634f4-3456-49aa-9282-f10a5799285d",
      "serviceName": "CreateFramesEndpointHandler",
      "status": "READY"
    },
    {
      "deploymentId": "7757a6b2-e6d4-4804-81bd-d3d6379e9c22",
      "serviceId": "83f96b33-f31d-4416-b159-d5207dd9a7e6",
      "serviceName": "CreateQualityMapsEndpointHandler",
      "status": "READY"
    },
    {
      "deploymentId": "a34f91a5-9a9d-4944-b9da-90a707140613",
      "serviceId": "14a4813f-55e8-4979-bacb-9690448414c8",
      "serviceName": "DeleteAnalysisEndpointHandler",
      "status": "READY"
    },
    {
      "deploymentId": "1188937f-d30e-4a03-8a2a-8a3e01e63ce7",
      "serviceId": "930c7627-2091-497f-bd50-e98d96bf1bbe",
      "serviceName": "DeleteCacheEndpointHandler",
      "status": "READY"
    },
    {
      "deploymentId": "0c8276e8-64fe-43d9-90dc-b7f67e5dabbf",
      "serviceId": "e1bb022e-36c3-46d9-bac1-5c18bc54d5ed",
      "serviceName": "FileCacheService",
      "status": "READY"
    },
    {
      "deploymentId": "48a75f9a-938c-423d-bbc4-82236bca652e",
      "serviceId": "2717ec00-a279-475b-8683-a0845af5fc61",
      "serviceName": "FilebeatConfigurationService",
      "status": "READY"
    },
    {
      "deploymentId": "e7d52f29-21c1-4f87-8a67-1618e2fe5704",
      "serviceId": "5466034b-dac5-4d00-963d-f8632b01f05d",
      "serviceName": "FrameServicesOpenApiRestService",
      "status": "READY"
    },
    {
      "deploymentId": "0ebf9d4f-3b4b-4a1c-ac3d-c5cf89a0b611",
      "serviceId": "1cad1120-5b77-45bd-9253-73d63e811a03",
      "serviceName": "FramesService",
      "status": "READY"
    },
    {
      "deploymentId": "55b9fde8-a858-430b-b3b9-c4ffa12ac5df",
      "serviceId": "dda60962-8518-42ea-b60b-6c82961a5681",
      "serviceName": "GetBucketLocationEndpointHandler",
      "status": "READY"
    },
    {
      "deploymentId": "0d5e9051-a72d-4b15-ab84-9282c4397092",
      "serviceId": "fcff906d-8731-47b4-80c0-02eeb5f9118b",
      "serviceName": "GetCacheDetailsEndpointHandler",
      "status": "READY"
    },
    {
      "deploymentId": "445f26e3-4c66-4c3b-a173-6e1c954f0e9c",
      "serviceId": "f9ae2493-232d-4ba2-8b8a-62a703f10aaa",
      "serviceName": "GetConfigurationEndpointHandler",
      "status": "READY"
    },
    {
      "deploymentId": "3b81bca6-eae0-4b2d-95e6-acbd00365bbe",
      "serviceId": "6edd6831-08a8-4ff1-8591-2f9d95001ec3",
      "serviceName": "GetFeatureLicenseEndpointHandler",
      "status": "READY"
    },
    {
      "deploymentId": "cc555fad-b1f3-4b8c-b7c9-5d01ef291a8a",
      "serviceId": "a9e026b0-ebf3-4809-847e-eb74e16d1dd4",
      "serviceName": "GetStatusEndpointHandler",
      "status": "READY"
    },
    {
      "deploymentId": "fd72af13-3f0e-4e7c-951e-3cc750df1e82",
      "serviceId": "3936d21a-5cba-4b1d-8e11-d72456cc50a0",
      "serviceName": "GetVersionEndpointHandler",
      "status": "READY"
    },
    {
      "deploymentId": "a950d391-0e2f-4cb6-89f5-14db3d7803fd",
      "serviceId": "606235a3-2f81-4d68-beb7-789e6dc03a45",
      "serviceName": "HeadBucketEndpointHandler",
      "status": "READY"
    },
    {
      "deploymentId": "82378f07-0a0e-4cb1-8c92-cb86decb7b8a",
      "serviceId": "56938260-3fad-4141-a692-18ee160e7041",
      "serviceName": "HeadObjectEndpointHandler",
      "status": "READY"
    },
    {
      "deploymentId": "31471811-93ee-44f6-912e-846d6834b8ca",
      "serviceId": "69df8859-4800-4746-9a6f-1d6aab0c131c",
      "serviceName": "HlsService",
      "status": "READY"
    },
    {
      "deploymentId": "5fcee6aa-e2da-4871-b455-01ff893f5d6e",
      "serviceId": "633814b5-d998-4ddc-865b-8c4aaa83229b",
      "serviceName": "HttpReverseProxyService",
      "status": "READY"
    },
    {
      "deploymentId": "a6a72ea8-5bb7-4733-85e1-3ea4d33f7337",
      "serviceId": "02480f35-8e4d-4100-993c-73384a72f5c4",
      "serviceName": "InsightsClientService",
      "status": "READY"
    },
    {
      "deploymentId": "cf387da5-5ffa-4c12-a37d-ad908f8c8e0d",
      "serviceId": "d9137973-5f27-46cb-9c10-f599196d986f",
      "serviceName": "InsightsKafkaService",
      "status": "READY"
    },
    {
      "deploymentId": "e5499065-bd5c-4d2b-aefe-937fbb250f3c",
      "serviceId": "80565b3f-41d0-4978-a29a-f23fd0d062dd",
      "serviceName": "JobTimeoutService",
      "status": "READY"
    },
    {
      "deploymentId": "b6c85faa-3b5b-49c7-aa00-b7b247aa1d2a",
      "serviceId": "f933de93-c7d5-4960-90f9-07bd735b8df7",
      "serviceName": "KubernetesConfigurationService",
      "status": "READY"
    },
    {
      "deploymentId": "4e34d73d-e226-4e4d-9876-e1dc48f2f43d",
      "serviceId": "4671d050-113c-490e-bdd3-74d53edaa3a7",
      "serviceName": "KubernetesFeatureLicenseService",
      "status": "READY"
    },
    {
      "deploymentId": "5e0fcfa0-2acf-4ecf-a221-ae3a79a760d0",
      "serviceId": "89954c88-6a9a-41a5-93d3-7efeeacd0792",
      "serviceName": "KubernetesJobJanitorService",
      "status": "READY"
    },
    {
      "deploymentId": "0e227dea-dd05-4479-bdb1-b2babf8c5deb",
      "serviceId": "0fa1032f-28b1-4d16-ab4e-3c96dab82b23",
      "serviceName": "KubernetesJobManagementService",
      "status": "READY"
    },
    {
      "deploymentId": "8af50b7a-37cb-4f33-9b9f-cdb304543f35",
      "serviceId": "41104967-0699-4536-9e00-2e8d274bef26",
      "serviceName": "KubernetesPodStatusService",
      "status": "READY"
    },
    {
      "deploymentId": "332634fc-6c5d-4de2-a3b5-9574a4c52a34",
      "serviceId": "c8cc9133-4e85-462f-85ff-b2f556abeb9a",
      "serviceName": "KubernetesServiceConfigurationProvider",
      "status": "READY"
    },
    {
      "deploymentId": "2ad26a6d-f6be-4aa2-add1-875aefcb293d",
      "serviceId": "590b8e63-1d7b-4d50-bbe7-1640d04d7181",
      "serviceName": "KubernetesSupportService",
      "status": "READY"
    },
    {
      "deploymentId": "9a0b7776-307f-48a0-a40f-8106561a7fcc",
      "serviceId": "7ecdd934-1984-4ac7-b061-854a9b1624e7",
      "serviceName": "ListBucketsEndpointHandler",
      "status": "READY"
    },
    {
      "deploymentId": "a916a21a-c594-4663-8496-ff70e2c845e8",
      "serviceId": "93604b76-92e4-45cf-875c-e22f983acf07",
      "serviceName": "ListObjectsEndpointHandler",
      "status": "READY"
    },
    {
      "deploymentId": "2e794590-8ccc-4a77-97e2-b0a1127d4c30",
      "serviceId": "b872ea4a-5571-4dbc-9b4b-284dca47145d",
      "serviceName": "PatchAnalysisEndpointHandler",
      "status": "READY"
    },
    {
      "deploymentId": "d367138c-54d5-4d88-bcce-0d3a7a56b5aa",
      "serviceId": "18dd8d46-3cd1-4da8-963e-4257423a583f",
      "serviceName": "PatchConfigurationEndpointHandler",
      "status": "READY"
    },
    {
      "deploymentId": "d0bb455b-ae10-40bc-bd65-3ba219ec7999",
      "serviceId": "a4620745-cb76-4a82-aaa8-259e311099b8",
      "serviceName": "PutFeatureLicenseEndpointHandler",
      "status": "READY"
    },
    {
      "deploymentId": "dc04dfd7-c864-41b7-a914-f5ff99a4bfb6",
      "serviceId": "8f6a2535-9c82-477a-830d-2c97a0c0cf63",
      "serviceName": "PutS3AccessSecretEndpointHandler",
      "status": "READY"
    },
    {
      "deploymentId": "96edac94-88d7-4712-9b1a-f5da04a170a8",
      "serviceId": "e7e30de1-a51f-4ab0-84a2-1c998390b272",
      "serviceName": "QualityMapsService",
      "status": "READY"
    },
    {
      "deploymentId": "c672fc76-3d97-4eee-ab24-7b60bba3ebcb",
      "serviceId": "bc264ea9-bf26-4aa4-8faa-3ba21cf43676",
      "serviceName": "ResourceEstimateHandlerService",
      "status": "READY"
    },
    {
      "deploymentId": "a32a580f-5197-49c0-af16-802888788f7c",
      "serviceId": "d8dd8378-bc72-4abd-819d-56a8b23e7532",
      "serviceName": "S3Service",
      "status": "READY"
    },
    {
      "deploymentId": "1a0e0b1e-5fe9-4158-9da7-566adf6d8c23",
      "serviceId": "d9da3ec5-290a-4100-a361-6e7ca0800a99",
      "serviceName": "StreamSmartArgoService",
      "status": "READY"
    },
    {
      "deploymentId": "a9d4d7c6-0032-4b1c-a898-d700de05c9ff",
      "serviceId": "974975a0-f6c3-44c2-95d5-1ecda37eac0e",
      "serviceName": "StreamSmartOpenApiRestService",
      "status": "READY"
    },
    {
      "deploymentId": "ffcc806f-5b38-4ccd-bb9d-58204b4eabeb",
      "serviceId": "cddf9605-7bad-4d61-996a-afcd6c5ba45f",
      "serviceName": "StreamSmartWorkflowControllerService",
      "status": "READY"
    },
    {
      "deploymentId": "5eba5102-1907-4b27-a9f8-24378377d04b",
      "serviceId": "61da4490-8ca4-4b5e-87c6-b55f0068b9d7",
      "serviceName": "SystemService",
      "status": "READY"
    },
    {
      "deploymentId": "ba0b8092-ee76-4288-93b5-5f80399ed6d1",
      "serviceId": "c9113d97-0b98-4202-a4ea-7f5f31a2c7d9",
      "serviceName": "SystemServicesOpenApiRestService",
      "status": "READY"
    },
    {
      "deploymentId": "1804adc5-fec6-42ff-914a-dc6765383bd1",
      "serviceId": "bfde8742-b15c-4729-8678-7725d5b4e643",
      "serviceName": "VertxEventBusProxyService",
      "status": "READY"
    }
  ],
  "outcome": "READY"
}
Get the feature license
GET /featureLicense

Retrieves the current feature license.

Responses

200 OK

Indicates that the feature license was succesfully retrieved.

Body
text/plain
string
400 400

The request could not be understood by the server due to malformed syntax. The client SHOULD NOT repeat the request without modifications.

403 403

The server understood the request, but is refusing to fulfill it. Authorization will not help and the request SHOULD NOT be repeated.

405 405

The request HTTP method is known by the server but has been disabled and cannot be used for that resource.

415 415

Used when the request is asking for a content-type that is not supported (i.e. XML when you only support JSON). The IMAX Stream On-Demand Platform API currently only supports the JSON content-type (i.e. application/json).

500 500

The server encountered an unexpected condition which prevented it from fulfilling the request.

503 503

The server is currently unable to handle the request due to a temporary overloading or maintenance of the server. The implication is that this is a temporary condition which will be alleviated after some delay.

Examples

curl -X GET "https://localhost/api/v1/featureLicense" 

HTTP/1.1 200 OK  
 
Content-Type: text/plain

35a82e6900b6d5468073fbd0204e7b07546ec30f5e78f81af9fe4c95c8c88316
{
  "bandingDetection": true,
  "bandingMaps": true,
  "color": true,
  "colorDifferenceMaps": true,
  "contentAttributes": true,
  "contentComplexity": true,
  "expiry": "2022-12-31",
  "frameCaptures": true,
  "hdrSupport": true,
  "insights_analysis_url": "",
  "insights_cli_overrides": false,
  "insights_frame_scores": true,
  "insights_password": "test-password",
  "insights_qc_config_url": "",
  "insights_scene_definitions_url": "",
  "insights_servers": [],
  "insights_username": "test-user",
  "organization": "IMAX",
  "otherVideoQualityMetrics": true,
  "qualityChecks": {
    "blackFrame": true,
    "colorBarFrame": true,
    "freezeFrame": true,
    "missingCaptions": true,
    "scoreChecks": true,
    "silence": true,
    "solidColorFrame": true
  },
  "qualityMaps": true,
  "site": "Test Site"
}
Apply a feature license
PUT /featureLicense

Applies a product feature license.

Request body

text/plain
string

Responses

200 OK

Indicates that the feature license was successfully applied.

400 400

The request could not be understood by the server due to malformed syntax. The client SHOULD NOT repeat the request without modifications.

403 403

The server understood the request, but is refusing to fulfill it. Authorization will not help and the request SHOULD NOT be repeated.

405 405

The request HTTP method is known by the server but has been disabled and cannot be used for that resource.

415 415

Used when the request is asking for a content-type that is not supported (i.e. XML when you only support JSON). The IMAX Stream On-Demand Platform API currently only supports the JSON content-type (i.e. application/json).

500 500

The server encountered an unexpected condition which prevented it from fulfilling the request.

503 503

The server is currently unable to handle the request due to a temporary overloading or maintenance of the server. The implication is that this is a temporary condition which will be alleviated after some delay.

Examples

curl -X PUT "https://localhost/api/v1/featureLicense"  \
 -H "Content-Type: text/plain"  \
 -d '35a82e6900b6d5468073fbd0204e7b07546ec30f5e78f81af9fe4c95c8c88316
{
  "bandingDetection": true,
  "bandingMaps": true,
  "color": true,
  "colorDifferenceMaps": true,
  "contentAttributes": true,
  "contentComplexity": true,
  "expiry": "2022-12-31",
  "frameCaptures": true,
  "hdrSupport": true,
  "insights_analysis_url": "",
  "insights_cli_overrides": false,
  "insights_frame_scores": true,
  "insights_password": "test-password",
  "insights_qc_config_url": "",
  "insights_scene_definitions_url": "",
  "insights_servers": [],
  "insights_username": "test-user",
  "organization": "IMAX",
  "otherVideoQualityMetrics": true,
  "qualityChecks": {
    "blackFrame": true,
    "colorBarFrame": true,
    "freezeFrame": true,
    "missingCaptions": true,
    "scoreChecks": true,
    "silence": true,
    "solidColorFrame": true
  },
  "qualityMaps": true,
  "site": "Test Site"
}'
Add Amazon S3 bucket access
PUT /s3AccessSecret

Add an AWS IAM access key granting read permissions to an Amazon S3 bucket for use with the system.

Request body

application/json

Responses

200 OK

Indicates that the secret was added.

400 400

The request could not be understood by the server due to malformed syntax. The client SHOULD NOT repeat the request without modifications.

403 403

The server understood the request, but is refusing to fulfill it. Authorization will not help and the request SHOULD NOT be repeated.

405 405

The request HTTP method is known by the server but has been disabled and cannot be used for that resource.

415 415

Used when the request is asking for a content-type that is not supported (i.e. XML when you only support JSON). The IMAX Stream On-Demand Platform API currently only supports the JSON content-type (i.e. application/json).

500 500

The server encountered an unexpected condition which prevented it from fulfilling the request.

503 503

The server is currently unable to handle the request due to a temporary overloading or maintenance of the server. The implication is that this is a temporary condition which will be alleviated after some delay.

Examples

Add credentials to access the AWS Amazon S3 bucket bucket named “mybucket”

curl -X PUT "https://localhost/api/v1/s3AccessSecret"  \
 -H "Content-Type: application/json"  \
 -d '{
    "bucketName": "mybucket",
    "accessKey": {
        "accessKeyId": "AKIAIOSFODNN7EXAMPLE",
        "accessKey": "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY"
    }
}'
Create system configuration
POST /configurations

Creates a system or service configuration identified by its unique type and id.

Important

Be careful to use this endpoint only at the direction of deployment/configuration instructions or at the request of your IMAX representative.

You cannot create configurations for internal services (i.e. where type is SERVICE).

Request body

application/json
Object
type
ConfigType required

Captures the type of the configuration.

You cannot create configurations for internal services (i.e. where type is SERVICE).

Example:
NONSENSITIVE
id
string required

The unique id of the system component or service configuration you wish to create.

Min length: 2
Max length: 200
Pattern: ^[A-Za-z][A-Za-z0-9-]{1,199}$
Example:
analyzer-extra-options
config
Object required

The JSON configuration content.

Example:
{
  "data": {
    "extraOption1":"extraOptionValue1",
    "extraOption2":"extraOptionValue2"
  }
}

Responses

201 201

The system or service configuration was successfully created.

400 400

The request could not be understood by the server due to malformed syntax. The client SHOULD NOT repeat the request without modifications.

403 403

The server understood the request, but is refusing to fulfill it. Authorization will not help and the request SHOULD NOT be repeated.

405 405

The request HTTP method is known by the server but has been disabled and cannot be used for that resource.

415 415

Used when the request is asking for a content-type that is not supported (i.e. XML when you only support JSON). The IMAX Stream On-Demand Platform API currently only supports the JSON content-type (i.e. application/json).

500 500

The server encountered an unexpected condition which prevented it from fulfilling the request.

503 503

The server is currently unable to handle the request due to a temporary overloading or maintenance of the server. The implication is that this is a temporary condition which will be alleviated after some delay.

Examples

Create extra configuration options for the On-Demand Analyzer.

curl -X POST "http://localhost/api/v1/configurations"  \
 -H "Content-Type: application/json"  \
 -d '{
    "type": "NONSENSITIVE",
    "id": "analyzer-extra-options",
    "config": {
      "data": {
        "extraOption1":"extraOptionValue1",
        "extraOption2":"extraOptionValue2"
      }
    }
}'
Get system or service configuration
GET /configurations/{id}

Fetches the system or service configuration identified by its unique id.

Path variables

id
string required

The unique id of the system component or service whose configuration you wish to retrieve.

Min length: 2
Max length: 200
Pattern: ^[A-Za-z][A-Za-z0-9-]{1,199}$
Example:
analyzer-extra-options

Responses

200 200

The JSON configuration for the requested system component or service.

Body
application/json
Object
Examples

System configuration for extra options for the On-Demand Analyzer.

{
  "data": {
    "extraOption1": "extraOptionValue1",
    "extraOption2": "extraOptionValue2"
  }
}
400 400

The request could not be understood by the server due to malformed syntax. The client SHOULD NOT repeat the request without modifications.

403 403

The server understood the request, but is refusing to fulfill it. Authorization will not help and the request SHOULD NOT be repeated.

405 405

The request HTTP method is known by the server but has been disabled and cannot be used for that resource.

415 415

Used when the request is asking for a content-type that is not supported (i.e. XML when you only support JSON). The IMAX Stream On-Demand Platform API currently only supports the JSON content-type (i.e. application/json).

500 500

The server encountered an unexpected condition which prevented it from fulfilling the request.

503 503

The server is currently unable to handle the request due to a temporary overloading or maintenance of the server. The implication is that this is a temporary condition which will be alleviated after some delay.

Examples

Fetch the system configuration for the extra options on the On-Demand Analyzer.

curl -X GET "http://localhost/api/v1/configurations/analyzer-extra-options" 

HTTP/1.1 200 OK  
 
Content-Type: application/json 

{
  "data": {
    "extraOption1": "extraOptionValue1",
    "extraOption2": "extraOptionValue2"
  }
}
Update system or service configuration
PATCH /configurations/{type}.{id}

Applies an update to a system or service configuration identified by its unique type and id.

Important

Be careful to use this endpoint only at the direction of deployment/configuration instructions or at the request of your IMAX representative.

Path variables

type
ConfigType required

Captures the type of the configuration.

Example:
SERVICE
id
string required

The unique id of the system component or service whose configuration you wish to update.

Min length: 2
Max length: 200
Pattern: ^[A-Za-z][A-Za-z0-9-]{1,199}$
Example:
AssetProbeService

Request body

application/json
Object
Examples

System configuration for the AssetProbeService.

{
  "extraOption1": "extraOptionValue1",
  "extraOption2": "extraOptionValue2"
}

Responses

200 200

The system or service configuration was successfully updated.

404 404

The server cannot find the requested resource.

400 400

The request could not be understood by the server due to malformed syntax. The client SHOULD NOT repeat the request without modifications.

403 403

The server understood the request, but is refusing to fulfill it. Authorization will not help and the request SHOULD NOT be repeated.

405 405

The request HTTP method is known by the server but has been disabled and cannot be used for that resource.

415 415

Used when the request is asking for a content-type that is not supported (i.e. XML when you only support JSON). The IMAX Stream On-Demand Platform API currently only supports the JSON content-type (i.e. application/json).

500 500

The server encountered an unexpected condition which prevented it from fulfilling the request.

503 503

The server is currently unable to handle the request due to a temporary overloading or maintenance of the server. The implication is that this is a temporary condition which will be alleviated after some delay.

Examples

Update the configuration for the AssetProbeService.

curl -X PATCH "http://localhost/api/v1/configurations/SECRET.AWS-EMC"  \
 -H "Content-Type: application/json"  \
 -d '{
   "extraOption1": "extraOptionValue1",
   "extraOption2": "extraOptionValue2"
 }'
Type Definitions
ActiveSegmentConfig

Configuration options for specifying the conditions under which segments of a content type are considered active. By default, all content types will be considered active when at least one audio channel is active throughout that segments duration.

Object
Example:
{
    "motionVideoSegments": {
        "canBeActive": true,
        "activeAudioChannelsDefinition": "SILENCE"
    },
    "blackFrameSegments": {
        "canBeActive": false
    },
    "colorBarFrameSegments": {
        "canBeActive": true,
        "activeAudioChannelsDefinition": "ALL_CHANNELS_ACTIVE"
    },
    "freezeFrameSegments": {
        "canBeActive": true,
        "activeAudioChannelsDefinition": "ANY_CHANNEL_ACTIVE"
    },
    "solidColorFrameSegments": {
        "canBeActive": true,
        "activeAudioChannelsDefinition": "ANY_CHANNEL_ACTIVE"
    }
}
motionVideoSegments

Active segment definition for motion video segments

Example:
{
    "canBeActive": true,
    "activeAudioChannelsDefinition": "SILENCE"
}
blackFrameSegments

Active segment definition for black frame segments

Example:
{
    "canBeActive": false
}
colorBarFrameSegments

Active segment definition for color bar frame segments

Example:
{
    "canBeActive": true,
    "activeAudioChannelsDefinition": "ALL_CHANNELS_ACTIVE"
}
freezeFrameSegments

Active segment definition for freeze frame segments

Example:
{
    "canBeActive": true,
    "activeAudioChannelsDefinition": "ANY_CHANNEL_ACTIVE"
}
solidColorFrameSegments

Active segment definition for color frame segments

Example:
{
    "canBeActive": true,
    "activeAudioChannelsDefinition": "ALL_CHANNELS_ACTIVE"
}
Analysis

Captures an analysis of a video asset from which frame scores are produced using On-Demand Analyzer. A no-reference (NR) analysis is performed on a single video asset only and its results can be used to judge the quality of the asset in isolation. A full-reference (FR) analysis is performed using two video assets: a reference asset against which you will compare a subject asset. Generally, the reference asset is the higher quality video and the subject asset is the resulting video having gone through some kind of transcoding, compression or general transformation. A full-reference analysis will give frame scores on the absolute quality of each asset as well as the comparative quality, allowing one to ascertain the impact of the transformation process on the overall quality.

Analyses are used as the payload in an AnalysisResponse and contain the attributes necessary to lookup the associated frame score results. For a successfully submitted analysis, the id will represent a universally unique id (UUID) that can be used as a key to lookup the frame score results. Additionally the submissionTimestamp will indicate the time at which the analysis was successfully submitted.

For an analysis that fails to be submitted, the id and submissionTimestamp attributes will be missing and the submissionError attribute will contain details indicating the nature of the error. If you are unsure how to interpret the error or the workaround, please contact your IMAX representative.

Object
Example:
{
    "id": "04a2e841-9c9e-4f50-9f9f-4a8b847f5b3e",
    "referenceAsset": {
        "name": "Big_Buck_Bunny.mp4",
        "path": "/mnt/nas/ref/videos",
        "content": {
            "title": "Big Buck Bunny"
        },
        "storageLocation": {
            "type": "S3",
            "name": "videos",
            "credentials": {
                "useAssumedIAMRole": true
            }
        },
        "hdr": true,
        "qualityCheckConfig": {
            "scoreChecks": [
                {
                    "metric": "SVS",
                    "threshold": 80,
                    "viewingEnvironmentIndex": 1,
                    "durationSeconds": 5,
                    "durationFrames": 1,
                    "skipStart": 1.25,
                    "skipEnd": 1.25
                }
            ]
        }
    },
    "subjectAsset": {
        "name": "Big_Buck_Bunny_1080p@5000kbps.mp4",
        "path": "/mnt/nas/videos",
        "content": {
            "title": "Big Buck Bunny"
        },
        "storageLocation": {
            "type": "S3",
            "name": "videos",
            "credentials": {
                "useAssumedIAMRole": true
            }
        },
        "hdr": true,
        "qualityCheckConfig": {
            "scoreChecks": [
                {
                    "metric": "SVS",
                    "threshold": 80,
                    "viewingEnvironmentIndex": 1,
                    "durationSeconds": 5,
                    "durationFrames": 1,
                    "skipStart": 1.25,
                    "skipEnd": 1.25
                }
            ]
        }
    },
    "submissionTimestamp": "2018-01-01T14:20:22Z",
    "testId": "1",
    "content": {
        "title": "Big Buck Bunny"
    },
    "analyzerConfig": {
        "enableBandingDetection": true,
        "viewingEnvironments": [
            {
                "device": {
                    "name": "oled65c9pua",
                    "resolution": {
                        "width": 1920,
                        "height": 1080
                    }
                },
                "viewerType": "TYPICAL"
            }
        ]
    }
}
id
string uuid required

The UUID for the analysis.

Min length: 36
Max length: 36
Example:
04a2e841-9c9e-4f50-9f9f-4a8b847f5b3e
description
string

A description of the analysis which can be used for reference, categorization and search/filtering.

Example:
Capturing results of transcoding.
referenceAsset

The reference asset against which you will compare a subject asset. This attribute is ONLY used for full-reference (FR) analyses.

subjectAsset
Asset required

The subject asset which you will use to compare against the reference asset (for full-reference analysis) or the asset against which you will perform a no-reference analysis.

submissionError
string

Any error message resulting from the submission of the analysis. Note that the error represented here is meant to cover ONLY the submission of a new analysis for processing withing the Kubernetes cluster. It does NOT cover any errors that may be generated when the analysis is either scheduled or executed. These error messages will be available through alternative means (i.e. Kubernetes monitoring software - Prometheus and/or REST APIs available for result processing).

Example:
Failed to schedule Kubernetes job
submissionTimestamp
string date-time

The UTC timestamp (using ISO-8601 representation) recording when the analysis was successfully submitted for analysis. Analyses that fail to submit corectly will not have a value for this attribute.

Example:
2018-01-01T14:20:22Z
testId

The test ID used to uniquely identify the asset within the analysis

Example:
1-1
content

Metadata about the content being analyzed in this analysis.

Example:
{
    "title": "Big Buck Bunny"
}
analyzerConfig
AnalyzerConfig required

Analyzer configuration options used in this analysis

AnalysisPatchRequest

The request body used when updating an analysis.

Currently, the system supports only the following update operations:

  1. Cancelling an existing analysis

    Note

    Only analyses that are currently in progress (i.e. scheduled, estimating, aligning, analyzing) can be cancelled

Object
Example:
{
    "status": "CANCELLED"
}
status
string

The desired analysis status

Enumeration:
CANCELLED

Cancels a running analysis

Example:
CANCELLED
AnalysisResponse

The response payload of the POST on analyses. This response will contain an Analysis object for each analysis represented in the NewAnalysis request body. A given analysis can either be submitted successfully or not. In both cases, the Analysis will contain attribute values that can be used to either fetch the resulting frame scores when successful, or address the error condition when a failure occurs (please see Analysis for more details).

Object
submittedAnalyses
Array of Analysis required

The submitted no-reference or full-reference analysis.

Min items: 1
Unique items: YES
AnalyzerConfig

Specification of configuration options for use by the analyzer at the analysis level. Configuration options for assets can be specified on the Asset object.

Object
Example:
{
    "enableBandingDetection": true,
    "enableColorVolumeDifference": true,
    "enableColorStatsCollection": true,
    "enableVMAF": true,
    "enablePSNR": true,
    "qualityCheckConfig": {
        "enabled": true,
        "duration": 2.5,
        "skipStart": 1.25,
        "skipEnd": 1.25,
        "missingCaptions": {
            "enabled": false,
            "duration": 2.5,
            "skipStart": 1.25,
            "skipEnd": 1.25
        }
    },
    "viewingEnvironments": [
        {
            "device": {
                "name": "oled65c9pua"
            },
            "viewerType": "EXPERT"
        },
        {
            "device": {
                "name": "xl2420t"
            },
            "viewerType": "TYPICAL"   
        }
    ],
    "temporalAlignment": {
        "minSeconds": 5,
        "maxSeconds": 120,
        "maxSecondsHighFPS": 120
    }
}
enableBandingDetection
boolean

Detect the level of banding within the assets

Example:
true
enableCadencePatternDetection
boolean

Detect cadence patterns within the assets

Example:
true
enableComplexityAnalysis
boolean

For full-reference (FR) analyses, run complexity analysis on the reference asset(s). In a no-reference analysis, complexity analysis is run on the subject asset(s) instead.

Example:
true
enableColorVolumeDifference
boolean

Enable Colour Volume Difference calculation

Example:
true
enableColorStatsCollection
boolean

Enable Luminance and Color Gamut stats collection

Example:
true
enableVMAF
boolean

Enable VMAF score calculation for full-reference analyses

Example:
true
enablePSNR
boolean

Enable PSNR score calculation for full-reference analyses

Example:
true
enableTemporalAlignment
boolean

Controls whether the Analyzer will perform automatic temporal alignment or not. This flag applies only to full-reference analyses and it is recommended to leave enabled.

Example:
true
enablePhysicalNoise
boolean

Enable physical noise calculation for the video. Physical Noise measures standard deviation of camera/sensor noise when statistical behaviour of noise is random with Gaussian (or similar) distribution.

Example:
true
enableVisualNoise
boolean

Enable visual noise calculation for the video. Visual Noise measures the standard deviation of noise considering the contrast masking behaviour of the underlying content.

Example:
true
enableTemporalInformation
boolean

Enable temporal information collection for the video

Example:
true
enableSpatialInformation
boolean

Enable spatial information collection for the video

Example:
true
enableColorInformation
boolean

Enable color information collection for the video

Example:
true
qualityCheckConfig

Configuration options for quality checks.

Example:
{
    "enabled": true,
    "duration": 2.5,
    "skipStart": 1.25,
    "skipEnd": 1.25,
    "missingCaptions": {
        "enabled": false,
        "duration": 2.5,
        "skipStart": 1.25,
        "skipEnd": 1.25
    }
}
viewingEnvironments
Array of ViewingEnvironment nullable

Specifications of environments under which the content is viewed

Unique items: YES
Example:
[
    {
        "device": {
            "name": "oled65c9pua"
        },
        "viewerType": "EXPERT"
    },
    {
        "device": {
            "name": "xl2420t"
        },
        "viewerType": "TYPICAL"   
    }
]
framesToProcess
integer

Number of frames to process. When specified in the context of a full-reference analysis, the value applies to the reference asset.

Min: 1
Example:
240
temporalAlignment

Configuration options for temporal alignment

Example:
{
    "minSeconds": 5,
    "maxSeconds": 120,
    "maxSecondsHighFPS": 120
}
contentLayoutDetection

Configuration options for content layout detection.

additionalConfigurationOptions
Object nullable

Additional (undocumented) configuration options for use by the Analyzer and at the direction/suggestion of your IMAX representative.

Example:
{
    "bandingDetectionThreshold": 40,
    "macroBlocking": true
}
Asset

Represents a video asset in the system. There are several different supported formats for specifying the asset path. See examples.

Object
Examples:
{
    "assetUri": "s3://videos/example/Big_Buck_Bunny.mp4"
}
{
    "name": "example/Big_Buck_Bunny.mp4",
    "storageLocation": {
        "type": "S3",
        "name": "videos"
    }
}
{
    "content": {
        "title": "Big Buck Bunny"
    },
    "name": "/videos/Big_Buck_Bunny_1080p@5000kbps.mp4",
    "storageLocation": {
        "type": "S3",
        "name": "test-bucket",
        "credentials": {
            "useAssumedIAMRole": true
        }
    }
}
{
    "content": {
        "title": "Sparks - Dolby Vision"
    },
    "name": "20161103_1023_SPARKS_4K_P3_PQ_4000nits_DoVi.mxf",
    "sidecars": [
        {
            "type": "DOLBY_VISION_METADATA",
            "name": "20161103_SPARKS_DOVI_METADATA_AR_CORRECT.xml"
        }
    ],
    "path": "/mnt/nas/videos",
    "storageLocation": {
        "type": "PVC",
        "name": "videos"
    },
    "dynamicRange": "HDR"
}
content

Metadata about the content contained in this asset. This can also be set automatically for all assets by including the content field in the NewAnalysis request.

Example:
{
    "title": "Big Buck Bunny"
}
assetUri
string

A URI describing the asset location, of the form

storageLocationType://storageLocationName/path/name

Either this field or all of name, path, and storageLocation must be provided. Additional storageLocation properties such as credentials may still be specified alongside this field.

Any special characters, like space or hash, must be percent-encoded. For example, an S3 object with key my video#001.mp4 should be given as s3://my-bucket/mypath/my%20video%23001.mp4.

Examples:
http://hls-content-server/video/master.m3u8s3://my-bucket-name/test/Big_Buck_Bunny_480p.mp4s3://my-bucket-name/test/My%20Video.mp4pvc://video-files-pvc/ref/source/Big_Buck_Bunny.ts
name
string

To uniquely specify an asset location, either the assetUri field or all of name, path, and storageLocation must be provided.

The filename that represents the video. Although not required, one is encouraged to keep filenames unique, where possible.

If working with an image sequence asset (a collection of multiple image files at the same base path indexed by sequential numbers), a format string must be included in the file name to specify the position and format of the index.

The format string can take one of two forms:

  • %d specifies numbers with no special formatting at the included position. For example, "sintel_%d.dpx" will match an asset consisting of images sintel_1.dpx, sintel_2.dpx, … , sintel_10.dpx, …
  • %0[width]d specifies numbers that are 0-padded consisting of width characters total. For example, "sintel_%03d.dpx" will match an asset consisting of images sintel_001.dpx, sintel_002.dpx, sintel_003.dpx, …

The literal character % can be escaped with the string %%. For example, "big%%20buck%%20bunny%04d.png" will match an asset consisting of images big%20buck%20bunny0001.png, big%20buck%20bunny0002.png, big%20buck%20bunny0003.png, …

Note that only one format string can be specified in the file name. Additionally, if a format string is included, imageSequenceParameters must also be provided.

Min length: 1
Max length: 500
Example:
Big_Buck_Bunny_1080p@5000kbps.mp4
sidecars
Array of Sidecar nullable

The sidecar(s) to associate with the asset. If a sidecar does not specify a path, it is assumed to use the path associated with the asset.

Unique items: YES
Example:
[
    {
        "type": "DOLBY_VISION_METADATA",
        "name": "20161103_SPARKS_DOVI_METADATA_AR_CORRECT.xml"
    }
]
path
string

To uniquely specify an asset location, either the assetUri field or all of name, path, and storageLocation must be provided.

The path to the asset’s file (and possibly sidecar) location with the associated storage. The combination of the path and filename of a given asset must be unique.

Max length: 500
Example:
/mnt/nas/videos
storageLocation

To uniquely specify an asset location, you must use either the assetUri field OR the name and/or path, and storageLocation.

The storage location that houses the asset.

Example:
{
    "type": "S3",
    "name": "test-bucket",
    "credentials": {
        "useAssumedIAMRole": "true"
    }
}
hdr
boolean

DEPRECATED Only for use for backwards compatibility, please use dynamicRange instead. This flag will cease to be supported in future versions.

Hint that the video asset should be high dynamic range (HDR). Note that if the asset cannot be HDR due to low bit depth, the analysis will fail.

Example:
true
dynamicRange
string

Used to specify the dynamic range of the asset. The recommended option is the default of auto-detect. Forcing to HDR or SDR is usually not needed and should only be used if automatic detection of the dynamic range has failed.

Enumeration:
AUTO_DETECT

Auto detect the asset dynamic range based on its metadata.

SDR

Treat the asset as having SDR (standard dynamic range), ignoring asset metadata.

HDR

Treat the asset as having HDR (high dynamic range), ignoring asset metadata.

Default:
AUTO_DETECT
streamIdentifier

Used to specify the packet identifier (see VideoPID and VideoPIDHex) for assets with multiple video streams. In the case of HLS, this identifier can be used to represent the HLS variant (see HLSVariantIdentifer).

startFrame
integer

The starting frame to use for this asset, as it pertains to the analysis. For video assets, this value is relative to the start of the asset, regardless of the absolute frame index at which it starts. If your asset starts at frame X, specifying a value of 100 for startFrame will instruct the analyzer will ignore all frames in the asset from X -> X + 99.

However, in the case where your asset is an image sequence that does not start at frame 1, you must do the arithmetic to figure out the correct startFrame that applies. Consider, for example, an image sequence that starts at frame 240. If you want skip the first 100 frames and start at frame 340, you would specify a startFrame value of 100, not 340.

If unspecified, the analyzer will always use 1. This value is ignored for frame capture requests and one directed to use the frameIndex in the body of the frame capture request instead.

Min: 1
Example:
1
regionOfInterest

Specification for the region of interest of this asset.

Example:
{
    "originX": 20,
    "originY": 0,
    "regionHeight": 300,
    "regionWidth": 400
}
rawVideoParameters

Settings for RAW video formats. Must be provided if this asset has the file extension .yuv, .rgb or .bgr.

Example:
{
    "resolution": {
        "width": 720,
        "height": 576
    },
    "fps": 25,
    "scanType": "P",
    "fieldOrder": "TFF",
    "pixelFormat": "YUV420P"
}
imageSequenceParameters

Settings for image sequence assets. Must be provided if the name contains a format string (%d or %0[width]d) and the file extension is one of:

  • .png
  • .tif
  • .tiff
  • .dpx
  • .jpg
  • .jpeg
  • .j2c
  • .jp2
  • .jpc
  • .j2k
  • .exr
Example:
{
    "fps": 25
}
qualityCheckConfig

Configuration options for asset-level quality checks

Example:
{
  "scoreChecks": [
    {
      "metric": "SVS",
      "threshold": 80,
      "durationSeconds": 5,
      "skipStart": 1.25,
      "skipEnd": 1.25,
      "viewingEnvironmentIndex": 0
    },
    {
      "metric": "SVS",
      "threshold": 60,
      "durationSeconds": 2,
      "skipStart": 1.25,
      "skipEnd": 1.25,
      "viewingEnvironmentIndex": 0
    },
    {
      "metric": "SBS",
      "threshold": 75,
      "durationFrames": 48,
      "skipStart": 1.25,
      "skipEnd": 1.25,
      "viewingEnvironmentIndex": 0
    }
  ]
}
audio

Configuration options for audio groups that exist within this asset.

Example:
{
    "groups": [
        {
            "qualityCheckConfig": {
                "loudnessChecks": [
                    {
                        "type": "MAX_TRUE_PEAK",
                        "enabled": true,
                        "duration": 1,
                        "skipStart": 1.25,
                        "skipEnd": 1.25,
                        "threshold": -2
                    },
                    {
                        "type": "MIN_LOUDNESS_RANGE",
                        "enabled": true,
                        "threshold": 5                   
                    }
                    {
                        "type": "MAX_LOUDNESS_RANGE",
                        "enabled": true,
                        "threshold": 25                   
                    }
                ]
            },
            "loudnessMeasurements": {
                "algorithm": "ITU_R_BS_1770_3",
                "enabled": true
            }
        }
    ]
}
AssetId

A unique identifer for an asset within a completed analysis. For a NR analysis or to pick the reference asset within a FR anlaysis, this will simply the integer ID associated with the asset. For a FR analysis, you will use the “refId-subjectId” format.

string
Min length: 1
Pattern: (^[1-9][0-9]*$)|(^[1-9][0-9]*-[1-9][0-9]*$)
Examples:
11-1
Types: Analysis
AssetQualityChecks

Quality checks specified on a per-asset basis.

Object
Example:
{
    "scoreChecks": [
        {
            "metric": "SVS",
            "threshold": 80,
            "durationSeconds": 5,
            "skipStart": 1.25,
            "skipEnd": 1.25,
            "viewingEnvironmentIndex": 0
        }
    ]
}
scoreChecks
Array of ScoreCheck nullable

Any number of score-based quality check definitions.

Min items: 1
Unique items: YES
Example:
[
    {
        "metric": "SVS",
        "threshold": 80,
        "viewingEnvironmentIndex": 0,
        "durationSeconds": 5,
        "skipStart": 1.25,
        "skipEnd": 1.25
    }
]
metadataChecks
Array of MetadataCheck nullable

Any number of metadata-based quality check definitions

Min items: 1
Unique items: YES
Example:
[
    {
        "type": "DOLBY_VISION"
    }
]
pseHardingTest
PSEHardingTest nullable

Configure Photosensitive Epilepsy Harding Tests

Example:
{
    "enabled": true,
    "duration": 2.5,
    "skipStart": 1.25,
    "skipEnd": 1.25,
    "extendedFailure": {
        "enabled": true,
        "duration": 2.5,
        "skipStart": 1.25,
        "skipEnd": 1.25
    },
    "luminanceFlash": {
        "enabled": true,
        "duration": 2.5,
        "skipStart": 1.25,
        "skipEnd": 1.25
    },
    "redFlash": {
        "enabled": true,
        "duration": 2.5,
        "skipStart": 1.25,
        "skipEnd": 1.25
    },
    "spatialPattern": {
        "enabled": true,
        "duration": 2.5,
        "skipStart": 1.25,
        "skipEnd": 1.25
    },
    "standard": "ITU_R_BT_1702_2"
}
mxfComplianceCheck

MXF container compliance checks.

Example:
{
    "enabled": true
}
mp4ComplianceChecks

QuickTime container compliance checks.

Example:
{
    "enabled": true,
    "duration": true,
    "durationThreshold": 0.5,
    "audioDescriptors": true,
    "videoDescriptors": true,
    "timecodeDescriptors": true
}
Types: Asset
Audio

Represents configuration for audio measurements and audio specific quality checks.

Object
Example:
{
    "enabled": true,
    "groups": [
        {
            "name": "md:audtrackid:org:bbc.co.uk:123456:main.audio.en.primary.surroundsound",
            "language": "en-GB",
            "soundfieldMapping": {
                "type": "SoundfieldChannelMapping",
                "mapping": [
                    {
                        "name": "myleftsoundfile.mxf",
                        "path": "/path/to/file",
                        "inputTrackIndex": 1,
                        "inputChannelIndex": 1,
                        "outputChannelLocation": "FL"
                    },            {
                    {
                        "name": "myrightsoundfile.mxf",
                        "path": "/path/to/file",
                        "inputTrackIndex": 1,
                        "inputChannelIndex": 2,
                        "outputChannelLocation": "FR"
                    }
                ]
            },
            "qualityCheckConfig": {
                "loudnessChecks": [
                    {
                        "checkType": "MAX_SHORT_TERM_LOUDNESS",
                        "enabled": true,
                        "duration": 1,
                        "skipStart": 1,
                        "skipEnd": 1,
                        "threshold": 1
                    }
                ]
            },
            "loudnessMeasurements": {
                "algorithm": "ITU_R_BS_1770_1",
                "enabled": true
            }
        }
    ]
}
enabled
boolean

Enable or disable audio processing for the parent asset.

When set to disabled, no audio quality checks will be raised.

Default:
true
groups
Array of AudioGroup nullable

A collection of audio soundfield groups that exist within the parent asset.

Each audio group entry defines a soundfield group that can have loudness measured as well as quality checks defined.

Min items: 1
Max items: 32
Types: Asset
AudioAveragePhaseMismatchCheck

Configure the quality check for Average Audio Phase Mismatch. Average phase mismatch measures the entire asset for discrepancies between the selected pair of audio channels.

Audio Phase Mismatch Detection identifies discrepancies in the phase alignment of audio channel pairs:

  • Front Left / Front Right
  • Side Left / Side Right
  • Back Left / Back Right

Pairs of audio channels that are out of phase may weaken the soundwave, resulting in a distorted, thin output.

Object
Example:
{
    "enabled": true,
    "threshold": {
        "type": "DEGREE",
        "value": 90
    },
    "channelPairs": [
        "FL-FR"
    ]
}
enabled
boolean

Enable detection of average audio phase mismatch events.

Default:
false

The default threshold value is 120 degree when the type is DEGREE and -0.50 when the type is CORRELATION.

Example:
{
    "type": "DEGREE",
    "value": 90
}
channelPairs
Array of string nullable

Performs the average audio phase mismatch check against the selected channel pairs: “FL-FR”, “BL-BR” and “SL-SR”.

Min items: 1
Max items: 3
Unique items: YES
Example:
["FL-FR", "BL-BR", "SL-SR"]
AudioChannelType

An enum of supported channels that can be used to define a soundfield group.

string
Enumeration:
FL

Front Left

FR

Front Right

FC

Front Center

LFE

Low Frequency Effects

BL

Back Left

BR

Back Right

FLC

Front Left of Center

FRC

Front Right of Center

BC

Back Center

SL

Side Left

SR

Side Right

TC

Top Center

TFL

Top Front Left

TFC

Top Front Center

TFR

Top Front Right

TBL

Top Back Left

TBC

Top Back Center

TBR

Top Back Right

WL

Wide Left

WR

Wide Right

LFE2

Low Frequency Effects 2

TSL

Top Side Left

TSR

Top Side Right

BFC

Bottom Front Center

BFL

Bottom Front Left

BFR

Bottom Front Right

Example:
FL
AudioClicksAndPopsCheck

Configure the quality check for Audio Clicks and Pops. Clicks and pops are caused by a variety of factors, including a poor recording environment, bad equipment, or a misaligned recording.

Object
Example:
{
    "enabled": true,
    "skipStart": 1,
    "skipEnd": 1,
    "sensitivity": 50
}
enabled
boolean

Enable detection of audio click/pop detection events

skipStart
number

The duration in seconds to ignore at the start of the mixed audio track (ie. group). This value can be used to skip or ignore some portion at the start of audio in order to eliminate unwanted quality check failures.

Default:
0
skipEnd
number

The duration in seconds to ignore at the end of the mixed audio track (ie. group). This value can be used to skip or ignore some portion at the end of audio in order to eliminate unwanted quality check failures.

Default:
0
sensitivity
integer

The sensitivity of the check. Higher sensitivity means more detections, but more false positives.

Min: 1
Max: 100
Default:
50
AudioClippingCheck

Configure the quality check for Audio Clipping. This distortion occurs when an audio signal exceeds the maximum limit of a recording or playback system. It typically happens when the volume level of the audio reaches or exceeds the maximum level that can be accurately reproduced, resulting in the waveform being “clipped” or truncated. This distortion introduces unwanted distortion and distortion artifacts to the audio signal, leading to a harsh and distorted sound.

Object
Example:
{
    "enabled": true,
    "duration": 0.05,
    "skipStart": 1,
    "skipEnd": 1,
    "sensitivity": 50
}
enabled
boolean

Enable detection of audio clipping events

duration
number

The minimum clipping duration in seconds required for an event to trigger.

Default:
0.05
skipStart
number

The duration in seconds to ignore at the start of the mixed audio track (ie. group). This value can be used to skip or ignore some portion at the start of audio in order to eliminate unwanted quality check failures.

Default:
0
skipEnd
number

The duration in seconds to ignore at the end of the mixed audio track (ie. group). This value can be used to skip or ignore some portion at the end of audio in order to eliminate unwanted quality check failures.

Default:
0
sensitivity
integer

The sensitivity of the check. Higher sensitivity means more detections, but more false positives.

Min: 1
Max: 100
Default:
50
AudioContentType

Enumerates the audio segment audio channel activity types

string
Enumeration:
SILENCE

All audio channels are silent

ANY_CHANNEL_ACTIVE

At least one audio channel is not silent

ALL_CHANNELS_ACTIVE

All audio channels are active

Example:
ALL_CHANNELS_ACTIVE
AudioGroup

Represents audio measurements and audio specific quality check configuration for a particular audio group.

Object
Examples:
{
    "name": "md:audtrackid:org:bbc.co.uk:123456:main.audio.en.primary.fivepointone",
    "language": "en-CA",
    "description": "A logical audio soundfield group",
    "soundfieldMapping": {
        "type": "SoundfieldTrackMapping",
        "name": "mysoundfile.mxf",
        "path": "/path/to/file",
        "inputTrackIndex": 1,
        "outputChannelLayout": [
            "FL", "FR", "FC", "LFE", "SL", "SR"
        ]
    },
    "qualityCheckConfig": {
        "loudnessChecks": [
            {
                "checkType": "MAX_MOMENTARY_LOUDNESS",
                "enabled": true,
                "duration": 1,
                "skipStart": 1,
                "skipEnd": 1,
                "threshold": 1
            }
        ]
    },
    "loudnessMeasurements": {
        "algorithm": "ITU_R_BS_1770_3",
        "enabled": true
    }
}
{
    "name": "md:audtrackid:org:bbc.co.uk:123456:main.audio.en.primary.surroundsound",
    "language": "en-GB",
    "description": "The logical audio soundfield group from sidecar files",
    "soundfieldMapping": {
        "type": "SoundfieldChannelMapping",
        "mapping": [
            {
                "name": "myleftsoundfile.mxf",
                "path": "/path/to/file",
                "inputTrackIndex": 1,
                "inputChannelIndex": 1,
                "outputChannelLocation": "FL"
            },            {
            {
                "name": "myrightsoundfile.mxf",
                "path": "/path/to/file",
                "inputTrackIndex": 1,
                "inputChannelIndex": 2,
                "outputChannelLocation": "FR"
            }
        ]
    },
    "qualityCheckConfig": {
        "loudnessChecks": [
            {
                "checkType": "MAX_SHORT_TERM_LOUDNESS",
                "enabled": true,
                "duration": 1,
                "skipStart": 1,
                "skipEnd": 1,
                "threshold": 1
            }
        ]
    },
    "loudnessMeasurements": {
        "algorithm": "ITU_R_BS_1770_1",
        "enabled": true
    }
}
name
string nullable

A unique description for an audio soundfield group. All names within the AudioGroup object must be unique.

Min length: 1
Max length: 200
Example:
md:audtrackid:org:bbc.co.uk:123456:main.audio.en.primary.surroundsound
language
string nullable

The language for this specific audio soundfield group.

Min length: 1
Max length: 200
Example:
en-GB
description
string nullable

The description for this specific audio soundfield group.

Min length: 1
Max length: 1,000
soundfieldMapping
One of nullable

Can be either a SoundfieldChannelMapping or SoundfieldTrackMapping object.

Can be either a SoundfieldChannelMapping or SoundfieldTrackMapping object.

qualityCheckConfig

A collection of audio specific quality checks that will be performed on the audio groups within the parent asset.

Configuration parameters for audio loudness measurements that are performed on an audio group.

Types: Audio
AudioGroupQualityCheckConfig

A quality check that will be performed on an audio group.

Object
Example:
{
    "loudnessChecks": [
        {
            "checkType": "MAX_LOUDNESS_RANGE",
            "enabled": true,
            "duration": 1,
            "skipStart": 1,
            "skipEnd": 1,
            "threshold": 1
        }
    ],
    "clippingCheck": {
        "enabled": true,
        "duration": 0.05,
        "skipStart": 1,
        "skipEnd": 1,
        "sensitivity": 50
    },
    "clicksAndPopsCheck": {
        "enabled": true,
        "skipStart": 1,
        "skipEnd": 1,
        "sensitivity": 50
    },
    "phaseMismatchCheck": {
        "enabled": "false",
        "duration": 1,
        "skipStart": 1,
        "skipEnd": 1,
        "smoothing": 1,
        "threshold": {
            "type": "DEGREE",
            "value": 90
        },
        "channelPairs": [
            "FL-FR"
        ]
    },
    "averagePhaseMismatchCheck": {
        "enabled": false,
        "threshold": {
            "type": "DEGREE",
            "value": 90
        },
        "channelPairs": [
            "FL-FR"
        ]
    }
}
loudnessChecks
Array of AudioLoudnessCheck nullable

Configuration for one or more loudness quality checks.

Min items: 1
Unique items: YES
clippingCheck

Configuration for audio clipping quality check

clicksAndPopsCheck

Configuration for audio clicks/pops quality check

phaseMismatchCheck

Configuration for audio phase mismatch quality check

averagePhaseMismatchCheck

Configuration for average audio phase mismatch quality check

Types: AudioGroup
AudioLoudnessCheck

Used to define a quality check based on audio loudness measurements taken over a window of time.

Quality check failure events are generated when loudness values of the specified type are beyond the threshold limit for at least duration continuous seconds. Certain types of checks are performed over the duration of the asset, where skipStart, skipEnd and duration are not applicable. Those types are indicated within the schema definition below.

Object
Examples:
{
    "checkType": "MAX_TRUE_PEAK_LEVEL",
    "enabled": true,
    "duration": 5,
    "skipStart": 2.5,
    "skipEnd": 1.25,
    "threshold": -2
}
{
    "checkType": "MAX_INTEGRATED_LOUDNESS",
    "enabled": true,
    "threshold": 1
}
enabled
boolean

Enable detection of this particular audio loudness quality check event.

The type of loudness check to perform.

Example:
MAX_INTEGRATED_LOUDNESS
duration
number

The minimum continuous duration in seconds required for the loudness to exceed threshold for an event to trigger.

Duration can only be specified for the following checkType values:

  • MAX_MOMENTARY_LOUDNESS
  • MAX_SHORT_TERM_LOUDNESS
  • MIN_TRUE_PEAK_LEVEL
  • MAX_TRUE_PEAK_LEVEL
  • SILENCE

For the remaining checkType values not listed above, duration is always the length of the mixed audio track (ie. group) and is not allowed to be specified explicitly.

For a checkType of MAX_MOMENTARY_LOUDNESS, duration must be greater than 0.4 seconds.
For a checkType of MAX_SHORT_TERM_LOUDNESS, duration must be greater than 3 seconds.

skipStart
number

The duration in seconds to ignore at the start of the mixed audio track (ie. group). This value can be used to skip or ignore some portion at the start of audio in order to eliminate unwanted quality check failures.

skipStart can only be specified for the following checkType values:

  • MAX_MOMENTARY_LOUDNESS
  • MAX_SHORT_TERM_LOUDNESS
  • MIN_TRUE_PEAK_LEVEL
  • MAX_TRUE_PEAK_LEVEL
  • SILENCE
Default:
0
skipEnd
number

The duration in seconds to ignore at the end of the mixed audio track (ie. group). This value can be used to skip or ignore some portion at the end of audio in order to eliminate unwanted quality check failures.

skipEnd can only be specified for the following checkType values:

  • MAX_MOMENTARY_LOUDNESS
  • MAX_SHORT_TERM_LOUDNESS
  • MIN_TRUE_PEAK_LEVEL
  • MAX_TRUE_PEAK_LEVEL
  • SILENCE
Default:
0
threshold
number

The upper or lower threshold limit which loudness values must exceed for duration seconds for an event to trigger.

Loudness values less than the threshold for the following type entries will cause an event to trigger:

  • MIN_INTEGRATED_LOUDNESS (in LKFS)
  • MIN_LOUDNESS_RANGE (in LU)
  • MIN_TRUE_PEAK_LEVEL (in dBTP)
  • SILENCE (in dBTP)

Loudness values greater than the threshold for the following type entries will cause an event to trigger:

  • MAX_INTEGRATED_LOUDNESS (in LKFS)
  • MAX_LOUDNESS_RANGE (in LU)
  • MAX_MOMENTARY_LOUDNESS (in LUFS)
  • MAX_SHORT_TERM_LOUDNESS (in LUFS)
  • MAX_TRUE_PEAK_LEVEL (in dBTP)
AudioLoudnessCheckType

The type of loudness check to register an AudioLoudnessCheck for.

The techniques used to measure Momentary Loudness, Short-Term Loudness, and True Peak Level are defined by the following specifications:

The technique used to measure Integrated Loudness is defined by the following family of specifications:

In addition to the technique defined within the ITU-R BS.1770 specification, the Loudness Range calculation also utilizes a cascaded gating scheme and the statistical distribution of loudness readings when determining the overall loudness range. This is performed in order to minimize the impact of low-level signals, background noise, silence and short bursts of unusually loud sound (eg. explosions in a movie) from dominating the loudness range. The loudness range measurement technique is described in more detail here:

string
Enumeration:
MAX_MOMENTARY_LOUDNESS

Maximum Momentary Loudness (in LUFS) measured over an integration period of 400 milliseconds.

MAX_SHORT_TERM_LOUDNESS

Maximum Short-Term Loudness (in LUFS) measured over an integration period of 3 seconds

MIN_TRUE_PEAK_LEVEL

Minimum True Peak Level (in dBTP) for each channel within a group

MAX_TRUE_PEAK_LEVEL

Maximum True Peak Level (in dBTP) for each channel within a group

MIN_INTEGRATED_LOUDNESS

Minimum Integrated Loudness (in LKFS)

MAX_INTEGRATED_LOUDNESS

Maximum Integrated Loudness (in LKFS)

MIN_LOUDNESS_RANGE

Minimum Loudness Range (in LU)

MAX_LOUDNESS_RANGE

Maximum Loudness Range (in LU)

SILENCE

Detect periods of silence, reported for left, right and all channels

Example:
MAX_INTEGRATED_LOUDNESS
AudioLoudnessMeasurementAlgorithm

Represents the algorithm used to measure perceived loudness.

string
Enumeration:
ITU_R_BS_1770_1

ITU-R BS.1770-1 algorithm used to measure audio program loudness and true-peak audio level

https://www.itu.int/rec/R-REC-BS.1770-1-200709-S/en

ITU_R_BS_1770_2

ITU-R BS.1770-2 algorithm used to measure audio program loudness and true-peak audio level

https://www.itu.int/rec/R-REC-BS.1770-2-201103-S/en

ITU_R_BS_1770_3

ITU-R BS.1770-3 algorithm used to measure audio program loudness and true-peak audio level

https://www.itu.int/rec/R-REC-BS.1770-3-201208-S/en

ITU_R_BS_1770_4

ITU-R BS.1770-4 algorithm used to measure audio program loudness and true-peak audio level

https://www.itu.int/rec/R-REC-BS.1770-4-201510-I/en

Example:
ITU_R_BS_1770_1
AudioLoudnessMeasurementParameters

Represents configuration for each of the audio loudness measurements that can be performed.

Object
Example:
{
    "enabled": true,
    "algorithm": "ITU_R_BS_1770_1"
}
enabled
boolean

Controls whether audio loudness measurements are performed. This must be set to true if any audio loudness quality checks are desired for the associated asset.

Default:
true

The algorithm to use for loudness (Momentary, Short-term, Integrated, Loudness Range) and True Peak level measurements.

Types: AudioGroup
AudioPhaseMismatchCheck

Audio Phase Mismatch Detection identifies discrepancies in the phase alignment of audio channel pairs:

  • Front Left / Front Right
  • Side Left / Side Right
  • Back Left / Back Right

Phase mismatch occurs when audio channels are misaligned, leading to phase cancellation and interference resulting in a distorted, thin output.

Object
Example:
{
    "enabled": true,
    "duration": 1,
    "skipStart": 1,
    "skipEnd": 1,
    "smoothing": 1,
    "threshold": {
        "type": "DEGREE",
        "value": 90
    },
    "channelPairs": [
        "FL-FR"
    ]
}
enabled
boolean

Enable detection of audio phase mismatch events.

Default:
false
duration
number

The number of consecutive seconds required for an event to trigger.

Default:
1
skipStart
number

The duration in seconds to ignore at the start of the mixed audio track (ie. group). This value can be used to skip or ignore some portion at the start of audio in order to eliminate unwanted quality check failures.

Default:
0
skipEnd
number

The duration in seconds to ignore at the end of the mixed audio track (ie. group). This value can be used to skip or ignore some portion at the end of audio in order to eliminate unwanted quality check failures.

Default:
0
smoothing
integer nullable

The smoothing factor of the check. Higher values result in more aggressive smoothing and greater attenuation of outliers, while lower values preserve more of the original data.

Min: 1
Max: 5
Default:
1

The default threshold value is 160 degree when the type is DEGREE, and -0.94 when the type is CORRELATION.

Example:
{
    "type": "DEGREE",
    "value": 90
}
channelPairs
Array of string nullable

Performs the audio phase mismatch check against the selected channel pairs: “FL-FR”, “BL-BR” and “SL-SR”.

Min items: 1
Max items: 3
Unique items: YES
Default:
["FL-FR"]
Example:
["FL-FR", "BL-BR", "SL-SR"]
AudioPhaseMismatchThresholdParameter

Configure the quality check threshold for (average) audio phase mismatch.

Object
Example:
{
    "type": "DEGREE",
    "value": 90
}
type
string required

The threshold type for audio phase mismatch detection.

Enumeration:
CORRELATION

The audio phase mismatch correlation.

DEGREE

The audio phase mismatch degrees.

Default:
DEGREE
Example:
DEGREE
value
number required

The threshold value for the corresponding type. The value should be between 0 and 180 when the type is DEGREE, and between -1 and 1 when the type is CORRELATION.

Min: -1
Max: 180
Example:
90
AudioSegmentParameters

Captures the supported parameter values for audio silence detection.

Object
Example:
{
    "threshold": -60,
    "duration": 2.5
}
threshold
number

The loudness measurement below which an audio channel is considered silent for the purposes of determining if a segment has all channels active, any channels active, or no channels active. The default value is -60dbfs.

Default:
-60
Example:
-60
duration
number

The minimum duration in seconds of a channel being below threshold db for an audio channel to be considered silent. Segments shorter than this duration will be treated as active. The default value is 30 seconds.

Default:
30
Example:
2.5
AudioSilenceParameters

Deprecated.

Important Note: As of version 2.21.0, this schema has been deprecated and it is no longer recommended to configure an analysis level audio silence quality check. Please refer to the asset level audio check configuration Audio to perform silence quality checks in this version and future releases.

Captures the supported parameter values for audio silence detection.

Object
Example:
{
    "threshold": -60,
    "commonParameters": {
        "enabled": "true",
        "duration": 2.5,
        "skipStart": 1.25,
        "skipEnd": 1.25
    }
}
threshold
number

The loudness measurement below which the audio output for the asset is considered to be silent. The default value is -60 dBTP.

Example:
-60
commonParameters

Common quality check configuration parameters.

Example:
{
    "enabled": "true",
    "duration": 2.5,
    "skipStart": 1.25,
    "skipEnd": 1.25
}
CacheDetails

Represents the details about a given frame and map cache.

Object
Example:
{
    "cacheId": "533db73-0f9a-4805-9651-c5dcd519dc37",
    "numberOfFiles": 15182,
    "sizeOfFiles": 1073741824,
    "humanReadableSizeOfFiles": "1.0 G"
}
cacheId
string uuid required read-only

The UUID of the file/map cache

Example:
533db73-0f9a-4805-9651-c5dcd519dc37
numberOfFiles
integer int64 required read-only

The number of frame and/or map PNG files stored in the cache

Min: 1
Example:
15182
sizeOfFiles
integer int64 required read-only

The aggregate size of all the PNG files stored in the cache (in bytes)

Min: 1
Example:
1073741824
humanReadableSizeOfFiles
string required read-only

The aggregate size of all the PNG files stored in the cache (in human readable form usig KMGTPE units)

Min length: 1
Example:
1.0 G
CacheDetailsResponse

The response payload of the GET on cache which lists the number of files and overall size of the frames and map files in each cache. Most deployments will have only one file/map cache.

Object
Example:
{
    "caches": [
        {
            "cacheId": "533db73-0f9a-4805-9651-c5dcd519dc37",
            "numberOfFiles": 15182,
            "sizeOfFiles": 1073741824,
            "humanReadableSizeOfFiles": "1.0 G"
        }
    ]
}
caches
Array of CacheDetails required read-only

The list of known file/map caches

Min items: 1
Unique items: YES
CaptureRequest

The request body used when creating all possible captures for a given frame.

Object
Example:
{
    "frameRequest": {
        "type": "FullReferenceFrameRequest",
        "asset": {
            "name": "Big_Buck_Bunny_1080p@5000kbps.mp4",
            "path": "/mnt/nas/videos",
            "storageLocation": {
                "type": "S3",
                "name": "/videos"
            }
        },
        "startFrame": {
            "type": "PTS",
            "value": 1400
        },
        "reference": {
            "name": "Big_Buck_Bunny.mp4",
            "path": "/mnt/nas/videos/sources",
            "storageLocation": {
                "type": "S3",
                "name": "/videos"
            }
        },
        "referenceStartFrame": {
            "type": "PTS",
            "value": 1200
        },
        "additionalFrames": 24
    },
    "requestedCaptureType": "FRAME"
}
frameRequest
One of required

The frame request data. Use FrameRequestBody for single asset and FullReferenceFrameRequestBody to include a reference asset. Note that you must use FullReferenceFrameRequestBody if you specify a requestedType of QUALITY_MAP or COLOR_DIFFERENCE_MAP .

requestedCaptureType
CaptureType required

The capture type to send back in the response.

CaptureType

Represents the the different types of frame captures available (i.e. frame, banding map, quality map)

string
Enumeration:
FRAME

Represents the frame’s image content.

BANDING_MAP

Represents binary map with white pixels showing banding presence.

QUALITY_MAP

Represents a gray scale presentation of pixel-level perceptual quality that show the spatial distribution of impairments within a frame.

COLOR_DIFFERENCE_MAP

Represents a gray scale representation of pixel-level color and skin tone deviation with respect to the reference file.

Example:
FRAME
ConfigType

Captures the type of the configuration.

string
Enumeration:
SERVICE
SECRET
NONSENSITIVE
Example:
SERVICE
Content

Contains metadata about the content contained within an asset. HLS variants that are part of the same presentation should have the same title.

Object
Example:
{
    "title": "Big Buck Bunny"
}
title
string required

The title of the content

Min length: 1
Max length: 500
Example:
Big Buck Bunny
ContentLayoutDetectionConfig

Configuration options for content layout detection

Object
segmentDetectionConfig
activeSegmentConfig
ContentSimilarityParameters

Captures the supported parameter values for the content similarity quality check.

When this check is enabled, will run the analysis in content similarity detection mode and detect content differences arising from frame insertions and deletions for two versions of the same title.

Note that in the content similarity detection mode, exactly one test and one reference asset must be provided. Additionally, the usual viewer score metrics will not be generated; instead both the reference and test will be evaluated in a no-reference mode. Thus, full-reference metrics such as PSNR and CVD can not be enabled, nor can full-reference metrics be used as the basis for score based quality checks.

Object
Example:
{
    "enabled": true,
    "sensitivity": 75
}
enabled
boolean

Controls whether the content similarity quality check is enabled

Default:
false
Example:
true
sensitivity
integer

The sensitivity of the content similarity detector, from 1-100.

Larger numbers, or those closer to 100 correspond to a more sensitive detector, meaning more events and potentially more false positives will be detected. Smaller numbers, or those closer to 1 correspond to a less sensitive detector, meaning less events will be detected. Lowering the sensitivity will usually result in less false positives at the cost of potentially increasing false negatives.

The default value is 50.

Min: 1
Max: 100
Default:
50
Example:
75
Credentials

Authentication credentials for assets stored in Amazon S3.

Important

In order to support some software features (i.e. frame/map captures), the system needs to persist the access credentials provided in this object into our secure data store. For this reason it is strongly recommended that you use useAssumedIAMRole or the Add Amazon S3 bucket access endopint instead, so as to avoid the persisting of the IAM access key.

Object
Examples:
{
    "useAssumedIAMRole": true
}
{
    "accessKey": {
        "accessKeyId": "AKIAIOSFODNN7EXAMPLE",
        "accessKey": "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY"
    }
}
accessKey

The AWS IAM access key that grants read permissions to the associated Amazon S3 bucket.

Example:
{
    "accessKeyId": "AKIAIOSFODNN7EXAMPLE",
    "accessKey": "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY"
}
clientId
string

AWS Access Key ID for accessing assets stored in Amazon S3.

Deprecated: use accessKey instead.

Min length: 20
Max length: 20
Pattern: ^[A-Z0-9]{20}$
Example:
AKIAIOSFODNN7EXAMPLE
clientSecret
string

AWS Secret Access Key for accessing assets stored in Amazon S3.

Deprecated: use accessKey instead.

Min length: 40
Max length: 40
Pattern: ^[A-Za-z0-9/+=]{40}$
Example:
wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
useAssumedIAMRole
boolean

Authenticate using the role already assumed by the underlying container

Default:
false
Example:
true
Device

A specification of a device for which scores are calculated

Object
Examples:
{
    "name": "oled65c9pua",
    "resolution": {
        "width": 1920,
        "height": 1080
    }
}
{
    "name": "xl2420t"
}
name
string required

The name of the display device

Min length: 1
Max length: 100
Example:
oled65c9pua
resolution

Resolution of the device specified as width and height in pixels

Example:
{
    "width": 1920,
    "height": 1080
}
EMCConfig

Represents the configuration to apply when producing an optimized rendition using the AWS Elemental MediaConvert encoder. The config JSON property below can accept the verbatim config from an EMC invocation. Additionally, this property supports templating and the following variables are available to be used:

Variable Description
{INPUT_LOCATION} represents the assetUri property on the input video included in the optimization
Important

Note that EMC only supports assets stored in S3.

For full examples of optimizing the EMC encoder, please see the examples provided on the /optimizations endpoint.

Object
Example:
{
    "type": "EMCConfig",
    "config": {
        "JobTemplate": "",
        "Queue": "arn:aws:mediaconvert:us-east-1:315835334412:queues/Default",
        "UserMetadata": {},
        "Role": "arn:aws:iam::315835334412:role/mediaconvert-optimizer",
        "Settings": {
            "OutputGroups": [
                {
                    "CustomName": "top-profile-encode",
                    "Name": "CMAF",
                    "Outputs": [
                        {
                            "ContainerSettings": {
                                "Container": "CMFC"
                            },
                            "VideoDescription": {
                                "Width": 1920,
                                "ScalingBehavior": "STRETCH_TO_OUTPUT",
                                "Height": 1080,
                                "TimecodeInsertion": "DISABLED",
                                "AntiAlias": "ENABLED",
                                "Sharpness": 50,
                                "CodecSettings": {
                                    "Codec": "H_264",
                                    "H264Settings": {
                                        "InterlaceMode": "PROGRESSIVE",
                                        "NumberReferenceFrames": 3,
                                        "Syntax": "DEFAULT",
                                        "Softness": 0,
                                        "GopClosedCadence": 1,
                                        "GopSize": 2,
                                        "Slices": 1,
                                        "GopBReference": "ENABLED",
                                        "HrdBufferSize": 16000000,
                                        "MaxBitrate": 8000000,
                                        "EntropyEncoding": "CABAC",
                                        "RateControlMode": "QVBR",
                                        "QvbrSettings": {
                                            "QvbrQualityLevel": 9
                                        },
                                        "CodecProfile": "HIGH",
                                        "MinIInterval": 0,
                                        "AdaptiveQuantization": "AUTO",
                                        "CodecLevel": "AUTO",
                                        "SceneChangeDetect": "ENABLED",
                                        "QualityTuningLevel": "SINGLE_PASS",
                                        "UnregisteredSeiTimecode": "DISABLED",
                                        "GopSizeUnits": "SECONDS",
                                        "ParControl": "INITIALIZE_FROM_SOURCE",
                                        "NumberBFramesBetweenReferenceFrames": 3,
                                        "RepeatPps": "DISABLED",
                                        "DynamicSubGop": "ADAPTIVE"
                                    }
                                }
                            },
                            "NameModifier": "_8Mbps"
                        }
                    ],
                    "OutputGroupSettings": {
                        "Type": "CMAF_GROUP_SETTINGS",
                        "CmafGroupSettings": {
                            "TargetDurationCompatibilityMode": "SPEC_COMPLIANT",
                            "WriteHlsManifest": "ENABLED",
                            "WriteDashManifest": "ENABLED",
                            "SegmentLength": 4,
                            "Destination": "s3://s3-bucket/destination/path/",
                            "FragmentLength": 2,
                            "SegmentControl": "SEGMENTED_FILES",
                            "WriteSegmentTimelineInRepresentation": "ENABLED",
                            "ManifestDurationFormat": "FLOATING_POINT",
                            "StreamInfResolution": "INCLUDE"
                        }
                    }
                }
            ],
            "Inputs": [
                {
                    "AudioSelectors": {
                        "Audio Selector 1": {
                            "DefaultSelection": "DEFAULT"
                        }
                    },
                    "VideoSelector": {
                        "ColorSpace": "FOLLOW",
                        "Rotate": "DEGREE_0",
                        "AlphaBehavior": "DISCARD"
                    },
                    "FilterEnable": "AUTO",
                    "PsiControl": "USE_PSI",
                    "FilterStrength": 0,
                    "DeblockFilter": "DISABLED",
                    "DenoiseFilter": "DISABLED",
                    "TimecodeSource": "ZEROBASED",
                    "FileInput": "s3://s3-bucket/sources/source.mov"
                }
            ]
        },
        "AccelerationSettings": {
            "Mode": "DISABLED"
        },
        "StatusUpdateInterval": "SECONDS_15",
        "Priority": 0,
        "HopDestinations": []
    }
}
type
string required read-only

Must be "EMCConfig".

Min length: 9
Max length: 9
Pattern: ^EMCConfig$
Default:
EMCConfig
Example:
EMCConfig
config
Object required

Represents a complete configuration for an EMC encoding job in JSON format. The content here can be used verbatim as if you were calling the EMC encoder directly.

This configuration supports (optional) templating and the following variables are available to be used:

Variable Description
{INPUT_LOCATION} represents the assetUri property on the input video included in the optimization
Example:
{
    "JobTemplate": "",
    "Queue": "arn:aws:mediaconvert:us-east-1:315835334412:queues/Default",
    "UserMetadata": {},
    "Role": "arn:aws:iam::315835334412:role/mediaconvert-optimizer",
    "Settings": {
        "OutputGroups": [
            {
                "CustomName": "top-profile-encode",
                "Name": "CMAF",
                "Outputs": [
                    {
                        "ContainerSettings": {
                            "Container": "CMFC"
                        },
                        "VideoDescription": {
                            "Width": 1920,
                            "ScalingBehavior": "STRETCH_TO_OUTPUT",
                            "Height": 1080,
                            "TimecodeInsertion": "DISABLED",
                            "AntiAlias": "ENABLED",
                            "Sharpness": 50,
                            "CodecSettings": {
                                "Codec": "H_264",
                                "H264Settings": {
                                    "InterlaceMode": "PROGRESSIVE",
                                    "NumberReferenceFrames": 3,
                                    "Syntax": "DEFAULT",
                                    "Softness": 0,
                                    "GopClosedCadence": 1,
                                    "GopSize": 2,
                                    "Slices": 1,
                                    "GopBReference": "ENABLED",
                                    "HrdBufferSize": 16000000,
                                    "MaxBitrate": 8000000,
                                    "EntropyEncoding": "CABAC",
                                    "RateControlMode": "QVBR",
                                    "QvbrSettings": {
                                        "QvbrQualityLevel": 9
                                    },
                                    "CodecProfile": "HIGH",
                                    "MinIInterval": 0,
                                    "AdaptiveQuantization": "AUTO",
                                    "CodecLevel": "AUTO",
                                    "SceneChangeDetect": "ENABLED",
                                    "QualityTuningLevel": "SINGLE_PASS",
                                    "UnregisteredSeiTimecode": "DISABLED",
                                    "GopSizeUnits": "SECONDS",
                                    "ParControl": "INITIALIZE_FROM_SOURCE",
                                    "NumberBFramesBetweenReferenceFrames": 3,
                                    "RepeatPps": "DISABLED",
                                    "DynamicSubGop": "ADAPTIVE"
                                }
                            }
                        },
                        "NameModifier": "_8Mbps"
                    }
                ],
                "OutputGroupSettings": {
                    "Type": "CMAF_GROUP_SETTINGS",
                    "CmafGroupSettings": {
                        "TargetDurationCompatibilityMode": "SPEC_COMPLIANT",
                        "WriteHlsManifest": "ENABLED",
                        "WriteDashManifest": "ENABLED",
                        "SegmentLength": 4,
                        "Destination": "s3://s3-bucket/destination/path/",
                        "FragmentLength": 2,
                        "SegmentControl": "SEGMENTED_FILES",
                        "WriteSegmentTimelineInRepresentation": "ENABLED",
                        "ManifestDurationFormat": "FLOATING_POINT",
                        "StreamInfResolution": "INCLUDE"
                    }
                }
            }
        ],
        "Inputs": [
            {
                "AudioSelectors": {
                    "Audio Selector 1": {
                        "DefaultSelection": "DEFAULT"
                    }
                },
                "VideoSelector": {
                    "ColorSpace": "FOLLOW",
                    "Rotate": "DEGREE_0",
                    "AlphaBehavior": "DISCARD"
                },
                "FilterEnable": "AUTO",
                "PsiControl": "USE_PSI",
                "FilterStrength": 0,
                "DeblockFilter": "DISABLED",
                "DenoiseFilter": "DISABLED",
                "TimecodeSource": "ZEROBASED",
                "FileInput": "s3://s3-bucket/sources/source.mov"
            }
        ]
    },
    "AccelerationSettings": {
        "Mode": "DISABLED"
    },
    "StatusUpdateInterval": "SECONDS_15",
    "Priority": 0,
    "HopDestinations": []
}
encodingOptimizationConfigs
Array nullable

An array of configurations that control how IMAX Stream Smart™ optimizes the encode(s) produced from the config. Note that the entries in this array are applied in sequential order to the encodes produced, until one list is exhausted.

Please consult your IMAX representative for more details on the applicability of these objects for your use case(s).

Example:
[
    {
        "key1": "value1"
    },
    {
        "key1": "value1",
        "key2": "value2",
        "key3": "value3"
    }
]
Object
Encode

Represents an intermediate or final encoding of an input video. Encodes are included in the context of an encoder configuration (e.g. FFmpegConfig) when your goal is to use IMAX Stream Smart™ to produce an optimized rendition of a given input video.

Encodes support templating and the following variables are available to be used in the encoding commands:

Variable Description
{INPUT_LOCATION} represents the assetUri property on the input video included in the optimization
{OUTPUT_LOCATION} represents the outputLocation of the encoded video
{TEMP_FILE_1}
.
.
{TEMP_FILE_5}
represents the output location of up to 5 intermediate video/metadata files when performing multi-pass encoding
Object
Example:
{
    "command": [
        "ffmpeg -r 24 -i {INPUT_LOCATION} -c:v libx264 -x264-params \"ref=3:bframes=3:b_adapt=2:keyint=48:keyint_min=48:scenecut=0\" -profile:v high -level:v 4.1 -preset slow -crf 23 -maxrate 4500k -bufsize 6000k -an {OUTPUT_LOCATION}"
    ],
    "outputLocation": {
        "assetUri": "s3://videos/example/output/encoded_video.mp4"
    }
}
command
Array required

For single-pass encoding, this array holds the single encoding command used to produce the final encoded video.

For multi-pass encoding, this array holds all the encoding commands used to produce the intermediate video/metadata and, lastly, the final encoded video.

Requirements:

  • The input must be given as -i {INPUT_LOCATION}
  • The output must be given as {OUTPUT_LOCATION}, which must be the last argument. For multipass, only the last pass requires this.
  • For CRF commands, please specify -crf (or omit to use the default). Optionally specify -maxrate and/or -bufsize.
  • For VBR commands, please specify -b:v, -maxrate and -bufsize (currently, all three are required).
  • Not all FFmpeg arguments/flags are supported. Unsupported arguments currently include: -ss, -sseof, -t, -to, -fs.

Encodes support templating and the following variables are available to be used in the encoding commands:

Variable Description
{INPUT_LOCATION} represents the assetUri property on the input video included in the optimization
{OUTPUT_LOCATION} represents the outputLocation of the encoded video
{TEMP_FILE_1}
.
.
{TEMP_FILE_5}
represents the output location of up to 5 intermediate video/metadata files when performing multi-pass encoding
Min items: 1
Max items: 2
Unique items: YES
Examples:
[
    "ffmpeg -r 24 -i {INPUT_LOCATION} -c:v libx264 -x264-params \"ref=3:bframes=3:b_adapt=2:keyint=48:keyint_min=48:scenecut=0\" -profile:v high -level:v 4.1 -preset slow -crf 23 -maxrate 4500k -bufsize 6000k -an {OUTPUT_LOCATION}"
]
[
    "ffmpeg -r 24 -i {INPUT_LOCATION} -c:v libx264 -x264-params \"ref=3:bframes=3:b_adapt=2:keyint=48:keyint_min=48:scenecut=0\" -profile:v high -level:v 4.1 -preset slow -b:v 5000k -maxrate 6250k -bufsize 10000k -an {OUTPUT_LOCATION}"
]
[
    "ffmpeg -i {INPUT_LOCATION} -passlogfile {TEMP_FILE_1} -profile:v high -preset slow -pass 1 -vcodec libx264 -bf 0 -refs 4 -b:v 4500k -maxrate:v 4500k -bufsize:v 6000k -minrate:v 6000k -x264-params \"rc-lookahead=48:keyint=96:stitchable=1:keyint_min:48\" -copyts -start_at_zero -an -f mp4 /dev/null",
    "ffmpeg -i {INPUT_LOCATION} -passlogfile {TEMP_FILE_1} -profile:v high -preset slow -pass 2 -vcodec libx264 -bf 0 -refs 4 -b:v 4500k -maxrate:v 4500k -bufsize:v 6000k -minrate:v 6000k -x264-params \"rc-lookahead=48:keyint=96:stitchable=1:keyint_min:48\" -copyts -start_at_zero -an -f mp4 {OUTPUT_LOCATION}"
]
string
Min length: 1
outputLocation
OutputLocation required

Represents the output location for the final encoded video. Commands can reference this location using {OUTPUT_LOCATION}.

The Optimization job will fail if there is already a file at the given output location, to prevent overwrites.

optimizationConfig
Object

Additional configuration options that control how IMAX Stream Smart™ optimizes the resulting encode.

Please consult your IMAX representative for more details on the applicability of this object for your use case(s).

Example:
{
    "key1": "value1",
    "key2": "value2"
}
Types: FFmpegConfig
EncoderConfig

Encapsulates the encoder configuration used when producing the encoded video(s). Depending on your choice of encoder, you may have many configuration options available. The goal here is to supply IMAX Stream™ with the same configuration that you would use to produce your encoded version, which will serve as the baseline for the optimization process.

Currently, IMAX Stream™ supports the following encoders:

One of
Examples:
{
    "type": "FFmpegConfig",
    "encodes": [
        {
            "command": [
                "ffmpeg -r 24 -i {INPUT_LOCATION} -c:v libx264 -x264-params \"ref=3:bframes=3:b_adapt=2:keyint=48:keyint_min=48:scenecut=0\" -profile:v high -level:v 4.1 -preset slow -crf 23 -maxrate 4500k -bufsize 6000k -an {OUTPUT_LOCATION}"
            ],
            "outputLocation": {
                "assetUri": "s3://videos/example/output/encoded_video.mp4"
            }
        }
    ]
}
{
    "type": "EMCConfig",
    "config": {
        "JobTemplate": "",
        "Queue": "arn:aws:mediaconvert:us-east-1:315835334412:queues/Default",
        "UserMetadata": {},
        "Role": "arn:aws:iam::315835334412:role/mediaconvert-optimizer",
        "Settings": {
            "OutputGroups": [
                {
                    "CustomName": "top-profile-encode",
                    "Name": "CMAF",
                    "Outputs": [
                        {
                            "ContainerSettings": {
                                "Container": "CMFC"
                            },
                            "VideoDescription": {
                                "Width": 1920,
                                "ScalingBehavior": "STRETCH_TO_OUTPUT",
                                "Height": 1080,
                                "TimecodeInsertion": "DISABLED",
                                "AntiAlias": "ENABLED",
                                "Sharpness": 50,
                                "CodecSettings": {
                                    "Codec": "H_264",
                                    "H264Settings": {
                                        "InterlaceMode": "PROGRESSIVE",
                                        "NumberReferenceFrames": 3,
                                        "Syntax": "DEFAULT",
                                        "Softness": 0,
                                        "GopClosedCadence": 1,
                                        "GopSize": 2,
                                        "Slices": 1,
                                        "GopBReference": "ENABLED",
                                        "HrdBufferSize": 16000000,
                                        "MaxBitrate": 8000000,
                                        "EntropyEncoding": "CABAC",
                                        "RateControlMode": "QVBR",
                                        "QvbrSettings": {
                                            "QvbrQualityLevel": 9
                                        },
                                        "CodecProfile": "HIGH",
                                        "MinIInterval": 0,
                                        "AdaptiveQuantization": "AUTO",
                                        "CodecLevel": "AUTO",
                                        "SceneChangeDetect": "ENABLED",
                                        "QualityTuningLevel": "SINGLE_PASS",
                                        "UnregisteredSeiTimecode": "DISABLED",
                                        "GopSizeUnits": "SECONDS",
                                        "ParControl": "INITIALIZE_FROM_SOURCE",
                                        "NumberBFramesBetweenReferenceFrames": 3,
                                        "RepeatPps": "DISABLED",
                                        "DynamicSubGop": "ADAPTIVE"
                                    }
                                }
                            },
                            "NameModifier": "_8Mbps"
                        }
                    ],
                    "OutputGroupSettings": {
                        "Type": "CMAF_GROUP_SETTINGS",
                        "CmafGroupSettings": {
                            "TargetDurationCompatibilityMode": "SPEC_COMPLIANT",
                            "WriteHlsManifest": "ENABLED",
                            "WriteDashManifest": "ENABLED",
                            "SegmentLength": 4,
                            "Destination": "s3://s3-bucket/destination/path/",
                            "FragmentLength": 2,
                            "SegmentControl": "SEGMENTED_FILES",
                            "WriteSegmentTimelineInRepresentation": "ENABLED",
                            "ManifestDurationFormat": "FLOATING_POINT",
                            "StreamInfResolution": "INCLUDE"
                        }
                    }
                }
            ],
            "Inputs": [
                {
                    "AudioSelectors": {
                        "Audio Selector 1": {
                            "DefaultSelection": "DEFAULT"
                        }
                    },
                    "VideoSelector": {
                        "ColorSpace": "FOLLOW",
                        "Rotate": "DEGREE_0",
                        "AlphaBehavior": "DISCARD"
                    },
                    "FilterEnable": "AUTO",
                    "PsiControl": "USE_PSI",
                    "FilterStrength": 0,
                    "DeblockFilter": "DISABLED",
                    "DenoiseFilter": "DISABLED",
                    "TimecodeSource": "ZEROBASED",
                    "FileInput": "s3://s3-bucket/sources/source.mov"
                }
            ]
        },
        "AccelerationSettings": {
            "Mode": "DISABLED"
        },
        "StatusUpdateInterval": "SECONDS_15",
        "Priority": 0,
        "HopDestinations": []
    }
}
Types: Optimization
ErrorResponse

The response payload for any errors with all of the operations

Object
Examples:
{
    "code": "SS-20000",
    "description": "A Generic StreamSmart error occured"
}
{
    "code": "SA-10000",
    "description": "The request could not be understood by the server due to malformed syntax. The client SHOULD NOT repeat the request without modifications.",
    "details": {
        "type": "BodyProcessorException",
        "message": "[Bad Request] Validation error for body application/json: Input doesn't match one of allowed values of enum: [DOLBY_VISION_METADATA, AUDIO]",
        "causeType": "ValidationExceptionImpl",
        "causeMessage": "Input doesn't match one of allowed values of enum: [DOLBY_VISION_METADATA, AUDIO]",
        "actualContentType": "application/json",
        "errorType": "VALIDATION_ERROR",
        "invalidInputScope": "/subjectAssets/0/sidecars/0/type",
        "invalidInputKeyword": "enum",
        "invalidInput": "XYZ"
    }
}
code
string required

The code for the error

Enumeration:
SA-10000

Generic StreamAware error code

SS-20000

Generic StreamSmart error code

SS-20001

Parsing Error

SS-20002

Invalid optimization specification

SS-20003

Failed to start optimization

SS-20004

Unauthorized encoder

SS-20005

Invalid token provided

Example:
SA-10000
description
string required

A description of the error

Example:
Generic StreamAware error code
details
Object nullable

Additional details for the error

Example:
{
    "type": "BodyProcessorException",
    "message": "[Bad Request] Validation error for body application/json: Input doesn't match one of allowed values of enum: [DOLBY_VISION_METADATA, AUDIO]",
    "causeType": "ValidationExceptionImpl",
    "causeMessage": "Input doesn't match one of allowed values of enum: [DOLBY_VISION_METADATA, AUDIO]",
    "actualContentType": "application/json",
    "errorType": "VALIDATION_ERROR",
    "invalidInputScope": "/subjectAssets/0/sidecars/0/type",
    "invalidInputKeyword": "enum",
    "invalidInput": "XYZ"
}
Responses: 400 403 500 503
FFmpegConfig

Represents the configuration to apply when producing an optimized rendition using the FFmpeg encoder.

IMAX Stream™ supports a number of FFmpeg encoding strategies including:

  • Single-pass constant rate factor (CRF)
  • Single-pass variable bitrate (VBR)
  • Multi-pass variable bitrate (VBR)

For full examples of optimizing the FFmpeg encoder, please see the examples provided on the /optimizations endpoint.

Object
Example:
{
    "type": "FFmpegConfig",
    "encodes": [
        {
            "command": [
                "ffmpeg -i {INPUT_LOCATION} -c:v libx264 -x264-params \"ref=3:bframes=3:b_adapt=2:keyint=48:keyint_min=48:scenecut=0\" -profile:v high -level:v 4.1 -preset slow -b:v 4500k -maxrate 4500k -bufsize 6000k -an {OUTPUT_LOCATION}"
            ],
            "outputLocation": {
                "assetUri": "s3://videos/example/output/encoded_video.mp4"
            }
        }
    ]
}
type
string required read-only

Must be "FFmpegConfig".

Min length: 12
Max length: 12
Pattern: ^FFmpegConfig$
Default:
FFmpegConfig
Example:
FFmpegConfig
encodes
Array of Encode required

An array of one or more (FFmpeg) encodes to apply to the input asset. If you are producing a single encoded video, your array here would include a single encode. Whereas if you are producing a ladder of encoded videos, your array would include multiple encodes.

Min items: 1
Unique items: YES
Examples:
[
    {
        "command": [
            "ffmpeg -r 24 -i {INPUT_LOCATION} -c:v libx264 -x264-params \"ref=3:bframes=3:b_adapt=2:keyint=48:keyint_min=48:scenecut=0\" -profile:v high -level:v 4.1 -preset slow -crf 23 -maxrate 4500k -bufsize 6000k -an {OUTPUT_LOCATION}"
        ],
        "outputLocation": {
            "assetUri": "s3://videos/example/output/encoded_video.mp4"
        }
    }
]
[
    {
        "command": [
            "ffmpeg -r 24 -i {INPUT_LOCATION} -c:v libx264 -x264-params \"ref=3:bframes=3:b_adapt=2:keyint=48:keyint_min=48:scenecut=0\" -profile:v high -level:v 4.1 -preset slow -crf 23 -maxrate 4500k -bufsize 6000k -an {OUTPUT_LOCATION}"
        ],
        "outputLocation": {
            "assetUri": "s3://videos/example/output/output1.mp4"
        }
    },
    {
        "command": [
            "ffmpeg -r 24 -i {INPUT_LOCATION} -c:v libx264 -x264-params \"ref=3:bframes=3:b_adapt=2:keyint=48:keyint_min=48:scenecut=0\" -profile:v high -level:v 4.1 -preset slow -crf 25 -maxrate 4500k -bufsize 6000k -an {OUTPUT_LOCATION}"
        ],
        "outputLocation": {
            "assetUri": "s3://videos/example/output/output2.mp4"
        }
    },
    {
        "command": [
            "ffmpeg -r 24 -i {INPUT_LOCATION} -c:v libx264 -x264-params \"ref=3:bframes=3:b_adapt=2:keyint=48:keyint_min=48:scenecut=0\" -profile:v high -level:v 4.1 -preset slow -crf 27 -maxrate 4500k -bufsize 6000k -an {OUTPUT_LOCATION}"
        ],
        "outputLocation": {
            "assetUri": "s3://videos/example/output/output3.mp4"
        }
    }
]
FpsMismatchParameters

Configures the FPS and scan type quality check. When the detected FPS or scan type differs from the probed FPS or scan type a “fps-mismatch” event is fired. The event is also fired when stream frame rate (if detected by demuxer) is different from the measured FPS.

Object
Example:
{
    "allowed": "30i,60p",
    "enablePsfDetection": false,
    "commonParameters": {
        "enabled": true,
        "duration": 2.5,
        "skipStart": 1.25,
        "skipEnd": 1.25
    }
}
allowed
string

A comma separated list of allowed FPS and scan type combinations, such as 30i or 60p. If empty/unspecified everything is allowed. When the detected fps/scan combination is not one of the allowed ones, an “fps-not-allowed” event is fired.

Example:
30i,60p
enablePsfDetection
boolean

Enables detection for “bad” interlaced videos created from PsF sources. Requires more time and cycles and may not be 100% correct.

Default:
false
commonParameters

Common quality check configuration parameters.

Example:
{
    "enabled": true,
    "duration": 2.5,
    "skipStart": 1.25,
    "skipEnd": 1.25
}
FrameIndex

An identifier which can be used to uniquely identify a single frame within a video asset. The frame index/number within the sequential list of frames that consitute a video asset.

Object
Example:
{
    "type": "FrameIndex",
    "value": 1200
}
type
string required read-only

Capture the schema type for use in oneOf semantics.

Min length: 10
Max length: 10
Pattern: ^FrameIndex$
Default:
FrameIndex
Example:
FrameIndex
value
integer int64 required

Captures the frame index value.

Min: 1
Example:
1200
deinterlacingIndex
integer

If the video asset is being deinterlaced by frame (i.e. FrameNumber or FrameTime and not PTS) then this index tells the system whether it should seek to the first or second deinterlaced frame for the desired frame. This value is rarely needed and only useful in the context of a full-reference analysis and under certain scan type and frame rate combinations. Please consult your IMAX contact for more details.

Min: 1
Max: 2
Example:
1
FrameRequest

The request body for any request to create a frame and/or map.

Object
Example:
{
    "type": "FrameRequest",
    "asset": {
        "name": "Big_Buck_Bunny_1080p@5000kbps.m3u8",
        "path": "/mnt/nas/videos",
        "storageLocation": {
            "type": "S3",
            "name": "/videos"
        },
        "streamIdentifier": {
            "type": "HLSVariantIdentifier",
            "bandwidth": 4997885,
            "fallbackStreamIndex": 1
        }
    },
    "startFrame": {
        "type": "FrameIndex",
        "value": 1200
    },
    "additionalFrames": 24
}
type
string required read-only

Capture the schema type for use in oneOf semantics.

Min length: 12
Max length: 12
Pattern: ^FrameRequest$
Default:
FrameRequest
Example:
FrameRequest
asset
Asset required

The video asset for which you want to create the frame capture or map.

startFrame
One of required

The frame at which to start capturing.

Example:
{
    "type": "FrameIndex",
    "value": 1200
}

The frame at which to start capturing.

additionalFrames
integer

The number of additional frames after startFrame for which frame captures (or maps) will be automatically generated and cached. Decoding video, extracting frames and building maps can be expensive operations. Use this value to capture and cache a number of frames following startFrame to support faster subsequent look-ahead request-response exchanges (i.e. useful in scroll forward functionality).

Min: 0
Max: 300
Default:
0
Example:
24
FrameTime

An identifier which can be used to uniquely identify a single frame within a video asset and is structured as a hybrid time-frame format where:

  • HH is two-digit hour (00-24);
  • MM is two-digit minute (00-59);
  • SS is two-digit second (00-59);
  • and FF is frame number within the second; varies depending on asset frames per second (FPS).
Object
Example:
{
    "type": "FrameTime",
    "value": "00:34:28:21",
    "deinterlacingIndex": 1
}
type
string required read-only

Capture the schema type for use in oneOf semantics.

Min length: 9
Max length: 9
Pattern: ^FrameTime$
Default:
FrameTime
Example:
FrameTime
value
string required

Captures the frame time value.

Min length: 7
Pattern: ^([01]?\d|2[0-3]):([0-5]?\d):([0-5]?\d):(0{1}[1-9]{1}|[1-9]*)$
Examples:
01:56:59:0500:34:28:21
deinterlacingIndex
integer

If the video asset is being deinterlaced by frame (i.e. FrameNumber or FrameTime and not PTS) then this index tells the system whether it should seek to the first or second deinterlaced frame for the desired frame. This value is rarely needed and only useful in the context of a full-reference analysis and under certain scan type and frame rate combinations. Please consult your IMAX contact for more details.

Min: 1
Max: 2
Example:
1
FreezeFrameParameters

Captures the supported parameter values for the freeze frame quality check.

Object
Example:
{
    "enabled": true,
    "sensitivity": 75,
    "duration": 2.5,
    "skipStart": 1.25,
    "skipEnd": 1.25
}
enabled
boolean

Controls whether the freeze frame quality check is enabled

Default:
true
Example:
true
sensitivity
integer

The sensitivity of the freeze frame detector, from 1-100. Larger numbers, or those closer to 100 correspond to a more sensitive detector, meaning more events, and potentially more false positives will be detected. Smaller numbers, or those closer to 1 correspond to a less sensitive detector, meaning less events will be detected. Lowering the sensitivity will usually result in less false positives at the cost of potentially increasing false negatives (true freeze frame events will be reported as unimpaired video). The default value is 50.

Min: 1
Max: 100
Default:
50
Example:
75
duration
number

The number of consecutive seconds after which a freeze frame event will be reported as a quality check failure. The default value is 10s.

Example:
2.5
skipStart
number

The number of seconds to ignore at the start of the asset. This value can be used to skip or ignore some portion at the start of asset in order to eliminate unwanted quality check failures. The default value is 0s.

Example:
1.25
skipEnd
number

The number of seconds to ignore at the end of the asset. This value can be used to skip or ignore some portion at the end of asset in order to eliminate unwanted quality check failures. The default value is 0s.

Example:
1.25
FreezeFrameSegmentParameters
Object
Example:
{
    "include": true,
    "duration": 2.5,
    "sensitivity": 75
}
include
boolean

Whether or not to include this segment type in the content layout timeline

Default:
true
duration
number

Number The minimum duration in seconds for freeze frame segments to be included in the content layout timeline. Freeze frame segments shorter than the specified duration will be treated as motion video.

Example:
2.5
sensitivity
integer

The sensitivity of the freeze frame detector, from 1-100. Larger numbers, or those closer to 100 correspond to a more sensitive detector, meaning more events, and potentially more false positives will be detected. Smaller numbers, or those closer to 1 correspond to a less sensitive detector, meaning less events will be detected. Lowering the sensitivity will usually result in less false positives at the cost of potentially increasing false negatives (true freeze frame events will be reported as unimpaired video). The default value is 50.

Default:
50
Example:
75
FullReferenceFrameRequest

The request body for any full-reference request to create a frame and/or map. To create a quality map, you must use a full-reference request.

Object
Example:
{
    "type": "FullReferenceFrameRequest",
    "asset": {
        "name": "Big_Buck_Bunny_1080p@5000kbps.mp4",
        "path": "/mnt/nas/videos",
        "content": {
            "title": "Big Buck Bunny"
        },
        "storageLocation": {
            "type": "S3",
            "name": "/videos"
        }
    },
    "startFrame": {
        "type": "PTS",
        "value": 1400
    },
    "reference": {
        "name": "Big_Buck_Bunny.mp4",
        "path": "/mnt/nas/videos/sources",
        "storageLocation": {
            "type": "S3",
            "name": "/videos"
        }
    },
    "referenceStartFrame": {
        "type": "PTS",
        "value": 1200
    },
    "additionalFrames": 24
}
type
string required read-only

Capture the schema type for use in oneOf semantics.

Min length: 25
Max length: 25
Pattern: ^FullReferenceFrameRequest$
Default:
FullReferenceFrameRequest
Example:
FullReferenceFrameRequest
asset
Asset required

The subject asset in the context of the full-reference request.

startFrame
One of required

The frame in the subject asset at which to start capturing.

Example:
{
    "type": "PTS",
    "value": 1400
}
reference
Asset required

The reference video asset to be used when creating a full-reference request. Remember that a quality map image for a given asset requires access to the original reference asset in order to calculate and show the spatial distribution of impairments between the two frames.

referenceStartFrame
One of

The reference frame at which to start capturing. This value is only needed when the corresponding frame values differ between reference and subject assets (i.e. there is temporal misalignment).

Example:
{
    "type": "PTS",
    "value": 1200
}

The reference frame at which to start capturing. This value is only needed when the corresponding frame values differ between reference and subject assets (i.e. there is temporal misalignment).

additionalFrames
integer

The number of additional frames after startFrame for which frame captures (or maps) will be automatically generated and cached. Decoding video, extracting frames and building maps can be expensive operations. Use this value to capture and cache a number of frames following startFrame to support faster subsequent look-ahead request-response exchanges (i.e. useful in scroll forward functionality).

Min: 0
Max: 300
Default:
0
Example:
24
HLSVariantIdentifier

Only used if video asset is HTTP Live Streaming (HLS). This type is used to specify which variant video stream is to be used. If not included, all variant streams are used.

Object
Example:
{
    "type": "HLSVariantIdentifier",
    "bandwidth": 4997885,
    "fallbackStreamIndex": 1
}
type
string required read-only

Capture the schema type for use in oneOf semantics.

Min length: 20
Max length: 20
Pattern: ^HLSVariantIdentifier$
Default:
HLSVariantIdentifier
Example:
HLSVariantIdentifier
bandwidth
integer int32 required

The bandwidth of the variant stream to be used as the subject asset (The value of the BANDWIDTH key of the corresponding EXT-X-STREAM-INF tag). If multiple variant streams with the same bandwidth exist, the first is used.

Min: 1
Example:
4997885
fallbackStreamIndex
integer int32

If multiple variant streams with the same bandwidth are found in the master playlist, those after the first are treated as fallback streams for that variant. The second stream with the same bandwidth has fallback index 0.

Example:
1
IAMAccessKey

Represents an IAM access key which is comprised of two parts:

  1. an access key ID (for example, AKIAIOSFODNN7EXAMPLE) and
  2. a secret access key (for example, wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY)
Object
Example:
{
    "accessKeyId": "AKIAIOSFODNN7EXAMPLE",
    "accessKey": "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY"
}
accessKeyId
string required

The key identifier

Min length: 20
Max length: 20
Pattern: ^[A-Z0-9]{20}$
Example:
AKIAIOSFODNN7EXAMPLE
accessKey
string required

The secret access key value

Min length: 40
Max length: 40
Pattern: ^[A-Za-z0-9/+=]{40}$
Example:
wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
ImageSequenceParameters

Settings for using image sequences as an asset:

  • .png
  • .tif
  • .tiff
  • .dpx
  • .jpg
  • .jpeg
  • .j2c
  • .jp2
  • .jpc
  • .j2k
  • .exr

This property is required when the asset name contains a format string (%d or %0[width]d)

Object
Example:
{
    "fps": 24
}
fps
number required

Frames per second.

Min: 1
Max: 480
Examples:
2429.7
Types: Asset
LadderRefinementConfig

Configuration options for ladder refinement. When enabled, IMAX Stream™ will intelligently optimize the output ABR ladder of the optimization job. It does this by pruning redundant renditions from the ladder. That is, if two renditions are very close in size, one will be removed to reduce the total size of the ladder. By default, this feature is disabled and the ladder will contain all optimized outputs produced by the job.

Object
Default:
{
    "enabled": true
}
enabled
boolean

Enable ladder refinement for the optimization

Example:
true
Types: Optimization
MetadataCheck

Used to define a quality check event based on metadata validity and correctness.

Object
Examples:
{
    "type": "DOLBY_VISION"
}
{
    "type": "MAXCLL_AND_MAXFALL",
    "tolerance": 100,
    "metadataSources": [
        "CONTAINER"
    ]
}
type

The type of metadata to validate

tolerance
integer nullable

The tolerance (+/-) between the measured and metadata values before a quality check is raised

tolerance can only be specified for the following type values:

  • MAXCLL_AND_MAXFALL

For MAXCLL_AND_MAXFALL quality checks the unit is nits, and the default is 100.

Example:
100
metadataSources
Array nullable

Perform the metadata check against only the selected metadata source. Currently only used by MAXCLL_AND_MAXFALL. By default all metadata sources are checked (if present).

Min items: 1
Unique items: YES
Example:
[
    "PLAYLIST"
]
string
Enumeration:
CONTAINER

Validate the container metadata against the measured. Raise a quality check on mismatch, or if no MaxFALL/MaxCLL container metadata was detected.

DOLBY_VISION_METADATA

Validate the Dolby Vision metadata against the measured. Raise a quality check on mismatch, or if no MaxFALL/MaxCLL Dolby Vision metadata was detected. If the video does not have any Dolby Vision metadata (side car or embedded), then this check is ignored.

PLAYLIST

Validate the IMF CPL metadata against the measured. Raise a quality check on mismatch, or if no MaxFALL/MaxCLL CPL metadata was detected. If the video is not an IMF video (submitted via the CPL XML), then this check is ignored.

MetadataCheckType

The type of metadata to register a quality check definition for

string
Enumeration:
MAXCLL_AND_MAXFALL

Validate that Metadata from the container/CPL is consistant and matches measured light level values

Example:
DOLBY_VISION
MP4ComplianceChecks

QuickTime container compliance checks.

Object
Example:
{
    "enabled": true,
    "duration": false,
    "durationThreshold": 0.5,
    "audioDescriptors": false,
    "videoDescriptors": false,
    "timecodeDescriptors": false
}
enabled
boolean

Enable/disable mp4 compliance checks.

Default:
true
Example:
true
duration
boolean

Enable/Disable duration check.

Default:
false
Example:
false
durationThreshold
number

Maximum absolute allowable difference in fractional seconds between the duration calculated from the track properties and the duration calculated from sample timestamps within the track.

Default:
0.5
Example:
0.5
audioDescriptors
boolean

Enable/Disable audio descriptor validation checks.

Default:
false
Example:
false
videoDescriptors
boolean

Enable/Disable video descriptor validation checks.

Default:
false
Example:
false
timecodeDescriptors
boolean

Enable/Disable timecode descriptor validation checks.

Default:
false
Example:
false
MXFComplianceCheck

MXF container compliance checks.

Object
Example:
{
    "enabled": true
}
enabled
boolean

Enable/disable MXF compliance check.

Default:
true
Example:
true
NewAnalysis

Represents the request body used in an analyses POST request to submit a new analysis for processing using the specified assets.

A given analysis can either be full-reference or no-reference. A full-reference analysis requires specifying both reference and subject assets, whereas a no-reference analysis requires only a subject asset. For maximum efficiency, NewAnalysis has been designed to accept multiple reference and subject assets, with each subject asset being compared individually against all reference assets in separate analyses. This flexibility allows you to create a single request to execute anything from an ad-hoc no-reference analysis to multiple encoding ladder comparisons.

The following examples are representations in table format of how the system handles multiple reference and subject assets for common analysis scenarios:


Reference Asset(s) Subject Asset(s)
GOT_S2_EP1.mov GOT_S2_EP1_libx264_1920x1080_50-0.mov
GOT_S2_EP1_libx264_1280x720_50-0.mov

Results in 2 full-reference analyses:

  • GOT_S2_EP1.mov —> GOT_S2_EP1_libx264_1920x1080_50-0.mov
  • GOT_S2_EP1.mov —> GOT_S2_EP1_libx264_1280x720_50-0.mov


Reference Asset(s) Subject Asset(s)
GOT_S2_EP1_libx264_1920x1080_50-0.mov
GOT_S2_EP1_libx264_1280x720_50-0.mov
GOT_S2_EP1_libx264_960x540_50-0.mov

Results in 3 no-reference analyses:

  • GOT_S2_EP1_libx264_1920x1080_50-0.mov
  • GOT_S2_EP1_libx264_1280x720_50-0.mov
  • GOT_S2_EP1_libx264_960x540_50-0.mov


Reference Asset(s) Subject Asset(s)
GOT_S2_EP1.mov GOT_S2_EP1_libx264_1920x1080_50-0.mov
GOT_S2_EP1.mp4 GOT_S2_EP1_libx264_1280x720_50-0.mov

Results in 4 full-reference analyses:

  • GOT_S2_EP1.mov —> GOT_S2_EP1_libx264_1920x1080_50-0.mov
  • GOT_S2_EP1.mov —> GOT_S2_EP1_libx264_1280x720_50-0.mov
  • GOT_S2_EP1.mp4 —> GOT_S2_EP1_libx264_1920x1080_50-0.mov
  • GOT_S2_EP1.mp4 —> GOT_S2_EP1_libx264_1280x720_50-0.mov

For more details on how to structure the requests and responses for the examples above, please consult the POST endpoint on the analyses resource.

Since both no-reference and full-reference analyses require a subject asset, the subjectAssets is a required attribute. For full-reference analyses, the referenceAssets is also required.

Object
Example:
{
    "content": {
        "title": "Big Buck Bunny"
    },
    "referenceAssets": [
        {
            "name": "Big_Buck_Bunny.mp4",
            "path": "/mnt/nas/videos/sources",
            "storageLocation": {
                "name": "/videos",
                "type": "S3"
            }
        }
    ],
    "subjectAssets": [
        {
            "name": "Big_Buck_Bunny_1080p@5000kbps.mp4",
            "path": "/mnt/nas/videos",
            "storageLocation": {
                "name": "/videos",
                "type": "S3"
            }
        }
    ],
    "analyzerConfig": {
        "enableComplexityAnalysis": true,
        "enableBandingDetection": true,
        "qualityCheckConfig": {
            "enabled": true,
            "duration": 2.5,
            "skipStart": 1.25,
            "skipEnd": 1.25,
            "freezeFrame": {
                "enabled": true
            },
            "blackFrame": {
                "enabled": true
            }
        },
        "viewingEnvironments": [
            {
                "device": {
                    "name": "oled65c9pua"
                },
                "viewerType": "TYPICAL"
            }
        ],
        "framesToProcess": 240,
        "temporalAlignment": {
            "minSeconds": 5,
            "maxSeconds": 90,
            "maxSecondsHighFPS": 30
        }
    }
}
content

Metadata about the content being analyzed in this analysis. If included, content metadata will automatically be propagated to all assets in this analysis.

Example:
{
    "title": "Big Buck Bunny"
}
description
string

A description of the analysis which can be used for reference, categorization and search/filtering. This field may be deprecated in a future release of the API. As such, you are encouraged to use the content field in place of this field whenver possible as it plays a more prominent/visible role in Insights reporting.

Example:
Capturing results of transcoding.
referenceAssets
Array of Asset nullable

The reference asset against which you will compare a subject asset. This attribute is ONLY used for full-reference (FR) analyses.

Min items: 1
Unique items: YES
Example:
[
    {
        "name": "Big_Buck_Bunny.mp4",
        "path": "/mnt/nas/sources",
        "storageLocation": {
            "name": "/videos",
            "type": "S3"
        }
    }
]
subjectAssets
Array of Asset required

The subject asset(s) are the assets which you will use to compare against the reference asset (for full-reference analysis) or the asset(s) against which you will perform a no-reference analysis.

Min items: 1
Unique items: YES
Example:
[
    {
        "name": "Big_Buck_Bunny_1080p@5000kbps.mp4",
        "path": "/mnt/nas/videos",
        "storageLocation": {
            "name": "/videos",
            "type": "S3"
        }
    },
    {
        "name": "Big_Buck_Bunny_1080p@2000kbps.mp4",
        "path": "/mnt/nas/videos",
        "storageLocation": {
            "name": "/videos",
            "type": "S3"
        }
    },
    {
        "name": "Big_Buck_Bunny_7200p@1000kbps.mp4",
        "path": "/mnt/nas/videos",
        "storageLocation": {
            "name": "/videos",
            "type": "S3"
        }
    }
]
analyzerConfig

Configuration options for use by the analyzer at the analysis level. Configuration options for assets can be specified on the Asset object.

Example:
{
    "enableComplexityAnalysis": false,
    "enableBandingDetection": false,
    "qualityCheckConfig": {
        "enabled": true,
        "duration": 2.5,
        "skipStart": 1.25,
        "skipEnd": 1.25,
        "freezeFrame": {
            "enabled": true,
            "duration": 2.5,
            "skipStart": 1.25,
            "skipEnd": 1.25
        }
    },
    "viewingEnvironments": [
        {
            "device": {
                "name": "oled65c9pua"
            },
            "viewerType": "TYPICAL"
        }
    ],
    "framesToProcess": 240,
    "temporalAlignment": {
        "minSeconds": 5,
        "maxSeconds": 90,
        "maxSecondsHighFPS": 30
    },
    "additionalConfigurationOptions": {
       "bandingDetectionThreshold": 40
    }
}
Optimization

Represents an encoding optimization job. Use this type when creating the body of a POST request sent to the /optimizations endpoint and processing the response.

Important

Note that IMAX Stream™ currently only supports optimizations for assets stored in S3.

Object
Example:
{
    "content": {
        "title": "Big Buck Bunny"
    },
    "input": {
        "assetUri": "s3://videos-bucket/examples/Big_Buck_Bunny.mp4"
    },
    "encoderConfig": {
        "type": "FFmpegConfig",
        "encodes": [
            {
                "command": [
                    "ffmpeg -r 24 -i {INPUT_LOCATION} -pix_fmt yuv420p -color_primaries bt709 -color_trc bt709 -colorspace bt709 -color_range mpeg -c:v libx264 -x264-params \"ref=3:bframes=3:b_adapt=2:keyint=50:keyint_min=50:scenecut=0:stitchable=1\" -profile:v high -level:v 4.1 -b:v 5000k -maxrate 6250k -bufsize 10000k -r 24 -vf scale=1920x1080 -an {OUTPUT_LOCATION}"
                ],
                "outputLocation": {
                    "assetUri": "s3://videos-bucket/examples/output/encoded_video.mp4"
                }
            }
        ]
    }
}
content
Content required

Metadata about the content contained in this asset.

Example:
{
    "title": "Big Buck Bunny"
}
input

The input video for which to provide the optimized encoding.

If the codec configuration that you choose (i.e. see encoderConfig below) supports templating, the assetUri for this input will be made available as the variable {INPUT_LOCATION} for use in the encoder configuration.

Important

Note that IMAX Stream™ currently only supports optimizations for assets stored in S3.

Note that in an optimization context, the system will ignore any properties on the input that do not apply (e.g. regionOfInterest, qualityCheckConfig, imageSequenceParamters and audio).

Example:
{
    "assetUri": "s3://videos-bucket/examples/Big_Buck_Bunny.mp4"
}

The location to write the job’s output files. Currently, this location must be a path in S3.

This field is only supported for Elemental MediaConvert jobs. When given, we set the MediaConvert job’s Destination field (in OutputGroup/OutputGroupSettings) to this value.

Example:
{
    "assetUri": "s3://videos-bucket/test/outputs/"
}
encoderConfig
EncoderConfig required

The encoder and the configuration used when producing the encoded video(s). Depending on your choice of encoder, you may have many configuration options available. The goal here is to supply IMAX Stream™ with the same configuration that you would use to produce your encoded version, which will serve as the baseline for the optimization process.

ladderRefinement

Configuration options for ladder refinement. Enable ladder refinement to have IMAX Stream™ intelligently optimize the output ABR ladder of this optimization job. This feature is disabled by default.

Example:
{
    "enabled": true
}
allowOverwrites
boolean

By default, the Optimization will overwrite existing files when it writes to the output location. To prevent overwrites, set this flag to false. When false, the Optimization will fail if it tries to overwrite any existing file.

Default:
true
additionalConfigurationOptions
Object nullable

Additional (undocumented) configuration options for use with the optimization algorithms.

Please consult your IMAX representative for more details on the applicability of this object for your use case(s).

Example:
{
    "key1": "value1",
    "key2": "value2"
}
OptimizationPatchRequest

The request body used when updating an optimization.

Currently, the system supports only the following update operations:

  1. Cancelling an existing opyimization

    Note

    Only optimizations that are currently in progress (i.e. scheduled, estimating, aligning, analyzing) can be cancelled

Object
Example:
{
    "status": "CANCELLED"
}
status
string
Enumeration:
CANCELLED

Cancels a running optimization

OptimizationResponse

Represents the response from a successful POST to the /optimizations endpoint. Use the id on the optimization to fetch the results and information about the rendition’s quality and/or bitrate savings from Insights.

All of
Example:
{
    "id": "04a2e841-9c9e-4f50-9f9f-4a8b847f5b3e",
    "content": {
        "title": "Big Buck Bunny"
    },
    "input": {
        "assetUri": "s3://videos/examples/Big_Buck_Bunny.mp4"
    },
    "encoderConfig": {
        "type": "FFmpegConfig",
        "encodes": [
            {
                "command": [
                    "ffmpeg -r 24 -i {INPUT_LOCATION} -pix_fmt yuv420p -color_primaries bt709 -color_trc bt709 -colorspace bt709 -color_range mpeg -c:v libx264 -x264-params \"ref=3:bframes=3:b_adapt=2:keyint=50:keyint_min=50:scenecut=0:stitchable=1\" -profile:v high -level:v 4.1 -b:v 5000k -maxrate 6250k -bufsize 10000k -r 24 -vf scale=1920x1080 -an {OUTPUT_LOCATION}"
                ],
                "outputLocation": {
                    "assetUri": "s3://videos/examples/output/encoded_video.mp4"
                }
            }
        ]
    },
    "submissionTimestamp": "2018-01-01T14:20:22Z"
}
Object
id
string uuid required

The UUID that represents the analysis that was done as part of the optimization process.

Min length: 36
Max length: 36
Example:
04a2e841-9c9e-4f50-9f9f-4a8b847f5b3e
submissionTimestamp
string date-time required

The UTC timestamp (using ISO-8601 representation) recording when the analysis was successfully submitted for analysis. Analyses that fail to submit corectly will not have a value for this attribute.

Example:
2018-01-01T14:20:22Z
OutputLocation

The location into which the system will save the optimized asset(s). The Optimization job will fail if there is already a file at an output location, to prevent overwrites.

There are two supported formats. Either specify assetUri or both of name and storageLocation. See examples.

Object
Examples:
{
    "assetUri": "s3://reference-assets/example/output/path/encoded_video.mp4"
}
{
    "name": "example/output/path/encoded_video.mp4",
    "storageLocation": {
        "type": "S3",
        "name": "reference-assets"
    }
}
assetUri
string

A URI describing the location of the video asset, of the form

storageLocationType://storageLocationName/path/name

Either this field or both of name and storageLocation must be provided.

Any special characters, like space or hash, must be percent-encoded. For example, an S3 object with key my video#001.mp4 should be given as s3://my-bucket/mypath/my%20video%23001.mp4.

Examples:
s3://my-bucket-name/test/Big_Buck_Bunny_480p.mp4s3://my-bucket-name/test/My%20Video.mp4
name
string

To uniquely specify an asset location, either the assetUri field or both of name and storageLocation must be provided.

The full path and/or key for the video asset.

Min length: 1
Example:
example/output/path/encoded_video.mp4
storageLocation

To uniquely specify an asset location, either the assetUri field or both of name and storageLocation must be provided.

The storage location for the video asset.

Example:
{
    "type": "S3",
    "name": "test-bucket"
}
PSEHardingTest

Configure Photosensitive Epilepsy Harding Tests

  • Red Flash Detection
  • Luminance Flash Detection
  • Spatial Pattern Detection
  • Extended Failures
Object
Example:
{
    "enabled": true,
    "duration": 2.5,
    "skipStart": 1.25,
    "skipEnd": 1.25,
    "extendedFailure": {
        "enabled": true,
        "duration": 2.5,
        "skipStart": 1.25,
        "skipEnd": 1.25
    },
    "luminanceFlash": {
        "enabled": true,
        "duration": 2.5,
        "skipStart": 1.25,
        "skipEnd": 1.25
    },
    "redFlash": {
        "enabled": true,
        "duration": 2.5,
        "skipStart": 1.25,
        "skipEnd": 1.25
    },
    "spatialPattern": {
        "enabled": true,
        "duration": 2.5,
        "skipStart": 1.25,
        "skipEnd": 1.25
    },
    "standard": "ITU_R_BT_1702_2"
}
enabled
boolean

Enable all PSE Harding Tests. Can be overridden for individual tests.

Default:
false
Example:
true
duration
number

The number of consecutive seconds after which the associated condition is considered to have failed its respective check. The default value is 0s (fail as soon as first detection is raised).

Example:
2.5
skipStart
number

The number of seconds to ignore at the start of the asset. This value can be used to skip or ignore some portion at the start of asset in order to eliminate unwanted quality check failures. The default value is 0s.

Example:
1.25
skipEnd
number

The number of seconds to ignore at the end of the asset. This value can be used to skip or ignore some portion at the end of asset in order to eliminate unwanted quality check failures. The default value is 0s.

Example:
1.25
extendedFailure

Configuration options for extended failure detection.

luminanceFlash

Configuration options for luminance flash detection.

Configuration options for red flash detection.

spatialPattern

Configuration options for spatial pattern detection.

standard
string

The standard to use for the Flash and Pattern Analyzer (FPA)

Enumeration:
OFCOM

Ofcom

NAB2006

NAB 2006

ITU_R_BT_1702_1

ITU-R BT.1702-1

ITU_R_BT_1702_2

ITU-R BT.1702-2

JAPAN_HDR

Japan HDR

Default:
ITU_R_BT_1702_2
PTS

An identifier which can be used to uniquely identify a single frame within a video asset. The presentation timestamp metadata field used to achieve sychronization of an asset’s separate elementary streams when presented to the viewer.

Object
Example:
{
    "type": "PTS",
    "value": 18542
}
type
string required read-only

Capture the schema type for use in oneOf semantics.

Min length: 3
Max length: 3
Pattern: ^PTS$
Default:
PTS
Example:
PTS
value
integer int64 required

Captures the PTS value.

Example:
18542
QualityCheckConfig

Configuration options for supported video quality checks.

Object
Examples:
{
    "enabled": true,
    "duration": 2.5,
    "skipStart": 1.25,
    "skipEnd": 1.25
}
{
    "enabled": false,
    "freezeFrame": {
        "enabled": true,
        "duration": 5,
        "skipStart": 1.25,
        "skipEnd": 1.25
    },
    "blackFrame": {
        "enabled": true,
        "duration": 5,
        "skipStart": 1.25,
        "skipEnd": 1.25
    }
}
enabled
boolean

Enable detection of all video quality check events. Can be overridden for individual detections.

Example:
true
duration
number

The number of consecutive seconds after which all included and enabled video and audio quality checks are considered to have failed their respective checks. Can be overridden for individual detections.

Example:
2.5
skipStart
number

The number of seconds to ignore at the start of the asset. Applies to all included and enabled video and audio quality checks. Can be overridden for individual detections.

Example:
1.25
skipEnd
number

The number of seconds to ignore at the end of the asset. Applies to all included and enabled video and audio quality checks. Can be overridden for individual detections.

Example:
1.25

Configuration options for freeze frame detection.

Configuration options for black frame detection.

solidColorFrame

Configuration options for solid color frame detection.

colorBarFrame

Configuration options for color bars detection.

missingCaptions

Configuration options for missing captions detection.

Deprecated. Configuration options for audio silence detection.

Important Note: As of version 2.21.0, this schema has been deprecated and it is no longer recommended to configure an analysis level audio silence quality check. Please refer to the asset level audio check configuration Audio to perform silence quality checks in this version and future releases.

Configuration options for bitstream FPS and scan type mismatch detection

multipleCadences
boolean

Enable a quality check for detection of multiple cadence patterns within an asset

Example:
true
brokenCadence
boolean

Enable a quality check for detection of frames with a broken cadence

Example:
true
allowedCadences
Array nullable

Enable a quality check for allowed cadences. Provide a list of cadences that are allowed to be present in the video.

Example:
[
    "2:3",
    "2:2"
]
string
Min length: 3
Pattern: ^[1-9](:[1-9])+$
Examples:
1:21:1:1:22:22:32:2:2:42:2:3:32:2:3:2:32:2:2:2:2:2:2:2:2:2:33:34:45:56:57:8
contentSimilarity

Configuration options for content similarity detection.

Example:
{
    "enabled": true,
    "sensitivity": 75
}
QualityCheckParameters

Captures the supported parameter values for any video or audio quality check.

Object
Example:
{
    "enabled": "true",
    "duration": 2.5,
    "skipStart": 1.25,
    "skipEnd": 1.25
}
enabled
boolean required

Controls whether the associated video or audio quality check is enabled.

Default:
true
Example:
true
duration
number

The number of consecutive seconds after which the associated condition is considered to have failed its respective check. For all video quality checks (i.e. black frames, solid color frames, freeze frames and color bar frames) the default value is 10s. For closed captions quality checks (i.e. missing captions) the default value is 60s.

Example:
2.5
skipStart
number

The number of seconds to ignore at the start of the asset. This value can be used to skip or ignore some portion at the start of asset in order to eliminate unwanted quality check failures. The default value is 0s.

Example:
1.25
skipEnd
number

The number of seconds to ignore at the end of the asset. This value can be used to skip or ignore some portion at the end of asset in order to eliminate unwanted quality check failures. The default value is 0s.

Example:
1.25
QualityDelta

An object that can be used to capture various configuration options that apply to the optimization algorithms.

Please consult your IMAX representative for more details on the applicability of this object for your use case(s).

Object
Example:
{
    "x1": 30,
    "x2": 50,
    "y1": 0,
    "y2": 2.5
}
x1
number
Example:
30
x2
number
Example:
50
y1
number
Example:
0
y2
number
Example:
2.5
RawVideoParameters

Settings needed to decode raw video with the following extensions:

  • .yuv,
  • .rgb,
  • .bgr,
  • .v210, or
  • .raw.
Object
Example:
{
    "resolution": {
        "width": 720,
        "height": 576
    },
    "fps": 25,
    "scanType": "P",
    "fieldOrder": "TFF",
    "pixelFormat": "YUV420P"
}
resolution
Resolution required

Resolution of the asset specified as width and height in pixels

Example:
{
    "width": 1920,
    "height": 1080
}
fps
number

Frames per second

Min: 1
Max: 480
Default:
25
scanType
string

Scan Type

Enumeration:
I

Interlaced

P

Progressive

Default:
P
fieldOrder
string

Field Order

Enumeration:
TFF

Top Field First

BFF

Bottom Field First

Default:
TFF
pixelFormat
string

The pixel format

Enumeration:
YUV420P
YUV422P
YUV444P
YUV420P10
YUV422P10
YUV444P10
YUV420P12
YUV422P12
YUV444P12
RGB24
RGB48
BGR24
BGR48
RGB555
RGB565
BGR555
BGR565
UYVY422
V210
Default:
YUV420P
Types: Asset
RegionOfInterest

Specification for the region of interest

Object
Example:
{
    "originX": 20,
    "originY": 0,
    "regionHeight": 300,
    "regionWidth": 400
}
originX
integer required

x coordinate for region of interest origin

originY
integer required

y coordinate for region of interest origin

regionHeight
integer required

height in pixels of the region of interest

Min: 88
Multiple Of: 2
regionWidth
integer required

width in pixels of the region of interest

Min: 88
Multiple Of: 2
Types: Asset
Resolution

A width and a height in pixels that specify the resolution of an asset

Object
Example:
{
    "width": 1920,
    "height": 1080
}
width
integer required

Width of the video in pixels

Min: 88
Max: 8,192
Example:
1920
height
integer required

Height of the video in pixels

Min: 88
Max: 6,144
Example:
1080
S3BucketCredentials

Credentials for accessing an AWS Amazon S3 bucket

Object
Example:
{
    "bucketName": "mybucket",
    "accessKey": {
        "accessKeyId": "AKIAIOSFODNN7EXAMPLE",
        "accessKey": "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY"
    }
}
bucketName
string required

AWS Amazon S3 bucket name.

Min length: 3
Max length: 63
Pattern: (?!(^xn--|.+-s3alias$))^[a-z0-9][a-z0-9-]{1,61}[a-z0-9]$
Example:
mybucket
accessKey
IAMAccessKey required

The AWS IAM access key that grants read permissions to the associated Amazon S3 bucket.

Example:
{
    "accessKeyId": "AKIAIOSFODNN7EXAMPLE",
    "accessKey": "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY"
}
ScoreCheck

Used to define a quality check event based on scores over a window of time. Quality check failure events are generated when scores of the specified type exceed threshold for at least eventDuration continuous seconds.

Important

A viewingEnvironments array in the AnalyzerConfig object must be specified in order to use score-based quality checks on SVS, EPS and SBS metrics.

Object
Example:
{
    "metric": "SVS",
    "threshold": 80,
    "durationSeconds": 5,
    "skipStart": 1.25,
    "skipEnd": 1.25,
    "viewingEnvironmentIndex": 0
}
metric
ScoreCheckType required

The type of score to check for. Several restrictions apply regarding where each can be used:

  • SVS, SBS and LUMINANCE can be applied to both Source (Reference) and Output (Test/Subjects), whereas EPS and CVD are only applicable to Output (Test/Subjects) assets
  • EPS and CVD can only be used in a full-reference analysis
Examples:
SVSEPSSBSCVDMIN_PIXEL_LUMINANCEMAX_PIXEL_LUMINANCEMIN_FRAME_LUMINANCEMAX_FRAME_LUMINANCE
threshold
integer required

The threshold that scores must exceed for eventDuration seconds for an event to trigger. Scores lower than the threshold for SVS, EPS, MIN_FRAME_LUMINANCE and MIN_PIXEL_LUMINANCE and higher than the threshold for SBS, CVD, MAX_FRAME_LUMINANCE and MAX_PIXEL_LUMINANCE will cause an event to trigger. For SVS, EPS, SBS, and CVD the max threshold is 100. For MAX_FRAME_LUMINANCE and MAX_PIXEL_LUMINANCE score checks, the max threshold is 10000

Min: 0
Max: 10,000
Example:
80
viewingEnvironmentIndex
integer

Specifies the (0-based) index of the viewing environment to use for this quality check. Required for SVS, EPS, and SBS score checks.

Example:
0
durationSeconds
number

The minimum continuous duration in seconds required for the target score to exceed threshold for an event to trigger. Either durationSeconds or durationFrames must be specified, but both can not be specified simultaneously.

Example:
5
durationFrames
integer

The minimum continuous duration in frames required for the target score to exceed threshold for an event to trigger. Either durationSeconds or durationFrames must be specified, but both can not be specified simultaneously.

Min: 1
Example:
1
skipStart
number

The duration in seconds to ignore at the start of the asset. This value can be used to skip or ignore some portion at the start of asset in order to eliminate unwanted quality check failures. The default value is 0s.

Default:
0
Example:
1.25
skipEnd
number

The duration in seconds to ignore at the end of the asset. This value can be used to skip or ignore some portion at the end of asset in order to eliminate unwanted quality check failures. The default value is 0s.

Default:
0
Example:
1.25
ScoreCheckType

The type of score to register a quality check definition for. Several restrictions apply regarding where each can be used:

  • SVS, SBS and LUMINANCE can be applied to both Source (Reference) and Output (Test/Subjects), whereas EPS and CVD are only applicable to Output (Test/Subjects) assets
  • EPS and CVD can only be used in a full-reference analysis
string
Enumeration:
SVS

IMAX VisionScience Viewer Score

EPS

IMAX VisionScience Encoder Performance Score

SBS

IMAX VisionScience Banding Score

CVD

Color Volume Difference Score

MIN_PIXEL_LUMINANCE

Minimum Pixel Luminance Score

MAX_PIXEL_LUMINANCE

Maximum Pixel Luminance Score

MIN_FRAME_LUMINANCE

Minimum Frame Luminance Score

MAX_FRAME_LUMINANCE

Maximum Frame Luminance Score

Example:
SVS
Types: ScoreCheck
SegmentActivityParameters

Parameters for defining what constitutes an active segment for the purposes of constructing an active segment timeline. Allows specifying which segment types should be considered always inactive and under which audio conditions segment types should be considered active.

Object
canBeActive
boolean

If set to false, this content type will always be considered inactive

Default:
false
activeAudioChannelsDefinition

The “least active” audio content type required for a segment of this content type:

  • SILENCE: this content type will be considered active regardless of audio
  • ANY_CHANNEL_ACTIVE: this content type will be considered active if at least one audio channel is active
  • ALL_CHANNELS_ACTIVE: this content type will only be considered active if all audio channels are active
Default:
ANY_CHANNEL_ACTIVE
SegmentDetectionConfig

Configuration options for specifying which content types and under which conditions should be reported in the content layout timeline. By default, all content type segments will be included with a default minimum duration of 10 seconds.

Object
Example:
{
    "blackFrameSegments": {
        "include": true,
        "duration": 0.5
    },
    "solidColorFrameSegments": {
        "include": false
    },
    "colorBarFrameSegments": {
        "include": true,
        "duration": 1
    },
    "freezeFrameSegments": {
        "include": true,
        "duration": 0.25
    },
    "silenceDetection": {
        "threshold": -80,
        "duration": 5
    }
}
blackFrameSegments

Configuration options for black frame segments.

Example:
{
    "include": true,
    "duration": 0.5
}
solidColorFrameSegments

Configuration options for color frame segments.

Example:
{
    "include": false
}
colorBarFrameSegments

Configuration options for color bar frame segments.

Example:
{
    "include": true,
    "duration": 1
}
freezeFrameSegments

Configuration options for freeze frame segments.

silenceDetection

Configuration options for silence detection in audio segments.

Example:
{
    "threshold": -80,
    "duration": 5
}
Sidecar

A text file that accompanies a video asset and is used to provide metadata or supplemental data on the asset.

Object
Examples:
{
    "type": "DOLBY_VISION_METADATA",
    "name": "20161103_SPARKS_DOVI_METADATA_AR_CORRECT.xml"
}
{
    "type": "DOLBY_VISION_METADATA",
    "name": "20161103_SPARKS_DOVI_METADATA_AR_CORRECT.xml",
    "path": "/mnt/videos"
}
type
SidecarType required read-only

The type of the sidecar.

Example:
DOLBY_VISION_METADATA
name
string required read-only

The filename that represents the sidecar.

Min length: 1
Max length: 500
Example:
20161103_SPARKS_DOVI_METADATA_AR_CORRECT.xml
path
string read-only

The path to the sidecar file. If not supplied, the sidecar file must use the same path as its associated Asset.

Min length: 1
Max length: 500
Example:
/mnt/videos
Types: Asset
SidecarType

The type of sidecar file that accompanies the video asset.

string
Enumeration:
DOLBY_VISION_METADATA

Dolby Vision metadata in XML format

AUDIO

Audio file in WAV, BWF, MXF or MP4 container format

Example:
DOLBY_VISION_METADATA
Types: Sidecar
SoundfieldChannelMapping

Contains an ordered array of objects that define a soundfield group for measurement and quality checks. Each object specifies a source asset, track and channel index as well the corresponding output channel location. This collection is used to form an audio soundfield group for measurements.

Note that all input tracks in a group must have the same sample rate, sample format, but depth, bitstream mode, timebase and duration or the analysis will fail.

Object
Example:
{
    "type": "SoundfieldChannelMapping",
    "mapping": [
        {
            "name": "myleftsoundfile.mxf",
            "path": "/path/to/file",
            "inputTrackIndex": 1,
            "inputChannelIndex": 1,
            "outputChannelLocation": "FL"
        },
        {
            "name": "myrightsoundfile.mxf",
            "path": "/path/to/file",
            "inputTrackIndex": 1,
            "inputChannelIndex": 2,
            "outputChannelLocation": "FR"
        },
    ]
}
type
string required

Capture the schema type for use in oneOf semantics.

Min length: 24
Max length: 24
Pattern: ^SoundfieldChannelMapping$
Default:
SoundfieldChannelMapping
Example:
SoundfieldChannelMapping
mapping
Array required
Min items: 1
Max items: 32
Unique items: YES
Object
name
string

The filename that represents the asset containing the audio track that will be used for a specific channel.

Example:
mysoundfile.mxf
path
string

The path to the primary or sidecar asset’s file location.

Example:
/path/to/file
inputTrackIndex
integer

The source track within the asset to be used for the soundfield channel.

Min: 0
Max: 31
Example:
1
inputChannelIndex
integer

The channel index within the source track to be used for teh soundfield channel.

Min: 0
Max: 31
Example:
1
outputChannelLocation

The output location that the source channel will be mapped to.

Types: AudioGroup
SoundfieldTrackMapping

A mapping of a single physical audio tracks within an embedded asset or as separate sidecar files that will be used to represent an audio soundfield group. The described channel layout for the track can be overriden with a user defined channel layout that will use a 1 to 1 mapping of input channels to output channels.

Object
Example:
{
    "type": "SoundfieldTrackMapping",
    "name": "mysoundfile.mxf",
    "path": "/path/to/file",
    "inputTrackIndex": 1,
    "outputChannelLayout": [
        "FL", "FR", "FC", "LFE", "SL", "SR"
    ]
}
type
string required

Capture the soundfield mapping type for use in oneOf semantics.

Min length: 22
Max length: 22
Pattern: ^SoundfieldTrackMapping$
Default:
SoundfieldTrackMapping
Example:
SoundfieldTrackMapping
name
string required

The filename that represents the asset containing the audio track that will be used for a specific channel.

Example:
mysoundfile.mxf
path
string required

The path to the asset’s file (and possibly sidecar) location with the associated storage.

Example:
/path/to/file
inputTrackIndex
integer required

The source track of the asset to be used for the soundfield channel.

Min: 0
Max: 31
Example:
1
outputChannelLayout
Array of AudioChannelType nullable

A channel layout that will override the channel layout that is described within the metadata of the track. The order in which the channels appear in the array will be mapped to the channels in the track. The amount of entries in the array must equal the number of channels in the track.

Min items: 1
Max items: 32
Unique items: YES
Types: AudioGroup
StorageLocation

Captures the storage location used to house one or more assets. Every asset has a storage location.

Object
Examples:
{
    "type": "S3",
    "name": "test-bucket",
    "credentials": {
        "useAssumedIAMRole": true
    }
}
{
    "type": "PVC",
    "name": "videos"
}
{
    "type": "S3",
    "name": "videos",
    "credentials": {
        "accessKey": {
            "accessKeyId": "AKIAIOSFODNN7EXAMPLE",
            "accessKey": "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY"
        }
    }
}

An enumeration to capture the supported storage types.

Example:
S3
name
string

A name required to access the root of the storage location. The root of the store location will be used with the path and name of the asset to uniquely identify the location of the asset. For Amazon S3, this value would likely be the S3 bucket name and would need to match the pattern (?!(^xn--|.+-s3alias$))^[a-z0-9][a-z0-9-]{1,61}[a-z0-9]$ . For a persistent volume backed by NFS, this would likely be the volume mount name. For HTTP, this must be the server hostname.

Min length: 1
Max length: 500
Example:
videos
credentials
Credentials nullable

Authentication credentials for assets stored in Amazon S3.

Example:
{
    "useAssumedIAMRole": "true"
}
StorageType

An enumeration to capture the supported storage types.

string
Enumeration:
S3

Amazon S3

PVC

Any persistent volume claim that can be defined/supported in Kubernetes

HTTP

A HTTP/HTTPS server (required for HLS)

Example:
S3
StreamIdentifier

Captures additional information about the video stream(s) within the asset. Specifically, this type can be:

  • used to specify the packet identifier (see VideoPID and VideoPIDHex) for assets with multiple video streams or,
  • in the case of HLS, used to represent the HLS variant (see HLSVariantIdentifer).
One of
Examples:
{
    "type": "VideoPIDHex",
    "identifier": "0x101"
}
{
    "type": "HLSVariantIdentifier",
    "bandwidth": 4997885,
    "fallbackStreamIndex": 1
}
{
    "type": "VideoPID",
    "identifier": 1
}
Types: Asset
SystemService

Represents a (micro)service needed to support some function of the overall system.

Object
Example:
{
    "serviceName": "AnalysesService",
    "serviceId": "f533db73-0f9a-4805-9651-c5dcd519dc37",
    "deploymentId": "d8e89059-c7dd-454e-92ab-f61e4107d33b",
    "status": "READY"
}
serviceName
string required read-only

The name of the service

Min length: 1
Example:
AnalysesService
serviceId
string uuid read-only

The UUID associated with service

Min length: 36
Max length: 36
Example:
f533db73-0f9a-4805-9651-c5dcd519dc37
deploymentId
string uuid read-only

The UUID associated with the service’s deployment within the system

Min length: 36
Max length: 36
Example:
d8e89059-c7dd-454e-92ab-f61e4107d33b
status
SystemStatusType required read-only

The status of the servce

Example:
READY
message
string read-only

Any message (i.e. error, detail) that helps to clarify the state of the service when the status is not READY

Min length: 1
Max length: 255
Example:
Service did not respond quickly enough likely due to system or network load
SystemStatusResponse

The response payload of the GET on reaydz which contains the overall system readiness as well as the readiness of the individual (micro)services that comprise the system.

Object
Example:
{
    "checks": [
        {
            "deploymentId": "d8e89059-c7dd-454e-92ab-f61e4107d33b",
            "serviceId": "f533db73-0f9a-4805-9651-c5dcd519dc37",
            "serviceName": "AnalysesService",
            "status": "READY"
        },
        {
            "deploymentId" : "d2bcd3f6-79c6-43c7-9462-afa614d25176",
            "serviceId" : "eb1e4722-461f-438b-95df-7bf3c6e30989",
            "serviceName" : "AnalysisLifecycleService",
            "status" : "READY"
         }
    ],
    "outcome": "READY"
}
checks
Array of SystemService required read-only

An array of the individual readiness checks performed on the services that comprise the system.

Unique items: YES
outcome
SystemStatusType required read-only

The overall system readiness. All services that comprise the system must be READY or UNLICENSED in order for this value to be READY.

Example:
READY
SystemStatusType

An enumeration to capture the supported system/service statuses.

string
Enumeration:
READY

Indicates that the system/service is operational

NOT_READY

Indicates that the system/service is not operational

UNLICENSED

Indicates that the service is not operational due to a missing or invalid license.

This state applies only to an individual service and never the entire system.

Example:
READY
TemporalAlignment

Configuration options for temporal alignment

Object
Example:
{
    "minSeconds": 5,
    "maxSeconds": 120,
    "maxSecondsHighFPS": 120
}
minSeconds
integer

The minimum duration of the misalignment between two videos in seconds

Default:
5
maxSeconds
integer

The maximum duration of the misalignment between two videos in seconds

Min: 1
Default:
90
maxSecondsHighFPS
integer

The maximum duration of the misalignment in seconds between two videos in seconds for assets with a framerate of 120 frames per second or higher

Min: 1
Default:
30
Version

The system version information for the deployed API.

Object
Example:
{
    "commitBranch": "stream-ondemand/release/3.1.0",
    "commitHash": "facc2ef0a3c8ebc10819dc1218748f8d2cbfafd9",
    "commitTime": "2022-05-02T18:58:44Z",
    "stamped": "true",
    "versionString": "3.1.0-12"
}
commitBranch
string required read-only

The git branch where the release was committed.

Min length: 1
Example:
stream-ondemand/release/3.1.0
commitHash
string required read-only

The hashcode associated with the release’s git commit.

Min length: 1
Example:
facc2ef0a3c8ebc10819dc1218748f8d2cbfafd9
commitTime
string date-time required read-only

The UTC timestamp associated with the release’s git commit.

Example:
2022-05-02T18:58:44Z
stamped
boolean required read-only

Indicates if the version was stamped.

Example:
true
versionString
string required read-only

The alphanumeric system version for the API.

Example:
2.14.2-12
VersionResponse

The response payload of the GET on version which contains the system version information.

Object
Example:
{
    "version": {
        "commitBranch": "stream-ondemand/release/3.1.0",
        "commitHash": "facc2ef0a3c8ebc10819dc1218748f8d2cbfafd9",
        "commitTime": "2022-05-02T18:58:44Z",
        "stamped": "true",
        "versionString": "3.1.0-12"
    }
}
version
Version required read-only

The system version information for the API.

Methods: Get the version
VideoPID

The packet identifier (PID) used to identify the video stream within the asset that you are interested in working with. In case of Multiple Program Transport Stream (MPTS), use this value to specify the Program ID (PID) of the video to be processed.

Object
Example:
{
    "type": "VideoPID",
    "identifier": 1
}
type
string required read-only

Capture the schema type for use in oneOf semantics.

Min length: 8
Max length: 8
Pattern: ^VideoPID$
Default:
VideoPID
Example:
VideoPID
identifier
integer int32 required

Represents the desired video packet index.

Example:
1
VideoPIDHex

The packet identifier (PID) used to identify the video stream within the asset that you are interested in working with. In case of Multiple Program Transport Stream (MPTS), use this value to specify the Program ID (PID) of the video to be processed.

Object
Example:
{
    "type": "VideoPIDHex",
    "identifier": "0x101"
}
type
string required read-only

Capture the schema type for use in oneOf semantics.

Min length: 11
Max length: 11
Pattern: ^VideoPIDHex$
Default:
VideoPIDHex
Example:
VideoPIDHex
identifier
string required

Represents the desired video packet index as a hexadecimal value in the form 0x101

Min length: 3
Pattern: ^0[xX][0-9a-bA-B]+$
Example:
0x101
VideoSegmentParameters
Object
include
boolean

Whether or not to include this segment type in the content layout timeline

Default:
false
duration
number

The minimum duration in seconds for this segment type to be included in the content layout timeline. Segments of this type shorter than the specified duration will be treated as motion video.

Example:
2.5
ViewerType

The viewer type for which scores will be calculated

string
Enumeration:
TYPICAL

Represents a typical, untrained viewer

EXPERT

Represents a trained viewer schooled at spotting and judging video anomalies.

STUDIO

Represents a studio viewer trained in assessing the impact of video anomalies on the creator’s artistic intent.

Default:
TYPICAL
Example:
TYPICAL
ViewingEnvironment

A specification of the environment under which the content is viewed

Object
Example:
{
    "device": {
        "name": "oled65c9pua",
        "resolution": {
            "width": 1920,
            "height": 1080
        }
    },
    "viewerType": "TYPICAL"
}
device
Device required

The display device

Example:
{
    "name": "oled65c9pua",
    "resolution": {
        "width": 1920,
        "height": 1080
    }
}
viewerType
ViewerType required

The viewer type

Default:
TYPICAL
Responses
400 400
Applied to all operations

The request could not be understood by the server due to malformed syntax. The client SHOULD NOT repeat the request without modifications.

Body
application/json
Examples

A generic parsing error occured when accessing the /optimizations endpoint

{
    "code": "SS-10000",
    "description": "A generic parsing error occurred"
}

A generic parsing error occurred when accessing the /analyses endpoint

{
    "code": "SA-10000",
    "description": "The request could not be understood by the server due to malformed syntax. The client SHOULD NOT repeat the request without modifications.",
    "details": {
        "type": "BodyProcessorException",
        "message": "[Bad Request] Validation error for body application/json: Input doesn't match one of allowed values of enum: [DOLBY_VISION_METADATA, AUDIO]",
        "causeType": "ValidationExceptionImpl",
        "causeMessage": "Input doesn't match one of allowed values of enum: [DOLBY_VISION_METADATA, AUDIO]",
        "actualContentType": "application/json",
        "errorType": "VALIDATION_ERROR",
        "invalidInputScope": "/subjectAssets/0/sidecars/0/type",
        "invalidInputKeyword": "enum",
        "invalidInput": "XYZ"
    }
}
403 403
Applied to all operations

The server understood the request, but is refusing to fulfill it. Authorization will not help and the request SHOULD NOT be repeated.

Body
application/json
404 404

The server cannot find the requested resource.

405 405
Applied to all operations

The request HTTP method is known by the server but has been disabled and cannot be used for that resource.

415 415
Applied to all operations

Used when the request is asking for a content-type that is not supported (i.e. XML when you only support JSON). The IMAX Stream On-Demand Platform API currently only supports the JSON content-type (i.e. application/json).

500 500
Applied to all operations

The server encountered an unexpected condition which prevented it from fulfilling the request.

Body
application/json
503 503
Applied to all operations

The server is currently unable to handle the request due to a temporary overloading or maintenance of the server. The implication is that this is a temporary condition which will be alleviated after some delay.

Body
application/json