Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
4 changes: 1 addition & 3 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,6 +1,4 @@
logs
*.log
coverage
node_modules
.idea/
.nyc_output/
build
4 changes: 0 additions & 4 deletions .mocharc.js

This file was deleted.

3 changes: 1 addition & 2 deletions .npmignore
Original file line number Diff line number Diff line change
@@ -1,11 +1,10 @@
.gitignore
.gitattributes
.idea
cam_tests
coverage
experiments
node_modules
test/*.map
test/*.js
examples
__test__
src
23 changes: 12 additions & 11 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,11 @@

[![Coverage Status](https://raw.githubusercontent.com/agsh/onvif/refs/heads/gh-pages-debug/badges/coverage.svg)](https://github.com/agsh/onvif/tree/v1)

ONVIF Client protocol Node.js implementation.
Node.js implementation of the ONVIF client protocol.

> [!TIP]
> This is unstable branch for version 1.x, for stable use version 0.x see [branch v0.x](https://github.com/agsh/onvif/tree/v0.x)
> This is a master branch for the developing version 1.x, for stable use version 0.x, see
> [branch v0.x](https://github.com/agsh/onvif/tree/v0.x)

This is a wrapper to ONVIF protocol which allows you to get information about your NVT (network video transmitter)
device, its media sources, control PTZ (pan-tilt-zoom) movements and manage presets, detect devices in your network and
Expand All @@ -17,20 +18,19 @@ The library uses Node.js. And works on the server-side.
[![ONVIF](https://www.onvif.org/wp-content/themes/onvif-public/images/logo.png)](http://onvif.org)


This is a new version of the ONVIF library. Previous version was written in JS, and this the typescript library with
interfaces descbring ONVIF data structures. Right now some of the methods from the v.0.8 were implemented,
the list of supported ONVIF commands you can find here: https://github.com/agsh/onvif/blob/v1/CHECKED.md
This is a new version of the ONVIF library. While the previous version was written in JavaScript, this version is
written in TypeScript, including interfaces that describe ONVIF data structures. Currently, a subset of methods from
version 0.8 has been reimplemented, but there are more methods to be added. The list of supported ONVIF commands you can
find here: https://github.com/agsh/onvif/blob/v1/CHECKED.md

The library will be updated because other methods are currently under development.
Development is ongoing and more methods will be added over time.

The documentation for the new library was generated by typedoc and you can find it here:
https://htmlpreview.github.io/?https://github.com/agsh/onvif/blob/v1/docs/index.html

The code, which use the old version of the library (0.8.x), should work using the compatibility class:
https://github.com/agsh/onvif/blob/v1/src/compatibility/cam.ts where all methods are located. (Currently unsupported)

Thanks a lot for your interest and I will be glad to any questions and comments!

### Interfaces
Interfaces are generated according to the latest version of the [ONVIF specification](https://github.com/onvif/specs).

Expand Down Expand Up @@ -134,9 +134,10 @@ Here we have required `name` field in the element item. Also parsed unknown for
[xml2js](https://github.com/Leonidas-from-XIV/node-xml2js).

A reasonable question: why? Why do we have this here? The answer is simple. When we need to manipulate with the device
and setup different extensions, parameters, etc. we don't event know how to serialize the pretty js-object to a
ugly xml-structure. But we want to work with the data easily and comfortable. So, when we want to change something in
the ONVIF device, in this example it is a `setMetadataConfiguration` method, we just need to follow two simple rules:
and setup different extensions, parameters, etc. Especially when configuring vendor-specific extensions. We don't event
know how to serialize the pretty js-object to a ugly xml-structure. But we want to work with the data easily and
comfortable. So, when we want to change something in the ONVIF device, in this example it is a `setMetadataConfiguration`
method, we just need to follow two simple rules:
* change known fields as is: `elementItem[0].name = 'hello'`
* change anything we don't know about in the specification in the `__all__` field `elementItem[0].__any__.Param2 = 'hi'`
And this structure will be easy picked up and converted into the appropriate SOAP XML!
Expand Down
164 changes: 163 additions & 1 deletion __tests__/media.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -476,6 +476,10 @@ describe('Configurations', () => {
],
},
},
subscriptionPolicy : {
__clean__ : true,
name : 'policy',
},
},
analytics : true,
multicast : {
Expand All @@ -484,7 +488,12 @@ describe('Configurations', () => {
TTL : 512,
autoStart : false,
},
sessionTimeout : 'PT120S',
sessionTimeout : 'PT120S',
analyticsEngineConfiguration : {
__clean__ : true,
analyticsModule : [],
extension : { __clean__ : true },
},
},
'AudioOutput' : {
name : 'AOName',
Expand Down Expand Up @@ -528,4 +537,157 @@ describe('Configurations', () => {
});
});
});

describe('Stream uri', () => {
it('should get stream uri with default options', async () => {
const result = await cam.media.getStreamUri();
expect(result.uri).toBeDefined();
expect(result.invalidAfterConnect).toBe(false);
expect(result.invalidAfterReboot).toBe(false);
expect(result.timeout).toBeDefined();
});

it('should get stream uri with other parameters', async () => {
const result = await cam.media.getStreamUri({
profileToken : 'ProfileToken_2',
streamSetup : {
stream : 'RTP-Unicast',
transport : {
protocol : 'HTTP',
},
},
});
expect(result.uri).toBeDefined();
});
});

describe('Snapshot', () => {
it('should get snapshot with default options', async () => {
const result = await cam.media.getSnapshotUri();
expect(result.uri).toBeDefined();
expect(result.invalidAfterConnect).toBe(false);
expect(result.invalidAfterReboot).toBe(false);
expect(result.timeout).toBeDefined();
});

it('should get snapshot with other parameters', async () => {
const result = await cam.media.getSnapshotUri({
profileToken : 'ProfileToken_2',
});
expect(result.uri).toBeDefined();
});
});

describe('Multicast', () => {
it('should start multicasting with different configurations and breaks on non-existent one', async () => {
await cam.media.startMulticastStreaming();
await cam.media.startMulticastStreaming({
profileToken : 'ProfileToken_2',
});
await expect(cam.media.startMulticastStreaming({
profileToken : 'Unknown',
})).rejects.toThrow('Profile Not Exist');
});

it('should stop multicasting with different configurations and breaks on non-existent one', async () => {
await cam.media.stopMulticastStreaming();
await cam.media.stopMulticastStreaming({
profileToken : 'ProfileToken_2',
});
await expect(cam.media.stopMulticastStreaming({
profileToken : 'Unknown',
})).rejects.toThrow('Profile Not Exist');
});
});

describe('Synchronization Points', () => {
it('should set synchronization points', async () => {
await cam.media.setSynchronizationPoint();
await cam.media.setSynchronizationPoint({
profileToken : 'ProfileToken_2',
});
await expect(cam.media.setSynchronizationPoint({
profileToken : 'Unknown',
})).rejects.toThrow('Profile Not Exist');
});
});

describe('Video source mode', () => {
let videoSourceMode;
it('should get video source modes with given video configuration token', async () => {
const result = await cam.media.getVideoSourceModes();
[videoSourceMode] = result;
expect(videoSourceMode.token).toBeDefined();
expect(videoSourceMode.maxResolution).toBeDefined();
expect(videoSourceMode.maxFramerate).toBeDefined();
expect(videoSourceMode.reboot).toBeDefined();
expect(videoSourceMode.encodings).toBeInstanceOf(Array);
});

it('when getting should throw an error when configuration token does not exist', async () => {
await expect(cam.media.getVideoSourceModes({
videoSourceToken : 'Unknown',
})).rejects.toThrow('The requested video source does not exist');
});

it('should set video source mode to video source', async () => {
const result = await cam.media.setVideoSourceMode({ videoSourceModeToken : videoSourceMode!.token });
expect(result.reboot).toBeDefined();
});
});

describe('OSD', () => {
it('should create a new text OSD', async () => {
const result = await cam.media.createOSD({
position : {
type : 'UpperLeft',
},
type : 'Text',
videoSourceConfigurationToken : 'VideoSourceConfigurationToken_1',
textString : {
plainText : 'zxc',
type : 'Plain',
fontColor : {
transparent : 7,
color : {
z : 9,
y : 8,
x : 8,
colorspace : 'http://www.onvif.org/ver10/colorspace/RGB',
},
},
},
});
expect(typeof result).toBe('string');
});

it('should create a new image OSD', async () => {
const result = await cam.media.createOSD({
position : {
type : 'Custom',
},
type : 'Image',
videoSourceConfigurationToken : 'VideoSourceConfigurationToken_1',
image : {
imgPath : 'http://www.onvif.org/ver10/media/wsdl',
},
});
expect(typeof result).toBe('string');
});

it('should set existing OSD', async () => {
const result = await cam.media.setOSD({
position : {
type : 'Custom',
},
token : 'OSDConfigurationToken_1',
type : 'Image',
// videoSourceConfigurationToken : 'VideoSourceConfigurationToken_1',
image : {
imgPath : 'http://www.onvif.org/ver10/media/wsdl',
},
});
expect(typeof result).toBeDefined();
});
});
});
37 changes: 35 additions & 2 deletions __tests__/utils.test.ts
Original file line number Diff line number Diff line change
@@ -1,6 +1,15 @@
import xml2js, { parseStringPromise } from 'xml2js';
import { inspect } from 'node:util';
import { build, guid, linerase, parseSOAPString, struct, toOnvifXMLSchemaObject, xsany } from '../src/utils';
import {
build,
guid,
linerase,
parseSOAPString,
struct,
toOnvifXMLSchemaObject,
upFirstLetter,
xsany,
} from '../src/utils';
import { Config, LensDescription } from '../src/interfaces/onvif';

describe('Linerase function', () => {
Expand Down Expand Up @@ -180,7 +189,9 @@ describe('xs:any', () => {
'Parameters' : {
'ElementItem' : [{
'Name' : 'elementItem1',
'Param1' : 'param1',
'Param1' : {
Data : 42,
},
}, {
'Name' : 'elementItem2',
'Param2' : 'param2',
Expand All @@ -197,6 +208,28 @@ describe('xs:any', () => {
});
});

describe('upFirstLetter', () => {
it('should capitalize the first letter of a string', () => {
expect(upFirstLetter('hello')).toBe('hello');
});

it('should capitalize the first letter of an array', () => {
expect(upFirstLetter([{ hello : 'world' }, { hello : 'world' }])).toStrictEqual([{ Hello : 'world' }, { Hello : 'world' }]);
});

it('should capitalize the first letter of an object', () => {
expect(upFirstLetter({ hello : 'world' })).toStrictEqual({ Hello : 'world' });
});

it('should capitalize the first letter of a nested object', () => {
expect(upFirstLetter({ hello : { world : 'world' } })).toStrictEqual({ Hello : { World : 'world' } });
});

it('should capitalize the first letter of a nested array', () => {
expect(upFirstLetter({ hello : [{ world : 'world' }] })).toStrictEqual({ Hello : [{ World : 'world' }] });
});
});

export function clean(obj: any): any {
if (Array.isArray(obj)) {
return obj.map(clean);
Expand Down
Loading