Вы находитесь на странице: 1из 18

Aureus SDK User Guide

Contents:
1: Introduction 2: Video Streams 3: License 4: Main Configuration 4.1: Minimum Detection Proportion 4.2: Tokenize Results 4.3: Render Width 4.4: Number of Results Images 4.5: Time to Acquire Results 4.6: Use Pose Correction 4.7: Yaw 4.8: Pitch 4.9: Use POST 4.9.1: POSTed XML data 4.10: POST URL 4.11: Save Results Images 5: Stream configuration 6: Aureus Objects 6.1: CX_IMAGE 6.2: CX_FACE 6.3: CX_HEAD_LIST 6.4: CX_HEAD 6.5: CX_HEAD_DATA 6.6: CX_HEAD_DATA_LIST 6.7: CX_AUREUS_VIDEO_OUTPUT 6.8: CX_AUREUS_IMAGE 6.9: CX_ANNOTATION_SET 6.10: CX_ANNOTATION 6.11: CX_MESH 6.12: CX_POLYGON 7: Using the Aureus SDK 8: The SDK Files 9: FAQ

1. Introduction The Aureus SDK provides the functionality to process two different types of input: (1) video streams and (2) one or more unconstrained images. For video stream input, Aureus detects and tracks human heads, it locates various facial features and outputs pose corrected tokenized images suitable for facial recognition algorithms in real time. Aureus performs pose correction using the 2D to 3D patented Cyberextruder algorithms. Aureus produces tokenized images such that the face and eyes are consistently positioned so as to be suitable for 2D facial recognition; a passport image is an example of a tokenized image. For multiple unconstrained image input Aureus performs the same pose correction whilst also providing access to the reconstructed 3D meshes of the human heads in the input images. Aureus also provides a 3D mesh that is a weighted consolidation of the multiple image input reconstructions. 2. Video Streams Aureus allows three types of video stream to be processed: (1) USB cameras (2) IP Cameras (3) Media files To allow USB camera input you must have a suitable device driver installed, similarly, to process media (video) files you must have a suitable video codec installed. Additionally Aureus is currently a 64 bit SDK intended to run on a 64 bit Microsoft Windows 7 platform. It has been successfully tested on alternative Microsoft operating systems however you may require additional software installation depending on your choice of operating system. For example, running Aureus on Windows Server 2008 is fine as long as the Windows Desktop Experience is enabled. Aureus provides different levels of integration with your software. You can treat the SDK as a video processing black box in which you tell the SDK what input configuration you require and where you want the resulting tokenized images to be sent or you can delve deep into the SDK and extract/display the video streams and resulting processed images yourself. For unconstrained image input Aureus can also be used in a simple black box approach or you can obtain the intermediate Aureus image, annotation and 3D mesh objects to use in whatever way proves most useful to you. Two source code examples of using Aureus as a simple black box video stream processor are provided with the SDK installation in which there are very few calls made to the SDK; namely CX_Initialize() and CX_StartStream() and the calls made to configure the desired input stream. The example code projects are called AureusSDK_TestConsole and AureusSDKlib_TestConsole and can be compiled with Microsoft Visual Studio 2008. The difference between the two projects are that the former loads the DLL explicitly using LoadLibrary() and GetProcAddress() and the latter links to the AureusSDK.lib import library. Additionally there is a video stream processing source code example in which the SDK is used to display the images from the video streams and the tracked and processed people. This example provides an MFC based GUI which calls the various Aureus SDK functions to display the results of the

real time video processing. The example project is called AureusSDK_TextGUI and can be compiled with Microsoft Visual Studio 2008. There are also two Visual Studio 2008 example projects with source code that demonstrate how to use the SDK for multiple unconstrained image input. The project named AureusSDK_Image_TestConsole shows how to use the SDK from a windows console to process multiple images simultaneously whiles the project named AureusSDK_Image_TestGUI demonstrates how to display the SDK input/output objects and adjust the annotation set objects with real time feedback of the 3D mesh reconstruction and tokenized rendering. 3. License The Aureus SDK must have a valid license key present during processing. To obtain a license key you will need to provide a machine identifier and either the required number of video streams that you want the SDK to process or whether you want to use the SDK to process unconstrained image input. A license can be provided for both video stream and image input however processing multiple input images will obviously consume CPU and hence could slow down video stream processing depending on the hardware configuration of the system on which the SDK is running. The machine identifier can be obtained directly from the SDK or alternatively you can simply provide your PC's MAC address. One you have obtained your license key for (for example) 2 video streams you will be able to simultaneously process any two video streams until your license duration expires. Alternatively if you obtain a license for image input you will be able to process images until the license duration elapses. You will be provided with a license file called AureusSDK_License.txt which contains one or more license key strings. Please put this file in the installation folder. Aureus will read this file to obtain a valid license key. This facilitates the use of the SDK on many PCs without needing to set up a separate license key for each machine; simply send the MAC addresses (and desired number of streams) for all the PCs and you will receive a single license file which you place in the SDK installation folder of each of the PCs. The answer to the question of how many video streams to process for a given PC depends on the speed and quantity of CPU cores in the machine. By CPU cores we mean processing cores, for example a dual core processor has two cores, a dual core machine with hyper threading can be considered as a quad core processor. The base line recommendation for successfully processing video streams using the Aureus SDK is two 3.5GHz cores per video stream. The speed at which the stream processing can be performed is entirely dependent on the number of cores and their speed. The SDK will perform adequately with a lower hardware specification than two 3.5GHz cores however the system will slow down and skip frames especially when many people are being tracked at once. Slowing down the frame rate can also have undesired effects on tracking consistency. The speed at which Aureus processes multiple image input is also dependent on the number of CPU cores and their speed. For single image input the speed of Aureus is dependent on the CPU speed since the input image is processed on a single core. For multiple image input the speed of Aureus is also dependent on the number of cores. For example, processing 4 images on a 4 core system will take approximately the same time as processing 1 image; however processing 8 images will take approximately twice as long.

4. Main Configuration: To process a video stream Aureus needs to be told what stream to process and how. This configuration information is stored locally in a file called "AureusSDK.cfg". This is a plain text file. You can edit this file, however be careful, if you do not know what you are doing you might cause undefined SDK behaviour. A more robust way of configuring Aureus is to call the CX_GetXXX() and CX_SetXXX() functions (where XXX refers to different configuration parameter names). When you adjust the configuration with a CX_SetXXX() function the information will be automatically updated to the "AureusSDK.cfg" file and hence be persistent. There is one SDK configuration parameter which is not persistent this flag relates to the type of rendering that the SDK will perform. The SDK defaults to performing hardware accelerated rendering using OpenGL pbuffers for speed. The SDK can be configured to perform off screen rendering to a Device Independent Bitmap (DIB) by calling CX_SetRenderDIBFlag() with the parameter set to unity. Setting the parameter to zero tells the SDK to use off screen hardware accelerated pbuffers. The later option is significantly faster, however it is not always available depending on your graphics hardware. A call to CX_GetRenderDIBFlag() will revela the type of rendering that the SDK will attempt to use. If you wish to set the render type by calling CX_SetRenderDIBFlag() please ensure that you call this function before calling CX_Initialize(). The main configuration contains the following information: 4.1 Minimum Detection Proportion This is the proportion of the height of the images in the stream below which human heads will not be detected. This represents approximately 1.5 times the height of a human head. This value defaults to 0.2, in other words detection of heads smaller than 20% of image height will not be attempted. Setting this to a lower value will allow Aureus to detect smaller heads (further away) however this can significantly affect the speed of the system. This parameter effects both video stream processing and multiple image input. Call CX_GetMinDetectionProportion() to get this parameter and CX_SetMinDetectionProportion() to set this parameter. 4.2 Tokenize Results This is a Boolean flag (1 or 0). It defaults to 1 meaning that all output images will be tokenized. A tokenized image is like the portrait passport image in which the head size and eye locations are consistent. Setting this to zero will cause Aureus to output square images in which the consistent locations of the eyes is not attempted. Figure 1 demonstrates a tokenized image and the relative size and location of the head and eye respectively. This parameter effects both video stream processing and multiple image input. Call CX_GetTokenizeFlag() to get this parameter and CX_SetTokenizeFlag() to set this parameter.

Figure 1: A tokenized passport style image demonstrating the locations of the eyes an head and the image aspect ratio. 4.3 Render Width This value defaults to 240 pixels. It represents the width of the output images (whether tokenized or not). For example, if the output images are tokenized and this value is set to 240 then the image height will always be 320 pixels. This parameter effects both video stream processing and multiple image input. Call CX_GetTokenizedWidth() to get this parameter and CX_SetTokenizedWidth() to set this parameter. 4.4 Number of Results Images The Aureus SDK will locate and track people in the video streams in real time. It will locate their facial features and produce a pose corrected tokenized image. This can result in a large number of images per person however Aureus filters all the output images to select the best n images for facial recognition. This configuration parameter sets the number of images to collect. Its default value is 4. This parameter effects only video stream processing. Call CX_GetMaxNumOfResultsImages() to get this parameter and CX_SetMaxNumOfResultsImages() to set this parameter. 4.5 Time to Acquire Results (seconds) Since the Aureus SDK allows a limit to be set on the number of images to output per person it also allows you to set a time limit for collection of those images. This value defaults to 3 seconds. After a person is located, tracked and processed for this duration the processed images will be available for facial recognition and no more images will be produced (for this person). The person will continue to

be tracked until they leave the frame. If you set this value to a large amount (e.g. 60 seconds) then the output images will be provided when the person leaves the frame (assuming they do leave the frame in less than 60 seconds). If you set this value to a negative amount then the people will be tracked and processed but the resulting images will never be posted. However, you make obtainin them via the frame call-back function described later. This parameter effects only video stream processing. Call CX_GetMaxTimeToAcquireResults() to get this parameter and CX_SetMaxTimeToAcquireResults() to set this parameter. 4.6 Use Pose Correction One of the strengths of Aureus is its ability to perform real time construction of the tracked persons head in 3D. This 3D head mesh is then rendered to 2D with zero pose essentially ensuring that the resulting image will be a front facing passport style image ready for facial recognition. This flag therefore defaults to 1. If you have no need for pose correction you can set this flag to zero (off) in which case you will still obtain tokenized image however there is no guarantee that they will be front facing images. Aureus will, however, attempt to provide you with the best possible images from a facial recognition point of view. It is recommended that you leave this flag switched on. This parameter effects only video stream processing. Call CX_GetPoseCorrectionFlag() to get this parameter and CX_SetPoseCorrectionFlag() to set this parameter.

Figure 2: An example of a tokenized image (left) and a pose corrected tokenized image (right). 4.7 Yaw If pose correction is used this parameter allows you to alter the point at which pose correction is utilized. If the absolute yaw of the person's head (turning the head side to side) is less than this value then pose correction will not be performed. Default value is 7.5 degrees. This parameter works in tandem with the following Pitch parameter so that if the absolute yaw is less than this parameter and the absolute pitch is less than the pitch parameter then pose correction will not be applied, otherwise it will be applied. This parameter effects only video stream processing. Call CX_GetPoseCorrectionYaw() to get this parameter and CX_SetPoseCorrectionYaw() to set this parameter.

4.8 Pitch If pose correction is used this parameter allows you to alter the point at which pose correction is utilized. If the absolute pitch of the person's head (nodding the head) is less than this value then pose correction will not be performed. Default value is 10.0 degrees. This parameter effects only video stream processing. Call CX_GetPoseCorrectionPitch() to get this parameter and CX_SetPoseCorrectionPitch() to set this parameter.

4.9 Use POST Since the Aureus SDK provides tokenized pose corrected passport style images as output you may obtain these images by setting a call-back function (described later) or you may elect for the SDK to send those images as an xml file via the SOAP protocol to a specific http URL address. If this parameter is switched on the SDK will encode the output images (base 64) and create a SOAP xml document containing the images. It will save this document in a sub folder (called POST) in the installation folder. It will then use the CURL executable to post the xml data to your choice of http URL. The SDK installation will create the POST folder and place an example xml file in the folder. This parameter effects only video stream processing. Call CX_GetUsePOSTFlag() to get this parameter and CX_SetUsePOSTFlag() to set this parameter. 4.9.1 POSTed XML data The xml files (actual extension is "tmp") contain the encoded images and the following information tags in the xml file: CID : camera ID, e.g. USB 0 for a USB camera at pin zero TS : date time stamp, e.g. 30:4:2012:14:59:37 for the 30th April 2012 at time 14.59 and 37 seconds TID : tracking ID, this will be a number denoting the "nth" person to be tracked in this stream NIM : number of image pairs to follow IMAGEPAIR : this contains either one or two images. If this image has been pose corrected there will be two images, the pose corrected and the original IMAGECOUNT : tag denoting how many images are in this "pair" IMAGE : this contains the image data thus... PC : flag denoting whether this image has been pose corrected (1 or 0) IW : the width of the image in pixels IH : the height of the image in pixels REX : the x coordinate of the right eye (origin top left) REY : the y coordinate of the right eye (origin top left) LEX : the x coordinate of the left eye (origin top left) LEY : the y coordinate of the left eye (origin top left) IEPD : the inter-eye pixel distance YAW : the detected yaw of the person's head (degrees) PITCH : the detected pitch of the person's head (degrees) FIT : an Aureus confidence fit measure DATA : the image data, this is a jpeg compressed, base 64 encoded image.

4.10 POST URL This parameter defines the http URL that the xml files containing the processed images will be sent to (assuming the Use POST flag is switched on). If you set a valid URL to a system that is listening for SOAP messages the information described above will arrive at the target system. This parameter effects only video stream processing. Call CX_GetPOSTurl() to get this parameter and CX_SetPOSTurl() to set this parameter.

4.11 Save Results Images This parameter tells the SDK whether the user wants to save the resulting processed images into the POST sub folder of the installation folder. Call CX_GetSaveResultsImagesFlag() and CX_SetSaveResultsImagesFlag() to get and set the flag respectively.

5. Stream configuration: In addition to the main configuration file there are separate configuration files for each video stream. Each one is called "VideoStream1.cfg", "VideoStream2.cfg" etc. In each configuration file stores the information about the video stream. These files provide persistent information to inform the SDK what input to use for each stream. Quite simply, the files say what type the stream is (USB, IP or media file) and the stream connection information. Once again it is best not to edit these files unless you have a clear understanding of what you are doing. Use the CX_GetXXX() and CX_SetXXX() functions instead. Call CX_GetStreamType() and CX_SetStreamType() to view/edit the stream type Call CX_GetStreamInfo() and CX_SetStreamInfo() to view/edit the stream information

For example, if you wish to process the stream coming from a USB camera you would load and initialize the AureusSDK DLL and then call: cx_uint stream_index = 0; // the first video stream cx_uint stream_type = 0; // 0 = USB, 1 = IP, 2 = Media file CX_SetStreamType(stream_index, stream_type); CX_SetStreamInfo(stream_index,"0"); // set to USB pin number zero CX_StartStream(stream_index); // start the stream

Or if you wished to process a media file: cx_uint stream_index = 0; // the first video stream cx_uint stream_type = 2; // 0 = USB, 1 = IP, 2 = Media file CX_SetStreamType(stream_index, stream_type); CX_SetStreamInfo(stream_index,"C:\\Videos\\Example.wmv"); // tell Aureus where the file is CX_StartStream(stream_index); // start the stream

More robust code would ensure you check for errors thus: cx_uint stream_index = 0; // the first video stream cx_uint stream_type = 2; // 0 = USB, 1 = IP, 2 = Media file if (!CX_SetStreamType(stream_index, stream_type)) { cerr<<CX_GetLastError()<<"\n"; } if (!CX_SetStreamInfo(stream_index,"C:\\Videos\\Example.wmv")) // tell Aureus where video file is { cerr<<CX_GetLastError()<<"\n"; } if (!CX_StartStream(stream_index)) // start the stream { cerr<<CX_GetLastError()<<"\n"; }

6. Aureus Objects: The SDK is designed to be called from the lowest common denominator...the C language, or more specifically any languages that is capable of calling a standard C function can use the SDK. However, this does not mean that Aureus itself uses C. The SDK provides some definitions of objects that Aureus uses. All of these objects are defined as simple void pointers. For example, a CX_IMAGE is defined as a void* . Underlying a CX_IMAGE is an actual Aureus image object residing within and managed by the SDK. To obtain information and data from a CX_IMAGE you simply call the associated SDK image function CX_GetImageData(). This provides you with the image size and a pointer to the pixel bytes. You may then display or use that data in any way you see fit. However, please note that all Aureus objects are managed by the SDK, so you must not free the CX_IMAGE, Aureus will do that for you. For the different Aureus object types (all defined as void pointers) the SDK will perform run-time type checking. So for example if you provide a CX_HEAD to CX_GetImageData() the SDK function will return an error code. Upon receiving the error code if you call CX_GetLastError() you will be provided with and error message thus: "CX_GetImageData Error! p_image incompatable type" If you pass a pointer to anything (at all) other than an Aureus object you will obtain an error message from CX_GetLastError() like the following: "CX_GetImageData Error! p_image was not an Aureus object"

The purpose of the Aureus objects is to store and process image and 3D data. The following objects are defined:

6.1 CX_IMAGE This object stores an RGBA image with the origin at the bottom left. The image always has 4 bytes per pixel arranged in rows configuration thus: the first (bottom) row of RGBA pixel values followed by the second to bottom etc. until the top (last) row. To obtain the image data call CX_GetImageData(). To obtain the size of the image dimensions call CX_GetImageSize(). To save the image call CX_SaveImage(). 6.2 CX_FACE This object stores the location of a tracked face. It provides two 2D floating point vectors denoting the bottom left and top right and an addition 3D floating point vector denoting the RGB colour of the face. Again all coordinates relate to a bottom left origin. To obtain the face box position and colour call CX_GetFaceData(). 6.3 CX_HEAD_LIST For each video stream the Aureus SDK will pass a CX_HEAD_LIST into a frame call back function which you can set. The CX_HEAD_LIST contains zero or more CX_HEAD objects corresponding to the number of people currently being tracked in the video stream. Each CX_HEAD will contain a current tracking image and face box and a list of images that Aureus has labelled as the best "n" images for facial recognition from those that have been processed. You can find out what the total number of "best" head images is using CX_GetTotalNumHeadImages() or alternatively you can step through each CX_HEAD and obtain the number of images for that head using CX_GetNumHeadImages(). To obtain the number of items in the list call CX_GetNumOfHeads(). To obtain access to each CX_HEAD in the list call CX_GetHead().

6.4 CX_HEAD This object contains the current tracked CX_FACE, a current CX_IMAGE and a list of CX_IMAGE's representing the best "n" images as measured by the Aureus confidence measure. The value of "n" is the maximum number of result images (set by CX_SetMaxNumOfResultsImages()). To extract the current face call CX_GetCurrentFaceBox(). To extract the current image call CX_GetCurrentHeadImage(). To extract a colour associated with the CX_HEAD call CX_GetHeadColor().

To obtain the number of best images call CX_GetNumHeadImages(). To obtain one of the best images call CX_GetHeadImage().

6.5 CX_HEAD_DATA This object contains a CX_IMAGE, the location of the eyes within the image, the original head pose of the person in the image and an Aureus confidence fit measure. To obtain the CX_IMAGE from the CX_HEAD_DATA object call CX_GetHeadDataImage(). To obtain the Aureus confidence fit measure call CX_GetHeadDataFit(), the lower the value the higher the confidence and the better the fit. To obtain the locations of the eyes in the image call CX_GetHeadDataRightEye() and CX_GetHeadDataLeftEye(). To obtain the pose (head rotation) call CX_GetHeadDataPose().

6.6 CX_HEAD_DATA_LIST This object contains a list of CX_HEAD_DATA objects as well as a stream source ID string, a date time stamp string and a tracking ID number. To obtain the number of items in this list call CX_GetHeadDataListSize(). To obtain the CX_HEAD_DATA objects call CX_GetHeadData(). To obtain a colour associated with the list call CX_GetHeadDataListRGB(). To obtain the stream source ID string call CX_GetHeadDataListCameraID(). To obtain the date time stamp string call CX_GetHeadDataListDateTime(). To obtain the tracking ID number call CX_GetHeadDataListTrackingID().

6.7 CX_AUREUS_VIDEO_OUTPUT The Aureus SDK provides access to all the processed images from all the video streams using this object type. It is a list of CX_HEAD_DATA_LIST, one for each person being tracked in any stream. The CX_HEAD_DATA_LIST objects are added to the CX_AUREUS_VIDEO_OUTPUT list when the tracked person leaves the frame or is lost or the maximum time to acquire results duration has expired. If you have configured the SDK to POST the results via the SOAP protocol it is at this point the data contained in the CX_HEAD_DATA_LIST will be posted. A call to CX_GetTotalNumOutputImages() will provide the total number of images in the CX_AUREUS_VIDEO_OUTPUT object. The CX_AUREUS_VIDEO_OUTPUT object will be passed into the output call-back function. You set the output call back using CX_SetOuputCallBack(). Additionally the number of items in the list will be passed into the call-back function. To obtain a CX_HEAD_DATA_LIST object call CX_GetHeadDataList(). 6.8 CX_AUREUS_IMAGE

The Aureus SDK represents input images with this object. It contains the input image itself (a CX_IMAGE), the fitted annotation set, the reconstructed 3D mesh and a rendered front facing pose corrected image obtained by rendering the 3D mesh. To add an image from a file to Aureus so that it can be processed call CX_AddImage(). To add an image from a memory to Aureus so that it can be processed call CX_AddImageFromData(). To replace an image already added to Aureus from a file so that it can be processed call CX_ReplaceImage(). To replace an image already added to Aureus from memory so that it can be processed call CX_ReplaceImageFromData(). To delete an image from the set of images already added to Aureus call CX_DeleteImage() or CX_DeleteImageFromIndex(). To remove all the images currently added to Aureus call CX_ClearImages(). To get an image from the set of images already added to Aureus call CX_GetImage(). To get an annotation set from the set of images already added to Aureus call CX_GetAnnotationSet(). If you already have the eye coordinates you can tell Aureus where they are by calling CX_SetEyes(). 6.9 CX_ANNOTATION_SET The Aureus SDK determines the location and outline of a human face and represents the face as a set of annotations. An annotation is a list of normalized 2D points with some associated landmark points and open/closed flag and a descriptive string label. For example, an annotation might describe the outline of the persons nose with a list of 2D points delineating the nose outline and a list of landmark indices to those points denoting anatomically important points such as the inner/outer nostril locations. A CX_ANNOTATION_SET contains the whole set of annotations that Aureus uses to describe a human face. To save an annotation set to a text file call CX_SaveAnnotationSet(). To load an annotation set into a CX_AUREUS_IMAGE object call CX_LoadAnnotationSet(). To obtain the number of annotations in the set call CX_GetAnnotationSetSize(). To get a single annotation from the set call CX_GetAnnotation(). To enable adjustment of an annotation set there are three functions: CX_AnnotationSetMove() : Moves all annotations in the set, a single annotation or a single point or landmark point. CX_AnnotationSetScale() : Scales all the annotations in the set or a single annotation. CX_AnnotationSetRotate() : Rotates all the annotations in the set or a single annotation. 6.10 CX_ANNOTATION This Aureus object contains a list of 2D points, a flag that determines whether the annotation is closed or open, a descriptive string label and a list of indices to the 2D points that determine the landmark points. To get the number of 2D points in the annotation call CX_GetAnnotationSize(). To get a particular 2D point call CX_GetAnnotationPoint(). To determine whether the annotation is open call CX_GetAnnotationOpen(). To obtain the string label call CX_GetAnnotationLabel(). To get the number of landmark points call CX_GetAnnotationNumOfLandmarks().

To get a particular landmark index call CX_GetAnnotationLandmark(). 6.11 CX_MESH After the Aureus SDK has successfully processed an input image it will produce a reconstructed 3D mesh of the person in the image. The 3D mesh is represented by the Aureus object CX_MESH. Call CX_GetMesh() to obtain the mesh. Call CX_SaveMesh() to save the mesh to an OBJ format. Call CX_GetMeshTexMap() to obtain the mesh texture map as a CX_IMAGE object. Call CX_GetMeshInfo() to obtain the number of vertices, normal, texture coordinates and polygons in the mesh. Call CX_GetMeshExtents() to obtain the minimum and maximum extent of the 3D vertices in the mesh. Call CX_GetMeshVertexInfo() to obtain a particular 3D vertex, 3D normal or 2D texture coordinate. Call CX_GetMeshPolygon() to obtain a particular mesh polygon as a CX_POLYGON object. 6.12 CX_POLYGON This Aureus object represents a 3D textured polygon. It is essentially a list of polygon vertices; where a polygon vertex consists of a 3D vertex index, a normal index and a texture coordinate index. Call CX_GetPolygonSize() to obtain the number of vertices in the polygon. Call CX_GetPolygonVertex() to obtain the indices to the mesh 3D vertices, normals and texture coordinates.

7. Using the Aureus SDK The simplest way to use the SDK for video stream processing is to load the DLL, initialize it and then start all the streams, the example below explicitly loads the DLL thus:
#include AureusSDK.h HANDLE aureus = LoadLibrary("AureusSDK.dll"); if (!aureus) { cerr<<"Failed to load library!\n"; return; } PCX_GetLastError CX_GetLastError = (PCX_GetLastError)GetProcAddress( aureus, "CX_GetLastError"); PCX_GetLicenseInfo CX_GetLicenseInfo = (PCX_GetLicenseInfo)GetProcAddress( aureus, "CX_GetLicenseInfo"); PCX_Initialize CX_Initialize = (PCX_Initialize)GetProcAddress( aureus, "CX_Initialize"); PCX_StartAllStreams CX_StartAllStreams = (PCX_StartAllStreams)GetProcAddress( aureus, "CX_StartAllStreams"); if (!CX_Initialize()) { cerr<<GetLastError()<<"\n"; return;

} // display the license information cout<<CX_GetLicenseInfo()<<"\n"; if (!CX_StartAllStreams()) { cerr<<GetLastError()<<"\n"; return; } // loop until stop FreeLibrary(aureus);

Or alternatively set the configuration and start the streams individually. The example below uses the AureusSDKlib.h and links to the AureusSDK.lib import library.
#include AureusSDKlib.h if (!CX_Initialize()) { cerr<<GetLastError()<<"\n"; return; } cx_int n_streams = CX_GetNumOfStreams(); if (!n_streams) { cerr<<GetLastError()<<"\n"; return; } if (n_streams>=1) { if (!CX_SetStreamType(0,0)) // set to USB camera { cerr<<GetLastError()<<"\n"; return; } if (!CX_SetStreamInfo(0,"0")) // set pin number to zero { cerr<<GetLastError()<<"\n"; return; } if (!CX_StartStream(0) // start the stream { cerr<<GetLastError()<<"\n"; return; } }

However, much more control can be obtained if you provide a frame and/or an output call back function. The SDK installation comes with example code showing how to provide the call back functions and how to access and use the data that the SDK provides. The simplest way to use the Aureus SDK for unconstrained image input it to initialize the SDK then load the images you wish to process, call CX_ProcessImages() and then get the output using CX_GetConsolidatedRenderedImage(). The test project AureusSDK_Image_TestConsole() provides an example of this whilst the AureusSDK_Image_TestGUI project provides a far more indepth example of using the SDK with image input.

8. The SDK files After installing the Aureus SDK there will be the following files/folders in the installation folder: AureusSDK : A folder containing the SDK header file

AureusSDK_TestConsole : Example VC++ project for running Aureus as a command line executable with video stream input. AureusSDKlib_TestConsole : Example console project linking to the Aureus import library with video stream input. AureusSDK_TestGUI : Example VC++ project for running Aureus in a GUI with video stream input.

AureusSDK_Image_TestConsole : Example VC++ project running Aureus as a command line executable with image input. AureusSDK_Image_TestGUI : Example VC++ project running Aureus in a GUI with image input.

AureusSDK_Test.sln : The VC++ solution file to load all test projects AureusSDK_UserGuide.pdf : This document AureusSDKreference.chm : A linked reference help file providing details of all the Aureus SDK. Depending on your operating system you may need to install some Microsoft re-distributable files and/or codecs. The installer will provide the following re-distributable files vc2008_feature_pack_redist_x64.exe vc2008_redist_x64_sp1.exe windows.7.codec.pack.v4.0.3.setup.exe The sub-folder x64/Release is the main installation folder; you should place your license file in this folder if you want to use the AureusSDK_Test.sln solution directly without alteration. The solution will build the test projects executable files and place them in this folder. The folder contains the following files/folders:

AureusSDK_Data Example Videos ExampleMultiImages POST

: A folder containing the data Aureus requires. : A folder containing one or more example video files. : A folder containing example sets of images. : A folder in which the resulting XML files will be deposited

There will additionally be the following files in the installation folder AureusSDK.dll AureusSDK.lib CE_VideoStream64.dll cefacedetect.dll openvc_ffmpeg_64.dll libcurl.dll curl.exe msvcr100.dll tbb.dll AureusSDK_TestConsole.exe AureusSDKlib_TestConsole.exe AureusSDK_TestGUI.exe AureusSDK_Image_TestConsole.exe AureusSDK_Image_TestGUI.exe

To distribute your own application using the Aureus SDK you will need to have the following files and folders installed: AureusSDK.dll CE_VideoStream64.dll cefacedetect.dll openvc_ffmpeg_64.dll libcurl.dll curl.exe msvcr100.dll tbb.dll AureusSDK_Data POST And additionally you will need to place the AureusSDK_License.txt file in your installation when you receive it from Cyberextruder.

9. FAQ I get an error message No license key when I launch either of the test applications. This means that you do not have an Aureus SDK license in your installation. Please read the section titled License in this document and send the required information to Cyberextruder.

How do I get the MAC address for my PC? Start a command window by typing cmd.exe into the search field of the Start menu. Type ipconfig -all at the prompt then press enter. Then scroll down until you see Ethernet adaptor Local Area Connection. The item marked Physical Address is your Ethernet MAC address it should be comprised of 6 two character codes separated by hyphens. Alternatively if you do not have an Ethernet adaptor you can use the address given under Wireless LAN adaptor the SDK will accept licenses generated from either of these values.

What if I dont have an Ethernet or Wireless MAC address? You can still obtain your machine ID directly from the SDK. A call to CX_GetMachineID will provide you with a hardware string that can be used to create a valid license key.

The GUI test application wont run. You need to select the correct type of input (USB, IP or Media file) and provide the source information by clicking the Edit button. For USB cameras provide a pin number (usually zero). For IP cameras provide the URL, e.g. http://131.111.125.248/axis-cgi/mjpg/video.cgi?fps=30. For media files click the edit button and browse to the ExampleVideos folder then select a video.

Aureus runs but it runs very slowly It is recommended that you run the Aureus SDK using two 3.5GHz CPU cores per video stream. Video processing will slow down if you have fewer CPU cores or slower ones. It is also recommended that you have a good graphics card with 1GB of memory.

I get a Windows error The side by side configuration The SDK is a 64 bit Windows DLL, you will need to install the VC redistributables that come with the SDK. Also, this is highly dependent on your operating system; if you are installing the SDK to a bare bones Windows server OS then you must enable the Desktop Experience in your operating system.

The Test GUI will run a USB camera but not a video file Processing a video file is dependent on your machine having a suitable video codec installed for the video format you wish to process. The SDK comes with a selection of 64 bit video codecs. Please install them and try again. If you still have an error you may need to search on-line for the correct codec.

The Test GUI can play a media file but not a USB camera This is mostly caused by not having a valid device driver for your USB camera or not enabling Desktop Experience on a Windows Server OS. Please contact your camera supplier for a valid 64 bit device driver.

The test applications crash This is usually caused when installed to a Windows Server OS and not having Desktop Experience enabled.

Вам также может понравиться