SHREC 16 Track
3D Object Retrieval with Multimodal Views
In this track, we provide 200 objects as query objects and 405 objects as testing objects. For each object, we will provide 73 RGB images and 73 depth images.
The participants also need to download and file Agreement and Disclaimer Form and send it back to us with your register email. We will then email you the instructions to download the dataset and the related feature (Zernike of RGB image and HoG of Depth image).
For our track, The 605 objects belong to 61 categories and the number of objects in each category ranges from 1 to 20. 100 real objects and all the 100 3D printed objects are used as the query once.
In our track, these 200 objects are used as the query object once. For each object, we provide 73 images and 73 depth images to represent one object. If the participants have more then one method and more than one result, they should provide different folders and each folder includes one result. The filename of folder should be named as the name of method.
For example, the method used by user is name as CCFV. Author should build a txt document, which is named as author-CCFV.txt, where author should be replaced with the first author's surname. CCFV is the method's name. In this track, we provide 200 objects as query dataset and 405 objects as test dataset. Thus, these txt file should include 200 row and 405 column. Each row represents the retrieval result of one query. The first column represents query model and the rest of columns represents the retrieval results. Each result separated by Spaces.
You can download this dataset from here. You can find a standard retrieval result file in here.
Last Update: 07/09/2016