I Introduction
Phaseshift approach[1][2] encodes the projector pixel coordinates with
the phase of projected sinusoidal fringe patterns. When this
pattern is projected onto the scene of interest it codifies the
scene as well using the phase of the incident sinusoidal
signal. Camera captures the scene with projected patterns.
Recovery of phase information at all captured points in the
scene provides information regarding source projector
pixel(using phase value). Hence stereocorrespondence
between camera and projector pixels can be established. But
since the sinusoidal patterns are periodic in nature, the phase
value repeats after every (or a period) therefore binary coded patterns[3]
are used in addition to assist in recovering original phase at
each point in the captured scene by assigning unique period
number to each cycle of sinusoidal fringe[4]. This approach
eliminates the ambiguity due to periodicity of sinusoidal
fringes.
This combined approach has the advantage of high
accuracy of phaseshift technique and high robustness
against noise as of binarycoded pattern based technique[3].
Development work to realize this approach was initiated because in our knowledge there is no Opensource implementation of this technique which can be used as a starting point for accurate 3D measurements. Authors in [56] have developed such systems but in [5] there are no provisions for system calibration and the work found in [6] do not have documentation and is developed only for Windows platform.
Furthermore, the performance of a metrology equipment has been conventionally measured in terms of its measurement accuracy and precision(or repeatability). Recently, there have been many attempts to evaluate performance of another popular 3D sensor Microsoft Kinect[710]. Same methods can be applied to evaluate performance of our developed system. But these approaches assess measurement accuracy either with respect to 3D data obtained from laser scanner[7] which requires accurate calibration of laser scanner itself, or confirm to VDI/VDE 2634 standard which requires accurate fabrication of sphere balls and hexagonal structures which again requires sophisticated fabrication[9]. To avoid such overheads, this paper reports a simple and straightforward method for assessing measurement accuracy. Further, we report the precision of the developed system.
In section II, process for estimating stereo correspondence using Coded phaseshift technique is described followed by description of process of system calibration and finally, triangulation. In section III, approach used for evaluation of measurement accuracy and precision has been described. Section IV concludes the paper.
Ii Development
A structuredlight 3D scanner based on optical
triangulation(shown in Figure 1) needs to determine the
correspondence between camera and projector pixels. This
goal is achieved by projection of a known pattern on object
of interest. This pattern(e.g. light stripes in Figure 1) assign a
unique code to each point on the surface of object. Camera
captures a view of scene and recovers this code at each pixel.
This process indirectly relates a projector column(strip
number in Figure 1) or row or a unique combination of row and
column to a camera pixel.
Once the correspondence is known, equations for optical
rays emanating from any corresponding pair of camera and
projector pixels needs to be known in real world units, for
which system calibration is performed.
Given stereocorrespondence and system calibration
information, optical triangulation can be performed to
compute 3D coordinates for each realworld point seen by both
camera and projector.
Iia System setup
In this work, we have used Logitech Quickcam Sphere AF webcam at 1600X1200 resolution, Sharp PGF200X projector at 1024X768 resolution. Figure 2 shows our system setup.
IiB System calibration
Camera and projector calibration is a process of estimating their intrinsic(i.e. focal length:(), principlepoint:()) and extrinsic geometry(i.e. rotation:(), translation:())[1113].
In this work we have used OpenCV library for camera calibration and pose estimation algorithms for intrinsic and extrinsic calibration respectively. For projector intrinsic calibration we have used VPCLib[14] since we have experimentally observed unacceptably lower repeatability of OpenCV calibration algorithm when used for projector.
IiB1 Camera calibration
A checkerboard of known dimension is shown at different distances and orientations with respect to camera. Figure 10 shows the plot of the views used for camera calibration. This process gives sample 2D image point and corresponding 3D points. This information is used to estimate calibration parameters using OpenCV method. Table I shows the estimated calibration parameters for camera.
IiB2 Projector calibration
Projector can be modeled as inversecamera[15]. Projector calibration is performed with camera as a feedback device. Initially, cameratoscreen(or any planerboard)homography(projectivemapping between 2 planes) is computed. Then projector projects the checkerboard pattern. Camera captures the projected checkerboard pattern and detects its innercorners. Using cameratoscreen homography world coordinates for detected checkerboard corners are computed. This process gives set of 2D3D correspondences required for projector calibration. Thereafter, an identical procedure as of camera calibration can be applied for estimating projector calibration parameters. Figure 11 shows plot of the views used for projector calibration. Table I shows estimated calibration parameters for projector.
IiB3 Extrinsic cameraprojector calibration
To bring opticalray from a camerapixel and opticalray
emanating from corresponding projectorpixel to a common
coordinate system, relative rotation and translation between
camera and projector coordinate system needs to be known.
This is required for performing optical triangulation since
computed 3Dcoordinate is the intersection point of optical
rays from camera and projector, requiring both rays to be in
a common coordinate system.
Since intrinsic parameters are already known for both
projector and camera, similar procedure as used for
camera(and projector) calibration is applied but for a single
view of physical(and projected) checkerboard to get 2D3D
mappings for camera and projector separately. These
mapping are used to estimate rotation and translation
transformation for camera and projector coordinate systems
with respect to world coordinate system. These are then
combined to get projectortocamera rotation and translation
parameters.
Parameter  Camera  Projector 

1362.2  2261.7  
1372.2  2262.8  
803.9  522.7  
590.1  713.8  
0.07  0.0  
0.14  0.0 
IiC Stereo correspondence
Once system intrinsic and extrinsic geometry is defined, stereocorrespondence which pairs the points in camera and projector viewing a common 3D point is estimated. Following subsections describe the working of modules for estimating stereocorrespondence.
IiC1 Pattern generation module
This module generates phaseshifted sinusoidal fringes and binary coded patterns. Equation 1 shows the relation used for generating the phaseshifted sinusoidal fringes.
(1)  
where, ,, represent 3 sinusoidal signals at any point represented by phase and successively shifted in phase by . models a biasing factor, models the modulation intensity of the sinusoid. Further, binarycoded patterns are designed such that width of a bitplane equals the width of a fringe(or one sinusoidal cycle).
IiC2 Pattern projection and capture module
This module sequentially projects and captures the sinusoidal phaseshifted and binarycoded patterns. Figure 3 shows some captured vertical and horizontal fringe and binary coded pattern images.
IiC3 Phase wrapping module
As already mentioned, periodic nature of sinusoidal leads to a recovered value of phase which repeats itself after period of . This makes correspondence ambiguous since multiple points with common phase value exists. This situation is shown in (2) where computed phase wrapsup after every interval. Hence, the computed phase is called wrapped phase. Figure 4 shows wrapped phase across the scene of interest.
(2) 
where,
IiC4 Phase unwrapping module
Wrapped phase computed in (2) repeats its value after every 2 interval thereby making discrimination between pixels from different intervals nontrivial. This problem can be resolved by assigning unique period number to each sinusoidal cycle(or interval). As described earlier, binary coded patterns with width of a bitplane equal to the width of a fringe(or one sinusoidal cycle) are used for this purpose. Therefore the true phase of incident signal at any camera pixel can be represented as in equation 3. Since wrapped phase is unrolled/unwrapped by this process, computed phase in this process is called unwrapped phase. Figure 5 shows the corresponding vertical and horizontal unwrapped phase.
(3) 
where, represents the unwrapped phase at pixel (x,y) and C(x,y) represents the decoded binary code at pixel (x,y)
Typically, thresholding techniques are used to recover the codeword at each pixel from the captured images. Major issue in accurate codeword extraction is faced at edges of the strips in these patterns. Practically, this region has a smooth gradient instead of a hard edge, leading to ambiguity in codeword extraction in this region as mentioned in [1617].
IiC5 Absolute phase computation
Vertical unwrapped phase gives projector Xcoordinate() whereas horizontal unwrapped phase gives projector Ycoordinate() corresponding to camerapixel for a fringe width . Combining this information gives projector pixel coordinates corresponding to camera coordinates. Equation (4) explains this cameratoprojector coordinate mapping. Figure 6 shows one example of estimated stereocorrespondence.
(4)  
IiD Triangulation
System calibration parameters and cameraprojector pixeltopixel correspondence information can be used to compute the rayray intersections [18]. Solution to these equations will give the 3D coordinates of real world point with respect to world coordinate system. Figure 7 shows a example of 3D reconstructions obtained after solving these equation. However, nonlinear response of projector to input voltage was found to be adding waviness in the 3D reconstruction. This effect was also observed in [19].
Iii Accuracy Evaluation
To evaluate measurement accuracy of system, 3Ddistance between selected featurepoints in a planer checkerboard were measured and compared against their true values. The distance between cameraprojector baseline and measurement object was decided based on common depthoffield of cameraprojector system such that acceptably sharper details of projected patterns can be acquired. In our case it was . Figure 8 shows the system pose used for measurement experiment.
Four innercorners A,B,C,D (shown in Figure 9) were used to define lengths AB,BC,CD,DA,AC,BD. Measurement object was scanned 10 times to reduce the effect of nonsystematic errors in the measurements. Average percentage absolute relative error defined in equation (5) was used as a measure of accuracy of the system.
(5) 
where, denotes the absolute value of , is the true value for length measurement, is the corresponding estimated value using 3D scanner data. denotes the total number of measurements. To assess precision of the system, we determined average of % deviation of those 10 samples(6 measurements per sample) with respect to their mean values as defined in equation (6).
(6) 
where, denotes total number of length measurements, denotes total number of samples for length measurement, and denote the corresponding mean value and the sample respectively. Table II summarizes the observed results for measurement accuracy and precision for our system with measurement object at a distance of from cameraprojector baseline.
Metric  Value(in %) 

Measurement accuracy  0.61 
Precision  0.29 
Iv Conclusion
We have described a system for 3D scene reconstruction based on coded phase shift approach. Measurement accuracy and precision of system was evaluated and found to be within 1% of true and mean measurements respectively. Developed system is designed to be experimental in nature allowing modification in various structured light and system calibration parameters. This will provide us a platform to investigate the problems related with system calibration specifically the effect of relative pose of camera and projector used for calibration on its accuracy(technically, sensorplanning). In addition, it will allow us to study the problem of projectorcamera system nonlinearity which is resulting in waviness in the 3D scan results. Recently, [2021] have reported studies on effect of global illumination and projector defocus on accuracy of binary coded and phase shifting algorithms for stereo correspondence. These works have motivated our ongoing investigation of these issues in order to establish objective criteria for selecting a particular spatial frequency for binary coded and phase shifting patterns. This will reduce requirement of manual tweaking of system parameters to get optimal 3D measurement accuracy. Further, we have planned to extend the system to be able to do accurate 360 degree scans of the object.
Acknowledgment
Authors would like to thank Computer Division,Bhabha Atomic Research Centre technical staff and administration for providing them support and facilities to pursue this work.
References
 [1] S.Zhang and S. Yau, “Highresolution, realtime 3D absolute coordinate measurement based on a phase shifting method”, Opt. Express 14, 26442649 (2006).
 [2] S.Zhang. Highresolution, Realtime 3D Shape Measurement, PhD thesis,Stony Brook University,(2005)
 [3] Jason Geng, “Structuredlight 3D surface imaging: a tutorial”, Adv. Opt. Photon. 3, 128160 (2011)
 [4] Sansoni,G., Carocci,M., and Rodella,R.(1999). “Three Dimensional Vision Based on a Combination of GrayCode and PhaseShift Light Projection: Analysis and Compensation of the Systematic Errors,” Appl. Opt. 38, 65656573 .
 [5] ‘Structuredlight’Google code. http://code.google.com/p/structuredlight/. Accessed 5 December 2012
 [6] ‘openlight’Google code http://code.google.com/p/openlight/. Accessed 5 December 2012
 [7] Khoshelham, K., Elberink, S.O.(2012). Accuracy and Resolution of Kinect Depth Data for Indoor Mapping Applications. Sensors, 12, 14371454.
 [8] Molnar,B., Toth,C.K., and Detrekoi,A.(2012). ACCURACY TEST OF MICROSOFT KINECT FOR HUMAN MORPHOLOGIC MEASUREMENTS, Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XXXIXB3, 543547, doi:10.5194/isprsarchivesXXXIX B35432012
 [9] Boehm,J.(2012).NATURAL USER INTERFACE SENSORS FOR HUMAN BODY MEASUREMENT, Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XXXIXB3, 531536, doi:10.5194/isprsarchivesXXXIX B35312012
 [10] Chow,J.C.K.,Ang,K.D.,Lichti,D.D.,and Teskey,W.F. (2012).PERFORMANCE ANALYSIS OF A LOW COST TRIANGULATIONBASED 3D CAMERA: MICROSOFT KINECT SYSTEM. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XXXIXB5 XXII ISPRS Congress
 [11] Tsai,Roger Y. (1987) :A Versatile Camera Calibration Technique for HighAccuracy 3D Machine Vision Metrology Using OfftheShelf TV Cameras and Lenses,IEEE Journal of Robotics and Automation, Vol. RA3, No. 4,August 1987, pp. 323344.
 [12] Zhang,Z.(2000).A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(11):13301334, 2000
 [13] Camera Calibration Toolbox For Matlab http://www.vision.caltech.edu/bouguetj/calib_doc/index.h tml. Accessed:3 December 2012
 [14] J. Draréni, S. Roy, and P. Sturm, “Methods for Geometrical Video Projector Calibration”, Machine Vision and Applications, 2012.
 [15] Song Zhang and Peisen S. Huang.“Novel method for structured light system calibration”, Opt. Eng. 45(8), 083601 (August 21, 2006). ; http://dx.doi.org/10.1117/1.2336196
 [16] Trobina, Marjan. “Error model of a codedlight range sensor”. Technique Report, Communication Technology Laboratory (1995).
 [17] Qican Zhang, Xianyu Su, Liqun Xiang, Xuezhen Sun, 3 D shape measurement based on complementary Gray code light, Optics and Lasers in Engineering, Volume 50, Issue 4, April 2012, Pages 574579, ISSN 01438166, 10.1016/j.optlaseng.2011.06.024.
 [18] Brett R.,Jones(2010).AUGMENTING COMPLEX SURFACES WITH PROJECTORCAMERA SYSTEMS,M.S.Thesis,University of Illinois at Urbana Champaign
 [19] T. Hoang, B. Pan, D. Nguyen, and Z. Wang, “Generic gamma correction for accuracy enhancement in fringe projection profilometry”, Opt. Lett. 35, 19921994 (2010).

[20]
M. Gupta, S. K. Nayar, “Micro Phase Shifting”, cvpr, pp.813820, 2012 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2012
 [21] Gupta, Mohit and Agrawal, Amit and Veeraraghavan, Ashok and Narasimhan, SrinivasaG. “A Practical Approach to 3D Scanning in the Presence of Interreflections, Subsurface Scattering and Defocus”, International Journal of Computer Vision, Vol. 102,no. 13,pp. 3355,2013