You are on page 1of 12

2

ARToolKit as Starting Point


• ARToolKit - Free tracking library
Augmented Reality – Library for vision-based AR applications
AR Library - ARToolKit Tutorials • Open Source(C language), multi-platform(SGI IRIX, PC Linux, PC
Windows)
– Overlays 3D virtual objects on real markers
• Uses single tracking marker
박종승 • Determines camera pose information (6 DOF)
Dept. of CSE, Univ. of Incheon – Includes utilities for marker-based interaction
jong@incheon.ac.kr
http://ecl.incheon.ac.kr/ – ARToolKit Website
• http://www.hitl.washington.edu/artoolkit/

• Limitations
– Monocular camera setup을 사용하므로 3D 측정이 불가능함
– Lighting condition과 Marker의 재질에 민감하게 반응할 수 있음

ecl.incheon.ac.kr Augmented Reality - Spring 2007 Univ. of Incheon, CSE

3 4
Installation Introduction
• Requirement • 단계
– QuickCam (Video For Windows driver를 지원해야 함)
– GLUT OpenGL interface library: GLUT 3.6 이상
• Free download from http://reality.sgi.com/opengl/glut3/
• 설치
– 원하는 ARToolkit directory에 압축을 풀면 됨
– subdirectory: bin, examples, include, lib, patterns, util input video thresholded video virtual overlay
• 준비 – Live video image를 binary image로 변환 (lighting threshold value 사용)
– patterns/pattXXX.pdf 파일을 출력하여 얇고 딱딱한 카드에 부착 – binary image에서 모든 square region 탐색
• 실행 – 각 square region 내의 pattern을 capture하여 pre-trained pattern
– bin/simpleTest.exe 또는 bin/simpleTest2.exe를 실행 template들과 match를 시도하여 실제 marker인지를 결정
• 조명에 민감할 수 있음: threshold 값(default 100)을 0~255사이에서 – 일단 한 marker를 찾으면, 알려진 square size와 pattern orientation을
적절히 조절 사용하여 (marker에 상대적인) real video camera의 position을 계산함
• ESC key로 종료됨 (frame rate 정보를 출력함) – 실제 camera를 계산하면, virtual object들을 image에 draw할 수 있음

Augmented Reality - Spring 2007 Univ. of Incheon, CSE Augmented Reality - Spring 2007 Univ. of Incheon, CSE
5 6
System Flow ARToolKit Applications
• Tangible Interaction - ARToolKit supports physically based interaction

• Face to face collaboration, Remote conferencing

Augmented Reality - Spring 2007 Univ. of Incheon, CSE Augmented Reality - Spring 2007 Univ. of Incheon, CSE

7 8
ARToolkit Outline Pose & Position Estimation
• 1. Mathematical & Algorithm Background • Coordinate Systems
– Pose & Position Estimation
– Rectangle Extraction
• 2. Implementation
– Camera Calibration
– Pose Estimation
– Background Video Stream Display

Augmented Reality - Spring 2007 Univ. of Incheon, CSE Augmented Reality - Spring 2007 Univ. of Incheon, CSE
9 10
Coordinate Transformations (1/3) Coordinate Transformations (2/3)
• 1. Relation between marker and camera (rotation & translation) • 3. Relation between ideal and observed screen coordinates (image
distortion parameters)
⎡ X C ⎤ ⎡ R11 R12 R13 T1 ⎤ ⎡ X M ⎤ ⎡XM ⎤
⎢ Y ⎥ ⎢R R23 T2 ⎥⎥ ⎢⎢ YM ⎥⎥ ⎢Y ⎥
⎢ C ⎥ = ⎢ 21 R22 ⎢ M ⎥ d 2 = ( xI − x0 ) 2 + ( yI − y0 )2 (x0,y0): center coordinates of distortion
= TCM
⎢ Z C ⎥ ⎢ R31 R32 R33 T3 ⎥ ⎢ Z M ⎥ ⎢ ZM ⎥ p = 1 − fd 2 f: distortion factor
⎢ ⎥ ⎢ ⎥⎢ ⎥ ⎢ ⎥
⎣ 1 ⎦ ⎣ 0 0 0 1 ⎦⎣ 1 ⎦ ⎣ 1 ⎦ xO = p( xI − x0 ) + x0 , yO = p( yI − y0 ) + y0

• 2. Relation between camera and ideal screen (perspective projection)

⎡X ⎤ ⎡ XC ⎤
⎡ hX I ⎤ ⎡ sf x 0⎤ ⎢ C ⎥ ⎢ Y ⎥
0 xc
⎢ hY ⎥ = ⎢ 0 ⎥ Y
0⎥ ⎢ C ⎥ = C ⎢ C ⎥ C: camera parameters
⎢ I⎥ ⎢ sf y yc
⎢ ZC ⎥ ⎢ ZC ⎥
⎢⎣ h ⎥⎦ ⎢ 0 0 ⎦⎥ ⎢ ⎥ ⎢ ⎥
⎣ 0 1
⎣ 1 ⎦ ⎣ 1 ⎦

Augmented Reality - Spring 2007 Univ. of Incheon, CSE Augmented Reality - Spring 2007 Univ. of Incheon, CSE

11 12
Coordinate Transformations (3/3) Pose & Position Estimation (1/2)
• 4. Scaling parameters for size adjustment • What is pose & position estimation?
– 1. Marker coordinates: (Xm,Ym,Zm)
• ↕ Known if Tmc is given
– 2. Camera coordinates
• ↕ Known
– 3. Ideal screen coordinates
• ↕ Known
– 4. Observed screen coordinates: (xO,yO)
– Implementation of image distortion parameters
• How to get TCM?

Augmented Reality - Spring 2007 Univ. of Incheon, CSE Augmented Reality - Spring 2007 Univ. of Incheon, CSE
13 14
Pose & Position Estimation (2/2) Rectangle Extraction
• How to get TCM? : Search TCM by minimizing error (iterative optimization) • Steps
⎡ X Mi ⎤ – 1. Thresholding, Labeling, Feature Extraction (area, position)
⎡ hxˆi ⎤ ⎢Y ⎥ – 2. Contour Extraction
⎢ ˆ ⎥ ⎢ Mi ⎥ , i = 1, 2,3, 4
⎢ hyi ⎥ = CTCM ⎢ Z Mi ⎥
– 3. Four straight lines fitting
⎢⎣ h ⎥⎦ ⎢ ⎥ • Little fitting error -> Rectangle
⎣ 1 ⎦ – [This method is very simple. Then it works very fast.]
∑{ }
1
err = ( xi − xˆi )2 + ( yi − yˆi )2
4 i =1,2,3,4

• How to set the initial condition for optimization process


– Geometrical calculation based on 4 vertices coordinates
• Independent in each image frame: Good feature
• Unstable result (Jitter occurs): Bad feature
– Use of information from previous image frame
• Needs previous frame information
• Cannot use for the first frame
• Stable results (This does not mean accurate results)
– [ARToolkit supports both. See ‘simpleTest2’.]

Augmented Reality - Spring 2007 Univ. of Incheon, CSE Augmented Reality - Spring 2007 Univ. of Incheon, CSE

15 16
Camera Calibration Accurate two-step method (1/3)
• Camera Parameters • Using dot pattern and grid pattern
– Perspective projection matrix
– Image distortion parameters

• ARToolkit has two methods for camera calibration


– Accurate two-step method
– Easy one-step method
• 2 step method
– Step 1: Getting distortion parameters
– Step 2: Getting perspective projection parameters

Augmented Reality - Spring 2007 Univ. of Incheon, CSE Augmented Reality - Spring 2007 Univ. of Incheon, CSE
17 18
Accurate two-step method (2/3) Accurate two-step method (3/3)
• Step 1: Getting distortion parameters: ‘calib_dist’ • Step 2: Getting perspective projection matrix: ‘calib_cparam’

selecting dots with mouse getting distortion parameters


by automatic line-fitting Manual line-fitting

– Take pattern pictures as large as possible


– Slant in various directions with big angle
– 4 times or more

Augmented Reality - Spring 2007 Univ. of Incheon, CSE Augmented Reality - Spring 2007 Univ. of Incheon, CSE

19 20
Easy one-step method: ‘calib_camera2’ Camera Parameter Implementation
• Same operation as ‘calib_dist’ • Camera parameter structure
• Getting all camera parameters including distortion parameters and typedef struct {
int xsize, ysize;
perspective projection matrix double mat[3][4];
• Not require careful setup double dist_factor[4];
} ARParam;
• Accuracy is good enough for image overlay
– [But, Not good enough for 3D measurement.] • Adjust camera parameter for the input image size
int arParamChangeSize(ARParam *source,
int xsize, int ysize, ARParam *newparam);

• Read camera parameters from the file


int arParamLoad(char *filename, int num, ARParam *param, …);

Augmented Reality - Spring 2007 Univ. of Incheon, CSE Augmented Reality - Spring 2007 Univ. of Incheon, CSE
21 22
Notes on Image Processing (1/2) Notes on Image Processing (2/2)
• Image size for marker detection • Accuracy vs. Speed on pattern identification
– AR_IMAGE_PROC_IN_FULL – Pattern normalization takes much time
• Full size images are used for marker detection
– This is a problem in use of many markers
• Taking more time but accuracy is better
• Not good for interlaced images • Normalization process
– AR_IMAGE_PROC_IN_HALF
resolution
• Re-sampled half size images are used for marker detection normalization convert
• Taking less time but accuracy is worse
• Good for interlaced images
– External variable: arImageProcMode in ‘ar.h’
– Default value: DEFAULT_IMAGE_PROC_MODE in ‘config.h’
• In ‘config.h’:
• Use of tracking history #define AR_PATT_SAMPLE_NUM 64
– Marker detection
#define AR_PATT_SIZE_X 16
• With tracking history: arDetectMarker(); #define AR_PATT_SIZE_Y 16
• Without tracking history: arDetectMarkerLite();
– How to use tracking history identification accuracy speed
• Error correction of pattern identification large size good slow
• Lost marker insertion small size bad fast

Augmented Reality - Spring 2007 Univ. of Incheon, CSE Augmented Reality - Spring 2007 Univ. of Incheon, CSE

23 24
Pose and Position Estimation Background Video Display
• Two types of initial condition • Texture mapping vs. glDrawPixels()
– 1. Geometrical calculation based on 4 vertices in screen coordinates – Performance depends on HW and OpenGL driver
double arGetTransMat(ARMarkerInfo *marker_info, • Mode
double center[2], double width, double conv[3][4]);
– external variable: argDrawMode in ‘gsub.h’
– 2. Use of information from previous image frame – #define DEFAULT_DRAW_MODE in ‘config.h’
double arGetTransMatCont(ARMarkerInfo *marker_info, • AR_DRAW_BY_GL_DRAW_PIXELS
double prev_conv[3][4],
double center[2], double width, double conv[3][4]); • AR_DRAW_BY_TEXTURE_MAPPING
– [See ‘simpleTest2.c’]
⎡ X Mi ⎤ – Note: glDrawPixels() does not compensate image distortion
⎡ hxˆi ⎤ ⎢Y ⎥
⎢ ˆ ⎥ ⎢ Mi ⎥ , i = 1, 2,3, 4
• Use of estimation accuracy ⎢ hyi ⎥ = CTCM ⎢ Z Mi ⎥ – [See ‘examples/test/graphicsTest.c’ and ‘modeTest’]
– arGetTransMat() minimizes the ‘err’ ⎢⎣ h ⎥⎦ ⎢ ⎥
⎣ 1 ⎦
∑{ }
• It returns this minimized ‘err’ 1
• If ‘err’ is still big,
err = ( xi − xˆi )2 + ( yi − yˆi )2
4 i =1,2,3,4
– →Miss-detected marker
– 원인: Use of camera parameters by bad calibration

Augmented Reality - Spring 2007 Univ. of Incheon, CSE Augmented Reality - Spring 2007 Univ. of Incheon, CSE
25 26
ARToolkit Library Functions (1/2) ARToolkit Library Functions (2/2)
• ARToolkit library • ARToolkit library
– libAR : 주요 함수들. marker tracking, calibration, parameter 계산 등 – libAR
• arMalloc, arInitCparam, arLoadPatt, arDetectMarker, arDetectMarkerLite,
– libARMulti : multi-pattern 함수들. libAR을 확장함. arGetTransMat, arGetTransMatCont, arGetTransMat2, arGetTransMat3,
arGetTransMat4, arGetTransMat5, arFreePatt, arActivatePatt, arDeactivatePatt,
– libARvideo : video frame을 capture하는 video 관련 함수. Microsoft arSavePatt, arUtilMatInv, arUtilMatMul, arUtilMat2QuatPos, arUtilQuatPos2Mat,
arUtilTimer, arUtilTimerReset, arUtilSleep, arLabeling, arGetImgFeature,
Vision SDK의 video capture 함수들을 사용함. arDetectMarker2, arGetMarkerInfo, arGetCode, arGetPatt, arGetLine, arGetContour,
arModifyMatrix, arGetAngle, arGetRot, arGetNewMatrix, arGetInitRot
– libARgsub : OpenGL utilities. OpenGL과 GLUT library에 기반한
– libARMulti
graphics 함수들 • arMultiReadConfigFile, arMultiGetTransMat, arMultiActivate, arMultiDeactivate,
– libARgsubUtil : libARgsub에 추가된 것 arMultiFreeConfig
– libARvideo
• arVideoDispOption, arVideoOpen, arVideoClose, arVideoCapStart, arVideoCapStop,
• Library 계층 구조 arVideoCapNext, arVideoGetImage, arVideoInqSize, ar2VideoDispOption,
ar2VideoOpen, ar2VideoClose, ar2VideoCapStart, ar2VideoCapNext,
ar2VideoCapStop, ar2VideoGetImage, ar2VideoInqSize
– libARgsub
libARMulti libARgsubUtil • argInit, argLoadHMDparam, argCleanup, argSwapBuffers, argMainLoop,
libARgsub argDrawMode2D, argDraw2dLeft, argDraw2dRight, argDrawMode3D, argDraw3dLeft,
libARvideo argDraw3dRight, argDraw3dCamera, argConvGlpara, argConvGLcpara, argDispImage,
libAR argDispHalfImage, argDrawSquare, argLineSeg, argLineSegHMD, argInqSetting,
– libARgsubUtil
• argUtilCalibHMD

Augmented Reality - Spring 2007 Univ. of Incheon, CSE Augmented Reality - Spring 2007 Univ. of Incheon, CSE

27 28
Basic Structures Basic Functions in ‘ar.h’
• Detect된 marker의 정보들은 ARMarkerInfo 구조체에서 정의됨 (ar.h) • See ‘ar.h’!
typedef struct {
int area;
int id; //marker identity number • Load initial parameters and trained patterns:
int dir; int arInitCparam( ARParam *param );
double cf; //confidence value (0.0~1.0) that the marker int arLoadPatt( char *filename );
//has been correctly identified.
double pos[2]; //center of the marker in ideal screen coords
double line[4][3]; //line eq for the 4 sides of the marker • Detect markers and camera position:
//in ideal screen coords. int arDetectMarker( ARUint8 *dataPtr, int thresh,
//Three values, line[X][0/1/2], are the a,b,c in ARMarkerInfo **marker_info, int *marker_num );
//the line equation ax+by+c=0. int arDetectMarkerLite( ARUint8 *dataPtr, int thresh,
double vertex[4][2]; //position of 4 marker vertices ARMarkerInfo **marker_info, int*marker_num );
//in ideal screen coords int arGetTransMat( ARMarkerInfo *marker_info,
} ARMarkerInfo; double pos3d[4][2], double trans[3][4] );
int arSavePatt( ARUint8 *image,
ARMarkerInfo *marker_info, char *filename );

Augmented Reality - Spring 2007 Univ. of Incheon, CSE Augmented Reality - Spring 2007 Univ. of Incheon, CSE
29 30
Basic Functions in ‘video.h’ Sample Patterns
• See ‘video.h’! • SampPatt1, SampPatt2, hiroPatt, kanjiPatt

• Commonly used:
int arVideoOpen( void );
int arVideoClose( void );
int arVideoInqSize( int *x, int *y );
unsigned char *arVideoGetImage( void );

Augmented Reality - Spring 2007 Univ. of Incheon, CSE Augmented Reality - Spring 2007 Univ. of Incheon, CSE

31 32
AR Application의 개발 AR Application Code: main
• Application의 작성 단계
#include <windows.h>/<stdio.h>/<stdlib.h>
– 1. video path를 초기화, marker pattern 파일과 camera parameter를 #include <GL/gl.h>/<GL/glut.h>
읽기 #include <AR/gsub.h>/<AR/video.h>/<AR/param.h>/<AR/ar.h>
– *2-5를 반복 int main(int argc, char **argv){
– 2. video 입력 frame을 grab init(); //initialize video path, read marker and
// camera parameters, setup graphics window
– 3. video 입력 frame에서 marker를 detect하고 pattern을 인식 arVideoCapStart(); //start video image capture
– 4. detect된 pattern에 상대적인 camera 변환을 계산 argMainLoop( NULL, keyEvent, mainLoop ); //start the loop
//keyEvent: keyboard event function
– 5. detect된 pattern에 virtual object를 draw //mainLoop: main graphics rendering function
– 6. video path를 close //prototype defined in lib/Src/Gl/gsub.c
return (0);
}
• Steps and corresponding functions:
– 1. Initialize the application: ‘init’
– 2. Grab a video input frame: ‘arVideoGetImage’
– 3. Detect the markers: ‘arDetectMarker’
– 4. Calculate camera transformation: ‘arGetTransMat’
– 5. Draw the virtual objects: ‘draw’
– 6. Close the video path down: ‘cleanup’

Augmented Reality - Spring 2007 Univ. of Incheon, CSE Augmented Reality - Spring 2007 Univ. of Incheon, CSE
33 34
AR Application Code: init AR Application Code: mainLoop (1/2)
int xsize, ysize; static void mainLoop(void){
char *vconf = "flipV,showDlg"; //video configuration static int contF = 0;
//see video.h for supported params ARUint8 *dataPtr;
char *cparam_name = "Data/camera_para.dat"; ARMarkerInfo *marker_info;
ARParam cparam; int marker_num;
char *patt_name = "Data/patt.hiro"; int j, k;
int thresh = 100, count = 0, mode = 1, patt_id, patt_width = 80.0; /* grab a vide frame */
double patt_center[2] = {0.0, 0.0}, patt_trans[3][4]; if( (dataPtr = (ARUint8 *)arVideoGetImage()) == NULL ) {
arUtilSleep(2);
return;
static void init( void ) { }
ARParam wparam; if( count == 0 ) arUtilTimerReset();
/* open the video path */ count++;
if( arVideoOpen( vconf ) < 0 ) exit(0); argDrawMode2D();
/* find the size of the window */ argDispImage( dataPtr, 0,0 );
if( arVideoInqSize(&xsize, &ysize) < 0 ) exit(0); /* detect the markers in the video frame */
printf("Image size (x,y) = (%d,%d)\n", xsize, ysize); if( arDetectMarker(dataPtr, thresh,
/* set the initial camera parameters */ &marker_info, /*a list of marker structures*/
if( arParamLoad(cparam_name, 1, &wparam) < 0 ) { &marker_num /*num of detected markers*/
printf("Camera parameter load error !!\n"); exit(0); } ) < 0 ) {
arParamChangeSize( &wparam, xsize, ysize, &cparam ); cleanup();
arInitCparam( &cparam ); exit(0);
printf("*** Camera Parameter ***\n"); }
arParamDisp( &cparam ); arVideoCapNext();
if( (patt_id=arLoadPatt(patt_name)) < 0 ) { //load trained markers
printf("pattern load error !!\n"); exit(0); } /*next page*/
/* open the graphics window */
argInit( &cparam, 1.0, 0, 0, 0, 0 );
}

Augmented Reality - Spring 2007 Univ. of Incheon, CSE Augmented Reality - Spring 2007 Univ. of Incheon, CSE

35 36
AR Application Code: mainLoop (2/2) AR Application Code: draw
/*continue*/ static void draw( double trans[3][4] ){
/*pick the highest confidence value marker*/ double gl_para[16];
k = -1; GLfloat mat_ambient[] = {0.0, 0.0, 1.0, 1.0};
for( j = 0; j < marker_num; j++ ) { GLfloat mat_flash[] = {0.0, 0.0, 1.0, 1.0};
GLfloat mat_flash_shiny[] = {50.0};
if( patt_id == marker_info[j].id ) { GLfloat light_position[] = {100.0,-200.0,200.0,0.0};
if( k == -1 ) k = j; GLfloat ambi[] = {0.1, 0.1, 0.1, 0.1};
else if( marker_info[k].cf < marker_info[j].cf ) k = j; GLfloat lightZeroColor[] = {0.9, 0.9, 0.9, 0.1};
} argDrawMode3D();
} argDraw3dCamera( 0, 0 );
if( k == -1 ) { glClearDepth( 1.0 );
contF = 0; glClear(GL_DEPTH_BUFFER_BIT);
argSwapBuffers(); glEnable(GL_DEPTH_TEST);
return; glDepthFunc(GL_LEQUAL);
} /* load the camera transformation matrix */
argConvGlpara(trans, gl_para);
/* get the transformation between the marker and the real camera */ glMatrixMode(GL_MODELVIEW);
if( mode == 0 || contF == 0 ) { glLoadMatrixd( gl_para );
arGetTransMat(&marker_info[k], /* draw */
patt_center, patt_width, patt_trans); glEnable(GL_LIGHTING);
} else { glEnable(GL_LIGHT0);
arGetTransMatCont(&marker_info[k], glLightfv(GL_LIGHT0, GL_POSITION, light_position);
patt_trans, patt_center, patt_width, patt_trans); glLightfv(GL_LIGHT0, GL_AMBIENT, ambi);
} glLightfv(GL_LIGHT0, GL_DIFFUSE, lightZeroColor);
/*real camera position & orientation relative to marker i are glMaterialfv(GL_FRONT, GL_SPECULAR, mat_flash);
in the 3x4 matrix patt_trans */ glMaterialfv(GL_FRONT, GL_SHININESS, mat_flash_shiny);
contF = 1; glMaterialfv(GL_FRONT, GL_AMBIENT, mat_ambient);
glMatrixMode(GL_MODELVIEW);
glTranslatef( 0.0, 0.0, 25.0 );
draw( patt_trans ); glutSolidCube(50.0);
argSwapBuffers(); glDisable( GL_LIGHTING );
} glDisable( GL_DEPTH_TEST );
Augmented Reality - Spring 2007 Univ. of Incheon, CSE } Augmented Reality - Spring 2007 Univ. of Incheon, CSE
37 38
AR Application Code: keyEvent, mouseEvent, cleanup Recognizing different patterns (1/2)
static void keyEvent( unsigned char key, int x, int y){ • marker objects file
/* quit if the ESC key is pressed */
if( key == 0x1b ) { – 인식할 marker object들에 대한 정보를 명시함
printf("*** %f (frame/sec)\n", (double)count/arUtilTimer());
cleanup(); – 해당 marker의 pattern 파일도 명시함
exit(0);
} Format Example
Name #pattern 1
if( key == 'c' ) { Pattern Recognition File Name cone
printf("*** %f (frame/sec)\n", (double)count/arUtilTimer()); Width of tracking marker Data/hiroPatt
count = 0; 80.0
mode = 1 - mode; • 새로운 pattern을 만드는 방법
if( mode ) printf("Continuous mode: Using arGetTransMatCont.\n");
else printf("One shot mode: Using arGetTransMat.\n"); – 1. patterns/blankPatt.gif(까만 square가 있고 그 안에 빈 흰색 square가
} 있음) 를 출력
}
– 2. 원하는 흑백/칼라 패턴을 만들어 이 이미지의 안쪽 흰 square에 배치함
static void mouseEvent(int button, int state, int x, int y){
} • 좋은 pattern은 asymmetric하고, 정교한 것들이 없어야 함
/* cleanup function called when program exits */ – 3. 새 pattern을 만든 후 그 파일을 bin으로 옮기고
static void cleanup(void) { bin/mk_patt(소스코드는 util/mk_patt.c)를 실행
arVideoCapStop(); /*stop the video processing */
arVideoClose(); /*close down the video path */ • camera parameter filename을 물으면 이를 입력
argCleanup();
}

Augmented Reality - Spring 2007 Univ. of Incheon, CSE Augmented Reality - Spring 2007 Univ. of Incheon, CSE

39 40
Recognizing different patterns (2/2) Camera Calibration Utility
– (참고) possible sample patterns • Generate a camera parameter file for a specific camera that is being used.
– 1. 두 패턴을 출력
• patterns/calib_dist.pdf (6x4 dot pattern, 각 dot간의 거리가 40mm가
되도록 출력해야 함)
• patterns/calib_cpara.pdf (line들의 grid, 각 line간의 거리가 40mm가
되도록 출력해야 함)

– 4. mk_patt가 실행되면, Video window가 열림


• train되도록 할 pattern을 평평한 표면에 붙이고 ,조명 조건을 원하는
환경으로 조절한 후, 화면에 보이도록 함
• 카메라가 화면 바로 위에서 아래로 수직으로 내려보도록 함
• red/green square가 패턴 주위로 보여질 때까지 카메라를 조절함
• red corner가 비디오 이미지에서 top left corner에 오도록 카메라를
회전하여 조절
• 조절이 끝나면 left mouse button을 click – 2. bin/calib_dist를 실행하여 카메라 이미지의 center point와 lens
– pattern filename을 물으면 파일 이름(bitmap image file)을 입력 distortion을 계산 – calib_dist.pdf 이미지를 이용
• 다른 pattern들에 대해서 계속 반복할 수 있고, 프로그램 종료를 – 3. bin/calib_cparam을 실행하여 camera focal length와 그 외의 카메라
원하면 right mouse button을 click하면 됨 parameter를 계산 – calib_cpara.pdf를 이용

Augmented Reality - Spring 2007 Univ. of Incheon, CSE Augmented Reality - Spring 2007 Univ. of Incheon, CSE
41 42
Camera Calibration Utility: calib_dist (1/2) Camera Calibration Utility: calib_dist (2/2)
• calib_dist.pdf 이미지의 6x4 dot 간격을 측정하여 lens distortion을 계산 – 3. 위의 2번 과정을 이미지 5~10개에 대해서 반복함.
– 1. 모든 점들이 보이도록 카메라를 조정한 후 left button을 click하여 • 각 이미지의 각도나 위치가 다르도록 해야 함
비디오를 freeze시킴 • 더 많은 이미지를 반복할 수록 더 정확한 calibration을 얻을 수 있음
– 2. 까만 사각형을 left drag하여 각 dot의 위치에 맞춤 • 5~10개의 이미지를 반복한 후에 right button click을 하여 image
• dot의 순서는 top left 모서리 dot부터 시작하여야 하고 다음의 순서를 capture를 중지함
따라야 함
– center position과 camera distortion값을 계산함
– 까만 사각형을 움직이면 dot을 찾아 red cross로 그 중심을
표시함 – 시간이 소요됨; 출력 결과를 기록해 두어야 함
– 4. (선택) 결과값이 정확한지를 확인하려면 left click을 누름
• 첫번째 grab된 이미지에 각 dot들을 지나는 red line들을 draw함
• left click을 반복하면 다음 grab된 이미지들에 대해서 보여줌

– 24개 dot을 모두 찾았으면 다시 left button을 click함


» dot의 위치들을 저장하고, video를 unfreeze시킴
– 참고: 오른쪽 button을 click하면 입력된 값을 discard하고 – 5. 결과가 만족스러우면 right click하여 종료하고, calib_cparam을
video를 unfreeze함 실행시킴
Augmented Reality - Spring 2007 Univ. of Incheon, CSE Augmented Reality - Spring 2007 Univ. of Incheon, CSE

43 44
Camera Calibration Utility: calib_cparam (1/3) Camera Calibration Utility: calib_cparam (2/3)
• calib_cparam은 패턴 calib_cparam.pdf(7개의 수평선과 9개의 수직선)를 • 16개의 선(7개의 수평선, 9개의 수직선)을 위에서 아래로, 왼쪽에서
사용하여 camera focal length와 기타 parameter들을 계산함 오른쪽으로 수행함
– 1. 이전에서 계산한 center 좌표 (X,Y)와 distortion ration을 입력함. live
video window가 나타남
– 2. 카메라를 움직여 패턴을 위에서 수직 아래로 보도록 하고 모든 grid
line들이 포함되면서 동시에 최대로 보이도록 최대로 가까이 위치시킴
– 3. left click하여 이미지를 grab함
• 흰색 수평선이 이미지 위에 표시됨
• up/down 화살표키를 사용하여 위/아래로 이동시키고, left/right키를
사용하여 시계/반시계 방향으로 회전시켜서, 흰색 수평선을 가장 – 4. 한 이미지에 대해서 위의 3번 과정을 수행한 후, 카메라와 grid
위의 grid line과 최대로 일치시킴 패턴과의 거리가 100mm가 되도록 하여 3번 과정을 다시 수행함
• (주의: 카메라가 수직으로 패턴을 보도록 하는 것은 계속 유지시켜야
– 일치시킨 후 enter key를 누름 함)
» white line이 blue로 바뀌고 다른 white line이 생김 • 100mm씩 더 멀리하여 3번 과정을 다시 수행하는 작업을 5번 반복
• 모든 7개의 수평선들에 대해서 반복함 수행함 (5번째에서 패턴과의 거리가 500mm가 됨)
– 5. 5번 반복 후 camera parameter들이 자동 계산됨
• 모든 수평선에 대해서 반복한 후에는, 수직선이 흰색으로 나타나고 • parameter를 저장할 filename을 입력함
모든 9개의 수직선에 대해서 가장 왼쪽에서부터 오른쪽의 순서로 – 6. 저장 후, right click하여 종료함
반복함 • 이 파일은 ‘camera_para.dat’ 파일과 같이 프로그램에서 동일하게
사용할 수 있음 -> tracking 결과가 더 좋아질 것임
Augmented Reality - Spring 2007 Univ. of Incheon, CSE Augmented Reality - Spring 2007 Univ. of Incheon, CSE
45 46
Camera Calibration Utility: calib_cparam (3/3) Limitations
• 참고: grid line의 간격이 40mm이고, pattern을 100mm씩 이동하였음 • marker 전체가 온전히 보여야 함
– 40: current distance, 100: distance the pattern should be moved back – 일부가 가려진 경우에는 안됨
from the camera each time – 큰 virtual object를 삽입하기가 곤란함
– util/calib_cparam/calib_cparam_sub.c 안에서 고정된 값임 • pattern이 작으면 카메라에서 조금만 멀어지면 detect가 되지 않음
– 수정하고 싶은 경우, 다음 코드의 40,100을 변경하면 됨 pattern size Usable range
inter_coord[k][j][i+7][0] = 40.0*i; (inches) (inches)
inter_coord[k][j][i+7][1] = 40.0*j; 2.75 16
inter_coord[k][j][i+7][2] = 100.0*k; 3.50 25
4.25 34
– 5회의 반복 횟수를 수정하고 싶은 경우, 다음 코드의 5를 수정하면 됨 7.37 50
*loop_num = 5;
• 복잡한 모양의 pattern은 detect가 어려움
– 큰 흑백 영역들로 구성된 단순한 모양의 pattern이 잘 됨
• 4.25inch크기의 pattern을 복잡한 모양으로 바꾸면 tracking 거리가
34inch에서 15inch로 감소함
• marker의 기울어짐이 큰 경우에 detect가 어려움
• 조명 조건이 detect에 영향을 미침
– marker의 빛 반사(reflection, glare spots)가 detect를 어렵게 함
– non-reflective material을 사용 (예: 흰 바탕천에 까만 velvet
fabric(paper)을 부착)
Augmented Reality - Spring 2007 Univ. of Incheon, CSE Augmented Reality - Spring 2007 Univ. of Incheon, CSE

47
References
• Inside ARToolKit, Slides by Hirokazu Kato (Hiroshima City University)
• ARToolKit Manual (version 2.33), November 2000

Augmented Reality - Spring 2007 Univ. of Incheon, CSE

You might also like