Report - Microgeo

Transcript

Report - Microgeo
Measurement accuracy of Lidarbased SLAM systems
Kaarta Report 001
Abstract
We determine the accuracy of a lidar-based mapping system by determining differences
in point cloud measurements. We use two methods to determine accuracy. The first
method involves computing derived points and the second method compares point
clouds by first aligning the point clouds. The results are that the lidar-based SLAM
system does very well in comparison to stationary measurements with a high-end lidar.
September, 2016
Kaarta
5001 Baum Blvd, Suite 430, Pittsburgh, PA 15213
1
KAARTA
Table Of Contents
Overview..............................................................................................................................3
Background.........................................................................................................................3
Accuracy Specifications and Factory Calibration........................................................4
Assessing Accuracy....................................................................................................................4
A Comparison of Two Lidar-based Systems.................................................................5
Acquiring the Point Clouds........................................................................................................5
Derived-Point to Derived-Point Measurements......................................................................6
Comparing Point Clouds...........................................................................................................10
Summary............................................................................................................................14
References............................................................................................................................15
CopyrightKaarta2016
2
KAARTA
Overview
What is the accuracy of a 3D mapping system? Manufacturers often state the accuracy of the system is
the same as the sensor used to collect the data, but this is only part of the answer since subsequent
calculations affect the answer. Even the geometry and surfaces of the local environment can affect the
measurements.
Also, accuracy is often confused with precision. They aren’t the same. In this paper, we examine the
problem, define accuracy, and provide some answers.
Background
Real Earth has developed a mobile 3D mapping system that creates a map of its environment even while
moving continuously. It creates the map using only the range measurements it acquires along the way
and a low cost MEMS IMU. No GPS is required and there is no prior map or other environmental
infrastructure required. The accuracy of our point cloud-based 3D mapping system is determined by both
the accuracy and precision of the sensing device and by the software used to create the map. Since a
global 3D map is computed by incorporating a sequence of local maps, any noise or inaccuracy in the
data propagates into downstream calculations.
Figure1:TheStencilproductfromRealEarthisontheleftandtheFAROFocus3DX330ontheright.Relativesizeisnottoscale.
Therobustnessoftheprocessformergingpartiallyoverlappingpointcloudsdependsonthepresenceof
strongfeaturessuchasedgesintheenvironmenttoestablishcorrespondencesbetweenthepoint
clouds.Ifthesensorismovingwhileacquiringpointclouds,thealgorithmsusesensormeasurementsto
maptheenvironmentandlocatethesensorwithinthatmap.Abenefitofdeterminingthesensor
positionisdeterminingtheoutwarddirectionofthesurfacenormal.FortheRealEarthproducts,every
pointmeasurementinthecloudismadewiththesensorinadifferentpositionandorientation.This
processforanalyzingtherangedatatobuildamapanddeterminelocalizationisknownassimultaneous
localizationandmappingorSLAM.
CopyrightKaarta2016
3
KAARTA
Variouscalculationsandassumptionsaboutmeasurementerrorareincludedwhencreatingthe
resultantmap.Conditionsaffectingdatafromrealworldmeasurementsincludelighting,atmospheric
effects,environment,surfacematerialsandtextures,and,inthecaseoflidar,laserbeamdivergence.
Divergenceformsalargerspotatgreaterdistances.Ononehandtheresolutionandprecisionofthe
instrumentmaybeverygood;ontheotherhandthereisavastamountofdata,hundredsofthousands
ofpoints,whicharelow-passfilteredtomitigatethenoise.Theaccuracyspecificationoftheinstrument
isonlyastartingpointinunderstandingtheaccuracyofthemeasurementprocess.
Accuracy Specifications and Factory Calibration
Alidartypicallyproducesdataintheformat<angle><angle><range>.Theremayalsobereflectancedata
and,insomecases,achoiceofsignalorfirstreturn.Theseareusuallydifferent.Factorycalibration
definesthelocationandorientationofaninstrumentcoordinatesysteminsideofthelidarsensorhead.
Thedirectionandrangetoatargetinthatcoordinatesystemisdeterminedfromencoderpositionsthat
definethelaserbeamdirection.Ifthecalibrationisaccurate,scannedflatsurfaceswillbemeasuredto
beflat,andwewillseeaccuratemeasurementsofdistancesbetweenderivedpointssuchascorners,
radiiofcurvature,distancesbetweenparallelsurfaces,andtheanglesbetweenflatsurfaces,etc.
However,rangenoisecanpropagateintofittedparameterssuchassurfaceorientationorsurface
curvature.This,inturn,leadstouncertaintyintheresult.Theresolutionneededtodetectafeatureisa
separatetopic.
Factorycalibrationisstatic,thoughthesensormaybemovedbetweenmeasurements.SincetheReal
Earthproductprocessesastreamofrangemeasurementswhilemoving,theaccuracyoftheresultant
mapneedstobecomparedwithgroundtruthinameaningfulway.
Tomeasuretheaccuracy,wecomparedoutputsoftwolidarunits.OneunitistheRealEarthStencil
productandtheotherisaFAROunit.WeusedaFAROFocus3DX330lidartodefinegroundtruth.To
assessaccuracy,wecomparedderived-pointtoderived-pointmeasurements,andthencomparedthe
pointcloudsgeneratedbythetwosystems.ClearlytheFARO,whileanexcellentinstrument,isnottrue
groundtruth–ittoohasameasureableaccuracythatthemanufacturerprovides.
Assessing Accuracy
Theaccuracyofanysensorisdeterminedincomparisontogroundtruth.Importantly,withapoint
cloudsystem,specificfeaturepointsonasurfacearenotmeasured,butderived.Inotherwords,the
laserbeamfromalaserrangefinderorarayfromacamerapixeldoesnotintersectthesurfaceata
specificallytargetedlocation.Thus,derivedpointsarecomputed,notdirectlymeasured.Examplesof
derivedpointsincludethecenterofaspherewhichiscomputedfromdatapointscollectedfromthe
surfaceofthesphere,orofthelocationofacornerwhichiscomputedfromtheintersectionofthree
planes.Onewaytodetermineaccuracyistocomparederived-pointtoderived-pointmeasurements.
Anotherwaytoassessaccuracyistomeasuretheshapeofasurfacewithknowngeometry,andthen
comparetheresultsagainstamodel.Examplesincludetheradiusofasphereorcylinder,ortheflatness
ofaplane,ortherelativeorientationsoftwoplanes,oraCADmodel.However,creatinglargehigh
accuracyshapesofknowngeometryisverycostly.
CopyrightKaarta2016
4
KAARTA
Pointcloudscanbecomparedwithcarefullysurveyedpointsacquiredusingexpensiveandsophisticated
equipment.Oftentheoutputfromthesedevicesisacceptedasareferencestandard,butthese
measurementsalsohaveerrorsandnoiseassociatedwiththem.However,thesemeasurementscanbe
usedasgroundtruthiftheiraccuracyisatleastanorderofmagnitudebetterthanthesystemunder
test.
Accuracyisoftenconfusedwithprecision,repeatabilityandotherterms.Itisimportanttodefinethe
termswhenhavingaconversation.Measurementaccuracy,byinternationallyaccepteddefinition,is
theclosenessofagreementbetweenameasuredquantityvalueandatruequantityvalueofa
measurand.1Thetrickyparthereistheterm‘truequantity’becauseitassumesgroundtruth.Precision
istherepeatabilityorreproducibilityofthemeasurement.
Inthefollowingdiscussion,groundtruthisdefinedbyaFAROFocus3DX330,showninFigure1,whichis
alsoapointcloudsystem.Thislidarsystemispositionedatfixedlocationstoscanitsenvironment.Asa
result,itoperatesinitscalibratedmodeateachpositionitisplaced.Thehigh-densitypointcloudit
acquiresatonepositionisthenregisteredandmergedwiththepointcloudacquiredataneighboring
position.Registrationofthepointcloudscanspecificallytakeadvantageofhand-placedfiducial
featuresintheenvironmenttoensurehighaccuracy.
A Comparison of Two Lidar-based Systems
Toassessaccuracy,wemeasuredanaturalenvironmentthatincludedman-madestructures.Wemade
measurementsusingtwodifferentLidar-basedsystemsandcomparedtheresults.TheRealEarth
StencilproductincorporatesaVelodyneVLP-16Lidarsystemthatsimultaneouslyscans16rotatinglines
oriented1.875oapartwhiletheusermanuallycarriestheunitwhilemovingtoacquiredensedatain3D.
TheStencilisamobilemappingsystemthatproduces3Dmapsinreal-timewithoutpost-processing.The
RealEarthsystemneedsnoGPS,noinfrastructure,oranypriormapstobuilda3Dmodel.
Thesecondsystem,aFAROFocus3DX330,wasusedasthereferencetodefinegroundtruth.Itacquires
datawhilescanningitsenvironmentfromastaticpositionusingarotatingmechanism.Thissystemis
phase-basedand,alongwithsimilarproductsfromReiglandLeica,producesadensepointcloudwhose
accuracyisconsideredtobeof‘surveyquality’,whichisgenerallydefinedtobe5mmorless.
RangenoiseoftheFAROinstrumentusingalowreflectivitytargetis2.2mmat25meters.
MeasurementaccuracyofthescannerusedintheStencilproductis3cmupto100meters.Thisisan
orderofmagnitudelessthantheFAROscannerwhosemeasurementsarethereforeconsideredthe
reference.
Acquiring the Point Clouds
PointclouddatawascapturedusingtheFAROdeviceandRealEarth’sStencilsystem.TheFAROsystem
wastripod-mountedatfivefixedpositionstoobtainfiveindependentpointclouds.Thescanswere
1
JCGM 200:2012. International vocabulary of metrology: Basic and general concepts and associated
terms (VIM). France: BIPM; 2012.
CopyrightKaarta2016
5
KAARTA
madefromrotating360oaroundaverticalaxis.Thepointcloudswereregisteredtoeachotherina
commoncoordinatesystemusingFARO’sproprietarysoftwarewithsphericalmarkersplacedinthe
sceneforfurtheranalysis.
Whenusingthehand-heldStencilsystem,theoperatorsimplywalkedaroundthenaturalscenewhile
collectingthepointcloud.Asweepconsistsofasetofrangedatainsixteenplanespassingthroughthe
originoftheinstrument’scoordinatesystem.Thecomplicationisthattheuseriscontinuously
“painting”thescenebothbymovinghorizontallyandtiltingthescannervertically,toacquireafullscene
of3Ddata.Thealgorithmsusedforthistakethemotionintoaccountthroughrapidfeaturetrackingand
areassistedbyanintegratedIMUinthismatter.TheIMUdataisusedtounwarpeachlidarscanandto
provideanotionofrelativemotiontothescanmatchingalgorithmsandthefinalaccuracyisnotbased
ontheaccuracyoftheIMUitself.
Thescannedareaisofacabinofapproximately60squaremeters(10.5mX5.5m),witharoofandwalls.
Figure2showsthescenegeneratedfromthepointclouddata.Thecabinwasonrelativelylevelground
surroundedbytreesandbrush.Asphericalmarkerwaslocatedseveralmetersawayfromeachcorner
ofthecabin.Theappearanceofthesphericalmarkersprovidedawaytocomparederived-pointto
derivedpointdistances.Inaddition,sincethemarkersexistedinbothdatasets,theyprovidedawayto
alignandcomparetheRealEarthandFAROpointclouds.Rapidlyandaccuratelyaligningtwopoint
cloudswasusefulformanuallyeditingbothpointcloudsatthesametimetoselectfeaturesofinterest.
Inourevaluation,wecomparedthepointcloudsgeneratedbythehand-heldStencil,whichwas
continuouslymoving,andatripod-mountedFAROsystemwhichneededtoberelocatedtoseveral
positions.Thescantimetoacquirethepointcloudswasabout10minutesfortheStenciland1hourfor
theFAROdevice.
TheFAROpointcloudisusedasareference.Tocomparepointclouds,wemeasureddistancesfromthe
StencilpointcloudtotheFAROpointcloud.Thisisanalogoustomeasuringthedistancefromapointto
asurface.SmallsurfacepatchesarecomputedforlocalgroupsofpointswithintheFAROpointcloud,
anddistancesarecomputedtothesepatchesfromindividualpointsintheStencilpointcloud.See
Figure5.SincethelocationofStencilrelativetothesurfaceisfound,theerrorsare“signed”.Errors
outsidethesurfaceoftheobjectarepositive,whileerrorsinsidearenegative.
Derived-Point to Derived-Point Measurements
Onewaytoassessaccuracyistocomparethedistancesbetweenderivedpointssuchasthecentersof
spheresortheintersectionsofthreeplanes.Wedidthisforthespherespresentinthescene,andthe
cornersofthecabinatafixedheight.Figure2showsthelocationsofthespheresinthescene.
CopyrightKaarta2016
6
KAARTA
Figure2:AviewofscansfromtheStencilandFAROsystems.
Wethenalignedthepointcloudstomakethecomparisoneasier.ThepointcloudsfromtheStenciland
FAROinstrumentswereroughlyalignedinCloudCompareusinganIterativeClosestPoint(ICP)
algorithm.ICPminimizesthemeansquareerror.Then,man-madeobjectsinthescenewerekeptwhile
allofthefoliageandtheflagontheflagpolewerecroppedfromthescene.Wethenalignedthe
remainingpointcloudagain.Thealignmenttookplacewithoutrescalingsinceweassumethelidar
sensorshavebeencalibratedandrangemeasurementsareaccurate.Ourresultsandthealignment
showthiswasagoodassumption.
Figure3showstheremainingman-madeobjectsinthescene.Wekeptseveralobjectsincludingthe
cabin,theflagpole,bulletinboard,sign,flowerpots,spheres,andpointsfromthesideofacarinthe
parkinglot.ThetrajectoryofStencilduringscanningisshownastheredtraceinthefigure.Sinceonly
themeasuredsurfacecoordinatedatawasprovided,weusedtheclosestStencildatapoint(nearest
neighbor)toeachCloudCompareFARObased"core"pointalongwiththeStencilpositiontodetermine
thesignofthesurfacenormal.Theanglebetweenthesurfacenormalandvectorfromthesurfaceto
thesensorhadtobelessthan90degrees;otherwiseweflippedthedirectionofthesurfacenormal.
CopyrightKaarta2016
7
KAARTA
Figure3:Man-madeobjectsinthescene.ThetrajectoryoftheStencilwhenmeasurementsweretakenisshowninred.
ThedirectionoftheoutwardsurfacenormalinFigure3isinthegeneraldirectionofthesensor.Asa
result,weknowthesignoftheerrorwhencomparingthepointclouds.Themergedpointcloudsfor
theseobjectswerealignedandtheresultantmeansquaredistancebetweenthecloudswasdetermined
tobelessthan1cm.
Afteraligningthepointclouds,weidentifiedthelocationofeachsphericalmarker.Wefitspheresto
thelocalcloudofdatatomeasuretheircenterandradius.Thespheresarelampglobeswitharadiusof
15.24mm(6inches).Theyaretranslucent,sothereissomeuncertaintyintheirmeasuredcentroidsand
radii.Lowpassfilteringofthepointcloudalsoaffectsthesevalues.Ameasureofconfidencein
determiningthelocationsofthespherecenterswastheRMS(rootmeansquare)errorofthefit.Once
thecentersofthesphereswerelocatedwecomputedallofthemutualdistancesbetweenthespheres.
Figure4showsthepointclouddatainthevicinityofeachsphereforbothsetsofdata.Extraneous
pointsfromthegroundortripodalongwithinstrumentnoiseweresuppressedbyiterativelyfittinga
spheretothedatawhileremovingoutliers.Thatis,wefitaspheretothedata,removedoutliers,andfit
aspheretotheremainingpoints.Theoutliersarethebluepointsshowninthefigures.Theredpoints
wereincludedinthefit.Apossiblealternativeistousearandomsampleconsensus(RANSAC)based
algorithm.Theresultisthelocationofthecenterofeachsphere,alongwithitsradius.Wecomputed
thedistancesbetweenthespheresusingthecomputedspherecenters.
CopyrightKaarta2016
8
KAARTA
Stencilpointcloudspheredata
FAROpointcloudspheredata
Figure4:Pointclouddatafromthespheres.
Table1showstheresults.ThedifferencesbetweentheStencilandFAROmeasurementsoftheinter
spheredistanceisseveralmillimeters.
Side1-2
Side2-3
Side3-4
Side4-1
Stencil
FARO DistanceError
(meters) (meters) (millimeters)
10.323
10.318
5
15.349
15.351
-2
14.291
14.277
14
18.608
18.615
-7
Table1:DistancesbetweenthespheresareshowninmetersandthedifferencebetweentheStencilandFAROmeasurements
areshowninmillimeters.
Thedimensionsofthecabinwerealsoderivedfromthepointclouddata.Thecabinisaclassicstructure
withagabledroofandfourwalls.Thefourcornersofthecabincanbeconsideredderived-points.The
wallsofthecabinarenotperfectlyflatandtheyincluderecessedfeaturessuchaswindowsanddoors.
Butthesedeviationsaresmallrelativetothesizeofthecabinandaremeasuredbybothsystems.Wefit
aplanetoeachofthefourwallsofthecabinandthenestimatedthelocationsofthecorners
approximatelyonemeterabovethegroundfromboththeStencilandFAROpointclouds.Figure5
showsarenderingofthewallswiththeStencilandFAROpointcloudsoverlaid.
Figure5:Arenderingofjustthewallsofthecabin.Wefitplanestothefourwallsandthencomputedahorizontalcrosssection
toidentifythefourcornerpointsofthecabin.
Figure6showsthecoordinatesofthefourcornersofthecabinalongwiththecoordinatesofthefour
spheres.
CopyrightKaarta2016
9
KAARTA
Figure6:Thelocationsofthefourcornersofthecabinareshownintherendering.ThewalllengthsmeasuredbytheFAROare
showninTable2.
Table2showstheestimatedwalllengthsmeasuredonemeterabovetheground.TheFAROestimates
differby0.3%fromtheStencilestimates.
Stencil
FARO
(meters) (meters)
10.414
10.458
5.538
5.554
10.418
10.463
5.553
5.563
Wall1
Wall2
Wall3
Wall4
LengthError
(cm)
-4.4
-1.6
-4.5
-1.0
Table2:Shownaretheestimateddifferencesinthewalllengthsofthecabinbasedontheintersectionoftwoadjacentwalls.
Thethirdplaneneededtodefinethefourcornerpointsislocatedapproximatelyonemeterabovetheground.
Comparing Point Clouds
WethencomputedtheaveragedistancebetweenpointcloudsusingtheFAROpointcloudasthe
reference.Todothis,asmallsurfacepatchisfittolocalgroupsofpointsfromtheFAROpointcloud.
ThedistancefromnearbypointsintheStencilpointcloudtoeachsurfacepatchisthencomputed.The
distancesaresignedsincethesurfacenormalpointsinthegeneraldirectionofwherethesensorwas
locatedwhenthedatawasacquired.SeeFigure7.
CopyrightKaarta2016
10
KAARTA
Figure 7: The graphic shows how distance is measured between the Stencil and FARO point clouds. Source: Cloud
Compare M3C2 documentation.
A noise filter was applied to the point clouds from the scene with man-made structures and the
point clouds were then aligned. The filter suppresses high frequencies associated with the
sensor noise. It also rounds out sharp edges such as corners. Figure 8 shows a histogram of
the errors. The shape of the histogram indicates there is a slight distortion of the Stencil point
cloud relative to the FARO. The point clouds have been aligned to minimize the mean square
error. Ideally, the errors seen in the histogram should be symmetric with an average of zero.
When the point cloud shapes are in agreement the width of the histogram is proportional to the
noise. The magnitude of the average error is approximately 7mm.
CopyrightKaarta2016
11
KAARTA
Figure8:Histogramofpointclouderrorsfortheman-madeobjectsinthearea.Thecaptionshowstheaverageerror,the
standarddeviation,andthefullwidthathalfmaximumofthehistogram.
Thepointcloudsforthecabinalonewerethenextractedfromthedata.Thehistogramoftheerrors
wasfoundtobebimodalasseeninFigure9.
Figure9:Histogramofpointclouderrorsinthedataforthecabinalone.Thecaptionshowstheaverageerror,thestandard
deviation,andthefullwidthathalfmaximumofthehistogram.
TounderstandtheshapeofthehistogramweusedafeatureofCloudComparetoadjustthescaleof
datatoimprovethefitwiththeFAROpointcloud.Thisscalechangeisanon-rigidtransformation.
AllowingtheStencilpointcloudtoberesizedby0.5%(afactorof1.005)resultedintheuni-modal
histogramshowninFigure10.ThisscalechangeisequivalenttoincreasingeveryStencilrange
measurementby0.5%,suggestingasmallerrorincalibrationorpointcloudalignment.Thechangeof
scaleneededtoimprovethefitbetweenthetwopointcloudsisconsistentwiththedifferencesinthe
lengthsofthewallswecomputedusingthederivedcornerpointsofthecabin.Thisisworth
investigatingfurther.Notethatthescalingwasusedonlyforcomparingthehistograms.Itwasnotused
toimprovetheerrordifferencesbetweentheFAROandStencilcitedearlier.
CopyrightKaarta2016
12
KAARTA
Figure10:HistogramofpointclouderrorsinthecabinwalldatausingFAROasthereferenceafteralignmentofthecabinitself.
Thecaptionshowstheaverageerror,thestandarddeviation,andthefullwidthathalfmaximumofthehistogram.
Thefullwidthathalfmax(FWHM)ofthehistogramis2.5cm,or+/-1.25cm.Thisisconsistentwiththe
noisespecificationofthelaserscannerusedintheStencilafterfiltering.Ifthereisanerrorinthe
measuredshapeofthecabin,notjustscale,itshouldshowupasabimodaldistributioninthis
histogram.
CopyrightKaarta2016
13
KAARTA
Summary
RealEarth’smappingsystemacquiresandprocessespointclouddatainrealtimetoassembleapoint
cloudmapwithouttheuseofaGPS,maps,orinfrastructurewhileitisinmotion.Thesystemwas
comparedagainstaFAROFocus3DX330system,whichacquirespointclouddatawhilestationary.The
resultsshowdifferencesinthepointcloudsgeneratedbythetwosystemsontheorderof1to2cm
overanareaof20squaremeters.
Differencesinthepointcloudmeasurementsweredeterminedintwoways.Thefirstmethodinvolved
computingderivedpoints.Thesceneincludedspheresandacabinwithcorners.Localpointclouds
associatedwiththesphereswereextractedfromtheglobaldataandfittingroutineswereusedto
estimatethecentersofthespheres.Thedistancesbetweenthespherecentersasmeasuredbythetwo
deviceswerethencompared.Cornersofthecabinweredeterminedfromtheintersectionofplanar
surfacescomputedfromthecabinpointclouds.Cornertocornerdistanceswerethencompared.
Thesecondmethodusedtocomparepointcloudsfirstalignsthepointclouds.Thepointcloudacquired
bytheFAROinstrumentisconsideredthegroundtruth,sinceitsprecisioniswithinoneortwo
millimeters.WefitasurfacepatchtosmallgroupsofpointsacquiredbytheFAROandthencompute
distancestothesepatchesfromthepointcloud.Ideally,theerrorhistogramshouldbesymmetricand
theaveragedistanceclosetozero.Thewidthofthehistogram,asmeasuredbyitsfullwidthathalf
maximum,isameasureoftherandomnoiseofthepointcloud.Anybreakinthesymmetryofthe
histogramindicatesaproblemmergingofthepointcloudsthatareusedincreatingtheglobalpoint
cloud.
Intheexample,pointcloudswereacquiredinanaturalenvironmentwithgrass,treesandaflagmoving
inthewind,inadditiontoman-madeobjects.Theresultsshowthatamobilemappingsolutioncan
provideaccurateinformationforarchitecturalandengineeringuse.Insummary:
•
•
•
•
Wecollectedpointclouddataofanaturalenvironmentthatincludedman-madestructures.We
usedaFAROFocus3DX330fromfixedlocationsandRealEarth’sStencilsystem,whichacquires
pointclouddatawhilemoving.
Pointcloudsacquiredbythetwosystemswerealignedusingtheopensourceprogram
CloudCompareforcomparison.Man-madeobjectsincludinglampglobes,acabin,andavehicle
wereextractedfromthescenetocomparetheirpointclouds.
Weprocessedthepointcloudstocomparedistancesbetweenthespheresandthedimensions
ofthecabin.
Analysesofpointclouderrorsindicateasmallamountofdistortionintheglobalpointcloud.
Afterlocalre-alignmenterrorhistogramsareuni-modalandsymmetricaroundzeroerror.
Differenceswerewithin0.2-1.4cmincomparisonwithastationarylidarforderivedpointsat
spherecenters.Differenceswerewithin1-4.5cmfordistancesbetweenderivedpointsusing
wallintersections.
CopyrightKaarta2016
14
KAARTA
Theoverallaccuracyoftheresultantdatawasexcellentandthemobilesystemcanprovideexcellent
informationthatcanbeusedtomodelstructuresinCADandprovidedirectmeasurementsofbuildings
andstructures.
References
[1]VelodyneVL0-16ScanningLaserRangeFinderSpecification.www.velodynelidar.com.
[2]“Single-planeversusThree-planeMethodsforRelativeRangeErrorEvaluationofMedium-range3D
ImagingSystems”,DavidK.MacKinnon,NationalResearchCouncilCanada,SPIEProceedings,June2015.
[3]“TowardstheDevelopmentofaDocumentaryStandardforDerived-PointtoDerived-PointDistance
PerformanceEvaluationofSphericalCoordinate3DImagingSystems”,BalaMuralikrishnanetal.,NIST.
ProcediaManufacturing,V.XXX,2015,pp1-12.
[4]“SLAMforDummies”,SørenRiisgaardandMortenRufusBlas.MITocw.mit.edu16-412online
course.
[5]“LOAM:LidarOdometryandMappinginReal-time”,ZhangandSingh.Robotics:ScienceandSystems
Conference,July2014.
[6]CloudComparev2.6.1-Usermanual.http://www.danielgm.net/cc/documentation.html
[7]“Accurate3Dcomparisonofcomplextopographywithterrestriallaserscanner:applicationtothe
Rangitikeicanyon(N-Z)”,DimitriLague,NicolasBrodu,JérômeLeroux
[8]FAROTechnologyWhitePaper,Large-Volume3DLaserScanningTechnology
[9]“AccuracyassessmentoftheFAROFocus3DandLeicaHDS6100panoramictypeterrestriallaser
scannersthroughpoint-basedandplane-baseduserself-calibration”,Chow,Lichti,&Teskey,Canada.
LaserScannersII,5865.RomeMay6-10,2012.
[10]“Assessingtheperformanceof3D-imagingsystems”,DavidK.MacKinnon.SPIENewsroom.DOI:
10.1117/2.1201102.003393
CopyrightKaarta2016
15