๐Ÿ“ก Deep Dive

LiDAR Scanner
Technical Guide

Every aspect of the LiDAR Scanner Xcode project โ€” from how laser depth sensing physically works, to the Swift code that captures meshes and exports 3D files. iPad Pro M4 ยท ARKit ยท RealityKit ยท ModelIO ยท SwiftUI.

๐Ÿ“‹ What This Guide Covers

This guide walks through the complete LiDAR scanning pipeline โ€” from the physics of time-of-flight sensing, to ARKit's scene reconstruction pipeline, to the Swift code that converts raw GPU mesh buffers into exportable 3D files in USDZ, OBJ, and PLY formats.

Read front to back for a full understanding, or jump directly to any chapter. The code walkthrough in Chapters 5โ€“7 is self-contained if you already understand the theory.

Language
Swift 5.9
UI Framework
SwiftUI
3D Framework
RealityKit
AR Framework
ARKit Scene Reconstruction
3D Export
ModelIO
Target Device
iPad Pro (M1+), iOS 17.0+

Chapter 01 What is LiDAR?

1.1 The Physics of Laser Depth Sensing

LiDAR stands for Light Detection and Ranging. The sensor fires invisible infrared laser pulses at a scene and measures the exact time each pulse takes to bounce back. Because light travels at a known speed (~300,000 km/s), the sensor calculates the distance to every point it hits.

โšก Time-of-Flight (ToF)

Distance = (Speed of Light ร— Round-trip Time) รท 2

The iPad Pro uses direct ToF โ€” each laser pulse gets its own timer. This gives accurate depth even in motion.

๐Ÿ“ Point Cloud Output

Each pulse โ†’ one 3D point (X, Y, Z). Thousands of pulses per second โ†’ a dense point cloud. ARKit fuses these into a continuous mesh surface automatically.

1.2 iPad Pro LiDAR Specs

PropertyDetail
RangeUp to ~5 metres (indoor), best under 2m for objects
Pulse rateThousands of depth points per second
LatencyLow latency โ€” ARKit can reconstruct in real time
Works in dark?Yes โ€” laser is infrared, does not need visible light
Works outside?Yes, but sunlight IR interference reduces accuracy
Vs camera depthFar more accurate than stereo camera depth estimation
๐Ÿ’ก
Key insight: The LiDAR gives ARKit raw depth data. ARKit then uses the RGB camera alongside this to build a textured understanding of the scene. Our app rides on top of this.

Chapter 02 How ARKit Uses the Sensor

2.1 Scene Reconstruction Pipeline

ARKit runs a multi-stage pipeline to turn raw LiDAR depth data into the clean 3D mesh your app receives:

โ‘ 
Depth Frame Capture
LiDAR fires pulses. Each frame produces a depth map (every pixel has a distance value).
โ‘ก
RGB + Depth Fusion
The RGB camera image is aligned with the depth map using intrinsic/extrinsic calibration baked into the device.
โ‘ข
TSDF Integration
ARKit uses a Truncated Signed Distance Function โ€” a 3D volumetric grid. Each new depth frame "votes" to update the grid.
โ‘ฃ
Mesh Extraction
Apple's marching-cubes algorithm extracts a triangle mesh from the TSDF volume and creates ARMeshAnchor objects.
โ‘ค
Classification
A Core ML model classifies each triangle as floor, wall, ceiling, table, seat, window, door, or unknown.
โ‘ฅ
Delegate Callbacks
Your ARSessionDelegate receives didAdd / didUpdate / didRemove anchors โ€” that's where ScanManager.swift plugs in.

2.2 ARMeshAnchor โ€” The Data Container

Each ARMeshAnchor represents a region of the reconstructed mesh. It contains:

โš ๏ธ
Coordinate system: ARKit uses a right-handed coordinate system with Y-up. When you export to OBJ for Blender, note that Blender uses Y-forward, Z-up โ€” you may need to rotate on import.

Chapter 03 Project Setup (Xcode + Sideload)

3.1 What You Need

RequirementDetail
MacmacOS 14 Sonoma or later recommended
XcodeVersion 15 or later (free from Mac App Store)
Apple IDFree account works โ€” 7-day sideload cert. Paid $99/yr = no expiry
iPad ProAny iPad Pro with LiDAR (2020+). M4 is perfect
USB CableUSB-C to USB-C, or USB-C to Mac port adapter
iOS17.0 minimum on the iPad

3.2 Step-by-Step: Create the Xcode Project

๐Ÿ”‘
Signing is required: Without signing, Xcode cannot install to your physical iPad. Use any Apple ID โ€” it is free. Go to Signing & Capabilities โ†’ set Team โ†’ change Bundle Identifier to something unique, e.g. com.YOURNAME.lidarscanner.
๐Ÿ”„
Free sideload expiry: Free Apple ID certificates last 7 days. After that, just plug in the iPad and press โŒ˜R again in Xcode. The app data is NOT deleted.

Chapter 04 App Architecture & File Map

4.1 File Structure

FileResponsibility
LiDARScannerApp.swiftEntry point. @main struct that launches the SwiftUI window.
ContentView.swiftAll visible UI. Buttons, stats, share sheet trigger. Reads from ScanManager.
ARViewContainer.swiftTiny bridge between SwiftUI and UIKit/RealityKit. Wraps ARView.
ScanManager.swiftThe brain. Runs AR session, collects mesh anchors, exports 3D files.
ShareSheet.swiftWraps UIActivityViewController so SwiftUI can call the iOS share sheet.
Info.plistCamera permission text. LiDAR device capability requirement.

4.2 Data Flow

๐Ÿ“ก
LiDAR Sensor
Hardware depth pulses
๐ŸŽ
ARKit (OS level)
ARWorldTrackingConfiguration + sceneReconstruction
๐Ÿ”ท
ARSessionDelegate callbacks
didAdd / didUpdate / didRemove ARMeshAnchor
๐Ÿง 
ScanManager
Stores anchors in [UUID: ARMeshAnchor] dictionary, recalculates stats
๐Ÿ“ฆ
Export (ModelIO)
Converts ARMeshGeometry โ†’ MDLMesh โ†’ file on disk
๐Ÿ“ค
ShareSheet
UIActivityViewController โ†’ AirDrop / Files / Cloud

Chapter 05 Full Code Walkthrough

5.1 App Entry Point โ€” LiDARScannerApp.swift

The simplest file. Swift's @main attribute marks this as the application entry point. SwiftUI's App protocol launches a WindowGroup containing ContentView.

import SwiftUI

@main
struct LiDARScannerApp: App {
  var body: some Scene {
    WindowGroup {
      ContentView() // Root view
    }
  }
}Swift

5.2 ARView Bridge โ€” ARViewContainer.swift

RealityKit's ARView is a UIKit component (UIView). SwiftUI cannot use UIKit views directly โ€” you need a UIViewRepresentable wrapper. This struct is that bridge.

struct ARViewContainer: UIViewRepresentable {
  @ObservedObject var scanManager: ScanManager

  // Called ONCE: create the ARView
  func makeUIView(context: Context) -> ARView {
    let arView = ARView(frame: .zero)
    scanManager.setup(arView: arView) // hand off to ScanManager
    return arView
  }

  // Called on SwiftUI re-renders: nothing to update
  func updateUIView(_ uiView: ARView, context: Context) {}
}Swift
๐Ÿ’ก
Why @ObservedObject here? ARViewContainer observes ScanManager so SwiftUI knows the two are linked โ€” but we never actually use its published values directly here. The real work happens inside ScanManager.

Chapter 06 Mesh Capture Logic (ScanManager)

6.1 Class Declaration & State

ScanManager inherits from NSObject (required by ARSessionDelegate), ObservableObject (so SwiftUI can react to changes), and ARSessionDelegate (to receive mesh updates from ARKit).

@MainActor
class ScanManager: NSObject, ObservableObject, ARSessionDelegate {
  // Published = SwiftUI watches these automatically
  @Published var meshAnchorCount: Int = 0
  @Published var totalVertexCount: Int = 0
  @Published var totalFaceCount: Int = 0
  @Published var statusMessage: String = "Move iPad to start scanning"
  @Published var showMesh: Bool = true

  // Internal storage: one entry per mesh region
  private var meshAnchors: [UUID: ARMeshAnchor] = [:]
  private weak var arView: ARView?
}Swift
๐Ÿงฉ
@MainActor: This annotation ensures all code in ScanManager runs on the main thread. Required because ARKit delegate callbacks arrive on background threads but UI updates must be on main.

6.2 Starting the AR Session

func setup(arView: ARView) {
  self.arView = arView
  arView.session.delegate = self

  guard ARWorldTrackingConfiguration
    .supportsSceneReconstruction(.meshWithClassification) else {
    statusMessage = "LiDAR not available"
    return
  }
  startSession()
}

private func startSession() {
  let config = ARWorldTrackingConfiguration()
  config.sceneReconstruction = .meshWithClassification
  config.environmentTexturing = .automatic
  config.planeDetection = [.horizontal, .vertical]
  arView?.session.run(config, options: [.resetTracking, .removeExistingAnchors])
  arView?.debugOptions = showMesh ? [.showSceneUnderstanding] : []
}Swift

6.3 Receiving Mesh Delegate Callbacks

nonisolated func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
  let meshes = anchors.compactMap { $0 as? ARMeshAnchor }
  guard !meshes.isEmpty else { return }
  Task { @MainActor in self.addMeshAnchors(meshes) }
}

nonisolated func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
  let meshes = anchors.compactMap { $0 as? ARMeshAnchor }
  guard !meshes.isEmpty else { return }
  Task { @MainActor in self.updateMeshAnchors(meshes) }
}

nonisolated func session(_ session: ARSession, didRemove anchors: [ARAnchor]) {
  let ids = anchors.compactMap { $0 as? ARMeshAnchor }.map { $0.identifier }
  guard !ids.isEmpty else { return }
  Task { @MainActor in self.removeMeshAnchors(ids) }
}Swift

Chapter 07 Export Pipeline (USDZ / OBJ / PLY)

7.1 Overview: ARKit Geometry โ†’ ModelIO โ†’ File

TypeRole
ARMeshGeometryRaw GPU buffers โ€” vertices and face indices as Metal MTLBuffer
MDLMeshModelIO mesh object. Understands vertex descriptors, materials, transforms
MDLAssetContainer for one or more MDLMesh objects. Has the export() method
MTKMeshBufferAllocatorBridges between Metal GPU buffers and ModelIO CPU buffers

7.2 Export Formats Compared

FormatDescription
USDZApple's preferred format. Self-contained ZIP of USD + textures. Opens natively in iOS Files, QuickLook, AR Quick Look. Best for sharing on Apple devices.
OBJUniversal 3D format. Opens in Blender, Maya, Cinema4D, 3ds Max. Geometry only โ€” no texture in our export. Good for further editing.
PLYPolygon file format. Widely used in point cloud research. Opens in MeshLab, CloudCompare, Open3D. Good for scientific/analysis workflows.
๐Ÿ“ฑ
USDZ tip: AirDrop a USDZ file to another iPhone/iPad and it opens in AR Quick Look immediately โ€” the scanned object appears life-size in your room. No app needed.

Chapter 08 UI Layer (SwiftUI)

8.1 Layout Structure

The UI is a ZStack โ€” a layered stack. ARViewContainer fills the background (full screen camera feed + mesh), and all controls float on top as overlays.

ZStack {
  ARViewContainer(scanManager: scanManager)
    .ignoresSafeArea()
  VStack {
    // Top bar: title, status, mesh toggle button
    HStack { ... }.background(gradient: black โ†’ clear)
    Spacer()
    // Middle: vertex/face stats badges
    HStack { StatBadge(...) ... }
    // Bottom: Reset + Export buttons
    HStack { ... }.background(gradient: clear โ†’ black)
  }
}Swift

8.2 Reactive State

ContentView creates ScanManager as a @StateObject. Any change to its @Published properties automatically re-renders the relevant parts of the UI.

@StateObject private var scanManager = ScanManager()

// These auto-update the StatBadges:
scanManager.meshAnchorCount      // โ†’ "Anchors: 12"
scanManager.vertexCountFormatted  // โ†’ "Vertices: 48.2K"
scanManager.faceCountFormatted    // โ†’ "Faces: 31.7K"
scanManager.statusMessage         // โ†’ "Scanning... 12 mesh regions"Swift

Chapter 09 Known Limitations

โš ๏ธ
This is real โ€” know it before you start: ARKit's scene reconstruction is tuned for room-scale spatial computing, not object scanning. The mesh will look great for furniture/walls and blocky for small objects.
IssueEffect
Small objects (<20cm)Very low detail. Figurines, mugs look blocky. Use Polycam instead.
Shiny/reflective surfacesLiDAR IR pulses scatter โ€” gaps and holes in mesh
Glass/transparent objectsNear-invisible to LiDAR โ€” large holes guaranteed
Fine detail (hair, fabric)Smoothed out by the marching-cubes mesh extraction
Texture / colorRaw export has no texture. You need to project camera image onto mesh separately
Underside of objectsAnything the camera cannot see = missing geometry

Chapter 10 Next Steps & Upgrades

10.1 Texture Baking

The biggest missing feature: color on the exported mesh. Capture ARFrame.capturedImage alongside the depth, project camera pixels onto mesh triangles using the camera's intrinsic matrix, and bake a texture atlas into the OBJ export as an MTL material file. Apple's Object Capture API (on macOS) can do this automatically from a video scan.

10.2 Mesh Simplification

Raw ARKit meshes can be very dense. Adding a decimation step would shrink file sizes massively. ModelIO has built-in support via generateAmbientOcclusionTexture, or use a third-party lib like MeshOptimizer via Swift Package Manager.

10.3 Object Isolation

Use ARKit's mesh classification to filter and keep only specific surface types during export:

// Filter to only table-top objects during export
let classification = anchor.geometry.classificationOf(face: faceIndex)
if classification == .table || classification == .none {
  // include this face
}Swift

10.4 LiDAR โ†’ AI Texture Upgrade

01
Scan with LiDAR
Use this app to export OBJ mesh
02
Import to Python on Mac
Load OBJ into processing script
03
NeRF / Gaussian Splatting
Use Instant-NGP or nerfstudio to refine geometry from video frames
04
AI Texture Upscale
Use Real-ESRGAN to upscale textures
05
Result
Photorealistic 3D object โ€” fully local, no cloud needed
๐Ÿš€
The bigger picture: Your iPad M4 is the LiDAR scanner. Your Mac handles the AI refinement step. Together they form a fully local 3D scanning + AI pipeline โ€” no cloud needed.