CASE STUDY OF MR/AR PROJECT FOR GRATXRAY

GratXray needed a reliable way to demonstrate its next-generation breast CT scanner without transporting hardware to hospitals and medical conferences. The goal was simple and critical: let doctors see the device in full size, understand its motion paths, and evaluate clinical workflow — without guessing, lag, or visual gimmicks. We built a Mixed Reality solution for Meta Quest 3 and a synchronized companion tablet app so clinicians can walk around the system, observe real-world occlusion, and watch CT and tomosynthesis animations triggered in real time by a presenter. The system had to work offline, run at 90–120 FPS, hold pose accuracy across multiple headsets, and operate in kiosk-safe environments. Result: a stable, hospital-ready MR platform that communicates GratXray’s technology clearly and reduces travel and setup effort for physical devices.

Case Study — MR & AR App for GratXray (CONFIDENTIAL)
Breast CT visualized in MR on Meta Quest 3

Executive Summary

We built a Mixed Reality application for Meta Quest 3 with AR support that allows GratXray to accurately place and demonstrate a floor‑anchored breast CT system at hospitals and congresses. Observers can freely walk around the virtual device in passthrough MR without interacting, while a presenter uses a computer to trigger predefined animations and stream a headset’s view. The solution supports multi‑headset co‑location, kiosk mode, offline operation, and sustained performance at 90–120 FPS—without cables.

Meta Quest 3 Passthrough MR Shared Anchors PC Trigger & Casting Kiosk Mode Tablet Companion

Client

GratXray AG — Dr. Simon Spindler
Rütistrasse 14, CH‑8952 Schlieren, Switzerland

Keep contact details within confidential circulation.

90–120FPS target
60+min stable demo
1–5goggles synced

Objectives

  • Accurate floor‑constrained placement of the breast CT anywhere in a room.
  • High‑fidelity passthrough visualization; observers can walk around freely.
  • External computer triggers predefined animations; presenter can view headset cast.
  • Multi‑user co‑location so all viewers see one device in one place with synchronized motion.
  • Hands‑off observer experience; kiosk mode prevents accidental exits.

Key Constraints

  • No cables during use; app continues when headsets are donned/doffed.
  • Stable 90–120 FPS; ≥ 60 minutes continuous runtime.
  • Model budget ≤ 200,000 triangles for the CT asset.
  • No guardian boundary; safe free‑movement visualization.
  • Operable via local router without internet access.
clutch reviews
download 1 1
download 2 3
download 2 5

Key Technical Challenges & Our Solutions

Critical engineering problems solved for medical‑grade demonstration reliability.

Meta Quest Sync — No Internet

Challenge: Keep 1–5 Quest headsets perfectly synced without cloud or WAN.

  • Local router only — no external servers
  • Shared anchor resolution + deterministic state machine
  • UDP multicast + keyframe confirmations
  • Auto resync on headset wake / join

Solution: Local timebase + anchor exchange + timeline sync.

Human Occlusion & Depth Accuracy

Challenge: Virtual CT must hide behind real people + objects in MR.

  • Quest depth API + confidence masks
  • Stencil + depth pre‑pass for clean edges
  • Dynamic fallback to silhouette occlusion

Solution: Real‑time depth‑aware compositing pipeline.

AR ↔ MR Shared Pose System

Challenge: Tablet AR and Quest MR see the device in the same spot.

  • Shared anchor framework
  • Pose gateway + smoothing filter
  • Universal animation events + timestamps

Solution: Common world anchor sync + cross‑device pose fusion.

Performance at Medical Demo Quality

Challenge: 90–120 FPS in headset + casting + sync.

  • URP mobile tuning + GPU instancing
  • 200k tri limit, LODs, baked materials
  • Network ticks decoupled from render
  • Adaptive resolution + v‑sync policies

Solution: Render budget discipline + asynchronous network layer.

Impact & Results

Clinical clarity, exhibition reliability, and future‑proof asset pipeline.

≤ 2 cmAnchor alignment accuracy
90–120Stable FPS in headset
<1 sSync recovery for late join

Clinical Impact

  • Doctors see true scale + motion paths
  • Hands‑free guided viewing for clinicians
  • Occlusion ensures realism + trust

Operational Impact

  • Offline‑safe for conferences + hospitals
  • Kiosk mode stops accidental exits
  • Future CT revisions plug‑and‑play

LET’S WORK TOGETHER

Contact Form Demo (#4)
Unity3D game development means using the Unity engine to create fun and interactive 2D and 3D games. Developers use it to make games for phones, computers, VR, AR, and even consoles.
Some of the best Unity3D development companies for mobile games are NipsApp Game Studios, Juego Studios, and Zco Corporation. They create high-quality games with smooth graphics and fun gameplay.
To find trusted Unity3D developers for 3D simulations, check reviews, past projects, and client feedback. Look for studios that have already built training or educational simulation apps.
Studios like NipsApp Game Studios, The NineHertz, and Cubix are known for making VR and AR games. They create immersive worlds that work well on headsets and mobile devices.
You can hire an affordable Unity3D company by comparing prices, checking packages, and asking for demo projects. NipsApp Game Studios offers cost-effective solutions for training apps.
Companies such as NipsApp Game Studios and Innowise Group provide top Unity3D solutions for enterprise VR projects. They focus on making secure, scalable, and professional apps for businesses.