XR Developer (Unity / OpenXR) - Aethernova - 18455
Contract: Permanent, Salary: £30.00 per hour, Closing Date: Friday 24 October
Employer:
Aethernova is an Edinburgh-based startup in Digital Health Technology with global ambitions, building an XR-AI platform for neurodevelopmental assessment (initially autism/ADHD). We combine immersive experiences with multimodal signals (eye-tracking, speech prosody, motion, HRV/EDA, etc) to create fairer, faster, culturally aware assessments centred on human values and real-world utility.
Environment:
- Small, mission-driven team with high ownership and clear outcomes
- Hybrid/remote flexibility; periodic on-site testing days
- We value neurodiversity, lived experience, and respectful collaboration.
What might a day in this role look like?
- Prototyping and shipping Unity (C#) XR scenes using OpenXR and the XR Interaction Toolkit for Meta Quest and comparable headsets
- Designing and implementing accessible, low-cognitive-load interaction patterns suitable for autistic and ADHD users; iterating with clinicians, researchers, and lived-experience advisors
- Integrating device/SDK inputs (e.g., hand/eye tracking, controllers, external sensors via Bluetooth/USB) and time-synchronising data streams for analysis
- Building clean, testable gameplay/UX loops, in-app tasks, and stimuli presentation with precise timing and event logging
- Collaborating with our AI team to expose Unity events/telemetry via local APIs or messaging (e.g., gRPC/REST/WebSocket) to Python/ONNX/PyTorch runtimes
- Optimising performance for standalone headsets: batching, occlusion culling, shader and memory budgets, frame-time profiling
- Writing technical docs, contributing to lightweight QA, and taking part in pilot sessions with partner sites (NHS/academia/schools)
- Upholding security, privacy, and clinical safety practices appropriate for health tech (e.g., GDPR, data minimisation, DCB0129/0160 pathways).
What will I learn?
- End-to-end development of a regulated digital health product pathway (discovery → pilots → evaluation → evidence)
- Inclusive/XR accessibility design for neurodivergent users and participatory co-design methods
- Multimodal signal capture, synchronisation, and AI-assisted analytics
- Practical exposure to medical device and software lifecycle standards (e.g., IEC 62304), clinical safety case thinking, and user research in sensitive settings
- You’ll strengthen: XR engineering, accessible UX, data capture/synchronisation, optimisation, documentation, and inter-disciplinary teamwork (engineering–clinical–UX).
What qualifications/skills are required?
- Degree in Computer Science, Games/XR, HCI, or equivalent real-world experience
- Portfolio/GitHub or shipped titles showing relevant XR/Unity work (essential)
- Formal training in accessibility/HCI, human factors, or clinical UX is a plus.
Essential skills and experience:
- Unity & C# : scene architecture, ScriptableObjects, addressables, input systems, editor tooling
- XR foundations: OpenXR, Unity XR Interaction Toolkit, deployment to standalone headsets (e.g., Meta Quest)
- Interaction & UX: accessible, low-friction interaction patterns; attention to sensory load, contrast, typography, pacing, error recovery
- Performance & tooling: Unity Profiler, GPU/CPU frame analysis, build pipelines (CI/CD basics), crash/telemetry handling
- Data & timing: event logging, timestamping, JSON/CSV export, reliable file I/O on device, clock sync strategies
- APIs & interop: experience exposing/consuming local or remote APIs (REST/gRPC/WebSocket) for AI/analytics services
- Collaboration: version control (Git), code reviews, tickets/specs, writing clear documentation, communicating with non-technical stakeholders
- Mindset: user-safety first, curiosity, bias-aware design, and disciplined execution in small teams.
Desirable (nice-to-have) experience:
- Eye/hand tracking: Meta Quest/Pro, Tobii/Pupil Labs, XR gaze APIs; calibration flows and fixation/saccade eventing
- Audio & speech: microphone capture, latency handling; experience triggering/recording speech tasks
- Motion & biosignals: IMU/pose data, HRV/EDA (BLE sensors such as Polar/Empatica); basic signal-processing concepts
- Shaders & graphics: URP basics, simple shaders/material optimisation; 3D asset pipelines
- AI/ML interop: ONNX/PyTorch runtimes, feature extraction pipelines; sending/receiving model inferences
- Security & privacy: sandboxed storage, encryption at rest/in transit, data minimisation on device
- Quality & safety: familiarity with IEC 62304, ISO 13485 environments, NHS DSPT, and DCB0129/0160 clinical safety files (or willingness to learn)
- Testing with users: running pilot sessions, capturing consent, safeguarding awareness, and empathetic communication.
This post is subject to funding by the Scottish Government and Edinburgh’s Employer Recruitment Incentive. Candidates must meet one of the following criteria:
- Lone parents
- Parents with disabilities
- Young parents age 25 years and less
- Minority ethnic families
- Families with a disabled child
- Families with 3 or more children
- Families where the youngest child is under 1 year.
If you are unsure about these criteria, please feel free to make further enquiries before applying.