How It Works

Concept Overview

Crowd energy to forest behavior

Data Forest is a live visual system where human activity becomes ecological behavior. Camera-based metrics do not just display as numbers; they are translated into growth, particle density, wind motion, weather, and atmosphere inside a Unity forest.

The core idea is simple: more people, more life. A quiet building produces a calmer forest. A busy building produces denser vegetation, more motion, and more environmental activity.

Wide cinematic render of the Data Forest simulation

Render Reactivity

Input signals to visual behaviors
Growth stage 1: oak stump
01 Stump
Growth stage 2: leafless oak
02 Bare
Growth stage 3: branching oak with early leaves
03 Branching
Growth stage 4: fully leafed oak
04 Full
People Count -> Biomass

Tree and Ground Growth

Detected people counts are smoothed into biomass values. Biomass controls how dense each micro-zone becomes, including the custom four-stage SpeedTree oak, grass coverage, and growth/decay timing.

Noise implementation and particle system development
Noise -> Particles

Fireflies and Atmospheric Detail

Noise values connect directly to the particle system, driving firefly-like particles, drifting points, and ambient visual texture. These effects make the forest feel alive even when growth is low.

Data Forest scene showing vegetation and environmental motion
Movement -> Wind

Motion in the Canopy

Movement metrics can be mapped to wind intensity, giving the scene a physical response to crowd energy. The grass and tree assets share the same wind rate and direction, so motion reads as one coherent environment.

Secondary Interaction

Pose detection and rain

In addition to passive building-wide data, Data Forest includes a more direct interactive layer. A camera near the installation can detect viewer poses and trigger real time responses within the render, including rain and immersive audio. The rain system works with a reactive sky asset, turning the audience from a data source into an active participant in the weather of the scene.

Camera spoofing and pose detection test setup

Pose detection / installation-camera media slot.

Short rain demo showing a single zone orbit with the weather system active.

Media Gallery To Add

Final documentation slots

People Count -> Biomass Video

Show counts rising and vegetation growing.

Noise -> Fireflies Video

Show particle behavior responding to noise.

Movement -> Wind Video

Show canopy or grass motion responding to movement.

Pose Input Footage

Add camera-side footage showing the gesture that triggers the rain response.

Real Installation Photos

Add final physical setup and expo documentation.

System Diagram

Add final camera/server/Unity architecture diagram.

Technical Architecture

Full system path
01

Camera Inputs

Webcams observe selected zones and provide frame data to the detection server.

02

Detection Server

Python, OpenCV, and YOLOv8 process camera frames and compute people-count metrics.

03

Metrics API

FastAPI exposes per-camera values, aggregate counts, smoothed values, and health state.

04

Unity Polling

Unity polls the HTTP endpoint and converts the current metrics into normalized control values.

05

Render Systems

Micro-zone controllers drive four-stage tree growth, grass coverage, noise particles, synchronized wind, weather, sky state, camera behavior, and ambience.

Tech Stack Notes

Implementation detail
Computer Vision

Python, OpenCV, YOLOv8

The vision layer captures webcam frames, detects people, and produces simple metrics that the render system can use without storing identity information.

Server

FastAPI Metrics Endpoint

The server exposes the current system state through HTTP. This makes the data path inspectable in a browser and straightforward for Unity to poll.

Render

Unity and C# Controllers

Unity owns the live 3D world. C# scripts map incoming metrics into biomass, particles, wind, rain, camera movement, and other environmental behaviors.

Assets

SpeedTree and Custom Scene Work

Custom SpeedTree oak tree assets, Asset Store grass and rain systems, architectural geometry, particles, sky behavior, and scene lighting combine to make the data feel like a living terrarium rather than a dashboard.