Overview
This was a team project that extended a base Arkanoid implementation with richer mechanics, a refactored OOP architecture, and new features divided among the group. The team added improved collision detection, multiple brick types, a power-up system, and level progression. My responsibility was the sound system: designing and implementing event-driven audio feedback across the full game.
Adding sound to a game requires hooking into the event model without creating tight coupling between audio logic and game logic. I designed the sound layer to respond to game events (brick destroyed, ball bounced, power-up collected, level complete, game over) through a clean interface, so the audio system could be developed and tested independently.
Project Context
Team project built on a base Arkanoid clone. Each member owned a feature area; the team integrated at the end.
My Contribution
Full sound system - event-driven audio for every game interaction, loaded and played using Java's audio API.
Challenge
Integrating audio without coupling sound logic to game logic, so teammates' code didn't need to know about the audio system.
Tech Stack
My Contribution: Sound System
- Event-driven audio architecture: designed a
SoundManagerclass that other components could call without knowing how audio worked, keeping sound logic decoupled from game logic. - Audio loading and playback: used Java's
javax.sound.sampledAPI to load WAV files intoClipobjects at startup, avoiding disk reads during gameplay that would cause audio lag. - Game event coverage: wired sound effects to every meaningful game event: ball hitting a brick, ball hitting the paddle, ball hitting a wall, brick destroyed, power-up collected, level complete, life lost, and game over.
- Overlapping sounds: handled the case where the same sound fires multiple times in rapid succession (e.g. ball destroying several bricks at once) by opening a new
Clipinstance per play rather than reusing a single shared instance. - Team integration: coordinated with teammates so the sound API was simple enough to call from the collision and game-state code they were writing, without requiring them to understand the audio implementation.
What the Team Built
Context for the full project my sound system was integrated into:
- Improved collision detection: side-specific bounce logic replacing the original naive axis flip, fixing corner-clipping bugs.
- Brick class hierarchy: base
Brickclass with subclasses for standard, tough (two-hit), and exploding bricks. - Power-up system: falling power-ups (wide paddle, extra ball, slow ball) implemented as individual classes behind a shared interface.
- Game state and level progression: lives system, score tracker, win/lose conditions, and level loading from 2D brick-type arrays.
Challenges & What I Learned
- Audio lag from loading on demand: my first attempt loaded audio files from disk each time a sound fired, which caused a noticeable delay on the first play. Pre-loading all clips at startup eliminated the lag entirely.
- Concurrent sound playback: a single
Clipcan only play once at a time; restarting it cuts off the previous play. The fix was opening a fresh clip instance per event rather than sharing one. - Designing a clean API for teammates: the sound system had to be easy enough that teammates could add a one-line call from their code without understanding
javax.sound.sampled. Writing for other developers is a different skill than writing for yourself. - Integration in a team project: working against code written by others, rather than code I wrote, required reading and understanding their event model before designing how audio would hook into it.