Bob Pritchard (B.Mus, Mus.M, DMA) is a Canadian composer living in British Columbia. His works are performed and broadcast internationally and his research includes interactive performance, gesture tracking, and gesture-controlled speech synthesis. He creates video, software and music for his interactive works and produces short art films. He teaches music technology in the UBC School of Music, co-directs the Laptop Orchestra, and is a researcher with the UBC Institute for Computing, Information and Cognitive Systems (ICICS).
Dr. Pritchard completed degrees at the U. of Toronto and UBC. Prior to teaching at the UBC School of Music he taught at Brock University, the UBC Dept. of Physics, and Douglas College.
|I gratefully acknowledge that UBC-Vancouver activities take place on the traditional, ancestral, and unceded territory of the Coast Salish people, including the xwməθkwəy̓əm (Musqueam), Skwxwú7mesh (Squamish), Stó:lō, and Səl̓ílwətaʔ/Selilwitulh (Tsleil- Waututh) Nations.|
|Sept 17: 10:00 a.m. PST. Zabreb Bienalle will be showing Turning Point video projects, including Synapses|
|Sept 15: 6:00 - 6:45 CITR Radio's program Research Review: interview regarding my research|
|Sept 8: New interactive work for two dancers and two musicians commissioned from Athena Loredo by the TASTE project.|
|New choreo work started for second spine. Danielle Lee, dance; Emmalena Fredriksson, choreography.|
|August: Beginning work on new piece for 3D printed violin with sensors, interactive sensor suit with lighting, Kinect-Controlled Artistic Sensing System (KiCASS), and Max/MSP/Jitter. Daniel Tsui, violin; Danielle Lee, dance; Emmalena Fredriksson, choreography; Alaia Hamer, costume design.|
|July 22: Video recording of Doshite? for piano, Sleeve- Hand Responsive User Garment (SHRUG), video clips, and interactive electronics (MAx/MSP/Jitter). Commissioned and recorded by Megumi Masaki at the UBC Telus Theatre. Video project by Collide Entertainment.|
|July 3: Latest tests with Cheap Loops combining colour and gesture tracking for control of audio track volumes and the triggering of samples through gesture recognition.|
|June 11: Daniel Tsui and Jack Griffiths have refined the Hidden Markov Modelling for tracking dance movements. This will be used to control sample triggering and scrubbing, lighting presets, and lighting changes.|
|June 2: assisting with recording/processing for Emmalena Fredriksson's dance film Soft Palete|
|May 3: adding machine learning to KiCASS for gesture tracking libraries with multiple dancers|
|Apr 22: Synapses accepted for presentation at the New York City Electroacoustic Music Festival 2021|
© Copyright 2020 Bob Pritchard. All Rights Reserved.