Possible? PS4 RockBand - Reading Lightbar data
Posted: Sun Feb 14, 2021 5:45 am
Howdy!
Been looking for years for a way to do this project, and while I have a makeshift solution... it seems the Titan Two MIGHT just be able to be the magic box I need.
Background:
When playing RockBand, all the songs have lighting cues... just like a real stage performance. (When to dim / brighten what lights, what colours, etc).
RockBand 4 for PS4 uses those same cues to change the colours on the DualShock Lightbars.
Currently my lighting software (LightJams) is able to use a camera to read the colours off the controllers. However, this obviously introduces lag, and the colours aren’t totally accurate. (Although, only about .25 or so of a second of lag, so not bad.)
Question:
Is the following possible?
1) Using the Titan 2 to read the lightbar data from the PS4 DualShock controllers?
(From reading, this seems totally possible. However only 2 per device correct?)
2) is there a way to live export the data to LightJams?
LightJams can read OSC, MIDI, ArtNET, sCAN
Or heck, LightJams can read screen data... (so for example, if it converted the RGB value from the lightbar to a coloured square. LightJams could read that part of the screen and get the RGB value. (While this is the same concept as using the camera it’s obviously much quicker for it to get the RGB value of what’s on screen, than it is from a camera feed. And would also be more accurate.
Bonus Question:
3) It also seems the computer vision software would even be able to tell when you’re on the song selection screen / score screen / starting a song...?
(Same thing as above, anyway to export a trigger from this, via anything above?
Thanks in advance! I know this is a really weird edge case...
P.S. If this whole crazy idea would work, but a plug-in or something would be needed to be programmed... I’d totally be willing to fork over some money for it.
P.P.S. lastly... you haven’t REALLY played RockBand until you’ve rocked out with a fogger and full set of stage lighting ;)
Been looking for years for a way to do this project, and while I have a makeshift solution... it seems the Titan Two MIGHT just be able to be the magic box I need.
Background:
When playing RockBand, all the songs have lighting cues... just like a real stage performance. (When to dim / brighten what lights, what colours, etc).
RockBand 4 for PS4 uses those same cues to change the colours on the DualShock Lightbars.
Currently my lighting software (LightJams) is able to use a camera to read the colours off the controllers. However, this obviously introduces lag, and the colours aren’t totally accurate. (Although, only about .25 or so of a second of lag, so not bad.)
Question:
Is the following possible?
1) Using the Titan 2 to read the lightbar data from the PS4 DualShock controllers?
(From reading, this seems totally possible. However only 2 per device correct?)
2) is there a way to live export the data to LightJams?
LightJams can read OSC, MIDI, ArtNET, sCAN
Or heck, LightJams can read screen data... (so for example, if it converted the RGB value from the lightbar to a coloured square. LightJams could read that part of the screen and get the RGB value. (While this is the same concept as using the camera it’s obviously much quicker for it to get the RGB value of what’s on screen, than it is from a camera feed. And would also be more accurate.
Bonus Question:
3) It also seems the computer vision software would even be able to tell when you’re on the song selection screen / score screen / starting a song...?
(Same thing as above, anyway to export a trigger from this, via anything above?
Thanks in advance! I know this is a really weird edge case...
P.S. If this whole crazy idea would work, but a plug-in or something would be needed to be programmed... I’d totally be willing to fork over some money for it.
P.P.S. lastly... you haven’t REALLY played RockBand until you’ve rocked out with a fogger and full set of stage lighting ;)