
Hydra will divide a computer's peripherals as individual resources that will be controlled by different devices in the smart space, if they present the proper drivers. Currently, the aim is to offer Mouse, Keyboard, Camera and Screen resources for the smart space. So far, the mouse and keyboard drivers were already finished, and I'm studying how to write the Camera and Screen drive codes.
As an example regarding the usefulness of Hydra, I'll present two scenarios.
Lets suppose there's a meeting going on. While Madison presents some slides, Ethan doesn't understand some part of the presentation and requests control over the mouse to better explain his/her doubts. Madison uses Hydra to offer control over his computer's mouse pointer and Ethan then controls it using his own cellphone or computer.
A teacher wants to present how to configure some software to his/her students. S/he offers her screen as a resource and the students use Hydra to capture it and display on their own monitors. Then everyone can watch closely every step involved.
That's it. The idea is to finish the development and test by the end of November. Let's see how things will roll!