There is a standard that all the normal desktop userspace apps are already using, it's v4l2 and in particular a single /dev/video# device use case with all the highlevel controls exposed on this device directly.
For likes of Librem and Pinephone, the highlevel controls either don't exist on HW level, or they exist there, but are not exposed on the video device itself, but on various v4l2 subdevices that form the video pipeline.
One way to support all the already existing apps would be to implement what they already expect. (see above) That is, to be able to control the video device by usual means they already posess. Instead of extending all the apps to use libcamera, and leaving the rest behind, we could simply proxy the video controls from ioctls where all apps expect them to be to some userspace daemon, that would then configure the complex HW specific media pipeline behind the scenes (basically all the media system subdevices for sensors, sensor interfaces, ISP, etc.).
In other words to implement what's implemented by some USB microcontroller in UVC webcams in some userspace daemon, but keep the existing userspace interface expectations for simple camera usage.
This is kinda orthogonal to the libcamera effort, I guess. Just wanted to say that there already is a standard. :)
It's not orthogonal. In fact, it's a very good observation, and even libcamera itself recognizes it by providing v4l2 emulation.
It could be a viable way to get the basic use case of video streaming where special controls are not needed. It's worth considering, although then it makes sense to leverage work already in libcamera to implement the extra layer.
For likes of Librem and Pinephone, the highlevel controls either don't exist on HW level, or they exist there, but are not exposed on the video device itself, but on various v4l2 subdevices that form the video pipeline.
One way to support all the already existing apps would be to implement what they already expect. (see above) That is, to be able to control the video device by usual means they already posess. Instead of extending all the apps to use libcamera, and leaving the rest behind, we could simply proxy the video controls from ioctls where all apps expect them to be to some userspace daemon, that would then configure the complex HW specific media pipeline behind the scenes (basically all the media system subdevices for sensors, sensor interfaces, ISP, etc.).
In other words to implement what's implemented by some USB microcontroller in UVC webcams in some userspace daemon, but keep the existing userspace interface expectations for simple camera usage.
This is kinda orthogonal to the libcamera effort, I guess. Just wanted to say that there already is a standard. :)