samedi 8 février 2014

Serveur vidéo

Bon, ça fait un moment que je n'ai pas publié un peu sur ce blog.
Mes activités en ce moments:
- OpenEmbedded/Yocto sur Raspberry Pi
- Serveur Vidéo RTP/RTSP/RTCP

Lien intéressant pour openembedded/yocto:

Merci Monsieur Ficheux, d'ailleurs merci pour votre très intéressante formation de fin janvier.

Côté Serveur Vidéo:


Outils très intéressants pour créer son propre serveur vidéo.
Objectif: l'installer sur un Raspberry Pi.

Point intéressant pour traiter un flux:

The "test*Streamer" test programs read from a file. Can I modify them so that they take input from a H.264 or MPEG encoder instead, so I can stream live (rather than prerecorded) video and/or audio?

Yes. The easiest way to do this is to change the appropriate "test*Streamer.cpp" file to read from "stdin" (instead of "test.*"), and then pipe the output of your encoder to (your modified) "test*Streamer" application. (Even simpler, if your operating system represents the encoder device as a file, then you can just use the name of this file (instead of "test.*").) Alternatively, if your encoder presents you with a sequence of frames (or 'NAL units'), rather than a sequence of bytes, then a more efficient solution would be to write your own "FramedSource" subclass that encapsulates your encoder, and delivers audio or video frames directly to the appropriate "*RTPSink" object. This avoids the need for an intermediate 'framer' filter that parses the input byte stream. (If, however, you are streaming H.264, or MPEG-4 (or MPEG-2 video with "B" frames), then you should insert the appropriate "*DiscreteFramer" filter between your source object and your "*RTPSink" object.)
For a model of how to do that, see "liveMedia/DeviceSource.cpp" (and "liveMedia/include/DeviceSource.hh"). You will need to fill in parts of this code to do the actual reading from your encoder.

Aucun commentaire:

Enregistrer un commentaire