android--libstreaming使用说明------2025年android上面可以使用的rtsp服务器----只能用来查看手机摄像头
https://github.com/fyhertz/libstreaming
What it does
libstreaming is an API that allows you, with only a few lines of code, to stream the camera and/or microphone of an android powered device using RTP over UDP.
- Android 4.0 or more recent is required.
- Supported encoders include H.264, H.263, AAC and AMR.
The first step you will need to achieve to start a streaming session to some peer is called 'signaling'. During this step you will contact the receiver and send a description of the incomming streams. You have three ways to do that with libstreaming.
第一步就是:我觉得他是说需要打洞 就是突破防火墙啊
- With the RTSP client: if you want to stream to a Wowza Media Server, it's the way to go. The example 3 illustrates that use case.
- With the RTSP server: in that case the phone will act as a RTSP server and wait for a RTSP client to request a stream. This use case is illustated in the example 1.
- Or you use libstreaming without using the RTSP protocol at all, and signal the session using SDP over a protocol you like. The example 2 illustrates that use case.
使用上面那3个示例还只是发送了一个信号,而没有传输内容呢
The full javadoc documentation of the API is available here: Generated Documentation (Untitled)
How does it work? You should really read this, it's important!
There are three ways on Android to get encoded data from the peripherals:
- With the MediaRecorder API and a simple hack.
- With the MediaCodec API and the buffer-to-buffer method which requires Android 4.1.
- With the MediaCodec API and the surface-to-buffer method which requires Android 4.3.
Encoding with the MediaRecorder API
The MediaRecorder API was not intended for streaming applications but can be used to retrieve encoded data from the peripherals of the phone. The trick is to configure a MediaRecorder instance to write to a LocalSocket instead of a regular file (see MediaStream.java).
Edit: as of Android Lollipop using a LocalSocket is not possible anymore for security reasons. But using a