I'm trying to create a snapshot from a running pipeline in iOS. I use a button to take the snapshot.
I have the following pipeline
udpsrc auto-multicast=true address=224.1.1.1 port=5004"
+ " ! application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)\"Z0JAH6aAQAYZAA\\,aM4wpIAA\", payload=(int)96, ssrc=(uint)19088743, timestamp-offset=(uint)0, seqnum-offset=(uint)0"
+ " ! rtpjitterbuffer latency=400"
+ " ! rtph264depay ! avdec_h264 ! videoconvert"
+ " ! tee name=snapshot snapshot. ! queue ! valve name=snap drop=true ! jpegenc ! filesink name=filepath location=screenshot.jpg async=false snapshot. ! queue"
+ " ! autovideosink
So I use the following code in my button to handle the valve:
GstElement *element = gst_bin_get_by_name (GST_BIN (pipeline), "snap");
if (strcmp("drop", "drop") == 0)
{
gboolean prop_val = FALSE;
// if the property value is true, then send an EOS.
if ( strcmp("false", "true") == 0 )
{
gst_element_send_event(element, gst_event_new_eos());
prop_val = TRUE;
} else {
prop_val = FALSE;
}
g_object_set (element, "drop", prop_val, NULL);
}
But with this code I can only take one screenshot. And I can't set the filename of the image.
How can I save the screenshot without blocking the stream and save the image in the documents folder with a custom name every time the button is clicked?
After a long search and struggle I found a solution so I will answer my question myself.
I've removed the tee from my pipeline and use the last sample of the video buffer now.
My pipeline:
udpsrc auto-multicast=true address=224.1.1.1 port=5004"
+ " ! application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)\"Z0JAH6aAQAYZAA\\,aM4wpIAA\", payload=(int)96, ssrc=(uint)19088743, timestamp-offset=(uint)0, seqnum-offset=(uint)0"
+ " ! rtpjitterbuffer latency=400"
+ " ! rtph264depay ! avdec_h264 ! videoconvert"
+ " ! autovideosink
This is my Objective-C code:
-(UIImage*) takeSnapshot
{
GstSample *videobuffer = NULL;
GstCaps *caps;
gint width, height;
GstMapInfo map;
g_object_get(G_OBJECT(video_sink), "last-sample", &videobuffer, NULL);
if (videobuffer)
{
caps = gst_sample_get_caps(videobuffer);
if (!caps) {
return NULL;
}
GstStructure *s = gst_caps_get_structure(caps, 0);
/* we need to get the final caps on the buffer to get the size */
gboolean res;
res = gst_structure_get_int (s, "width", &width);
res |= gst_structure_get_int (s, "height", &height);
if (!res) {
return NULL;
}
GstBuffer *snapbuffer = gst_sample_get_buffer(videobuffer);
if (snapbuffer && gst_buffer_map (snapbuffer, &map, GST_MAP_READ))
{
CGDataProviderRef provider = CGDataProviderCreateWithData(NULL,
map.data,
height * width * 4,
NULL);
CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault;
CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;
CGImageRef imageRef = CGImageCreate(width,
height,
8,
4 * 8,
width * 4,
colorSpaceRef,
bitmapInfo,
provider,
NULL,
NO,
renderingIntent);
UIImage *uiImage = [UIImage imageWithCGImage:imageRef];
CGColorSpaceRelease(colorSpaceRef);
CGImageRelease(imageRef);
return uiImage;
}
return NULL;
}
return NULL;
}
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.