I am currently working in this problem for hours now. I have to create a program that when user gets a video from a child window which accesses your hard disk drives, I have to get the frame rate and other properties from that video.
Here's a sample code of how I'm getting the videos and some of their properties.
SelectDirectoryWindow selectDirectoryWindow = (sender as SelectDirectoryWindow);
if (selectDirectoryWindow.DialogResult.GetValueOrDefault(false))
{
foreach (System.IO.FileInfo fileInfo in selectDirectoryWindow.VideoFiles)
{
VideoFileInfo videoFileInfo = new VideoFileInfo();
videoFileInfo.FileName = fileInfo.Name;
videoFileInfo.Path = fileInfo.FullName;
videoFileInfo.Extension = fileInfo.Extension;
videoFileInfo.FileSize = fileInfo.Length;
switch (videoFileInfo.Extension.ToUpper())
{
case ".WMV":
videoFileInfo.VideoFileType = Constants.VideoFileType.Wmv;
break;
case ".MOV":
videoFileInfo.VideoFileType = Constants.VideoFileType.ProResHq;
break;
case ".MPG":
videoFileInfo.VideoFileType = Constants.VideoFileType.Mpeg2;
break;
case ".ISM":
videoFileInfo.VideoFileType = Constants.VideoFileType.SmoothStreaming;
break;
case ".MP4":
videoFileInfo.VideoFileType = Constants.VideoFileType.iPad;
break;
default:
break;
}
Is there any way I can also get the frame rate, video duration and bit rate from this? What can I do to get the frame rate and bit rate? Thanks in advance.
I have found the answer. There is a ShellFile class on the Microsoft.WindowsAPICodePack.Shell. In there you can get the properties of the video, just give it the source of the file(filepath). And you can get anything from there.
Here's how I got the Frame Rate.
ShellFile shellFile = ShellFile.FromFilePath(sourceFile);
return (shellFile.Properties.System.Video.FrameRate.Value / 1000).ToString();
Related
when using
getResources().getConfiguration().uiMode & Configuration.UI_MODE_NIGHT_MASK
to check what mode the app is currently in,
int currentNightMode = getResources().getConfiguration().uiMode
& Configuration.UI_MODE_NIGHT_MASK
switch (currentNightMode) {
case Configuration.UI_MODE_NIGHT_NO:
// Night mode is not active, we're in day time
case Configuration.UI_MODE_NIGHT_YES:
// Night mode is active, we're at night!
case Configuration.UI_MODE_NIGHT_UNDEFINED:
// We don't know what mode we're in, assume notnight
}
if called this with AppCompatDelegate.MODE_NIGHT_YES earlier
AppCompatDelegate.setDefaultNightMode(AppCompatDelegate.MODE_NIGHT_YES);
is the return of currentNightMode to be Configuration.UI_MODE_NIGHT_YES?
what it would return when the AppCompatDelegate.MODE_NIGHT_FOLLOW_SYSTEM was set before
AppCompatDelegate.setDefaultNightMode(AppCompatDelegate.MODE_NIGHT_FOLLOW_SYSTEM);
and the device has changed from light to dark them (or from dark to light)?
context.resources.configuration.uiMode and Configuration.UI_MODE_NIGHT_MASK == Configuration.UI_MODE_NIGHT_YES
tells current what mode the app will be in.
when
AppCompatDelegate.setDefaultNightMode(AppCompatDelegate.MODE_NIGHT_FOLLOW_SYSTEM)
if change the system theme in settings (in Android Q),
the configuration.uiMode will reflect the change.
same with the
AppCompatDelegate.setDefaultNightMode(AppCompatDelegate.MODE_NIGHT_YES)
or
AppCompatDelegate.setDefaultNightMode(AppCompatDelegate.MODE_NIGHT_NO)
note: the configuration.uiMode change will trigger a config change and may cause recreate the activity.
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.P) {
themeSystem.setVisibility(View.VISIBLE);
} else {
themeSystem.setVisibility(View.GONE);
}
switch (sharePref.getTheme()) {
case THEME_LIGHT:
themeLight.setChecked(true);
break;
case THEME_DARK:
themeDark.setChecked(true);
break;
case THEME_SYSTEM:
themeSystem.setChecked(true);
break;
default:
switch (getResources().getConfiguration().uiMode & Configuration.UI_MODE_NIGHT_MASK) {
case Configuration.UI_MODE_NIGHT_NO:
themeLight.setChecked(true);
break;
case Configuration.UI_MODE_NIGHT_YES:
themeDark.setChecked(true);
break;
case Configuration.UI_MODE_NIGHT_UNDEFINED:
themeLight.setChecked(true);
break;
}
}
themeGroup.setOnCheckedChangeListener(new RadioGroup.OnCheckedChangeListener() {
#Override
public void onCheckedChanged(RadioGroup group, int checkedId) {
switch (checkedId) {
case R.id.themeLight:
setTheme(AppCompatDelegate.MODE_NIGHT_NO, THEME_LIGHT);
break;
case R.id.themeDark:
setTheme(AppCompatDelegate.MODE_NIGHT_YES, THEME_DARK);
break;
case R.id.themeSystem:
setTheme(AppCompatDelegate.MODE_NIGHT_FOLLOW_SYSTEM, THEME_SYSTEM);
break;
}
}
});
Similar to other ping questions here, we need to ping many IP addresses at the same time. We thought we had it running great after following many responses and examples from here, however we are experiencing failed pings due to timeout when we attempt to ping them at the same time, or even close together.
We tested our list of IP's on another ping monitoring program and the timeout does not happen. The issue is specific to our application.
In our application, every "device" has a RouterProperties class that holds its Name, IP Address, etc. We ping from the observable collection of these devices, getting their IP addresses from their RouterProperties.
Tested separating pings by 10(ms)/20(ms) await task.delay
Tested separating pings by 10(ms)/20(ms) using a dispatcher timer
Tested pinging all at once with no delay between, causing largest amounts of timeouts.
Tested converting IP Address "strings" to IP Address before pinging, no visible change in issue.
Tested with ping timeout at 500(ms), 1000(ms), 5000(ms), etc.
Current test is for 143 devices, but needs to be able to handle more.
//OUR MAIN PING SCHEDULER
private async void PingAllDevices(object sender, EventArgs e)
{
var allPingTasks = new List<Task>();
int numOfDevices = 1;
//Assign a task to each device.
foreach (RouterProperties device in devices)
{
await Task.Delay(10);
Console.WriteLine("Pinging device #" + numOfDevices + " : " + device.RouterIP);
numOfDevices++;
allPingTasks.Add(AsyncPingDevice(device));
}
//Block here for all created tasks.
await Task.WhenAll(allPingTasks);
}
}
//OUR PING TASK
async Task AsyncPingDevice(RouterProperties device)
{
// Get device IP address to ping.
string deviceIP = device.RouterIP;
try
{
Ping pingSender = new Ping();
PingOptions options = new PingOptions();
var reply = await pingSender.SendPingAsync(deviceIP, (int.Parse(Properties.Settings.Default.PingTimeout)), buffer, pingOptions);
Console.WriteLine(deviceIP + " has responded.");
if (reply.Status == IPStatus.Success)
{
device.PingCounter = 0;
}
else
{
if (device.PingCounter <= 3)
{
device.PingCounter++;
}
}
}
catch
{
if (device.PingCounter <= 3)
{
device.PingCounter++;
}
}
await Dispatcher.BeginInvoke(DispatcherPriority.Normal, new Action( () =>
{
if (device.IsDeactivated != true)
{
switch (device.PingCounter)
{
case 0:
device.Color = "#FF8AA587";
break;
case 1:
device.Color = "#FF9AB999";
break;
case 2:
device.Color = "#FFFCCEAA";
break;
case 3:
device.Color = "#FFF4837D";
break;
case 4:
device.Color = "#FFEB4960";
break;
}
}
}));
}
When we run our code, with and without the task.delay inside the ping scheduler to separate the pings, we end up with inconsistent results from the devices being pinged.
Without the delay = 75% of all devices timeout.
With the delay = inconsistently random 10% of devices timing out per
cycle.
Similar ping program to ours = 100% consistent results without any
timeouts.
I am trying to use the includeAllBurstAssets of PHFetchOptions.
In my camera roll, there are about 5 burst assets (each with about 10 photos).
To enumerate the assets in the Camera Roll I'm doing the following:
PHFetchOptions *fetchOptions = [PHFetchOptions new];
fetchOptions.includeAllBurstAssets = YES;
PHFetchResult *fetchResult = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeSmartAlbum subtype:PHAssetCollectionSubtypeSmartAlbumUserLibrary options:fetchOptions];
PHFetchResult *assetFetch = [PHAsset fetchAssetsInAssetCollection:fetchResult[0] options:fetchOptions];
NSLog(#"Found assets %lu",(unsigned long)assetFetch.count);
No matter, if I set the includeAllBurstAssets property to NO or YES, I get the exact same count of assets.
I expected the number to be higher, if includeAllBurstAssets is set to YES.
Is this a bug or I am interpreting the includeAllBurstAssets in a wrong way.
There is a special method to query for all images of a burst sequence.
[PHAsset fetchAssetsWithBurstIdentifier:options:]
In your case you need to iterate over your assetFetch object and check if the PHAsset represents a burst.
PHAsset defines the property BOOL representsBurst
If that returns YES, fetch all assets for that burst sequence.
Here is a code snippet that might help to understand:
if (asset.representsBurst) {
PHFetchOptions *fetchOptions = [PHFetchOptions new];
fetchOptions.includeAllBurstAssets = YES;
PHFetchResult *burstSequence = [PHAsset fetchAssetsWithBurstIdentifier:asset.burstIdentifier options:fetchOptions];
PHAsset *preferredAsset = nil;
for (PHAsset *asset_ in burstSequence) {
if (PHAssetBurstSelectionTypeUserPick == asset.burstSelectionTypes || PHAssetBurstSelectionTypeAutoPick == asset.burstSelectionTypes) {
asset = preferredAsset = asset_;
}
}
if (!preferredAsset) {
asset = burstSequence.firstObject;
}
}
As you can see, the burstSelectionTypes are not always set resp. are sometimes PHAssetBurstSelectionTypeNone for all assets of a burst sequence.
I am new to DirectShow. I, like many others, am trying to create a socket-based P2P streaming solution for a WPF-based card game. I want each player to be able to see each other via small video windows.
My questions are two-fold. The first is How do I lower the frame sample rate and resolution? I believe 320x200 x 15 to 20 fps should be fine. I am using the SampleGrabber callback to grab frame data and send it over the socket; which is actually working with no compression at 640x480 resolution.
My second question is, since each frame contains 921,600 bytes, this really bogs down and I get very slow rendering just across my local WiFi connected LAN. I added a simple MJPEG compression (wanting to switch to h.264 later) and I noticed the bytes drop to around 330-360k. Not a bad improvement.
On the receiving end Do I need to create a custom DirectShow Source Pin in order to serve up the bytes received from the socket so I can attach a decoder and render the bytes in a window?
I just wanted to ask this first since it seems like a lot of work to create a new COM object (haven't done that in about 15 years!), register it, and use/debug it.
Is there perhaps another way?
Also if that is the way to go, should I use a SampleGrabber on the receiving end and create a BitmapSource from the decompressed bytes, or should I allow DirectShow to create a child window? Thing is, I want to have more than one other player and I set an extra byte in the socket to tell what table position they are in. How do I render each position in turn?
For those that are interested, here is how you set the resolution and add an encoder/compressor:
// Create a graph builder
int hr = captureGraphBuilder.SetFiltergraph(graphBuilder);
// Find a capture device (WebCam) and attach it to the graph
sourceFilter = FindCaptureDevice();
hr = graphBuilder.AddFilter(sourceFilter, "Video Capture");
// Get the source output Pin
IPin sourcePin = DsFindPin.ByDirection((IBaseFilter)sourceFilter, PinDirection.Output, 0);
IAMStreamConfig sc = (IAMStreamConfig)sourcePin;
int count;
int size;
sc.GetNumberOfCapabilities(out count, out size);
VideoInfoHeader v;
AMMediaType media2 = null;
IntPtr memPtr = Marshal.AllocCoTaskMem(size);
for (int i = 0; i < count; ++i)
{
sc.GetStreamCaps(i, out media2, memPtr);
v = (VideoInfoHeader)Marshal.PtrToStructure(media2.formatPtr, typeof(VideoInfoHeader));
// Break when width is 160
if (v.BmiHeader.Width == 160)
break;
}
// Set the new media format t0 160 x 120
hr = sc.SetFormat(media2);
Marshal.FreeCoTaskMem(memPtr);
DsUtils.FreeAMMediaType(media2);
// Create a FramGrabber
IBaseFilter grabberF = (IBaseFilter)new SampleGrabber();
ISampleGrabber grabber = (ISampleGrabber)grabberF;
// Set the media type
var media = new AMMediaType
{
majorType = MediaType.Video,
subType = MediaSubType.MJPG
//subType = MediaSubType.RGB24
};
// The media sub type will be MJPG
hr = grabber.SetMediaType(media);
DsUtils.FreeAMMediaType(media);
hr = grabber.SetCallback(this, 1);
hr = graphBuilder.AddFilter(grabberF, "Sample Grabber");
IPin grabberPin = DsFindPin.ByDirection(grabberF, PinDirection.Input, 0);
// Get the MPEG compressor
Guid iid = typeof(IBaseFilter).GUID;
object compressor = null;
foreach (DsDevice device in DsDevice.GetDevicesOfCat(FilterCategory.VideoCompressorCategory))//.MediaEncoderCategory))
{
if (device.Name == "MJPEG Compressor")
{
device.Mon.BindToObject(null, null, ref iid, out compressor);
hr = graphBuilder.AddFilter((IBaseFilter)compressor, "Compressor");
break;
}
string name = device.Name;
}
// This also works!
//IBaseFilter enc = (IBaseFilter)new MJPGEnc();
//graphBuilder.AddFilter(enc, "MJPEG Encoder");
// Get the input and out pins of the compressor
IBaseFilter enc = (IBaseFilter)compressor;
IPin encPinIn = DsFindPin.ByDirection(enc, PinDirection.Input, 0);
IPin encPinOut = DsFindPin.ByDirection(enc, PinDirection.Output, 0);
// Attach the pins: source to input, output to grabber
hr = graphBuilder.Connect(sourcePin, encPinIn);
hr = graphBuilder.Connect(encPinOut, grabberPin);
// Free the pin resources
Marshal.ReleaseComObject(sourcePin);
Marshal.ReleaseComObject(enc);
Marshal.ReleaseComObject(encPinIn);
Marshal.ReleaseComObject(encPinOut);
Marshal.ReleaseComObject(grabberPin);
// Create a render stream
hr = captureGraphBuilder.RenderStream(PinCategory.Preview, MediaType.Video, sourceFilter, null, grabberF);
Marshal.ReleaseComObject(sourceFilter);
Configure(grabber);
I'm working on a cross platform mobile app using PhoneGap and I need to retrieve the IMSI code.
Here's the question: Is there any way to do this via PhoneGap?
I appreciate your comments.
hi for android we can find via native code , for phonegap we need to write a plugin, java code is given bellow
public String findDeviceID() {
String deviceID = null;
String serviceName = Context.TELEPHONY_SERVICE;
TelephonyManager m_telephonyManager = (TelephonyManager) getSystemService(serviceName);
int deviceType = m_telephonyManager.getPhoneType();
switch (deviceType) {
case (TelephonyManager.PHONE_TYPE_GSM):
break;
case (TelephonyManager.PHONE_TYPE_CDMA):
break;
case (TelephonyManager.PHONE_TYPE_NONE):
break;
default:
break;
}
deviceID = m_telephonyManager.getDeviceId();
return deviceID;
}
for creating phonegap plugin check this http://docs.phonegap.com/en/edge/guide_platforms_android_plugin.md.html#Android%20Plugins