The resolution of the Samsun Galaxy Note II is 1280*720, but I don't know what the devicePixelRation or device-independent pixels.
Someone say the dips is 320px, and others say 360px. What is the exact value?
This is my first reply, hope I give the right answer as you want... ##
By the wiki definition:Calculation of monitor PPI
Dp = sqrt(Wp*Wp+Hp*Hp)
where
Dp is diagonal resolution in pixels
Wp is width resolution in pixels (in Note2 is 720px)
Hp is height resolution in pixels (in Note2 is 1280px)
and
PPI = (Dp / Di)
Di is diagonal size in inches (this is the number advertised as the size of the display) (in Note2 is 5.55inch)
So your can get your dp = sqrt(720*720+1280*1280) = 1468.60
your PPI(not always the same as DPI, but in our situation you can consider they are the same) PPI = (1468.60 / 5.55) = 264 (physically), such should locate at hdpi...
The Ratio should be 264/160 = 1.65, the device-independent pixels on width should be (720/1.65) = 436.36 dip (& on height=>775.76 dip)
You mentioned that someone say the dips is 320px, I think they means the DPI is 320 or 360... In the formula the DPI is 264(physically)...
What I saw on XDA-Developer Said:
DPI is the pixel density of your screen that can be soft-modded to make your display "look" different. There are many apps that can actually change your DPI. Note 2's DPI is 320 by default. 240 DPI is the best one IMO for this type of screen size.
Hope this answer can help you...
Related
I have this device physical characteristics:
Display:
8.4 in.: 21.3 cm; 720 nits
10.1 in.: 25.7 cm; 540 nits
Supports up to 2560x1600;
Corning Gorilla Glass; daylight viewable
Dimensions:
8.4 in. tablet: 9 in. W x 5.9 in. H x 0.5 in. D/
228 mm W x 150 mm H x 12.7 mm D
10.1 in. tablet: 10.6 in. W x 7.1 in. H x 0.5 in. D/
269 mm W x 181 mm H x 12.7 mm D
I want to set MinHeight and MinWidth, that should be the actual height and width of the screen of this device.
What's the corect MinHeight and MinWidth? I don't know how to calculate it
To determine the WPF width and height of the display requires knowledge of the DPI of the display. One thing is the actual DPI and another is the DPI that WPF believes the display has.
Taking the 8.4 tablet as an example:
Unit
Width
Height
mm
228
150
inch
~8.98
~5.91
pixels
2560
1600
DPI
~285
~271
You convert mm to inch by dividing by 25.4. You get DPI by dividing the pixels with the inches.
Most likely the pixels are square so the horizontal and vertical DPI are the same. Probably the height in mm is greater than the actual display which means that it includes a bezel. And so the same is probably true for the width.
So a guess is that the "logical" DPI of the display is 288. This happens to be three times 96 and WPF units are 1/96 inch.
If this is true you have to divide the actual pixels (2560 and 1600) by 3 to determine the WPF units to use:
MinWidth = 2560/3;
MinHeight = 1600/3;
However, as you can see there is some guessing involved in this. What if WPF sees the display as having 96 DPI and not 288 DPI? Then you should not divide by 3 but instead by 1 which is the same as not dividing. The best way to figure this out is to actually use WPF with the display.
You could go through all the trouble of getting the screen height in physical pixels, converting to WPF's "device-independent pixels" unit, accounting for Taskbar size and placement, accounting for reserved space at the borders of the screen, etc.
You could do all that, or you could just use the very simple method from this answer:
Just set the WindowState to Maximized, and the WindowStyle to None.
This totally fills the entire screen, but leaves all the above details to Windows to figure out.
I set my screen to 72 DPI and designed a small png image that is 100x100
# 72 DPI. Which means 72 pixels represents 1 inch.
Now I changed my screen to 120 DPI and designed the same
graphic png image with 100x100 # 120 DPI. Which means 120 pixels represents 1 inch.
Again I changed my screen to 96 DPI. Which means 96 pixels represents 1 inch.
Then Created a WPF application and added the two images (step 1 and step 2) - I have set the Stretch Mode to None
Not sure if I have understood the concept properly,
I expected the 72 dpi image to look smaller at 96 dpi because if 72 pixels represented 1 inch for that image, then in the new configuration 96 pixels represents 1 inch. And therefore I expected the 72 dpi image to look smaller. But is not the case. Infact, it was just he opposite. The 72 dpi image look bigger # 96 dpi. Why? Is it like WPF will always default to 96 dpi when it comes to images?
Update
Why is that, even at 120 dpi (setting system dpi to 120), only 96 dpi image fits 200x200 box perfectly?
You are making some odd assumptions about the nature of device independent nature of WPF's graphical units.
A device independent pixel in wpf world is worth 1/96th of an inch regardless of the screen settings. This is why only the 96 dpi experiment is correct.
Secondly, your monitors native dpi has an impact
The second scale factor, the “DPI setting”, is what we will vary in
our tests. WPF doesn’t independently know what your monitor’s actual
physical DPI value is. Instead WPF uses the current setting of this
second scale factor the “DPI setting”. If the “DPI setting” does not
match the true physical DPI, then WPF’s “resolution independence” will
appear to break — although it really doesn’t.
http://www.wpflearningexperience.com/?p=41
An image that has 72 dots per inch (DPI) will have 72 dots per inch, whereas an image that has 120 DPI will have 120 per inch. Therefore, if we are displaying an image that is an inch by an inch, each side will have 72 dots for the 72 DPI image and 120 dots for the 120 DPI image.
Therefore, each dot in the 72 DPI image is larger than each dot in the 120 image, so the whole 72 DPI image is larger than the 120 DPI image.
For further reading, you might like to view the DPI setting and resolution of WPF application page on the Mindfire Solutions website.
WPF documentation and tutorials state that WPF is resolution independent which I understood shows a window in the same size in different resolutions (1600x1200 -> native and 1024x768) and/or DPI settings. However, when I tried a sample app. with different resolutions the sizes are different. On the net I found http://www.wpflearningexperience.com/?p=41, which use "native resolution" in order to see the same window size on different computers, however I could not understand the underlying concept.
Why native resolution for LCD is vital and resolution indepence is a term instead of DPI independence? Probably, I do not know/use terminology well, but I need a clarification in order to understand this issue.
I do not want to answer my own questions, but I think I got the point. Sorry for this premature question, but after a while I noticed the problem.
As far as I understand WPF uses System DPI (which you set by changing through Windows Desktop Settings) as a scale factor. For example in the tutorial above one of the computers have a native resolution of 1600x1200 and 96 DPI (94 actually as stated in the tutorial). Everything is fine because System DPI (96) is quite close to the real DPI (94) and WPF can use this information to scale your window.
As you know a Device Independent Unit is 1/96 inch and with the numbers above real pixel size (physical pixel on your screen) is multiplied with ( (1/96) * 96 ) which is equal to 1. So if you have a window with 300 DIU, then you will see ( 300/96 ) inch on your screen.
However, when you change your resolution without changing System DPI, here comes the problem which confused my mind. Say, you set the screen resolution to 1024x768 without changing the System DPI (still it is 96) and run the application again and you see a bigger window. This is the reqult of wrong System DPI and naturally wrong scale factor for WPF. WPF does not know much about your real DPI, it only uses the information you give to it which is System DPI. Let's recalculate our window size with this new settings. First we need scale factor which is ( (1/96) * 96 ) equal to 1. We can say that 1 logical pixel is also 1 physical pixel on screen. However, we changed the resolution and 300 pixel is not the same with the previous resolution. Previously we have 1600 pixel on the diagonal which is also 17 inch in length. However, we have 1024 pixel for 17 inch now. 300 pixel is almost 5 inch on our new resolution (1024x768), but only 3.18 inch in the previous resolution (1600x1200). Therefore, WPF cannot be resolution independent due to the wrong DPI value and draws a larger window with the new resolution.
So, how I fix the problem (in my words, I do not claim that this is an absolute solution)! When I change the resolution, I also changed the DPI value which is true for this new resolution. For example, for my own monitor the diagonal size is 17 inch with a resolution of 1024x768, I use the formula (1024 pixel / 17 inch) and find that my new true DPI is almost 60 DPI. I set the System DPI to 60 (through the Desktop Settings of course) and it works. Not perfectly, due to the rounding errors during the calculations, but in practice values can be considered equal.
WPF uses System DPI in order to be resolution independent and you need to set actual DPI (real DPI value for current resolution). You need to set DPI in order to help WPF to be resolution free. Finally (and I think very crucially) you have do one more thing, at least on Windows XP, you need a restart in order to enable your DPI settings with the new value. If you do not (as I did), WPF still make the calculations wrong and draw in different size.
These are my understanding and results from the tests I did, but I cannot claim they are absolute. I just want to share it with the people who may find this information useful. Please comment/edit my post if you find any error or think that needs improvement.
What is the meaning of the statement below:
My system resolution is 1024 x 768 at 96 DPI
I am not able to understand the internal maths that when we increase the DPI at fixed resoltion the user interface developed in VC++/MFC or C# /Winform application expands ( look larger then that at 96 DPI ).
For example we develop user interface at 96 DPI which mean 96 dots per inch. Now when we increase the DPI then we are increasing the Dots per inch then user interface should look compressed instead of enlarge.
I am doing it at windows 7 machine
Please help!!
My system resolution is 1024 x 768 at 96 DPI
This means that your computer thinks that your monitor has 96 dots (pixels) per inch (at this resolution). When a program does graphical calculations, it uses this setting to convert between real lengths (in inches or centimetres) and pixels.
This will work out correctly if the 96 DPI setting matches your monitor (i.e. the display area is 1024/96=10.67 by 768/96=8 inches).
Why do things get larger when you increase this setting? Let's say we want to make a button 1 inch high, and your monitor's real DPI is 96, but you have set it to 150. One inch times 150 dots per inch gives us 150 pixels, so we will draw our button 150 pixels high. But our monitor's real DPI is 96, so this appears as 150 pixels / 96 dpi = 1.56 inches high.
there is no "DPI"-setting for your monitor, this device only knows about pixels. DPI = either printers or preformatted documents which need to be viewed in special devices. You CAN, however, calculate how many pixels would be needed to display something with the physical size (hence DPI) of X ... which is a rather unprecise calculation, by the way.
If you're calculating physical sizes you're either developing computer-games, writing your own printing-driver or need to fulfill extraordinary project-tasks
Does anyone know what measurement units are used by Silverlight/WFP? For example, if I create a new button and set its height to 150, is that 150 pixels? points? millimeters?
I design all of my applications in Adobe Illustrator before proceeding to code, and although I try and set everything to the dimensions in my Illustrator file, the Silverlight application is usually larger.
Although in theory, 1 unit in WPF is 1/96th of an inch, that's frequently not the case in practice.
It's usually true when printing. But it's rarely true on screen. The reason for this is that Windows almost always knows the true resolution of a printer, but almost never knows the true resolution of a screen.
For example, I have three screens attached to my computer. Windows thinks that they all have a resolution of 96 pixels per inch. Actually they don't. Two of them have a resolution of 101 pixels per inch, and one has a resolution of 94 pixels per inch. (Why? Because Windows has no way of working the true resolutions out for itself, and I haven't told it. The fiction that they all have the same pixel size is close to the truth, and turns out to be a convenient fiction.)
So when I create, say, a Rectangle in WPF with Width and Height both set to 96, the size of the Rectangle actually depends on which screen it appears on. Windows thinks that all 3 screens have a resolution of 96 pixels per inch, and so it'll render the rectangle as being 96 pixels tall and wide no matter which screen it appears on. That'll make it appear 0.95 inches tall on two of the screens, and 1.02 inches tall on the third.
So in practice, that means that units in WPF on my computer here are either 1/100th of an inch, or 1/94th of an inch in practice. (I.e., in practice, the size of 1 unit in WPF is exactly the size of 1 pixel on my particular setup, no matter how big the pixels happen to be.)
I could change that. I could reconfigure Windows - I could tell it the actual resolution of all 3 screens, in which case the nominal and actual WPF unit sizes would coincide. Or I could lie - I could claim that I have 200 pixel per inch screens, in which case everything would be massive...
The basic problem here is that there is no standard way for the computer to discover the true size of the physical pixels on the screen, and very few people bother to set it up by hand. (And in fact you can cause problems by configuring it 'correctly', because a lot of software doesn't behave correctly when you do.) So the majority of Windows computers don't report physical pixel sizes correctly to WPF - they can't because they don't know.
Consequently, there's no reliable answer to the question - 1 unit in WPF could be pretty much anything on screen. (In practice, most of the time, it turns out to be 1 pixel, simply because if you don't tell Windows anything else, it defaults to assuming that your screens have pixels that are 1/96th of an inch tall, which is the same as 1 WPF unit. And for most desktop screens, that's actually quite likely to be a good guess. But this isn't universal. On systems configured with what used to be called 'large fonts' for example, you'll find a different nominal screen resolution, and 1 WPF unit will correspond to slightly more than 1 physical pixel - about 1.2 in fact.)
With printers, it's all much more predictable. Printers are invariably able to report their resolutions correctly. So if you print something that's 96 WPF units high, you can be confident that it will be 1 inch high.
MSDN's documentation states that the FrameworkElement.Height property (for Silverlight) refers to:
The height, in pixels, of the object
However, for WPF it refers to:
a device-independent unit (1/96th inch) measurement
So, to answer your question... pixels for Silverlight, device-independent units for WPF.
The documentation refers to Pixels, however these are Pixels where there are 96 such pixels per inch. A line of Width 96 when display on a 120 DPI display will be 120 actual device pixels. Similarly such a line drawn on a printer output which has 600 DPI will be 600 pixels long.
They are Device Independent Units.
You can find more detailed explanations here.