Graphics.DrawString with high resolution bitmaps == LARGE TEXT - vb.net

I have an app that creates a large bitmap and later the user can add some labels. Everything works great as long as the base bitmap is the default 96x96 resolution. If I bump it up to 300 for instance, then the text applied with Graphics.DrawString is much too large - a petite size 8 or 10 font displays like it is 20.
On the one hand, it makes sense given the resolution increase, but on the other, you'd think the Fonts would scale. MeasureString returns a larger size when measured on a 300 vs 96 dpi bitmap, which wasn't really what I expected.
I've tried tricking it by creating a small bitmap of the appropriate size, printing to it, then pasting that to the master image. But when pasted to the high res it enlarges the pasted image.
The only other thing I can think of is to create a high res temp bitmap, print to it, then shrink it before pasting to the main image. That seems like a long way to go. Is there a compositing or overlay type setting that allows this? Are font sizes only true for a 96 dpi canvas?
Thanks for any hints/advice!

The size of a font is expressed in inches. One point is 1/72 inch. So if you draw into a bitmap that has 300 dots-per-inch then your font is going to use a lot more dots for the requested number of inches. So when you display it on a 300 dpi display then you'll get the size in inches back that you asked for.
Problem is, you are not displaying it a 300 dpi device, you are displaying it on a 96 dpi device. So it looks much bigger.
Clearly you don't really want a 300 dpi bitmap. Or you want to draw it three times smaller. Take your pick.

If you want a consistent size in pixels, specify UnitPixel when creating your Font object.

Related

Re-sizing visual image while maintaining image dimensions

I'm working with documents, so maintaining the the original image dimensions and subsequent dpi is important.
The aspect ratio is always maintained so the automatic fill modes and alike don't seem to have any effect.
Say I have a 300 dpi document and the user want to clear an inch border around the image. So I need an inch cropped from the image but the result needs to be the original image dimensions (2550x3300).
I have been able to achieve this effect with...
...&crop=300,300,-300,-300&margin=300,300,300,300
This works, but seems more than a little clunky. I've tried a lot of other combinations but they all seem to enlarge or reduce the image size which is undesirable in my case.
So does someone know a simpler syntax to achieve the desired result, or do I need to re-size the image then calculate and fill with a margin as I'm doing now.
Thanks
It turns out that my example requests the image in it's full size which turns out to be a special case. When I introduce a width or height into the command line things don't work very well since crop size is in respect to the original image dimensions and margin size is in respect to the result image.
Thinking about it more I abandoned the crop approach. What I really needed was a way to introduce a clipping region into the result bitmap. So I built an extension to do just that. It works well as it doesn't interfere with any of Resizer's layout calculations and the size of the returned image is whatever the height or width were specified as. Which is just what I needed. The Faces plugin has an example of introducing a clipping region.
Karlton
Cropping and re-adding 300px on each edge is best accomplished exactly the way you're doing it:
&crop=300,300,-300,-300&margin=300
What kind of improved syntax would you expect? This isn't a common operation.

Is there any way to optimizing image upload?

I have tried the following methods,
normal image upload.
encoding and decoding.
these two methods are taking long time to upload the image.
Any suggestion?
There are some simple ways:
Reduce the size of the image. From 1000x1000 to 500x500
Reduce the bpp of the image. For example instead of RGBA representation (32 bits per pixel) use RGB_565 (16bits per pixel) or even gray level image (8bits)
Reduce the quality of the image. Save it as .jpg. This will make the image much smaller. You can play with the quality parameter of jpeg. 100% means very high quality and large files, 1% means extremely tiny images (~40 times smaller) but all the details will be lost.
Save the image in Jpeg200 format. It reduces the size even further. Not every browser supports this format, so you might need to convert it to regular jpeg.
Use pyramids of images. For example. You have 1000x1000 image. Reduce its size by 2 to get 500x500, reduce again and again. Now you got 4 images 1000x1000, 500x500, 250x250, 125x125. You upload the 4 of them. Starting from the smallest to the largest. The smallest image will be uploaded very quick and you will be able to display it (though it is in lower resolution). Next when a better image arrives you update the display and resolution enhances. The effect would be that the basic image is loaded extremely fast and over time the resolution is enhanced. The transfer time of the 4 images will take only 30% more time than the original but the first one will arrive 64 times faster than the original
These are the basic solutions. If they are not what you needed please refine the question

What exactly is UIFont's point size?

I am struggling to understand exactly what the point size in UIFont means. It's not pixels and it doesn't appear to be the standard definition of point which is that they relate to 1/72th inch.
I worked out the pixel size using -[NSString sizeWithFont:] of fonts at various sizes and got the following:
| Point Size | Pixel Size |
| ---------- | ---------- |
| 10.0 | 13.0 |
| 20.0 | 24.0 |
| 30.0 | 36.0 |
| 40.0 | 47.0 |
| 50.0 | 59.0 |
| 72.0 | 84.0 |
| 99.0 | 115.0 |
| 100.0 | 116.0 |
(I did [#"A" sizeWithFont:[UIFont systemFontOfSize:theSize]])
And looking at the 72.0 point size, that is not 1-inch since this is on a device with a DPI of 163, so 1-inch would be 163.0 pixels, right?
Can anyone explain what a "point" in UIFont terms is then? i.e. is my method above wrong and really if I used something else I'd see something about the font is 163 pixels at 72 point? Or is it purely that a point is defined from something else?
A font has an internal coordinate system, think of it as a unit square, within which a glyph's vector coordinates are specified at whatever arbitrary size accommodates all the glyphs in the font +- any amount of margin the font designer chooses.
At 72.0 points the font's unit square is one inch. Glyph x of font y has an arbitrary size in relation to this inch square. Thus a font designer can make a font that appears large or small in relation to other fonts. This is part of the font's 'character'.
So, drawing an 'A' at 72 points tells you that it will be twice as high as an 'A' drawn at 36 points in the same font - and absolutely nothing else about what the actual bitmap size will be.
ie For a given font the only way to determine the relationship between point size and pixels is to measure it.
I am not sure how -[NSString sizeWithFont:] measures the height. Does it use line height or the difference between the peaks of the beziers? What text did you use?
I believe -[UIFont lineHeight] would be better to measure the height.
Edit:
Also, note that none of the measurement methods returns the size in pixels. It returns the size in points. You have to multiply the result by [UIScreen mainScreen].scale.
Note the difference between typographic points used when constructing the font and points from iOS default logical coordinate space. Unfortunately, the difference is not explained very clearly in the documentation.
I agree this is very confusing. I'm trying to give you some basic explanation here to make the things clearer.
First, the DPI (dot-per-inch) thing comes from printing, on physical papers. So does font. The unit point was invented to discribe physical printing size of text, just because inch is too large for usual text sizes. Then people invented point, that is the length of 1/72 inch (actually evolved in the history), to describe text size easily. So yes, if you are writing a document in Word or other word processing software for printing, you will get absolutely one-inch-height text if you use 72pt font.
Second, the theoretical text height is usually different from the rendered strokes you can actually see by your eyes. The original text height idea came from the actual glyphs used for printing. All letters are engraved on glyph blocks, which share the same height – which matches the font point height. However, depending on different letters and different font design, the actual visible part of the text may a little bit shorter than the theoretical height. Helvetica Neue is actually very standard. If you measure the top of a letter "k" to the bottom of a letter "p", it will match the font height.
Third, computer display screwed up DPI, as well as the definition of point at the same time. The resolution of computer displays are described by their native pixels, such as 1024 x 768 or 1920 x 1080. Software actually doesn't care the physical size of your monitors, because everything would be very fuzzy if they scale screen content like printing on paper — just the physical resolution is not high enough to make everything smooth and legit. Software uses a very simple and dead way: Fixed DPI for whatever monitor you use. For Windows, it's 96DPI; for Mac, it's 72DPI. That's said, no matter how many pixels make an inch on your monitor, software just ignores it. When the operating system renders text in 72pt, it would be always 96px high on Windows and 72px high on Mac. (That's why Microsoft Word documents always look smaller on Mac and you usually need zoom to 125%.)
Finally on iOS, it's very similar, no matter it's iPhone, iPod touch, iPad or Apple Watch, iOS uses the fixed 72DPI for non-retina screen, 144DPI for #2x retina display, and 216DPI for #3x retina display used on iPhone 6 Plus.
Forget about the real inch. It only exists on actual printing, not for displaying. For software displaying text on your screen, it's just an artificial ratio to physical pixels.
I first wondered if this had something to do with the way [CSS pixels are defined at 96 per "inch"][1] while UI layout points are defined at 72 per "inch". (Where, of course, an "inch" has nothing to do with a physical inch.) Why would web standards factor into UIKit business? Well, you may note when examining stack traces in the debugger or crash reports that there's some WebKit code underlying a lot of UIKit, even when you're not using UIWebView. Actually, though, it's simpler than that.
First, the font size is measured from the lowest descender to the highest ascender in regular Latin text -- e.g. from the bottom of the "j" to the top of the "k", or for convenient measure in a single character, the height of "ƒ". (That's U+0192 "LATIN SMALL LETTER F WITH HOOK", easily typed with option-F on a US Mac keyboard. People used it to abbreviate "folder" way back when.) You'll notice that when measured with that scheme, the height in pixels (on a 1x display) matches the specified font size -- e.g. with [UIFont systemFontOfSize:14], "ƒ" will be 14 pixels tall. (Measuring the capital "A" only accounts for an arbitrary portion of the space measured in the font size. This portion may change at smaller font sizes; when rendering font vectors to pixels, "hinting" modifies the results to produce more legible onscreen text.)
However, fonts contain all sorts of glyphs that don't fit into the space defined by that metric. There are letters with diacritics above an ascender in eastern European languages, and all kinds of punctuation marks and special characters that fit in a "layout box" much larger. (See the Math Symbols section in Mac OS X's Special Characters window for plenty of examples.)
In the CGSize returned by -[NSString sizeWithFont:], the width accounts for the specific characters in the string, but the height only reflects the number of lines. Line height is a metric specified by the font, and related to the "layout box" encompassing the font's largest characters.
The truth, as far as I have been able to ascertain, is that UIFont lies. All of UIKit takes liberties with fonts. If you want the truth you need to use CoreText, but in a lot of cases it will be slower! (So in the case of your pixel height table I think it was that it adds some sort of a + bx factor where x is point size.
So why does it do this? Speed! UIKit rounds up stuff and fiddles with spacing so that it can cache bitmaps. Or at least that was my take away!

NSImageRep wrong resolution?

MacOS 10.7.4 comes with new icons having image reps at 144 DPI. The bad thing is that when I load one of these icons in a NSImage I only get reps having a size of 512px. I mean: I load a 1024px/144dpi icns file in a NSImage and then I ask every image rep for its size... no rep has a size of 1024px, I only get sizes with a max. of 512px (no matter if a rep has a resolution of 72dpi rather than 144dpi: in fact new icons in 10.7.4, like TextEdit or Automator, have reps for both resolutions for each size except for 1024px which exists in a single rep at 144dpi).
Whay does NSImageRep seem like it doesn't understand its real resolution? Why do I get this issue just for 1024px/144dpi and not, for example, for 512px/144dpi?
If I read the TIFFRepresentation of the NSImage and I write it back to a file I get a correct 1024px/144dpi TIFF file, while if I write the same NSImage going through CGImageSource/CGImageDestination as kUTTypeTIFF I get a 1024px/72dpi file.
All these things are making me getting very confused.
Thanks a lot
The docs for -[NSImageRep size] say:
The size of the image representation, measured in points in the user coordinate space.
(Emphasis added.)
This is not a measurement in pixels. It's a measurement in points, so an image that is 1024 pixels at 144 dpi measures 512 points when the points are 72 dpi.
You want to query the -pixelsWide and -pixelsHigh methods (if, indeed, you care about the pixel dimensions; often you should not).

What is the printing resolution of a XAML file?

I'm a designer and I like having a little control over the dimensions...
I am styling a XAML file that is meant to be printed.
Since dimensions are in pixels, I'd like to know which resolution I should base myself to calculate lengths (in cm)?
Thank you!
According to Charles, Silverlight is fixed at 96 DPI:
As you know, a Silverlight program normally sizes graphical objects
and controls entirely in units of pixels. However, when the printer is
involved, coordinates and sizes are in device-independent units of
1/96th inch. Regardless of the actual resolution of the printer, from
a Silverlight program the printer always appears to be a 96 DPI
device.
...
PrintPageEventArgs has two handy get-only properties that also report
sizes in units of 1/96th inch: PrintableArea of type Size provides the
dimensions of the area of the printable area of the page, and
PageMargins of type Thickness is the width of the left, top, right and
bottom of the unprintable edges. Add these two together (in the right
way) and you get the full size of the paper.
I did some quick searching, but couldn't turn up this info in the documentation. Leave it to Charles to know this sort of information.